Despite a wider variety than ever before, video games don’t have the same effect on me as they used to. That might not sound like a problem to some of you, but it is to me. I have played video games from the early days of my childhood, starting somewhere around the late ’80s. I became heavily addicted to my Game Boy as a kid, and I can still remember the thrill I felt the day I bought my first PlayStation 22 years ago.
Gaming was like breathing. It was the biggest part of my life as a teenager, one of my priorities as a college student, and eventually one of my most expensive “hobbies” as a young professional.
Then all of a sudden, after thousands of hours spent playing across genres and platforms, boredom hit me hard for the very first time in my early thirties. Some of my favorite games soon gave me the impression of being terribly long. I couldn’t help but notice all the repeating tropes and similarities in game design between franchises.
I figured it was just a matter of time before I found the right game to stimulate my interest again, but time continued to go by and nothing changed.
My 41-year-old cousin had dealt with the same thing years before me, and he had a simple explanation: “Now it’s you who has to worry about rent and bills, not your dad.” Deep inside I knew he was right. The more responsibilities, problems, and stress in life, the more we lose our appetite for things that used to entertain us, gaming included. But could there be something more to it than that? Games themselves were also changing. As technology enabled things I wouldn’t have dreamed were possible as a kid, it created entirely new platforms for gaming.
On top of that, as people grow older there are inevitable changes that influence how we see games—including things as simple as needing glasses to actually play the game. The gradual loss of focusing ability for near vision as we age, known as presbyopia, makes gaming on smartphones and portable game consoles a nightmare for older people. Happenings like this help explain why devices such as PS Vitas are created almost exclusively for younger players.
So, are the changes I've been experiencing some kind of fluke or an inevitable process for many? Can you in essence age out of gaming?
I’m far from the only former gamer thinking about this reality—there have been several good scientific studies on the matter within the last decade, in fact. One in particular seemed to answer all my initial questions. Collecting information from a staggering 239,000 gamers in recent years, market research company Quantic Foundry has been closely examining how gaming motivations and preferences change with age via its Gamer Motivation Profile.
The research suggests that many things occur as we age, all pushing people a bit further away from gaming. It’s not that we wake up one morning thinking video games are boring all of sudden, but instead everything happens slowly throughout a lengthy process that most of us don’t even realize. Generally, QF found that the older a gamer gets, the less eager they become in terms of gaming—respondents were less likely to label gaming as “very important/enjoyable,” and their preferences shifted toward specific gaming experiences (less social and more solo gaming, for instance) not as popular among younger gamers.
While this could be due solely to a decline in enthusiasm for some, an alternative explanation is that many older gamers—especially when they have longterm partners, become parents, or ascend into highly demanding professional positions—simply have different priorities and less time for gameplay. And when you have plenty of other things to do, it’s easy to avoid activities that appear to be either too easy and thus boring, or too difficult and thus frustrating.
Beyond a simple matter of time, gaming doesn’t necessarily provide a sense of accomplishment, either. There is a general lack of reward when it comes to gaming—at least in terms of how adults interpret the concept of reward. The repetition of an activity that doesn’t produce a visible benefit can decrease the feeling of novelty, and that’s when boredom may strike. Breaking virtual records and topping the scoreboards may be bigger goals for a teenager, but for a young professional in their early thirties, something with a concrete output—think cooking, painting, gardening, or bodybuilding—may appear to be much more rewarding. Studies (like another good one from Florida State researchers in 2014) back this up, showing that certain adults simply don’t view gaming as a productive activity on which to spend their free time. Many people in their thirties and forties may get more out of a dance or yoga class instead.
For those of us who won’t stop playing video games, it bears keeping in mind that game mechanics tend to be defined by the core gaming market (typically male, ages 18 to 30). So if your tastes in games change in a way that mimics the QF data trends, you may find yourself in the broad “casual gamer” category with fewer high-profile new games being specifically targeted to your tastes.
“One long-term trend we’ve highlighted in our talks and blog posts is that we have a generation of folks who grew up with gaming, are now 35+, and most likely won’t stop gaming. However, their tastes in gaming have changed,” cofounder and analytics lead at Quantic Foundry Dr. Nick Yee told Ars.
One of the changes in tastes Yee points to is that aging gamers tend to gravitate toward more casual and less competitive games. “Picking up a competitive shooter takes time to practice and a strong desire to compete, both of which older gamers have less of,” Yee says. Of the 12 motivations in Quantic Foundry’s model used to identify a gamer’s preferences throughout the years, the appeal of competition declines the most with age. It dips faster even than the desire for destruction, excitement, and challenge, characteristics which are usually seen as a young gamer’s motivation.
“The most dramatic shift in gaming motivations (for both men and women) is in competition—the appeal of duels, matches, and leaderboard rankings,” Yee tells Ars. “Gamers are much less driven by competition as they get older, and this motivation drops the most between ages 15 and 25 and levels off around age 32-plus.”
So the QF data shows that competition is a young person's sport. And just as with real sports, competitive titles tend to require time to practice, something that most gamers don’t have in big doses in their thirties and forties. Furthermore, competitive gaming requires rapid reaction times and precise mechanical skills, both of which decline with age.
A University of Michigan study within the last decade confirmed that as we age our brain connections break down, slowing our physical response times. But at what age does a person’s reaction time begin to change enough to have a serious effect on gaming skills?
In the Michigan study, scientists measured the response times of adults over age 65 and compared them against those of a group of players ages 18 to 30. Researchers then used a functional MRI to image the blood-oxygen levels in different parts of the brain, a measurement of brain activity. The physical response times between the two age groups was great—there were no significant differences between a thirty-year-old gamer and a teenager. In other words, gamers in their thirties think they can’t keep up with their younger counterparts and lose interest in competitive gaming. It’s psychological rather than the experience of aging.
Strategy, on the other hand, is a gaming genre that spans generations, according to the data. QF has found motivation for strategy changes the least with aging. If you’re like me, this makes it a little clearer why your once-favorite survival horror, racing, first-person shooter, fighting and massively multiplayer online games have been replaced by real-time strategy games, online puzzles, and brain teasers. So while younger players feel more strongly about other motivations, strategy comes to the forefront as other preferences fade over time. The result is that a lot of older gamers are excited by “careful decision-making and planning” (QF’s survey wording) in a game as opposed to facing real-life consequences when they fail in analogue real-world scenarios.
This seems pretty logical if you take into account that, by the time you’re old enough for marriage or parenting, you usually have to help make decisions that determine the fate of your whole family and not just your own future. Strategizing and planning wisely in the gaming world could possibly help you find new ways to think about the real world, although I might be engaging in wishful thinking in this case.
Despite all the factors lessening the role of gaming in our lives as we age, there may be an upside to consciously pushing against those QF trends. It’s not an exaggeration to say video games could actually help keep the brain agile as we grow old. A recent US National Institutes of Health-funded study involving nearly 3,000 retirees showed that video games could cut the risk for dementia by almost 30 percent. For this study, a training exercise was used to test the brain’s perception, decision-making, plasticity, reasoning, and recall. Impressively, those who played video games appeared to have a 29-percent lower chance of developing dementia.
Even more encouraging are the findings of another study from the Université de Montréal, which showed that elderly people can stave off Alzheimer’s disease by playing video games only 15 minutes per day, three times a week. Scientists describe the results of both studies as really promising, but each team noted they had to be replicated.
Findings like this possibly explain why 48 percent of adults over the age of 50 play video games, at least according to the Entertainment Software Association (ESA). The organization’s 2015 report, titled “Gamers Over 50 Study: You’re Never Too Old to Play,” found that 80 percent of these gamers play at least once a week, while 45 percent play every day. Theoretically, those numbers should only increase as a generation raised on PCs and early consoles ages.
Talking with Ars, QF’s Dr. Yee suggests that many older people play video games simply because they consider it a creative and fun way to kill their free time. The majority of them likely ignore (or are simply unaware of) the potential benefits of gaming when it comes to aging. “A group of people who have a lot of disposable time are older folks past retirement age or once their kids are fully grown. So senior-living homes or community centers also provide the perfect context for local co-op/competitive games,” he tells Ars. “Even back when I was studying MMOs [Massively Multiplayer Online Games] in the early 2000s, about 25 percent of MMO gamers were over 32 years old and played on average 20-plus hours each week. I regularly had gamers age 60-plus filling out my MMO surveys.”
However, science doesn’t have an adequate answer as to why so many adults who play video games are hesitant to consider themselves avid gamers. In the same year (2015) that the ESA released the older gamers report, Pew Research found only 10 percent of them self-defined as gamers. One possible explanation is that a lot of our assumptions about gamers are still rooted in the “teenage basement dweller” stereotype. So, despite recent studies showing that a respectable number of people who play video games are mature adults of all sexes, there’s a tendency in society to depict the typical gamer as a male teenager who worships the latest titles. While this appears to be true for certain aspects of the market (looking at you, EA Sports), there’s a reason games like World of Tanks (which, while competitive, is a strategic MMO) remain very popular among gamers age 40-plus, Dr. Yee insists.
So, is my fate sealed? Do a majority of games simply get boring for us at a certain age? At that point, are you no longer a gamer? Beyond labels and self-identification, the data on this shows having a good time and playing games has nothing to do with age, gender, nationality, or any other identifying factors. Instead, the reality looks simple after you zoom in: once we hit a certain age, we are no longer the primary target audience most game developers have in mind. And as our time, attention, and thoughts have other demands beyond gaming, the type of gaming experiences we seek out shift accordingly. The exact same thing likely happens when it comes to music, literature, TV, and film as well, but this doesn’t mean we have to stop listening or watching, right? We just get older and change—or, perhaps how I would’ve once phrased it, we level up.
Theo is a law graduate and a freelance writer. He's the founder of Theo-Cracy blog, where you can find more of his work and contact him directly. When he’s not working, he usually plays video games, reads graphic novels, travels around the world, or investigates the culture of science fiction. He previously wrote about games focused on solving scientific challenges at Ars.