Can You Really Be Addicted to Video Games?

Charlie Bracke can’t remember a time when he wasn’t into video games. When he was 5, he loved playing Wolfenstein 3D, a crude, cartoonish computer game in which a player tries to escape a Nazi prison by navigating virtual labyrinths while mowing down enemies. In his teenage years, he became obsessed with more sophisticated shooters and a new generation of online games that allowed thousands of players to inhabit sprawling fantasy worlds. Ultima Online, World of Warcraft, The Elder Scrolls — he would spend as much as 12 hours a day in these imaginary realms, building cities and fortifications, fighting in epic battles and hunting for treasure.

During his childhood, Bracke’s passion for video games, like that of most young Americans, didn’t cause him any serious problems. At school, he got along with just about everyone and maintained straight A’s. His homework was easy enough that he could complete it on the bus or in class, which allowed him to maximize the time he spent gaming. After school, he would often play video games for hours with his cousin and a small group of close friends before going home for dinner. Then he would head to the den and play on the family computer for a few more hours before bed. When his parents complained, he told them it was no different from their habit of watching TV every night. Besides, he was doing his homework and getting good grades — what more did they want? They relented.

When Bracke went to Indiana University Bloomington, everything changed. If he skipped class or played games until 3 in the morning, no one seemed to care. And only he had access to his grades. After a difficult breakup with a longtime high school girlfriend and the death of his grandmother, Bracke sank into a period of severe depression. He started seeing a therapist and taking antidepressants, but by his junior year, he was playing video games all day and seldom leaving his room. He strategically ignored knocks at the door and text messages from friends to make it seem as though he were at class. Eventually, he was failing most of his courses, so he dropped out and moved back in with his parents in Ossian, Ind., a town of about 3,000 people, where he got a job at Pizza Hut.

His life there fell into a familiar rhythm: He woke up, went to work, returned home, played video games until late and repeated the whole cycle. “It did not strike me as weird at all,” he recalls. It felt a lot like high school, but with work instead of classes. “And the time I used to spend hanging out with friends was gone, because they had moved to different areas,” he says. “And I kind of just thought that was the way of the world.”

When Bracke was 24, he decided to get his real estate license and move from Indiana to Virginia to work at the same brokerage as his brother Alex, a decision that led to another breakup with another girlfriend and a deep sense of loneliness in a town where, once again, he had no friends. He eventually got in touch with his ex, hoping she would take him back, only to find out that she was dating someone else. “At that point, I lost it,” he says. By his estimate, he started playing video games about 90 hours a week. He did the bare minimum amount of work required to pay his bills. When it was time to log his progress in the brokerage’s internal system, he would just make something up: sent an email to this client; left a voice mail message for that one.

His employer got wise to the scheme and put Bracke on probation. Realizing he had a problem, Bracke dismantled his computer, stashed the pieces among a bunch of storage boxes in the garage and tried to focus on work. About a month later, after making a big sale, he talked himself into celebrating by playing League of Legends for an evening. He retrieved the components of his computer, reassembled them and started gaming around 6 p.m. Ten hours later, he was still playing. The week slipped away. He kept playing.

Image
CreditDamon Casarez for The New York Times

In May, the World Health Organization officially added a new disorder to the section on substance use and addictive behaviors in the latest version of the International Classification of Diseases: “gaming disorder,” which it defines as excessive and irrepressible preoccupation with video games, resulting in significant personal, social, academic or occupational impairment for at least 12 months. The latest edition of the Diagnostic and Statistical Manual of Mental Disorders, the American Psychiatric Association’s clinical bible, recognizes “internet gaming disorder” — more or less the same thing — as a condition warranting more research.

The W.H.O.’s decision has received substantial pushback, in part because the modern meaning of “addiction” is an uneasy amalgam of several contradictory legacies: a religious one, which has censured excessive drinking, gambling and drug use as moral transgressions; a scientific one, which has characterized alcoholism and drug addiction as biological diseases; and a colloquial one, which has casually applied the term to almost any fixation. People have written about behavioral addictions — to eating, sex and gambling — for centuries. In recent decades, some psychiatrists and counselors have even specialized in their treatment. But the idea that someone can be addicted to a behavior, as opposed to a substance, remains contentious.

Predictably, some of the W.H.O.’s staunchest critics are leaders in the gaming industry, many of whom fear that the new diagnostic label will further stigmatize their products, which have been smeared as promoting slothfulness, social ineptitude and violence. A sizable faction of scientists also disputes the idea that video games are addictive. The arguments against the validity of video-game addiction are numerous, but they generally converge on three main points: Excessive game play is not a true addiction but rather a symptom of a larger underlying problem, like depression or anxiety; the notion of video-game addiction emerges more from moral panic about new technologies than from scientific research and clinical data; and making video-game addiction an official disorder risks pathologizing a benign hobby and proliferating sham treatments. “It’s absolutely not an addiction,” says Andrew Przybylski, director of research at the Oxford Internet Institute. “This whole thing is an epistemic dumpster fire.” People enjoy and sometimes form all-consuming passions for countless activities — fishing, baking, running — and yet we don’t typically pathologize those.

Throughout history, technological innovations and new forms of entertainment have consistently provoked alarmism. In Plato’s “Phaedrus,” Socrates remarked that writing would “create forgetfulness in the learners’ souls, because they will not use their memories.” In the late 1800s, the sordid tales in penny dreadfuls and dime novels were blamed for juvenile crime. In 1906, the composer John Philip Sousa lamented the “menace of mechanical music,” worrying that children might become “simply human phonographs” without “soul or expression.” As technologies proliferated at an overwhelming rate, so did concerns about their potential harm. Trains, electricity, phones, radios, personal computers: All have been subjected to technophobia. Given the long history of hysteria surrounding technology, it’s tempting to agree with those who dismiss claims that video games are addictive. After all, millions of people around the world enjoy video games without any marked repercussions; some studies have even concluded that the right kind of game play can relieve symptoms of depression and anxiety.

But these denials become more difficult to accept when juxtaposed with the latest research on behavioral addictions. A substantial body of evidence now demonstrates that although video-game addiction is by no means an epidemic, it is a real phenomenon afflicting a small percentage of gamers. This evidence has emerged from many sources: studies indicating that compulsive game play and addictive drugs alter the brain’s reward circuits in similar ways; psychiatrists visited by young adults whose lives have been profoundly disrupted by an all-consuming fixation with gaming; striking parallels between video games and online gambling; and the gaming industry’s embrace of addictive game design.

Timothy Fong, a professor of addiction psychology at the University of California, Los Angeles, says he is convinced that video-game addiction is real. “It’s quite possible and common to have both addiction and another mental or behavioral disorder simultaneously,” he told me, like depression or anxiety. “At least half the time, compulsive gamers come in with clinical histories and mind-sets that are essentially the same as patients with heroin addiction, alcoholism or gambling disorder. They have all the hallmarks.”

The debate over video-game addiction is about much more than diagnostic nomenclature; at its center is a shifting scientific understanding of addiction itself. For too long the concept of addiction has been fettered by models and frameworks too meager to accommodate its complexity. Addiction has been attributed solely or primarily to weak willpower, or neural circuitry gone awry, or the inherent dangers of drugs themselves. In both the medical community and the public consciousness, the conflation of addiction and chemical dependency has stubbornly persisted.

Researchers in a wide variety of fields — from psychology to public health — are increasingly pushing back against the reductive schema of the past. Addiction is no longer considered synonymous with physiological dependence on a substance, nor can it be reduced to the activity of neurons in a few regions of the brain. Rather, experts now define addiction as a behavioral disorder of immensely complex origins. Addiction, they say, is compulsive engagement in a rewarding experience despite serious repercussions. And it results from a confluence of biology, psychology, social environment and culture. In this new framework, addictions to certain types of modern experiences — spinning virtual slot machines or completing quests in a mythical realm — are entirely possible. In the case of video-game addiction, the most vulnerable population seems to be young men like Bracke.

— Nate Bowman (right), 20, photographed with Wren Viele (left), 18, in September at reStart’s campus in Carnation, Wash.


Shortly after Bracke’s employers put him on probation, his parents, Sally and Steve, visited him in Virginia. One day, while driving back from the grocery store, Sally worked up the courage to ask her son a question that had been troubling her for some time: “Charlie, are you a gaming addict?” She was terrified of using that word — “addict” — terrified that Bracke would perceive it as an accusation and that their relationship would suffer for it. Bracke contemplated the question silently for a long time as they drove. In truth, the thought had occurred to him, but he had never taken it seriously, let alone said it out loud. Finally he answered: “Yeah, I think I might be.” Back home, he found an online questionnaire that assessed whether someone was an alcoholic. Wherever the quiz mentioned drinking, Bracke substituted gaming. He needed to answer yes to only a few of the questions to qualify as an addict; he affirmed almost all of them.

In the spring of 2015, Bracke was officially kicked off his real estate team. That summer, he stayed at his brother Alex’s house to take care of the dogs while Alex and his wife and son were on vacation. On the first day of his stay, he suddenly realized that his brother’s life — the home, the family, the steady job and income — was everything he wanted and would never have. It was a startling epiphany and the prelude to a period of profound self-loathing. He discontinued his antidepressants because he didn’t think he deserved them. He stopped bathing regularly. He left his brother’s house just twice in nine days, to grab snacks and frozen pizzas from a nearby grocery store. Gaming was the only thing that distracted him from his mental anguish. Nothing felt as good as gaming; nothing else felt good.

By August, he had a detailed suicide plan. He decided he would kill himself in November, around the same time of year his grandmother died; that way, he reasoned, his mother would have to endure only one morbid anniversary. About two months before Bracke intended to take his own life, his parents returned to Virginia to celebrate their grandchild’s birthday. They surprised Bracke with a visit one afternoon. Although they knew their son was struggling, they didn’t know the extent of it. They were shocked at the state of his apartment — cluttered with clothes, trash and empty pizza boxes — and Bracke’s own bedraggled appearance. He knew his gaming had become a terrible problem, he told them, but he felt powerless to stop.

In the following weeks, Sally called every rehab center and addiction hotline number she could find, searching for a program that recognized video-game addiction and knew how to treat it. Every single center turned her away, saying they didn’t offer treatment for her son’s condition. She called so many organizations — some of which used the same telephone switchboards — that she ended up speaking to certain individuals multiple times without realizing it. One day, an exasperated operator interrupted her sobs to tell her that they had already spoken and that he had some good news: His supervisor had recently mentioned a new rehab center in Washington State called reStart, which specialized in internet and video-game addiction.

Bracke and his parents were overjoyed to have finally found some recourse — but the price was staggering. It would cost about $22,000 for the minimum stay of 45 days, and their health insurance wouldn’t cover it. (At the time, there was no official diagnostic code for gaming addiction.) “I remember at one point saying we don’t know how we can afford this, and at the same time we don’t know how we can afford not to,” Bracke’s father told me. Ultimately, they decided to remortgage their house.

In the 1950s, the American psychologist James Olds and the Canadian neuroscientist Peter Milner performed a landmark experiment. They implanted electrodes in various parts of rats’ brains and placed the animals in boxes equipped with levers. Whenever the rats pressed a lever, their brains received a brief jolt of electricity. Zapping some areas of the brain did not change the animals’ behavior, whereas stimulation in other regions seemed to make them avoid the levers. When the researchers placed electrodes near a part of the brain known as the nucleus accumbens, something remarkable happened: The rats became fixated on the levers, pressing them as often as 80 times a minute for as long as 24 consecutive hours. Olds, Milner and other scientists showed that rats would gallop uphill, leap hurdles and even forsake food in order to keep stimulating that region of the brain. It seemed that science had located the brain’s pleasure center, the hypothesized area that made it feel so good to do things conducive to survival and reproduction, like having sex or eating calorie-rich meals. Perhaps, some scientists proposed, addictive drugs had some effect on this same area.

In the following decades, as the tools of neuroscience improved, researchers formed a more complete map of the brain’s reward system, which is a constellation of neural circuits involved in attention, motivation, desire and learning. Studies revealed that healthy rats became obsessed with drug-dispensing levers, but rats whose reward circuits had been disrupted showed little to no interest. Related experiments singled out the neurotransmitter dopamine as the most important chemical messenger in the reward system, demonstrating how certain addictive drugs drastically increased the amount of the dopamine traveling between neurons. With neuroimaging techniques developed in the 1990s, scientists could watch the brain’s reward center respond almost instantly to an injected drug and examine how the brain’s structure and behavior changed with continued use. In parallel, scores of studies identified heritable gene sequences that seemed to be associated with an increased risk for addiction.

These findings formed the core of what has come to be called the brain-disease model of addiction, which has been embraced by most major health organizations, including the National Institute on Drug Abuse and the American Medical Association. According to this model, addiction is a chronic disease of the brain’s reward system caused by continual exposure to particular substances and the dopamine release they trigger. The brain compensates by producing less dopamine in general and becoming less sensitive to it over all, forcing the user to take even larger doses to experience the same level of reward — a development known as tolerance. The neurochemical chaos produced by continued drug use also degrades the neural pathways that connect the reward center to the prefrontal cortex, which is crucial for planning, managing emotions and controlling impulses. The longer an addiction progresses, the higher someone’s tolerance, the stronger their cravings and the harder it may be to quit without relapsing.

From the 1990s to the late 2000s, neuroscientists demonstrated that many of the neurobiological changes underlying drug addiction occurred in pathological gamblers as well. For most of the 20th century, the psychiatric community regarded pathological gambling as a disorder of impulse control — more related to compulsive tics than to addiction. As scientists developed a more sophisticated understanding of the biology underlying addiction, however, many mental-health experts began to change their minds. Like certain drugs, gambling elicits a surge of dopamine in the reward circuit. Over time, compulsive gambling diminishes the ability to experience reward and inhibits circuits in the prefrontal cortex that are crucial for impulse control.

Studies of Parkinson’s disease provided further confirmation. Between 3 and 6 percent of people with Parkinson’s are compulsive gamblers, which is substantially higher than the estimate of 0.25 to 2 percent of the general population. Parkinson’s, which results in part from the death of dopamine-secreting neurons in the midbrain, is sometimes treated with the drug levodopa, which increases the amount of dopamine in the brain and nervous system. Some researchers have proposed that by raising dopamine levels, levodopa essentially mimics certain aspects of addiction, making the brain more susceptible to risk-taking and compulsive behavior. In 2013, after reviewing the mounting evidence, the American Psychiatric Association moved gambling disorder to the addictions section of the D.S.M.

In the last 10 years, scientists have been making similar discoveries about compulsive gaming. Neuroimaging studies have confirmed that video games trigger a release of dopamine in the reward circuit and that dopamine does not behave as it should in the brains of compulsive gamers. In a study performed in China, frequent gamers displayed unusually low activity in their reward circuits when anticipating a monetary prize. Some researchers think an inherently unresponsive reward system predisposes people to addiction by pushing them to seek big thrills; others interpret it as an early sign of tolerance. Last year, the psychologist Daria J. Kuss, part of the International Gaming Research Unit at Nottingham Trent University, and her colleagues published a review of 27 studies investigating the neurobiological correlates of compulsive gaming. They concluded that, compared with healthy individuals, compulsive gamers exhibit worse memory, poorer decision-making skills, impaired emotion regulation, inhibited prefrontal cortex functioning and disrupted electrochemical activity in their reward circuits — all similar to what researchers have documented in people with drug addictions.

“I don’t think we as psychologists can be justified in saying gaming addiction doesn’t exist,” Kuss told me. “From my experience of researching it for over 10 years, I can tell you I am very sure that this is indeed a real addiction requiring professional help.”

There’s a danger, though, in making neuroscience the ultimate arbiter of addiction. In the past decade, many researchers have argued persuasively that the brain-disease model of addiction has gained more prominence than it deserves. Neuroscientists have discovered that the relationship between the reward circuit and addiction is much more convoluted than is typically acknowledged. It turns out, for example, that only some addictive drugs, namely cocaine and amphetamine, dependably provoke huge releases of dopamine; many others — including nicotine and alcohol — do so inconsistently or hardly at all. Moreover, dopamine is not as closely linked to pleasure as once thought; it is much more important for wanting than liking, for anticipating or seeking out a reward than for enjoying it. And dopamine is involved in far more than reward and motivation; it is also important for memory, movement and immune-system regulation. But the explanatory power of neurobiology is so appealing that the basic tenets of the brain-disease model have seeped into public consciousness, popularizing a somewhat reductive understanding of addiction.

Sally Satel, a psychiatrist and Yale University lecturer, puts it this way: “Addiction is not a brain problem. It’s a human problem.” Derek Heim, an addiction psychologist at Edge Hill University in England, agrees completely: “People get very excited when they see pictures of a brain, but we’ve overextended that explanation. We need to think of addiction as an extremely multifaceted problem.” Video-game addiction perfectly exemplifies this multiplicity. It’s not just a biological phenomenon — it’s a cultural one too.


— Thomas Kuhn, 19, in September at reStart’s campus in Fall City, Washington.


When Bracke was born in the late 1980s, video games were still being assimilated into mainstream American culture. Today they are ubiquitous. Globally, more than two billion people play video games, including 150 million Americans (nearly half the country’s population), 60 percent of whom game daily. Professional athletes routinely perform goofy victory dances from Fortnite. Game Informer has the fifth-highest circulation of any American magazine, surpassed only by AARP’s Magazine and Bulletin, Costco Connection and Better Homes & Gardens. When Grand Theft Auto V was released in September 2013, it generated $1 billion of revenue within three days. No single entertainment product has ever made so much money in so little time. Video games are now one of the most lucrative sectors of the entertainment industry, having overtaken film, television, music and books. Games are also the most popular and profitable type of mobile app, accounting for about a third of all downloads and 75 percent of Apple’s App Store revenue.

A typical gamer in the United States spends 12 hours playing each week; 34 million Americans play an average of 22 hours per week. About 60 percent of gamers have neglected sleep to keep playing, and about 40 percent have missed a meal. Somewhere around 20 percent have skipped a shower. In 2018, people around the world spent a collective nine billion hours watching other people play video games on the streaming service Twitch — three billion more hours than the year before. In South Korea, where more than 95 percent of the population has internet access and connection speeds are the fastest in the world, compulsive game play has become a public-health crisis. In 2011, the South Korean government passed the Shutdown Law, which prevents anyone under 16 from playing games online between midnight and 6 a.m.

Video games are not only far more pervasive than they were 30 years ago; they are also immensely more complex. You could easily spend hundreds of hours not only completing quests but also simply exploring the vast fantasy kingdom in The Legend of Zelda: Breath of the Wild, a gorgeously rendered virtual world in which every blade of grass responds to the pressure of a footstep or the rush of a passing breeze. Fortnite attracted a large and diverse audience by blending the thrill of live events with the strategic combat and outrageous weaponry of first-person shooters, airbrushing it all with a playful cartoon aesthetic. In The Witcher 3: Wild Hunt, the choices players make change the state of the world and ultimately steer them toward one of 36 possible endings. All games — whether tabletop, field or electronic — are simulations: They create microcosms of the real world or gesture at imaginary ones. But these simulations have become so expansive, intricate and immersive that they can no longer be labeled mere entertainment, no more engrossing than an in-flight movie or a pop song. They are alternate realities.

Even games that are intentionally designed with a retro feel can be surprisingly absorbing. Take Stardew Valley, a quaint farming game with 16-bit graphics that reminded me of the early Pokémon titles for Game Boy. Apart from Candy Crush and word puzzles, I hadn’t spent much time playing video games since high school, so, while reporting this article, I decided that Stardew Valley might be an appealing way to reacquaint myself. It seemed like the kind of thing I could play for an hour here or there as my schedule allowed. The premise is simple: You leave your soul-deadening corporate office job and move to the country to revive your grandfather’s neglected farm. It seems refreshingly, perhaps deliberately, distinct from all the frenzied and ultracompetitive first-person shooters and survival games. Each day in the game equals about 17 minutes of real-world time, so a week passes in just under two hours.

At first, I was enchanted by the game’s pastoral setting and its emphasis on collaboration, compassion and discipline. As I became more deeply invested in my pixelated life, though, my attitude changed. I started to lose patience with my neighbors and their daily prattle and stopped noticing all the thoughtful details that once delighted me: the soft glow of fireflies on summer evenings, the fleeting shadow of an owl in flight, the falling petals in spring. And what disturbed me most was the sheer quantity of time I was pouring into the game. It was so easy to play continuously through an afternoon or an evening, in part because the great satisfaction of my achievements was so disproportionate to the effort I expended. I found it extremely difficult to stop playing, even at “nighttime,” when my character went to sleep, which doubles as a natural point to save your progress and quit. If I kept going, just another 20 minutes, I could get so much done. Compared with the game, everything else in my real life suddenly seemed so much harder — and so much less gratifying.

The fact that video games are designed to be addictive is an open secret in the gaming industry. With the help of hired scientists, game developers have employed many psychological techniques to make their products as unquittable as possible. Most video games initially entice players with easy and predictable rewards. To keep players interested, many games employ a strategy called intermittent reinforcement, in which players are surprised with rewards at random intervals. Some video games punish players for leaving by refusing to suspend time: In their absence, the game goes on, and they fall behind. Perhaps the most explicit manifestation of manipulative game design is the rising popularity of loot boxes, which are essentially lotteries for coveted items: a player pays real money to buy a virtual treasure box, hoping it contains something valuable within the world of the game.

As modern video games have become so immersive, their carefully composed dreamscapes have begun to offer a seductive contrast to the indifferent, and sometimes disappointing, world outside screens. Games, by definition, have rules; goals are often explicitly defined; progress is quantified. Although games frequently put players in challenging situations, they continuously offer tutorials, eliminate real-world consequences of failure and essentially guarantee rewards in exchange for effort. Games imbue players with a sense of purpose and accomplishment — precisely the kind of self-worth that can be so hard to attain in their actual lives, especially in a job market that can be punishing for the inexperienced.

From 2014 to 2017, American men in their 20s worked 1.8 fewer hours per week than they had in the three-year period 10 years earlier; in tandem, they increased the time they spent playing video games by the exact same amount. One economics study suggests that this correlation is not a coincidence — that young American men are working less in order to game more. For young men like Bracke, who have either not completed a four-year college degree or have not found work equal to their education and skills, video games can become something like a surrogate occupation — a simulacrum of success. Why suffer in a world that has no place for you when you can slip so easily into one that is designed to keep you happy, and is more than happy to keep you?


— Walker Wadsworth, 22, in September at reStart’s campus in Fall City, Washington.


On the evening of Oct. 21, 2015, a relative picked up Bracke from the Seattle airport and dropped him off at reStart’s main facility, a large two-story blue house in Fall City, Wash., surrounded by a garden and acres of forest. He checked in with the staff, dropped off his luggage in the house and joined a small group of young men sitting around a campfire. They were eating hot dogs and conducting their evening meeting, a ritual Bracke would come to know well: Each took his turn sharing what he accomplished that day and what he planned for the next. “Some of the guys, just to help me feel more comfortable, told part of their history about how they ended up here,” Bracke recalls. “Just being around other people who had gone through what I had gone through and knew what it felt like made a huge difference. I felt accepted. It almost sounds corny to say it, but I got there and immediately felt I had made the right choice.”

Because video-game addiction is a relatively new disorder, there are few published studies examining how best to treat it. Some clinicians warn that rehab programs and retreats focused on internet and video-game addiction make unsubstantiated claims, give people false hope and take advantage of desperate parents and adolescents. (In China, there have been disturbing reports of internet-addiction boot camps that use electroconvulsive therapy and corporal punishment, resulting in at least one teenager’s death.) But many compulsive gamers and their families counter that they have no other viable options; treatment centers focusing on substance abuse or gambling addiction often decline to help them or cannot provide a recovery environment that they think is suitable. ReStart opened in 2009, and it remains one of few dedicated long-term rehabilitation programs for internet and video-game addiction in the United States. Hilarie Cash, one of reStart’s founders, estimates that 80 percent of clients complete Phase I and that 70 percent complete Phase II. Former clients think it may be a lot less than that; they have seen many friends relapse or leave the program early.

Bracke spent about seven weeks at the house in an initial “detox” phase, following a strict schedule of chores, exercise, meals, group meetings and therapy sessions. Lights out by 10:30 p.m. No cellphones or computers. One landline for outgoing calls. The program forced him to try new activities — hiking, camping, Frisbee golf — many of which he enjoyed. He developed a “life balance plan,” which focused on strategies for responsible technology use after the program, for example forgoing a computer and limiting internet access. And he learned how to have difficult conversations using a “wheel of communication,” which required him to verbalize what he was feeling and thinking and to reiterate what the other person in the conversation had just said. “Toward the end of my time gaming, I was getting to the point where I felt like I couldn’t converse with people at all, unless it was about video games,” Bracke says. “So going through something like that really made it click that I can actually talk to people, I can really communicate with them.”

A huge component of reStart’s philosophy is the importance of maintaining relationships. “These guys have almost universally what I would call an intimacy disorder,” Cash told me. “They don’t really know how to build and maintain intimate relationships. The solution to addiction is connection. We are building a real recovery community with our guys. It’s all about building friendship and community that is face to face, in person, rather than online.”

Of course, for many people video games are explicitly and gratifyingly social. A raucous multiplayer game like Fortnite can bring large groups of friends and neighbors together online or in someone’s living room. People who struggle with severe social anxiety, or who cannot regularly leave their homes, may find camaraderie through an avatar. But video games are a poor substitute for meaningful human companionship. Virtual interactions are often stripped of behavioral cues and facial expressions; masked identities empower people to mistreat one another; and it’s easier to vanish from someone’s life if you’ve never met them. Games, like online social networks in general, sometimes provide the semblance of genuine connection while actually pushing someone into a dangerously secluded way of life.

The economic and cultural ascendancy of video games has collided with a social crisis that we are only beginning to understand: the isolation, emotional stagnation and profound loneliness of American men. Recent surveys indicate that loneliness is reaching epidemic proportions among Americans. According to a 2018 Cigna survey, more than 40 percent of Americans feel that their relationships are not meaningful and that they are generally isolated from others; 20 percent rarely or never feel close to anyone. Young adults between 18 and 22 score higher on scales of loneliness than any other group.

There’s good reason to think that single men are uniquely vulnerable to social isolation and its repercussions. Studies suggest that men rely primarily on a partner for emotional intimacy, whereas women are more likely to have additional support from close friends; men in their late 30s lose friends at a faster rate than women; and men are more likely to kill themselves because of prolonged emotional or social detachment. In three decades of research, Niobe Way, a professor of developmental psychology at New York University, has observed a striking pattern of behavior among American boys: in early adolescence, they are openly affectionate with one another, speaking freely of love and lifelong bonds; by late adolescence, as they become cultured to project an image of masculinity, heterosexuality and stoicism, they start to distance themselves from their same-sex friends. One 17-year-old told Way that “it might be nice to be a girl, because then you wouldn’t have to be emotionless.”

And while addiction was once regarded as a kind of vice or chemical thrall — and in more recent decades has been framed as dysfunctional neural circuitry — there is now a substantial body of research contextualizing addiction as a consequence of social isolation. People who are deprived of a dependable social network, or who have severe difficulty connecting with others, have a much higher risk of both developing an addiction and relapsing. Addiction itself can drastically magnify loneliness. Video-game addiction afflicts between 1 and 8 percent of gamers, according to estimates published by researchers. As a group, gamers are now more diverse than ever, comprising a wide range of ages and increasingly equal numbers of men and women. Yet as evidenced by both scientific studies and the experiences of clinical psychiatrists, self-identified video-game addicts are overwhelmingly male. To be more specific, they are typically single young adult men — the very segment of the population that may be most prone to social detachment. In the course of my conversations with dozens of compulsive gamers, a familiar narrative began to emerge: A young man repeatedly suffered some form of rejection from his peers; hurt, he turned to video games to soothe and distract himself; the games gave him a pretense of the kinship and achievement he never knew in the real world; when he left home for college or moved into his own place — and the familial checks on his day-to-day activities were lifted — his fixation on games intensified until it consumed him.

This is more or less the story that Cam Adair, perhaps the leading spokesman for the legitimacy of video-game addiction, tells in his public appearances. Like Bracke, he nearly committed suicide but sought help in the 11th hour. In 2011, he posted his story and insights on a blog and received thousands of responses from people with similar experiences. Inspired by this outpouring, Adair founded Game Quitters, an online support community for video-game addicts that today has about 75,000 members from 95 countries. Thanks to his ability to articulate the fraught reality of a fringe diagnosis, Adair is now self-employed as a full-time public speaker.

“I just really inquired, ‘Why do I game?’ ” Adair told me. “For me, it was so obvious that it wasn’t just that games were fun. They allowed me to escape. They allowed me to socially connect. They allowed me to see measurable progress. And they allowed me to feel a sense of certainty.” To Adair, the gaming industry’s repeated disavowal of video-game addiction is embarrassing. “It’s just not honest, and it’s not based in reality,” he says. “People have been coming forward for years, saying they are really struggling. What really matters is that you feel you have to keep playing despite it having a negative impact on your life. That’s addiction. I think, as a society, we should be saying, ‘How can we help?’ ”

Those who deny the reality of video-game addiction often overlook a fundamental ambiguity central to addiction itself. Current diagnostic criteria for addiction are not so much a definitive scientific description as a useful guideline. To insist that addiction must be restricted to certain substances is to presume a level of understanding that we have not yet reached. If addiction is an evolving concept, and an official expansion of that concept would profoundly benefit people who clearly need help, can we justify clinging to the status quo?

In the summer of 2016, shortly after he started working at a Costco near his home in Redmond, Wash., Bracke found himself surrounded, once again, by video games. Niantic had released Pokémon Go, an augmented-reality game for mobile devices that superimposed animated Pokémon onto screen-based images of a player’s actual surroundings — a pidgey in the park, a magikarp on the beach. You could catch them by swiping a finger across the screen. The game was downloaded more than 100 million times by the end of July and for a while was the single most active mobile game in the United States.

Many of Bracke’s co-workers were captivated by Pokémon Go. They would surreptitiously play during their shifts, occasionally keeping the game running in their pockets to increase their chances of encountering a Pokémon. Some of them asked him about it: What level was he? Which Pokémon had he caught so far? Bracke hadn’t tried the game yet, but he was extremely tempted. He decided he would download it so that he could immediately block all access to it using a “screen accountability” program called Covenant Eyes, which was originally designed to help people stop watching pornography.

Today, Bracke — a cordial, brown-bearded 31-year-old — still works at that Costco, though he recently completed a degree in accounting at Bellevue College and has begun his studies at the University of Washington. He owns a Samsung Galaxy smartphone and an “intentionally crappy” laptop, but he doesn’t have an internet connection at home. He hasn’t touched video games since starting rehab in the fall of 2015. Like Adair, he has become an outspoken advocate for video-game addicts, once appearing on the “Today” show.

Rehab taught him that in order to stay sober, he would have to do more than avoid video games — he needed to replace them with something else. In Washington, he started reading more. He broadened his social network, making new friends through work, school and mutual acquaintances. When the weather was nice, he went hiking, took his dog on a long walk or played Frisbee golf. At home, he enjoyed the occasional board game. “I’ve tried to branch myself out into a lot of hobbies that I take shallower dives into, rather than having one that occupies everything,” he told me.

After touring reStart this September, I visited Bracke at the apartment where he was living at the time. When I walked in, I was greeted by several of Bracke’s friends and roommates, all young men in their 20s who participated in reStart (they asked to remain unnamed). The apartment was small, with somewhat shabby furniture. A two-foot-tall artificial Christmas tree stood in one corner, a holdover from last year. Bracke’s small white dog, Minerva, ran between us, yipping and nudging our legs.

I chatted with Bracke and his friends for about two hours. We talked about their experiences in reStart, how they navigate life with so little internet access and their long-term plans. Each of them believed he would have killed himself without some type of formal treatment. Each stressed how important reStart’s emphasis on social connection had been to his recovery. And each said that, at least for the time being, he planned to stay in Washington — the place where they all had finally learned, or relearned, how to connect with others outside the context of multiplayer games. “I still hung out with people before,” one of Bracke’s friends said, “but most of the time, we would just talk about stuff we were going to do, like playing video games or something else that wasn’t particularly serious. I can just walk up to either of them,” he continued, gesturing to the other men beside him, “and be like, ‘This is what’s going on in my life.’ I’ve never really had that before.”

His admission stuck with me, in part because it so closely echoed conversations I had with other self-identified gaming addicts. As Bracke told me at one point, a huge part of his rehab was “allowing myself to rely on others, and being there to support others when they need it as well.” He explained that even if he had technically been socializing while gaming — talking with other players on his headset — he had never been genuinely interested in their lives, nor they in his. In contrast, the relationships he developed during rehab, in the complete absence of games, felt sincere and enduring: “A lot of the guys I met there were some of the only people I could be totally honest with.”

The more I spoke with the young men in Bracke’s living room, the clearer it became that they were not simply friends — they were family. They had suffered in such similar ways. They had arrived in Washington feeling helpless and utterly alone. Most of them had forgotten what it was like to have a meaningful conversation with another person. For months, they cooked and ate together, shared bedrooms and bathrooms and managed the same household. They exchanged mundane niceties and confessed deeply personal fears, hopes and secrets — their abandoned aspirations, their suicide plans. They cried in front of one another, repeatedly, because of anger or guilt or grief. They witnessed one another become more trusting and confident, less anxious and withdrawn, even hopeful.

Although each had his own future to focus on, whether school or work or both, they still lived together and encouraged one another in these pursuits. Perhaps they would never fully understand the reasons for their compulsions or distill the culpability of games from all the other elements of their lives. Perhaps it didn’t matter anymore. If addiction is the compulsive substitution of an artificially rewarding experience for essential human intimacy, then these men had found its opposite in one another.


Ferris Jabr is a contributing writer for the magazine. He last wrote about the evolution of beauty.

Source

Be the first to comment

Leave a Reply

Your email address will not be published.


*


17 − 6 =