slider top

Doom Scrolling


Social media and the teen-suicide crisis

Social media icons. Source: Defense Visual Information Distribution Service.

Source: Defense Visual Information Distribution Service.

Lori and Avery Schott wondered about the right age for their three children to have smartphones. For their youngest, Annalee, they settled on thirteen. They’d held her back in school a year, because she was small for her age and struggled academically. She’d been adopted from a Russian orphanage when she was two, and they thought that she might possibly have mild fetal alcohol syndrome. “Anna was very literal,” Lori told me when I visited the family home. “If you said, ‘Go jump in a lake,’ she’d go, ‘Why would he jump in the lake?’ ”

When Anna was starting high school, the family moved from Minnesota to a ranch in eastern Colorado, and she seemed to thrive. She won prizes on the rodeo circuit, making friends easily. In her journal, she wrote that freshman year was “the best ever.” But in her sophomore year, Lori said, Anna became “distant and snarly and a little isolated from us.” She was constantly on her phone, which became a point of conflict. “I would make her put it upstairs at night,” Lori said. “She’d get angry at me.” Lori eventually peeked at Anna’s journal and was shocked by what she read. “It was like, ‘I’m not pretty. Nobody likes me. I don’t fit in,’ ” she recalled. Though Lori knew Anna would be furious at her for snooping, she confronted her. “We’re going to get you to talk to a counsellor,” she said. Lori searched in ever-widening circles to find a therapist with availability until she landed on someone in Boulder, more than two hours away. Anna resisted the idea, but once she started she was eager to keep going.

Nonetheless, the conflicts between Anna and her parents continued. “A lot of it had to do with our fights over that stupid phone,” Lori said. Anna’s phone access became contingent on chores or homework, and Lori sometimes even took the phone to work with her. “I mean, she couldn’t walk the horse to the barn without it,” Lori said. Lori understood that the phone had become a place where her daughter sought validation and community. “She’d post something, and she’d chirp, ‘Oh, I got ten likes,’” she recalled. Lori asked her daughters-in-law to keep an eye on Anna’s Instagram, but Anna must have realized, because she set up four secret accounts. And, though Lori forbade TikTok, Anna had figured out how to hide the app behind a misleading icon.

As Anna grew older, she became somewhat isolated socially. At school, jocks reigned and some kids had started drinking, but Anna was straitlaced and not involved in team sports. Still, there was good news. Early in her senior year, in the fall of 2020, she landed the lead in the school play and was offered a college rodeo scholarship. “But anxiety and depression were just engulfing her,” Lori said. Like many teen-age girls using social media, she had become convinced that she was ugly—to the point where she discounted visual evidence to the contrary. “When she saw proofs of her senior pictures, she goes, ‘Oh, my gosh, this isn’t me. I’m not this pretty.’ ” In her journal, she wrote, “Nobody is going to love me unless I ‘look the part.’ I look at other girls’ profiles and it makes me feel worse. Nobody will love someone who’s as ugly and as broken as me.”

Because senior year was unfolding amid the disruptions of the COVID pandemic and everyone was living much of their lives online, her parents decided to be more lenient about Anna’s phone use. Soon, she was spending much of the night on social media and saying she couldn’t sleep. Shortly before Thanksgiving, Lori and Avery went to Texas to visit their eldest son, Cameron, and his wife and young son. Anna was going to go, too, but changed her mind because of the risk of getting COVID close to the play’s opening night. Rather than leave her alone, Anna’s parents had her stay with her other brother, Caleb, who was nine years older and lived near the family ranch with his wife.

Lori understood that the phone had become a place where her daughter sought validation and community.

In Texas, there was happy news: another grandchild was on the way. During a family FaceTime on Sunday, November 15th, Anna seemed thrilled at becoming an aunt for the second time. Afterward, she went to the ranch to check on the chickens. Caleb’s wife asked if she’d be back for dinner, but Anna said she’d stay put, given that her parents would return that night.

Lori and Avery were driving back to Colorado when a neighbor’s number popped up on Lori’s phone. She didn’t answer until the second call. The neighbor was too upset to say what had happened. Lori asked about Anna, and the neighbor kept repeating, “I’m sorry.” It turned out that she’d heard from a teacher whom Anna had phoned in distress and, when the neighbor went to check on Anna, she discovered that she’d shot herself. Now a sheriff’s deputy had arrived. Avery got him on the phone and asked, “Just tell us, is she alive?” The deputy replied, “No.” The Schotts drove on in terrible silence. As they crossed the prairie, a shooting star fell straight ahead of them.

Anna was buried on a hill at the family ranch. One of the rodeo cowboys brought a wagon to carry her ashes, and thirty other cowboys rode behind her. The ceremony has been viewed online more than sixteen thousand times. “So you know she made an impact,” Lori said.

Several months later, Lori heard from Cameron, who had read about the congressional testimony of a former product manager at Facebook named Frances Haugen. Haugen, who also released thousands of the company’s internal documents to the Securities and Exchange Commission and to the Wall Street Journal, claimed that the company knew about the harmful effects of social media on mental health but consistently chose “profit over safety.” (Meta—the parent company of Facebook and Instagram— has disputed Haugen’s claims.) Until Lori watched the testimony, she hadn’t really considered the role of social media in Anna’s troubles. “I was too busy blaming myself,” she recalled.

Lori began delving into Anna’s social-media accounts. “I thought I’d see funny cat videos,” she said. Instead, the feeds were full of material about suicide, self-harm, and eating disorders: “It was like, ‘I hope death is like being carried to your bedroom when you were a child.’” Anna had told a friend about a live-streamed suicide she had viewed on TikTok. “We have to get off social media,” she’d said. “This is really horrible.” But she couldn’t quit. A friend of Anna’s also told Lori that Anna had become fixated on the idea that, if her parents knew how disturbed she was, she’d be hospitalized against her will. That prospect terrified her.

Lori soon learned that other parents were suing tech companies and lobbying federal and state governments for better regulation of social media. Eventually she decided to do the same. “It takes litigation to pull back the curtain,” she explained. “I want those companies to be accountable. I don’t care about the money—I want transformation.”
Hundreds of lawsuits have been filed in relation to social-media platforms, including Facebook, Instagram, Snapchat, and TikTok. Families are not the only plaintiffs. Last year, Seattle’s public-school district sued multiple social-media companies alleging harm to its students and a resulting strain on district resources. Attorneys general for forty-one states and the District of Columbia have sued Meta for harming children by fuelling social-media addiction. Both the United Kingdom and the European Union have recently enacted legislation that heightens companies’ responsibility for content on their platforms, and there is bipartisan support for similar measures in the United States. The surgeon general, Vivek H. Murthy, has called for a warning label on social-media platforms, stating that they are “associated with significant mental health harms for adolescents.”

Still, research paints a complex picture of the role of technology in emotional states, and restricting teens’ social-media use could cause harms of its own. Research accrues slowly, whereas technology and its uses are evolving faster than anyone can fully keep up with. Caught between the two, will the law be able to devise an effective response to the crisis?

Facebook’s founding president, Sean Parker, stated in 2017, “God only knows what it’s doing to our children’s brains.”

Between 2007 and 2021, the incidence of suicide among Americans between the ages of ten and twenty-four rose by sixty-two per cent. The Centers for Disease Control found that one in three teen-age girls considered taking her life in 2021, up from one in five in 2011. The youth-suicide rate has increased disproportionately among some minority groups. Rates are also typically higher among the L.G.B.T.Q. population, teens with substance-abuse issues, and those who grow up in a house with guns.

Rates of depression have also risen sharply among teens, and fifty-three per cent of Americans now believe that social media is predominantly or fully responsible. Most American teen-agers check social media regularly; more than half spend at least four hours a day doing so. A 2019 study by researchers at Johns Hopkins University reported that spending more than three hours a day online can lead adolescents to internalize problems more, making it harder to cope with depression and anxiety. The authors of a 2023 report found that reducing social-media exposure significantly improves body image in adolescents and young adults. If you cannot distinguish between the “real” world and the virtual world, between what has happened and what is imagined, the result is psychic chaos and vulnerability to mental and physical illness. Facebook’s founding president, Sean Parker, stated in 2017, “God only knows what it’s doing to our children’s brains.”

Social media acts on the same neurological pleasure circuitry as is involved in addiction to nicotine, alcohol, or cocaine. Predictable rewards do not trigger this system nearly as effectively as unpredictable ones; slot-machine manufacturers know this, and so do social-media companies. “Teens are insatiable when it comes to ‘feel good’ dopamine effects,” a Meta document cited in the attorneys general’s complaint noted. Instagram “has a pretty good hold on the serendipitous aspect of discovery. . . . Every time one of our teen users finds something unexpected their brains deliver them a dopamine hit.” Judith Edersheim, a co-director of the Center for Law, Brain & Behavior, at Harvard, likens the effect to putting children in a twenty-four-hour casino and giving them chocolate-flavored bourbon. “The relentlessness, the intrusion, it’s all very intentional,” she told me. “No other addictive device has ever been so pervasive.”

Social-media platforms harness our innate tendency to compare ourselves with others. Publication of the number of likes, views, and followers a user garners has made social-media platforms arenas for competition. Appearance-enhancing filters may make viewers feel inadequate, and even teen-agers who use them may not register that others are doing the same. Leah Somerville, who runs the Affective Neuroscience and Development Lab, at Harvard, has demonstrated that a thirteen-year-old is likelier to take extreme risks to obtain peer approval than a twenty-six-year- old, in part because the limbic system of the adolescent brain is more activated, the prefrontal cortex is less developed, and communication between the two areas is weaker.

In 2017, the Australian discovered a Facebook document which, seemingly for advertising purposes, categorized users as “stressed,” “defeated,” “overwhelmed,” “anxious,” “nervous,” “stupid,” “silly,” “useless,” and “a failure.” The Wall Street Journal’s reporting on Meta’s internal documents indicated that management knew that “aspects of Instagram exacerbate each other to create a perfect storm”; that nearly one in three teen-age girls who felt bad about their bodies said that “Instagram made them feel worse”; that teen-agers themselves “blame Instagram for increases in the rate of anxiety and depression”; that six per cent of U.S. teens reporting suicidal ideation attributed it to Instagram; and that teen-agers “know that what they’re seeing is bad for their mental health but feel unable to stop themselves.” Last year, a former director of engineering at Facebook, Arturo Béjar, told Congress that almost forty per cent of thirteen-to- fifteen-year-old users surveyed by his research team said that they had compared themselves negatively with others in the past seven days.

Adam Mosseri, the head of Instagram, recently announced a set of safeguards designed to protect younger users, including making their profiles private and pausing their notifications at night. Still, social-media companies have been slow to enact meaningful overhauls of their platforms, which are spectacularly profitable. A notorious leaked Meta e-mail quoted in the attorneys general’s lawsuit announces, “The lifetime value of a 13 y/o teen is roughly $270 per teen.” Public-health researchers at Harvard have estimated that, in 2022, social-media platforms generated almost eleven billion dollars of advertising revenue from children and teen-agers, including more than two billion dollars from users aged twelve and younger. Proposed reforms risk weakening the grip on young people’s attention. “When depressive content is good for engagement, it is actively promoted by the algorithm,” Guillaume Chaslot, a French data scientist who worked on YouTube’s recommendation systems, has said. The Norwegian anti-suicide activist Ingebjørg Blindheim has described the dynamic as “the darker the thought, the deeper the cut, the more likes and attention you receive.”

In most industries, companies can be held responsible for the harm they cause and are subject to regulatory safety requirements. Social-media companies are protected by Section 230 of the 1996 Communications Decency Act, which limits their responsibility for online content created by third-party users. Without this provision, Web sites that allow readers to post comments might be liable for untrue or abusive statements. Although Section 230 allows companies to remove “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” material, it does not oblige them to. Gretchen Peters, the executive director of the Alliance to Counter Crime Online, noted that, after a panel flew off a Boeing 737 max 9, in January, 2024, the F.A.A. grounded nearly two hundred planes. “Yet children keep dying because of Instagram, Snapchat, and TikTok,” she said, “and there is hardly any response from the companies, our government, the courts, or the public.”

One may sue an author or a publisher for libel, but usually not a bookstore. The question surrounding Section 230 is whether a Web site is a publisher or a bookstore. In 1996, it seemed fairly clear that interactive platforms such as CompuServe or Usenet corresponded to bookstores. Modern social-media companies, however, in recommending content to users, arguably function as both bookstore and publisher—making Section 230 feel as distant from today’s digital reality as copyright law does from the Iliad.

No court has yet challenged the basic tenet of Section 230—including the Supreme Court, which last year heard a case brought against Google by the father of a young woman killed in the 2015 ISIS attacks in Paris. lawsuit The argued that YouTube, a subsidiary of Google, was “aiding and abetting” terrorism by allowing ISIS to use the platform. Without addressing Section 230, the Justices ruled that the plaintiff had no claim under U.S. terrorism law. This summer, the Court declined to hear another case involving Section 230, though Justices Clarence Thomas and Neil Gorsuch dissented, indicating that they thought the section’s scope should be reconsidered. “There is danger in delay,” Thomas wrote. “Social-media platforms have increasingly used §230 as a get-out- of-jail free card.”

Advances in communications technology have always been disruptive. The historian Anne Applebaum points out that the invention of movable type enabled the religious wars of the seventeenth century, and that broadcast media fed both communism and fascism. She believes that it takes a generation to learn how to negotiate any new communication system. In Sam Shepard’s 1976 play “Angel City,” a character’s observations sound much like those of children in thrall to social media: “I look at the screen and I am the screen. I’m not me. I don’t know who I am. I look at the movie and I am the movie. I am the star. . . . I hate my life when I come down. . . . I hate being myself in my life which isn’t a movie and never will be.”

Safety and freedom lie at opposite ends of a spectrum. To deprive a child of all access to the Internet would be draconian and also impractical, given that young people are more tech-savvy than their parents. Teen-agers need privacy, but when their methods for hiding far outstrip their parents’ capacity to monitor their activity, children can die of that privacy.

Predictable rewards do not trigger neurological pleasure circuitry nearly as effectively as unpredictable ones; slot-machine manufacturers know this, and so do social-media companies.

When Chris Dawley was a high-school freshman, in the nineteen-seventies, in Wisconsin, his twenty-seven- year-old brother, Dave, troubled and addicted to cocaine, was working in Las Vegas. One day, Dave called their father asking to come home. After their father insisted that he be drug-free first, Dave shot himself. Years later, Chris shared the story on a first date with the woman who was to become his wife, Donna; she told him about the death of her fiancé in a workplace accident. Grief was a nexus of intimacy from the time they met, and Donna now thinks that this helped them after their son C.J. killed himself, in 2015, when he was seventeen. “We came together—we didn’t come apart,” she told me when I visited them, in Salem, Wisconsin. “We’d been through so much heartache before.”

From an early age, C.J. was remarkable. At two, he found a screwdriver and removed the doors of several kitchen cabinets. When he was thirteen, Donna came home to find a new computer all in pieces on her bedroom floor; he reassembled it completely by bedtime. Just three points short of a perfect score on his ACT, his mother recalled, C.J. was wooed by a recruiter from Duke; he wanted to be a nuclear engineer. He had a busy social life and was popular with girls. His high-school yearbook named him “funniest person.”

C.J. was given a smartphone his freshman year of high school. He spent a lot of time playing games and chatting with friends on Facebook, Instagram, and Snapchat. His mother recalled, “His phone was his life. We couldn’t sit at the table and have a meal without it.” He got a buzz when his posts received attention. Tall and handsome, he liked posting selfies, but, after commenters judged him skinny, he became self-conscious and determined to gain weight.

The girl to whom C.J. was closest was one he had met online; he told her, “Every time I think of going to college and the future, I just want to kill myself.” She told him to talk to someone. He said, “Well, I’m talking to you.” Neither of them knew what to do. He complained of terrible aches in his legs, which his doctor said were growing pains. He didn’t apply to Duke and was considering joining the Navy.

On the first Sunday of 2015, the family was taking down Christmas ornaments, but C.J. was still in bed, having been on his phone past four in the morning. His father woke him up, and C.J. came out, but said he was too tired to help. His sister Shannon chewed him out and he retreated to his room in a sulk. At lunchtime, Donna warmed up leftover lasagna. She often took C.J. a plate in his room, but she figured he’d join them when he was ready. “That hurts so bad,” she said. “Maybe, if I would have brought him a plate of lasagna, he would have talked to me. I have not made lasagna since. I can’t.”

Around ten that night, Shannon went into C.J.’s room. He had shot himself. Rigor mortis had set in, and, though he had dropped the gun, his phone was still clutched in his hand. He had been using it to signal what he was about to do, texting a friend “Godspeed,” and posting, “Who turned out the lights?” on Facebook. “He couldn’t even kill himself without posting about it,” Donna said. On the back of a college-acceptance envelope, in shaky pencil, C.J. had written a note that began, “I don’t want you to think this is all your fault. It’s not.” He continued, “There’s a lot you don’t know about me. What goes on inside my head scares me. I try to be a good person. It’s just as if I’m running in a dark tunnel, running after the light so I can be happy, but my legs are tired. And what’s a man to do when the light goes out?”

For the next five years, Donna and Chris searched for an explanation. They talked to all C.J.’s friends. It was only when they saw Frances Haugen’s Facebook testimony that they began scrolling through C.J.’s phone. He had not been sleeping more than three or four hours a night for months. They eventually filed suit against Meta and Snap, the company that owns Snapchat.

Earlier this year, Donna and Chris took me to C.J.’s room. Donna showed me a gold trophy he’d won. She produced certificates for all kinds of prizes, his old Teddy bear, and the longbow she’d bought for him his final Christmas. The hunting rifle his parents had given him for his thirteenth birthday was no longer there, though, and the section of the carpet that had been soaked with blood was cut out. It was mid-January, and the Christmas decorations were still in place. “The house looks so pretty with them up,” Donna explained. It looked as it did the last morning she woke up with an intact family.

Chris said that filing the lawsuit had brought them friendships with other bereaved parents. They all wonder how we went from being a society where children below a certain age couldn’t see “Jaws” to one where they can watch all the porn they want—from one where you could check that your children weren’t “in with the wrong crowd” to one where you cannot even see what virtual crowd your children are mingling with. “You’re not going to let your kid run around with a sharp knife?” Chris said. “Then don’t let your kid get onto these sites until they’re eighteen. I thought I was a good and responsible father. I checked around the house and locked the doors every night, making everything nice and safe.” He lowered his head into his hands. “I didn’t understand that the lion was already inside the house.”

“I’ve never encountered clients less concerned about money and more concerned about justice,” Bergman said—but he will not let you leave the room until you are as sickened by their plight as he is.

In a three-decade legal career specializing in litigation on behalf of asbestos-exposed mesothelioma patients, Matthew Bergman won a total of more than a billion dollars for his clients. By 2021, he was looking for a new cause, and, not long after the Facebook leaks, he left the firm where he was a partner and established the Social Media Victims Law Center. The S.M.V.L.C. is named as counsel on hundreds of cases involving young people who he believes have died or been severely harmed because of social media. Harried, exhausted, and righteous, he advertises for clients, zigzagging across the country to spend time with them. (The four main stories in this article all feature families represented by the S.M.V.L.C.)

Bergman’s clients will pay him only if they win their cases. “There’s nothing like a contingent fee to make you think creatively about legal concepts,” he told me when I met him. To circumvent Section 230, he and other lawyers have settled on a strategy of targeting not the platforms’ dissemination of problematic content—which would treat them like traditional publishers—but their algorithms, which are proprietary and therefore arguably qualify as damaging products in themselves. Bergman’s clients do not expect substantial settlements—“I’ve never encountered clients less concerned about money and more concerned about justice,” he said—but he will not let you leave the room until you are as sickened by their plight as he is. “You talk to a mom who describes cleaning her son’s brain out of her hair,” he said. “Often, the link between the child’s social-media use and their harm is irrefutable.”

More than a dozen firms and scores of lawyers are working on this matter. Among them is Jennifer Scullion, at the New Jersey firm Seeger Weiss, which is partnering with S.M.V.L.C. on a number of cases. The latest arguments expand the product-liability theory beyond the algorithm, naming a long list of features, including parental controls, age verification, and notification systems, as defective or dangerous. “The problem here is not the content itself, but the design of the platform,” Scullion argues.

I reviewed several hundred briefs and filings— thousands upon thousands of pages, their details at once depressingly repetitive and agonizingly individual, their language legalistic yet furious. Many lawsuits pertain to body image and anorexia content: “Instagram specifically targeted B.W. with . . . messaging and imagery that promoted extreme weight loss; glamorized harmful relationships to food, dieting, and exercise; and drastically undermined B.W.’s self-image and self-worth. . . . B.W.’s physician diagnosed her with bulimia, irregular menstruation, tachycardia, generalized anxiety disorder, major depressive disorder, suicidal ideations, and nonsuicidal self-harm.”

Sexual abuse and exploitation feature often in the allegations: A girl attempted suicide twice when she was eleven, having been contacted by men on Roblox and then Discord, one of whom inveigled her into sending explicit photographs on Snapchat, where photos disappear (though they can still be screenshotted). A boy was persuaded to send illicit pictures to a user, also via Snapchat, only for that person to send back a collage of the images as blackmail. One suit, citing a leaked Meta document in which an employee asserted that the People You May Know function had, in the past, “contributed up to 75% of all inappropriate adult-minor contact,” claims that, “incredibly, Meta made the decision to continue utilizing its user recommendation products regardless.” (Meta says it has developed technology to limit potentially suspicious accounts from interacting with teens.)

Bergman told me of a boy who had been deluged with TikTok videos telling him to jump in front of a train— which he did. Another young man, after a breakup, posted about his heartache and was sent videos telling him to blow his head off—which he did. “This is kids affirmatively being directed to suicidal content,” Bergman said. “It’s Orwellian. You can’t walk away from the bully. It’s like you’re running right into his fist.” Even parents who recognize social media as a problem can find themselves facing tragedy when they try to shield their children from it: “Sarah was extremely distraught after having her phone taking [sic] away and . . . thought that she could not live without Defendants’ social media products,” one complaint reads. “Sarah went upstairs, found her father’s gun, and shot herself in the head.”

“When depressive content is good for engagement, it is actively promoted by the algorithm,” Guillaume Chaslot, a French data scientist who worked on YouTube’s recommendation systems, has said.

Brandy and Toney Roberts, who live in New Iberia, Louisiana, have emerged as faces of the movement to protect children from social media, appearing on “60 Minutes” and other news programs. When they speak of their daughter Englyn, who died by suicide in September, 2020, at the age of fourteen, they conjure her so vividly that you expect to see her entering the room, grinning, bubbly, and demanding. One of Englyn’s friends put a recording of her laughter into a Build-a-Bear plushie that she gave to Brandy and Toney; if you lean on it, as I did by accident, peals of hilarity break out. On the living-room sofa, there is a pillow printed with a snapshot taken on a trip to Destin, Florida. Englyn is hugging Brandy, and her face is alight with glee. A week after the picture was taken, she hanged herself.

As the youngest in a family of seven siblings, half siblings, and stepsiblings, Englyn was doted on, even spoiled. Because she loved travel, she and her parents would get in the car and Toney would say, “East or west or north or south?,” and they would set out, once making it as far as New Mexico. Toney loved to indulge his daughter, taking her for a manicure she suddenly wanted, or getting food late at night. Brandy tried to set limits, and cautioned her daughter, whose bearing was confident, even regal, that people in their community, the Black community, didn’t like anyone who came across as full of themselves.

On the last Saturday in August, 2020, Toney made soup for dinner, but Englyn didn’t have much. “Baby girl, you didn’t eat,” Toney said. “My soup wasn’t good?” He suggested they order pizza, and he and Brandy and Englyn sat up late, eating. At around half past ten, Englyn asked her parents if they wanted to watch a movie. “Oh, no, baby girl, we tired,” Toney said. Before she went upstairs, her parents said, as always, “Love you,” and she said, “Love you,” and kissed them both.

Later that night, the mother of a friend of Englyn’s who had been texting with her got in touch with Brandy, urging her to check on her. Toney and Brandy were surprised to find Englyn’s door locked. When they got in, they didn’t see her, so Toney went to look behind the bed. “All of a sudden, I turned around and she was hanging right there,” he told me as we stood in Englyn’s bedroom. She had used an extension cord to hang herself from a door hinge. “Seems like an eternity passed, and I get her down, and Brandy starts CPR.”

When paramedics arrived, they detected a pulse, and Englyn was placed on life support in the hospital. “And you just praying and praying and praying for nine days, vigils outside, just everything you can humanly, possibly do,” Toney said. On September 7th, the doctors advised the Robertses to discontinue life support. “You ask yourself, ‘What is she feeling? Is she feeling anything?,’ ” Toney remembered. He and Brandy were joined at the hospital by his mother and two priests, and everyone said their final farewells.

Of all the bedrooms of children lost to suicide that I visited during my reporting, Englyn’s was the most meticulously preserved: every pair of shoes in the same place she had kept them, the bed still made and occupied by a row of Teddy bears. “She had these socks on that night,” Toney said. “This is where I found her. See that box? She stood up on that box.” The box was just behind the door, where Englyn must have placed it before putting the cord around her neck.

After Englyn died, Toney checked her phone. In the family videos he’d shown me, Englyn looks not just cheerful but joyous, exploding into laughter and song, yet she was taking pictures in which she was cutting herself and posting thoughts like “I’ve been feeling ugly lately.” After she turned fourteen, she posted a birthday photograph with the caption “Swipe to see my real shape”; the next picture showed a distortion of her figure. A longer post said, “One Day Ima Leave This World And Never Come Back, You Gone Cry When You See A Picture Of Me. . . . So You Need To Appreciate Me Before I’m Gone.”

Instagram’s algorithms had sent her increasingly troubling suicide content. In one post, a Black woman screamed, “Stop this madness. What do you want from me? What do you want? Please. Please.” The woman then pretended to hang herself with an electrical cord, just as Englyn eventually did.

Brandy had often checked Englyn’s phone, looking for inappropriate photos or bad language; it never occurred to her to check the videos Instagram was recommending. After Englyn died, Toney and Brandy found a hidden note on her phone: “I show ppl what they want to see but behind the social media life nobody knows the real me and how much I struggle to make sure everyone’s good even though I’m not.”

Brandy, who is a teacher, thinks that people need to know more about the technology they use daily. “In the Black community, low-income, where I teach, parents are not educated enough on any type of technology,” she said. “We thought we were two well-educated people. I want to educate the parents first and then the students: What’s an algorithm? What do these sites do?” Toney said, “How in the hell could this happen? How could man develop such a thing as an algorithm that trumps the parents’ love? How could a machine mean more to her than us?”

Some of the lawsuits currently pending allege that the content social-media algorithms push to users’ feeds is influenced by race. “J.A.J. has no interest in guns or gangs, yet Instagram and TikTok would often direct him to gun and gang-themed content,” a legal complaint from a Black father of three reads. “These defendants know of the algorithmic discrimination in their products, yet continue to allow those products to push disproportionately violent and sexual content to African American users.” When I mentioned to Brandy that suicide is rising rapidly among Black youths, she said she suspected that Black suicide had previously been under-reported. “People do get shocked when they see it’s a Black family,” she said. “But it’s not your poor families anymore, not your rich families—it can attack anybody.” Her desire to downplay race seems to reflect a concern that Englyn’s Blackness might allow other groups to feel distant from her plight. Still, at the end of our conversation, Toney drew a provocative historical comparison. “Zuckerberg is the new ‘massa,’ ” he said. “He put the lifetime value of a teenager at two hundred seventy dollars. The price of a slave in 1770 was two hundred sixty dollars.”

The coroner concluded that Molly had been so influenced by what she saw online that her death was not truly suicide but rather “an act of self harm whilst suffering depression and the negative effects of online content.”

Beeban Kidron, a British filmmaker who sits in the House of Lords, runs a foundation dedicated to protecting children online. When I met with her, in the Houses of Parliament, she told me that her crusade had gained momentum after the suicide, in 2017, of a fourteen- year-old Londoner named Molly Russell, who had viewed and saved thousands of posts about depression and suicide. Two years later, her father launched a campaign in her name and appeared in a BBC video ( “Instagram ‘helped kill my daughter’ ”). Shortly thereafter, Adam Mosseri, the head of Instagram, published assurances that the platform was determined “to protect the most vulnerable,” and the platform began to delete millions of images related to suicide and self-harm.

The official in charge of a judicial investigation into Molly’s death ordered Meta, WhatsApp, Snap, Pinterest, and Twitter to provide data from her accounts. “When it was first shown in the coroner’s court, there was shock and awe in the room,” Kidron told me. “People in tears, including the press gallery, who had been following this issue for years. No one had understood the bombardment.”

The investigation found social-media platforms partially responsible for Molly’s death. The presiding official concluded that Molly had been so influenced by what she saw online that her death was not truly suicide but rather “an act of self harm whilst suffering depression and the negative effects of online content.”

Material from Molly’s inquest was presented to British lawmakers as they considered legislation, and last year they passed the Online Safety Act, which imposes stringent content-regulation requirements on digital platforms. Failure to comply can result in fines of either eighteen million pounds or ten per cent of a company’s annual global revenue, whichever is greater. Kidron believes that social-media companies have long realized that this kind of regulation would come and have been rushing to profit while they could. As we sat in Parliament, she fumed, “Those bastards are making money on the backs of children, and the collateral damage of those vast fortunes is sometimes, literally, the life of those children. And they won’t change it without people sitting in buildings like this and telling them they’ve got to.”

Meanwhile, many U.S. states are taking matters into their own hands. Spencer Cox, the governor of Utah, where suicide is the leading cause of death among teen-agers, has introduced multiple acts to constrain social-media use, and Gavin Newsom, the governor of California, recently signed into law a bill requiring social-media companies to stop curating minors’ feeds to drive engagement and sending them notifications at night or during school hours. The states’ attorneys general that have filed suit against Meta allege that social-media companies claimed their products were safe even though they knew that they could exacerbate mental-health problems. This tactic echoes similar moves against tobacco companies a generation ago. The attorneys general are asking that the judge not only impose financial penalties but also prohibit the platforms from continuing to use features that harm young users.

By now, more than two hundred school districts and municipalities have filed suit against Meta and other social-media companies in connection with students’ declining mental health. A complaint filed by Bucks County, in Pennsylvania, asserts that the defendants’ platforms “hijack a tween and teen compulsion—to connect—that can be even more powerful than hunger or greed.” Among the costs attributed to social media in the Seattle school district’s complaint are damage to school property, a need for additional personnel to counsel disturbed students, and the expense of training teachers to recognize signs of social-media addiction. Even Meta shareholders have sued the company, claiming that they were misled about its efforts to moderate content and keep users safe.

In July, 2023, Senators Elizabeth Warren and Lindsey Graham proposed a commission to regulate online platforms. The commission could require companies to develop and adhere to meaningful content-moderation policies, and would establish a “duty of care” to users, obliging the companies to take reasonable measures to avoid foreseeable harm. Once a duty of care is legally established— stipulating, for instance, that landlords are responsible for fire safety and the removal of lead paint in properties they own—it becomes possible to sue for negligence.

For social-media companies, forestalling a duty of care is vital, as became clear last fall, when they sought to have much of the case against them dismissed in an ongoing federal litigation involving a panoply of plaintiffs. The presiding judge, Yvonne Gonzalez Rogers, grilled a TikTok attorney about whether the company had a duty to design a safe product. When he eventually said that, legally speaking, they didn’t, she said, “Let me write it down. No duty to design a platform in a safe way. That’s what you said.” There are questions of law and questions of decency, and even those who skirt the outer limits of the law attempt to keep up an appearance of probity. Here such a façade was not maintained.

“I believe with my whole heart, if she did not have social media, she would still be here,” Darla said.

Darla Gill told me that I would recognize her family’s place because their trailer, five hours from New Orleans, was a double-wide. Cozy and attractive, it was decorated primarily with hunting trophies belonging to her husband, Ryan. For years, the Gills moved wherever Ryan’s work, in pipeline construction, took him—Oklahoma, West Virginia, New Mexico, Ohio, Pennsylvania, Kansas—with Darla mostly homeschooling their three children. Looking to settle near Darla’s family, they chose this corner of Louisiana to start a farm, raising chickens commercially and pigs for show.

Their eldest child, Emma Claire, made friends wherever her family lived. Tall, with long blond hair, she could go from participating in a beauty pageant to driving a tractor. For her sixteenth birthday, her parents took her and a group of her girlfriends to Baton Rouge; in the summer of 2021, for her seventeenth birthday, she planned to visit Alabama to watch college football with her father.

In the Gills’ small community, two teen-agers had died by suicide in the previous thirteen months. Of one, Emma Claire had said to Darla, “Momma, how could she do that to her family?” Darla, who had experienced postpartum depression, said, “Have you ever felt depressed, where you felt like you couldn’t get over it?” Emma Claire said she had been sad, but never that sad. Darla said, “If you ever feel that way, you have to tell us. You have to tell someone.”

On the first Saturday in August, 2021, Emma Claire, then a high-school junior, sneaked out late to meet a boy. Darla, realizing she hadn’t given her daughter a kiss good night, went into her room and found nobody there. When Emma Claire came home, past two in the morning, her parents told her she was grounded, and confiscated her car keys and her phone. She handed them over without complaint.

Early the next morning, Ryan received a call from someone who needed a rooster feeder fixed, so he woke Emma Claire and asked her to feed the pigs. Darla went into Emma Claire’s room a few minutes later with a bra and asked if she needed it for church. Emma Claire said, “No, ma’am.” Later, when Darla went to the barn to tell Emma Claire it was time to leave for church, she saw her daughter’s legs on the ground. “She was propped up against a feedbag. I knew immediately she was gone,” Darla said. Emma Claire had fed the pigs, then shot herself in the head with a .22 that she had received for Christmas a few years earlier, which her father kept locked in his truck.

“All the things that you worry about— car wrecks, drowning, kidnapping—you just don’t think of suicide,” Ryan said. Darla added, “She wasn’t a depressed, loner kid. She was popular. She was outgoing. She was involved. She played basketball. She showed pigs. She won beauty pageants. She deer-hunted with her dad. Just an all-American girl. Everybody liked her except the girls whose boyfriends liked her.” Hundreds of people attended her funeral. “She did not get her wedding, but she packed that whole church for her funeral, the whole church,” Darla said.

In the note she left, Emma Claire wrote, “Momma, I’m sorry there’s a lot of things I wish I could explain to you. . . . I’m sorry I disappointed you. I can’t do it knowing I just let you down. . . . I love you all so much. I’m sorry I had to do this. I just can’t take it anymore.” Her parents still cannot think how she ever disappointed them. “When did she write that note?” Darla asked. “Did she write it after I woke her up that morning? Did she write it before she went to sleep, when she didn’t have her phone to occupy her?” Darla blames herself for taking away the phone. “It may have been what triggered her,” she said. “I don’t know. I didn’t know how it felt to be suicidal until my daughter’s suicide. I just want to sleep so I can dream about her.”

After police investigating Emma Claire’s death took her phone for inspection, her parents started thinking about the role it might have played. Emma Claire had Snapchat, because she said that that was how her basketball team communicated, and her parents let her use it on the condition that she give up TikTok and her extra Instagram account. In fact, she still had TikTok, and kept a secret Instagram account. When Darla saw an ad for the Social Media Victims Law Center, she thought maybe someone there could figure out what she and Ryan were missing. In 2022, the S.M.V.L.C. filed a suit on the Gills’ behalf, saying that design features of various social-media platforms created a “harmful dependency” and “proximately caused Emma Claire’s death.” (In response to requests for comment on cases described in this article, spokespeople for Meta, TikTok, Snap, and Google emphasized their commitment to safety for teens on their platforms.)

“I believe with my whole heart, if she did not have social media, she would still be here,” Darla said. “You feel very responsible that you allowed her to have it.” Ryan described his state of mind in metaphorical terms: “You’re constantly carrying that rock with you, that grief. When I first found out, it’s like you had a big old boulder on you, couldn’t even sit up. A few days later, you could get up on a knee. Maybe a week later, you could stand up with it. Eventually, you could walk. But I’m still carrying that boulder.” The family is deeply Christian, and Ryan used to lead the singing at his church, but his voice now fails him.

Darla and Ryan’s son, Burch, developed a terror of the dark and began sleeping in his parents’ bed. Though Darla knew he should be in his own bed, she was relieved to know where he was. One night, their young daughter, Rylan, started crying and ended up in the bed, too. “And I had both of them on each side of me, and I thought, This is the most content that I can ever be, having them here and knowing they’re both alive,” Darla told me. Just before I visited, Burch had shot a blackbird with his pellet gun, then had been overcome with regret. “I shot it and now it’s gone, just like Emma Claire,” he said.

One of the parents I spoke to suggested that finding that their child had killed themselves after some devastating trauma would, paradoxically, bring a kind of relief: “Then it would make sense.”

A parent who loses a child to suicide must deal with three simultaneous griefs: anguish at losing someone you love; self-blame at the thought that you might have failed as a parent; and, worst of all, bewilderment, the impossibility of making sense out of what happened. If you can find a cause, you can address it. One of the parents I spoke to suggested that finding that their child had killed themselves after some devastating trauma would, paradoxically, bring a kind of relief: “Then it would make sense.” For many of the parents I spoke to, activism provided a way to reëstablish meaning in a world that now seemed to have none. Norma Nazario, a single mother who lives in Manhattan, in Alphabet City, lost her son, Zackery, in 2023—not to suicide but to subway surfing, which he took up after seeing people doing it on social media, and then becoming obsessed with posting risky videos of his own. When we spoke, I was amazed at how measured she managed to sound, and she explained, “Anger is activism—it is how I cope with the world. Sadness is something I allow myself only in this apartment, because if I let the sadness well up, it would paralyze me, and then I couldn’t do the work I need to do. Anger lets me fight the problems that sadness lays bare.”

It is easy to suppose that blaming social media could be a way for parents to stop blaming themselves, but I never felt this with the parents I met; there was still plenty of room for self-blame. Activism was neither vengeful nor self-justifying; saving other people’s children was simply the best means of surviving one’s own loss. Although the world is sympathetic to grief, there is less grace for the confusion parents feel as they try to decipher a story that will never make sense. Kierkegaard proposed that life must be understood backward but lived forward. However, these lives cannot be understood backward, and yet they must move forward anyway. Comforting people is easy; not comforting them (because there is no comfort to be found) is much harder.

Almost every time a suicide is mentioned, an explanation is offered: he was depressed; her mother was horrible; they lost all their money. It would be foolish to ignore contributing factors, but it is equally absurd to pretend that any suicide makes sense because of concrete woes. This is also true when it comes to social media. It is surely a factor in many of these deaths, and substantial regulatory interventions, long overdue, may bring down the suicide rate in some populations, especially the young. Nonetheless, research has failed to demonstrate any definite causal link between rising social-media use and rising depression and suicide. The American Psychological Association has asserted that “using social media is not inherently beneficial or harmful to young people,” and a community of scientists, many of them outside the United States, has published research underscoring the absence of a clear link. Gina Neff, who heads a technology-research center at the University of Cambridge, told me, “Just because social media is the easy target doesn’t make it the right target.”

Andrew Przybylski, a psychologist at the University of Oxford, has observed that decreasing life satisfaction among youths between the ages of ten and twenty usually led to an increase in social-media use. “But the opposite isn’t necessarily true,” he writes. “In most groups, the more time a child spends on social media doesn’t mean their life satisfaction will decrease.” Working with have Amy Orben, a Cambridge psychologist, Przybylski has also noted that lower life satisfaction correlates slightly more strongly with wearing glasses than with digital-technology use.

Orben has grown cautious in what she writes, however, because social media’s defenders have cited her work to assert that the platforms are safe—which she has never contended. “The research evidence describes averages,” she explained to me. “Suicide relates to individuals. There’s a disconnect between those two, because you’re averaging across the heterogeneity that makes us human.” She is at pains to emphasize that tech is not the only thing that has happened in the past fifteen years. “The world is on fire,” she said, adding that politicians have focussed on social media because it makes for a simple, popular target. “It is a lot easier to blame companies than to blame very complex phenomena.” The British political theorist David Runciman suggested to me that children’s online interactions aren’t very different from those of adults and that the difference lies in young people’s lack of agency. “They feel powerless about climate change, war, misery,” he said. “That is a toxic combination: permissionless access to information, and relative powerlessness over the topics to which that information pertains.”

I have conducted dozens of interviews with young survivors of suicide attempts, and few mentioned social media as a factor. They pointed to a sense of impotence and purposelessness; climate change; the brutal language of modern politics; intolerance for their gender, race, or sexuality; bleak financial prospects and diminished social mobility; an inability ever to feel that they had caught up, as though their brains were slower than their lives; and acute loneliness, even among those who appeared not to be lonely.

Reductive models cloud the issue, providing false reassurance to many (“My child never uses Instagram”) and piling anguish on people who have lost children (“If only I’d kept her off Instagram”). In North America, rates of depression and anxiety in young people have been rising for at least eighty years. “Why weren’t people in the 1980s or ’90s asking why adolescent depression was at an all-time high?” the Johns Hopkins psychologist Dylan Selterman has written. “This isn’t new. And it’s going to keep getting worse in the absence of major cultural adjustments. We aren’t a mentally healthy society, and we haven’t been for a very long time.” The British psychologist Peter Etchells has written that, rather than considering social media a cause of mental-health difficulties, “it’s more useful to consider them as a lens through which pre-existing issues and inequalities are either dampened or intensified.”

Laurence Steinberg, a Temple University expert on the psychology of adolescence, has outlined three potential causative scenarios: social media causing mental disturbance; mental disturbance leading people to overuse social media; or some unrelated issue boosting both mental disturbance and social-media use. “All of these interpretations are reasonable,” he writes. “Given the widespread eagerness to condemn social media, it’s important to remember that it may benefit more adolescents than it hurts. . . . If other factors that have contributed to the rise in adolescent depression are being overlooked in the rush to point the finger at Facebook, we may be contributing to the very problem we hope to solve.”

A McKinsey Health Institute survey of forty-two thousand people in twenty-six countries found that social-media engagement may facilitate mental-health support and connection. Young people can play games on Discord and catch up with friends on Instagram and Snapchat. Those who are isolated find like-minded people. Immigrants build community with others who share their language and culture. Gina Neff, the Cambridge technology researcher, grew up in the hill country of eastern Kentucky. “Kids who are gay in Appalachia find the Internet and it is a lifesaver,” she said.

Smartphones can save lives in other ways. Randy P. Auerbach, a clinical psychologist at Columbia University, has been using phone tracking to monitor suicide risk. He measures changes in sleep patterns (many teens look at their phones right before and after sleep), changes in movement (depressed people move less), and changes in vocabulary and punctuation (people in despair start using personal pronouns more often). Matthew Nock, a clinical psychologist at Harvard and a MacArthur Fellow, has been examining the relationship between text-message frequency and mental vulnerability. Most suicides today, he said, come at the end of a “trail of digital bread crumbs.” Young people who are not responding to their peers—or whose messages no longer receive responses—may be in trouble. Nock’s research team uses cell-phone tracking to determine when people are at highest risk and calls or messages them. “We haven’t equipped the field with the tools to find, predict, and prevent suicide, in the way we’ve done for other medical problems,” he said. “We just haven’t developed the tools, other than to ask people, ‘How are you doing? Are you hopeless? Are you depressed? Do you think you’re going to kill yourself?,’ which is not a very accurate predictor. We should be taking advantage of advances in computing and statistics to do what human brains can’t.”

Meta has programs to pick up troubling posts and has notified emergency services about them. But Nock points out that social-media companies may be afraid that, if they put systems in place and those systems fail, their liability could be enormous. Meanwhile, suicide research in the U.S. receives two-thirds less funding than research on other major causes of mortality. Research review boards often scrutinize suicide-study proposals particularly stringently, because of the risk of a subject’s attempting suicide. “We’re not blocking oncology research because people are dying from cancer,” Nock said. “We want to do our best not to make people more suicidal, but the fact that people are going to die—why is it treated so differently?”

“How do you describe not washing your child’s clothes anymore?” Toney Roberts said. “That you don’t watch them get off the bus anymore? That you don’t help them with homework anymore?”

In recent years, congressional committees have held many hearings on the adverse effects of social media, and legislation has passed out of committee repeatedly without becoming law. But the Kids Online Safety Act has made more progress than previous efforts, having cleared the Senate; a narrower version of the bill now awaits a vote in the House. It has been subject to opposition by free-speech groups, including the A.C.L.U., which is concerned that screening out harmful content may also deny teens access to content they badly need, such as information about abortion.

A year ago, several families I’d met with joined a delegation that visited members of Congress and their staffers. Each family had three minutes to tell their story. “How do you describe not washing your child’s clothes anymore?” Toney Roberts said. “That you don’t watch them get off the bus anymore? That you don’t help them with homework anymore?” Brandy was deeply frustrated. “The group that we’re in, we are all educated, we all loved our children,” she said. “In front of Congress, we felt like a child begging for our parents to do the right thing, begging them to hold these companies accountable.”

Nonetheless, seventy-two senators signed on as co-sponsors of the bill, and, in late January, the families travelled to Washington again, to watch the Senate Judiciary Committee quiz five C.E.O.s: Mark Zuckerberg, of Meta; Shou Zi Chew, of TikTok; Evan Spiegel, of Snap; Linda Yaccarino, of X; and Jason Citron, of Discord. Before the hearing, some of the families gathered for breakfast at the Army and Navy Club. Though what they had in common was horrifying, there was a relaxed energy of acceptance. Their stories were so unbearable that they went through life having to choose whether to share or to elide them; here they had no need to do either.

The room in the Dirksen Senate Office Building where the hearing took place holds about five hundred people. As we filed in, I saw more families I knew. Each had brought a large photo of their dead child, which they carried at once protectively and casually. When everyone was seated, the C.E.O.s came in. As they entered, sixty or so families held up their photos, and a silence fell. To be in a room with sixty bereaved families is a solemn experience, but the C.E.O.s seemed largely unmoved. Some appeared nervous about their own testimony, but the accumulated weight of tragedy seemed not to register with them.

The room was divided with almost diagrammatic precision. The back was packed with people barely able to contain their searing grief; the middle contained tech folks manifesting no emotion whatsoever; in the front were the senators, a few of whom seemed genuinely engaged but most of whom were clearly calibrating the amount of righteous indignation that would make a viral clip. Real emotion; no emotion; fake emotion.

Lindsey Graham spoke directly to the executives: “Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people.” Any feeling that we were about to witness historic progress wilted in the hours that followed. The C.E.O.s gave banal opening statements about their good intentions. Zuckerberg, whose attitude seemed one of barely contained irritation, said, “Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated.” He that said Meta was “on the side of parents everywhere working hard to raise their kids.” Chew maintained that the average American TikTok user is older than thirty, sidestepping the question of the platform’s popularity among teens. Yaccarino, sheltering behind the rebranding of Twitter, announced, “X is an entirely new company, an indispensable platform for the world and for democracy,” and added that it was seldom used by anyone under seventeen.

The senators had numbers to hand. Mike Lee reported that seventeen per cent of minors using Discord had had sexual interactions on the platform. Richard Blumenthal reminded Zuckerberg that he had personally rejected a request for some eighty engineers to insure well-being and safety, then asserted that the additional staff would have cost Meta “about fifty million dollars in a quarter when it earned $9.2 billion.” Josh Hawley cited the survey of Instagram users from thirteen to fifteen: thirty-seven per cent reported being exposed to unwanted nudity; twenty-four per cent had been propositioned.

The C.E.O.s lobbed back numbers of their own. “Fifteen per cent of our company is focussed on trust and safety,” Citron said. “That’s more people than we have working on marketing and promoting the company.” Zuckerberg said Meta’s “prevalence metrics” suggested that A.I. was automatically removing “ninety-nine per cent or so” of inappropriate content; also that the company had flagged more than twenty-six million instances of predatory behavior. He repeatedly maintained that Meta does not allow children younger than thirteen onto its platforms. “So if we find anyone who’s under the age of thirteen, we remove them from our service,” he said, prompting a bitter laugh from the parent group.

Gretchen Peters, of the Alliance to Counter Crime Online, was incensed by the executives’ endlessly repeated statistics. Referring to Zuckerberg’s claim about the ninety-nine-per-cent success rate of Meta’s A.I. systems, she said, “If you found out that ninety-nine per cent of McDonald’s hamburgers were not made with dog poop, how often would you take your children there?”

Later in the day, there was a rally by the Capitol, staged by younger activists, many of whom had been exploited online or lost a friend to suicide. It was freezing cold, and the event, though mournful, was marked by an oddly uplifting, furious exuberance. Some of the senators who had spoken earlier also made remarks at the rally, as did the attorney general of New Mexico, the Facebook whistle-blower Arturo Béjar, and a youth activist named Zamaan Qureshi, who announced that a “coalition of young people” was determined “to take back control of our lives from social-media companies.” Arielle Geismar, George Washington University’s student-body president at the time, said, “We were forced to play a dopamine slot machine at the expense of our life.”

A few parents who had attended the hearings joined, standing behind the event’s speakers. At the end, I approached Lori Schott, Anna’s mother, and said that the day must have been exhausting for her. She pointed to the Capitol dome. “I brought Anna and a friend to Washington a few years ago, because I thought they should see our great country’s capital,” she said. “They were running up and down those stairs.” She swiped through her phone until she found a photograph. Once she’d shown me, her voice changed, becoming almost expressionless. “I never expected to be back here for this,” she said.