Who was Molly Russell

Steve Rosenbaum
4 min readOct 6, 2022

--

Molly Rose Russell was by all accounts a normal 14-year-old schoolgirl.

She had the interests and hobbies of a typical teenager: music from “Hamilton,” the band 5 Seconds of Summer, a lead role in her school play. A “positive, happy, bright young lady who was indeed destined to do good,” explained her father, as reported in The Guardian. Molly’s social media activity — music, fashion, jewelry, Harry Potter — was what you might expect of a teenage girl.

But slowly, something changed. Her father told her the family was concerned about her, but she said simply it’s “just a phase I’m going through.” In weeks after that conversation, Molly and her family had dinner together and then sat down to watch TV in the living room.

At 7 a.m. the next day, Molly’s mother opened the door to her bedroom and found her daughter’s body. Molly had taken her own life.

Teen suicide isn’t something we usually talk about. The media resists reporting suicide, treating it often as a private matter. And for parents of teens — I’ve had two of them — it’s almost impossible to determine the line between normal teen troubles and the downward spiral that results in heartbreaking despondency and death.

Stop here for a moment. Molly’s story is different. And it shines a powerful light on what more parents and teens are dealing with than you probably know.

Molly died in November 2017. At a coroner’s inquest held last week in London, the dark world that Molly had found herself in was laid out in stark relief, concluding “a legal battle that pitted the Russell family against some of Silicon Valley’s largest companies,” according to The New York Times.

On Instagram, Molly viewed a stream of dark content, including videos related to suicide, depression and self-harm. In total, Molly binged-watched 138 videos that contained suicide and self-harm content. Of 16,300 pieces of content Molly saved on Instagram in the six months before she died, 2,100 were related to depression and suicide. She last used her iPhone to access Instagram on the day of her death, at 12:45 a.m.

Coroner Andrew Walker told the court, after conducting a thorough investigation that social networks like Instagram and Pinterest were “not safe” and “shouldn’t have been available for a child to see,” as reported by BBC News. Concluding it would not be “safe” to rule Molly’s cause of death was suicide, Mr. Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content,” Walker was clear on who he held responsible for Molly’s death. “The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips, and text — some of which were selected and provided without Molly requesting them.”

These images weren’t sought out. They were algorithmically calculated and served to the 14-year-old. Princeton Professor of Psychology and Neuroscience Uri Hasson calls this behavior no different than a drug dealer. He says without ambiguity that social networks are addictive.

This is a terrible story. And the coroner’s ruling comes after Francis Haugen’s important efforts to expose internal research on the harm Instagram could cause teen girls, and Tristan Harris’s efforts to expose the internal efforts to amplify engagement at Facebook at any cost.

But Walker’s ruling connects two disturbing dots, showing that Instagram is doing what Molly’s father Ian Russell called “monetizing misery.”

So let’s look at the data.

The Journal of Psychiatry reports in the article “Social media, internet use and suicide attempts in adolescents” that “there is an independent association between problematic use of social media/internet and suicide attempts in young people.”

The article goes on, “Themes such as self-loathing, loneliness, and feeling unloved were found in content analysis of 3360 randomly selected posts from 17 depression-related accounts on Tumblr. There is an association between comments on Instagram with increasing severity of self-injury, suggesting social media may act to reinforce harmful behaviors.”

Meta, the corporate owner of Instagram, disputes these findings, saying “It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls.”

But testifying before the UK coroner’s inquest, Meta’s Elizabeth Lagone was less clear.

“Do you think this type of material is safe for children?” Oliver Sanders, the Russell family attorney, asked on cross-examination. “I think it is safe for people to be able to express themselves,” replied Lagone. “So you are saying yes, it is safe or no, it isn’t safe?” asked Sanders. “Yes, it is safe,” Lagone replied.

But father Ian Russell doesn’t buy it. He says Instagram “helped kill my daughter.”

After Molly took her own life, the Russells began to look at what the Instagram algorithm had been sending her.

One account Molly followed featured an image of a blindfolded girl hugging a teddy bear with the text: “This world is so cruel, and I don’t wanna see it any more.”

The Russells say that Molly was sent self-harm images even after her death.

--

--