In North London Coroner’s Court this week, Ian Russell, Molly’s father, accused Instagram of “helping to kill” his daughter just a few days before her 15th birthday. Russell believed that long-term exposure to harmful material contributed to Molly’s death.
She had viewed pictures and videos featuring suicide, drugs, alcohol, depression and self-harm while on social media. The inquest examined whether algorithms used by the social media firms to try to keep users hooked contributed to her death.
Oliver Sanders KC, representing the Russell family, said: “They were romanticising the idea of self-harm, romanticising the idea of suicide.”
In defence of Instagram, Elizabeth Lagone, head of health and wellbeing at Meta (which owns Facebook, Instagram and WhatsApp), denied it treated children like Molly as “guinea pigs” when it launched a new algorithmic system.
At the time of the teenager’s death, Instagram’s guidelines allowed users to post content about suicide and self-harm to “facilitate the coming together to support” other users, but not if it “encouraged or promoted” suicide and self-harm.
Lagone told the court: “These are cries for help, no matter what.” Lagone added that there was a risk that removing such content could do “unbelievable harm” by “silencing” someone.
The 17 video clips shown during the inquest – all of which had been viewed by Molly – were so distressing that senior coroner Andrew Walker said he had considered editing them, and issued the “greatest” warning before they were played in court. Walker told those present to leave if they were likely to be affected by the material.
The Met Police examined Molly’s phone in preparation for the inquest and told the court that 476 Instagram accounts were algorithmically recommended to Molly, 34 of which had sad or depressive content. Of the 16,300 posts she engaged with on Instagram in the six-month period before her death, 2,100 related to suicide, depression or self-harm. She was liking them, saving them or sharing them around 130 times a day.
Lagone admitted Molly had viewed posts that violated its content policies and apologised.
Ian Russell described it as a “ghetto of the online world that, once you fall into it, the algorithm means you cannot escape”.
In the last six months of her life, Molly was also an active user of Pinterest, with more than 15,000 engagements on the platform, including saving 3,000 pieces of content. She was able to view content relating to self-harm on the platform.
Pinterest’s head of community operations, Judson Hoffman, admitted the site was “not safe” when Molly Russell used it. He said he “deeply regrets” the graphic material Molly viewed on the platform before her death and told the court Pinterest “should be safe for everyone”.
Dr Navin Venugopal, a child psychiatrist, told the court that he saw no “positive benefit” to the material viewed by the teenager before she died”.
“I am of the opinion that it is likely that Miss Russell was placed at risk through accessing self-harm material on social media websites and using the internet,” Venugopal added.
The senior coroner told the court this is an “opportunity to make this part [social media] of the internet safe and we must not let it slip away. We must do it.”
Ian Russell is campaigning for improved internet safety. Following the loss of Molly, the Russell family and their friends set up the Molly Rose Foundation. The aim of the charity is suicide prevention, targeted at people under 25.