
UK coroner blames Instagram and Pinterest algorithms for teenager’s death
Senior coroner Andrew Walker said the cause of death of 14-year-old Molly Rose Russell was self-harm linked to depression and the negative consequences of online content. According to the BBC.
“It would be unsafe to leave suicide as the conclusion about the causes of death,” said Walker.
In the UK, a coroner’s verdict carries the weight of a court ruling.
The expert said that Instagram and Pinterest used algorithms that led to “periods of binge exposure” to content, some of which the platforms selected and provided to the schoolgirl without her request.
“Some content romanticised acts of self-harm by young people towards themselves, while other content contributed to isolation and hindered discussion of the issue with those who could help,” said Walker.
According to The Guardian, in 2017, on the eve of her death Russell had saved, liked or posted more than 2,000 posts on Instagram related to suicide, depression or self-harm. She also watched 138 videos of a similar nature, including episodes rated “15+” and “18+” from the series “13 Reasons Why”.
A child psychiatrist consultant told hearings that he could not sleep normally for weeks after reviewing the Instagram content Russell had viewed shortly before her death.
In the girl’s Pinterest account investigators found hundreds of images related to self-harm and suicide. It also emerged that the platform sent the schoolgirl emails with content recommendations with headlines such as “10 pins about depression you might like.”
“It is quite possible that the material Molly viewed, already suffering from depression and age-related vulnerability, negatively influenced her and contributed to the death of a child more than minimally,” Walker said.
Representatives from Meta and Pinterest apologised and acknowledged that Russell encountered content on those platforms that should not have been there.
“We strive to ensure that Instagram provides a positive experience for everyone, especially teenagers. We will carefully review the coroner’s full report when he provides it,” a Meta spokesperson said.
Pinterest said that it is continually improving the platform and seeks to guarantee safety for all.
“We will review the coroner’s report with caution,” the company said.
In May, a lawsuit was filed against TikTok over allegedly “deadly recommendations” from its algorithms.
In December 2021, Amazon’s virtual assistant Alexa suggested a deadly challenge to a 10-year-old child.
Subscribe to ForkLog News on Telegram: ForkLog AI — all the news from the world of AI!
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!