Social media content ‘likely’ to have contributed to Molly Russell’s death

0
36
9c3e0fc2 8435 43a0 9540 ddaef9ed61b9
9c3e0fc2 8435 43a0 9540 ddaef9ed61b9

Content on social media sites, including Instagram and Pinterest, is “likely” to have contributed to the death of British teenager Molly Russell, who took her own life after viewing thousands of posts about suicide, depression and self-harm, a coroner ruled on Friday.

Delivering his conclusions almost five years after Russell’s death in November 2017 when aged 14, senior coroner Andrew Walker said she had died from “an act of self-harm whilst suffering from depression and the negative effects of online content”.

The result marks a reckoning for social media platforms, as authorities around the world grapple with how to make the internet safe for children, and will put renewed pressure on companies that create apps used by young people.

Although not a trial, the inquest put social media in the dock, with executives from Meta, which owns Instagram, and Pinterest grilled for the first time in an English court about the potential harm to a generation of young people growing up online.

It also piles pressure on the UK government at a time when it is expected to water down already long-delayed safety rules which will govern how tech sites are policed.

In the last six months of her life, Russell liked, saved or shared 2,100 depression, suicide or self-harm posts on Instagram, and went only 12 days without engaging with that harmful content on the site.

Ian Russell, her father, told the inquest that social media had “helped kill my daughter”.

“You can see what your child is doing in the [offline] world much more easily ,” he said. “You can see if they’re going into the corner shop . . . smell alcohol on their breath . . . The effects of the digital world are invisible.”

Molly Russell © PA

Ian Russell, the father of Molly Russell
Ian Russell, the father of Molly Russell © Joshua Bratt/PA

According to Ofcom, the UK communications regulator, the majority of children under 13 now have a profile on at least one social media site, despite 13 being the minimum age. Russell had a secret Twitter account in which she documented her true state of mind and reached out to celebrities for help.

Walker said on Friday it was “likely that the materials used by Molly, already suffering from a depressive illness and vulnerable due to her age, affected her mental health in a negative way, and contributed to her death in a more than minimal way”.

Platform design

Over the past year, social media companies have come under pressure as society has grown increasingly concerned about how the platforms’ design might affect vulnerable minds.

Instagram and Pinterest are visual apps known for featuring glossy aspirational images, where individuals post idealised and often edited photos.

Last year, Frances Haugen, a former product manager at Meta-owned Facebook, leaked a trove of internal documents showing the ways algorithms could draw people down psychological rabbit holes. In particular, Instagram’s internal research suggested it could have a negative impact on teenage girls’ wellbeing — findings that Instagram said were misrepresented.

A few weeks later Instagram announced it was pausing plans to introduce Instagram Kids, a product for under-13s.

On Thursday, Walker said he was concerned that children and adults were not separated on Instagram, and that children’s accounts were not linked to an adult’s.

Algorithms, the computer rules that control the order of posts social media users see, have been front and centre in the Russell case. Depression-related content was emailed to her by Pinterest, and Instagram suggested accounts to follow that referred to suicide and self-harm.

Russell had been able to “binge” harmful videos, images and clips “some of which were selected and provided without Molly requesting them”, Walker said.

Engagement is often a key metric for designing algorithms: promoting content that users are likely to comment on, like or share. Meta described previous recommendation systems as “content agnostic”, but now its technology aims to proactively identify harmful content and not promote anything permitted on the platform related to self harm.

Elizabeth Lagone, Meta’s head of health and wellbeing © Beresford Hodge/PA

Judson Hoffman, Global Head of Community Operations at Pinterest © James Manning/PA

Meta and Pinterest both apologised to Molly’s family during the inquest, for enabling her to see content that violated their policies in 2017. They claimed to have upgraded both their technology and content rules since then.

Former Meta AI researcher, Josh Simons, a research fellow in technology and democracy at Harvard University, said that what happened to Russell “isn’t just about the responsibility of platforms to police harmful content”.

“It’s about the algorithms that push content and decide what our children see and hear every day — what’s driving those algorithms, how they are designed, and who gets to control them,” he said.

Moderation efforts

Since 2019, Instagram banned all graphic self-harm or suicide images, having previously only removed images that encouraged it, and has stepped up the automated technology that detects this type of content and flags it to human reviewers. Meta said the company took action on 11.3mn pieces of content related to suicide and self-harm between April and June 2022 on both Instagram and Facebook.

Some self-harm and suicide content, such as healed self-harm scars, are allowed as individuals seek out supportive online communities on Instagram.

“Heavy-handed and misinformed approaches to social media moderation risk removing content that, while sensitive on the surface, enables important social conversations that may not be able to take place elsewhere,” said Ysabel Gerrard, a lecturer at Sheffield university and an unpaid adviser on Meta’s suicide and self-injury advisory committee.

“Much as people credit social media for [negatively] impacting their mental health, there are also lots of people who say it helped to save theirs,” she added.

Meta said moderating this type of content was nuanced, making it tricky both for artificial intelligence systems to detect and humans to understand.

It has 15,000 moderators around the world, covering financial scams and political misinformation as well as self-harm. Last year, the company said it would hire 10,000 staff dedicated to building the metaverse, its virtual world.

“It seems unlikely that Facebook has adequate resources on either the technical product side or human reviewer side to manage the issue if the number is around the same as they are planning to spend on people playing video games,” Haugen told the Financial Times.

Meta and Pinterest both admit their moderation will never catch everything. On a site with over 1bn users, in the case of Instagram, failing to identify even 1 per cent of harmful posts can mean millions remain.

Users are also able to thwart algorithms by misspelling words, mixing them with numbers and using code words.

In a few minutes of scrolling on Instagram, using terms previously flagged to the company, the FT was able to identify self-harm content that violated Instagram’s policies. It was subsequently removed.

The inquest concludes as the government amends the Online Safety Bill, legislation that will force tech platforms to tackle harmful content on the internet. In its current iteration, it is expected that companies will have to comply with standards on age verification as well as complete risk assessments or independent audits of algorithms.

Last year, the UK introduced the Children’s Code, also known as the Age Appropriate Design Code, which set higher restrictions for companies handling children’s data. The legislation has inspired similar regulation in California, Europe, Canada and Australia.

Baroness Beeban Kidron, who proposed the code, said: “There is a version of tech that puts children’s wellbeing and safety ahead of the bottom line . . . It is not aspirational to insist that children’s wellbeing should go ahead of growth, it is simply a price of doing business.”

Anyone in the UK affected by the issues raised in this article can contact the Samaritans for free on 116 123.

Credit: Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here