Meta tries to hide suicide posts from under-18s

Meta is to remove more content on suicide, self-harm and eating disorders from the feeds of teenagers on Instagram and Facebook.

The company is making two main changes to its policies as it faces increased pressure to make its social media platforms safer for children. Teenage users will no longer see posts from others discussing their personal struggle with thoughts of self-harm or suicide, even if they follow that person.

All under-18s will also be put into the most restrictive content control settings on Instagram and Facebook.

Child safety campaigners gave a mixed response to the move, which comes in the face of increased regulatory, legal and political pressure.

Meta is facing new rules in the UK under the Online Safety Act that will require it to shield children from harmful content.

The European Commission has sought information on how Meta protects children and dozens of American states have sued the company, claiming it repeatedly misled the public about the dangers of its platforms.

Mark Zuckerberg, the Meta chief executive, is due to testify before the US Senate on the issue this month.

On self-harm, suicide and eating disorder content, Meta does allow people to “share content discussing their own struggles”. After the changes, which will be introduced in the coming months, a spokesman said: “What we will aim not to show teens is content where people are discussing their struggles with suicide, self-harm or eating disorders. And any time someone posts or shares content related to these topics we’ll continue to point them to local expert resources for help.”

When teenagers seek out this content in search, “we’ll start hiding these related results and will direct them to expert resources for help”, the company said in a blog post.

Andy Burrows, adviser to the charity set up by the family of Molly Russell, who took her own life after viewing social media content, called the changes “piecemeal”.

“Our recent research shows teenagers continue to be bombarded with content on Instagram that promotes suicide and self-harm and extensively references suicide ideation and depression.

Molly Russell took her own life in November 2017 after she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.

“While Meta’s policy changes are welcome, the vast majority of harmful content currently available on Instagram isn’t covered by this announcement, and the platform will continue to recommend substantial amounts of dangerous material to children.

“Unfortunately this looks like another piecemeal step when a giant leap is urgently required.”

Arturo Bejar, a former Meta employee who turned whistleblower, said that the changes did not address his concerns. He has claimed that the company was aware of harassment and other harms facing teenagers on its platforms but failed to act against them.

Bejar said that the company was relying on “‘grade your own homework’ definitions of harm” and still did not offer a way for a teen to easily report an unwanted advance. “This should be a conversation about goals and numbers, about harm as experienced by teens,” he said.

He wants Meta to make design changes on Facebook and Instagram to nudge users toward more positive behaviour and provide better tools for young people to manage unpleasant experiences.

Post Comment