Technology
Meta leads collaboration to stop spread of suicide and self-harm content
Meta has announced a significant new initiative, partnering with social media giants Snap and TikTok to combat the spread of harmful content related to suicide and self-harm.
The programme, known as Thrive, aims to stop the circulation of graphic material that promotes or encourages these dangerous behaviours.
Through shared alerts, the initiative will enable these platforms to act quickly and collaboratively.
Thrive, developed in partnership with the Mental Health Coalition, will allow the companies to share “signals” across platforms whenever harmful content appears.
This cooperation builds on Meta’s Lantern program, which is used to combat child abuse by securely sharing flagged content between platforms.
Using hashed data, a unique code generated from offending material.
Thrive will ensure that once harmful content is flagged on one platform, others can be notified immediately, reducing the risk of its spread.
While Meta has already made it more difficult for users to find self-harm and suicide-related content, it continues to ensure that people can still discuss mental health issues openly.
As long as posts don’t cross the line into promotion or graphic detail, users are allowed to share their personal experiences.
The Thrive initiative strengthens these efforts, ensuring that dangerous material is quickly addressed.
Meta, Snap, and TikTok will now be able to respond swiftly, preventing such content from reaching vulnerable users.
Meta’s data highlights the scope of the challenge, revealing that the company deals with millions of pieces of content related to self-harm and suicide each quarter.
In the most recent period, approximately 25,000 posts were restored after being appealed by users, reflecting the complexity of managing this type of content.
As social media continues to play a central role in people’s lives, particularly among younger users, the Thrive initiative represents a crucial step toward ensuring safer online spaces.
By sharing alerts and acting collaboratively, these platforms aim to limit the reach of harmful content and better protect their users.
Meta’s data shows that the company addresses millions of pieces of suicide and self-harm content each quarter.
In the last quarter, approximately 25,000 posts were reinstated, primarily following user appeals.
-
Technology3h ago
Public health surveillance, from social media to sewage, spots disease outbreaks early to stop them fast
-
Technology7h ago
Why a Technocracy Fails Young People
-
Technology19h ago
Transplanting insulin-making cells to treat Type 1 diabetes is challenging − but stem cells offer a potential improvement
-
Technology1d ago
Should I worry about mold growing in my home?
-
Technology1d ago
Blurry, morphing and surreal – a new AI aesthetic is emerging in film
-
Technology2d ago
Rethinking screen time: A better understanding of what people do on their devices is key to digital well-being
-
Technology2d ago
An 83-year-old short story by Borges portends a bleak future for the internet
-
Technology2d ago
Facebook users in Germany can seek compensation for 2018–2019 data misuse | The Express Tribune