Meta, Snapchat, and TikTok are collaborating on a new initiative aimed at detecting and removing suicide and self-harm-related content to minimize exposure for at-risk users.
Named “Thrive,” this initiative will be managed by the Mental Health Coalition, with the three platforms sharing data on concerning content, enabling more effective cross-platform action.
As Meta explained:
“Through Thrive, participating tech companies will be able to share signals about violating suicide or self-harm content, allowing other companies to investigate and take action if similar content is being shared on their platforms. Meta is providing the technical infrastructure that supports Thrive—similar to the technology we use for the Tech Coalition’s Lantern program—which facilitates secure signal sharing.”
All three platforms permit users to discuss mental health issues and share their experiences. However, there are strict guidelines governing the distribution of graphic content or material that may promote suicide or self-harm, which is the primary focus of the Thrive initiative.
The project will enable the three companies to share data related to such content, allowing for quicker and broader enforcement. This data will take the form of identifiable “hashes,” facilitating the detection of harmful material across each platform.
Meta emphasizes that the shared data will only identify content and will not contain any identifiable information about individual accounts or users. This approach aims to expedite the removal of harmful content while also aiding in the development of respective databases and enforcement measures within each app.
In addition to combating harmful content, major social networks have also collaborated on identifying influence operations, sharing information to detect and eliminate coordinated efforts designed to mislead users.
This type of cross-platform collaboration can significantly enhance response efforts, and it’s encouraging to see these companies working together to better protect their users.
Increased social media usage has been linked to rising rates of youth depression and self-harm, with suicide now the second leading cause of death among American youth. It is crucial for these platforms to continually revise and enhance their detection processes to ensure user safety.
This initiative is a significant step in that direction and may help set a precedent for broader collaboration in the future.