Numerous major social platforms including Meta, YouTube, TikTok and Snap they will submit to a new external grading process that scores social platforms on how well they protect adolescent mental health. The program comes from the Safe Online Standards (SOS) , which comprises about two dozen standards covering areas like platform policy, functionality, governance and transparency, content oversight and more. The SOS initiative is led by Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention.
In announcing these companies’ participation, the Mental Health Coalition “SOS establishes clear, user-informed data for how social media, gaming, and digital platforms design products, protect users ages 13–19, and address exposure to suicide and self-harm content. Participating companies will voluntarily submit documentation on their policies, tools, and product features, which will be evaluated by an independent panel of global experts.”
After evaluation, the platforms will be given . The highest achievable safety rating is “use carefully,” which comes with a blue badge that compliant platforms can display. Despite being the highest rating, the requirements seem fairly run-of-the-mill. The description includes things like “reporting tools are accessible and easy to use,” and “privacy, default and safety functions are clear and easy to set for parents.” As for what actions the standards ask of the companies being rated, the “use carefully” rating says “platforms and filters help reduce exposure to harmful or inappropriate content.”
The other ratings include “partial protection” which is described in part as “some safety tools exist on the platforms, but can be hard to find or use,” and “does not meet standards” which would be given if “filters and content moderation do not reliably block harmful or unsafe content.”
The Mental Health Coalition, founded in 2020, has mentioned Facebook and Meta as partners since the early days of the organization. In 2021 the it would bring together “leading mental health experts partner with Facebook and Instagram to destigmatize mental health and connect people to resources” during the COVID-19 pandemic.
In 2022 the nonprofit with “support from Meta” that found “mental health content on social media can reduce stigma while increasing individuals’ likelihood to seek resources, therefore positively impacting mental health.”
In 2024, the MHC “in partnership with Meta” launched a campaign called the . In it, the group urged parents to have “meaningful conversations” with teens about “healthy” social media use, focusing less on whether teens should be on these apps at all and more on keeping them on-platform in a “time well spent” way, from reduced screen time to “using social media for good” and reviewing their feeds together.
That same year it again to establish “,” a program that allows tech companies to share data regarding materials that violate self-harm or suicide content guidelines. The Mental Health Coalition as a “creative partner” on its website.
Last year it was alleged that showing the ill effects of its products on users’ mental health. The internal research, dubbed “Project Mercury,” began in 2020. Since then the company has introduced some bare-minimum attempts at addressing mental health concerns, such as . Meta is now facing allegations over child harm from addictive products, the first of upcoming lawsuits against the social media giant.
Other companies participating in the rating program include Roblox, which has recently faced stiff accusations over the on the platform, and Discord which has its age-verification processes in response to its own serious .