Tech News
← Back to articles

Meta, TikTok and Snap are participating in an online safety ratings system

read original related products more articles

Numerous major social platforms including Meta, YouTube, TikTok and Snap say they will submit to a new external grading process that scores social platforms on how well they protect adolescent mental health. The program comes from the Mental Health Coalition’s Safe Online Standards (SOS) initiative , which comprises about two dozen standards covering areas like platform policy, functionality, governance and transparency, content oversight and more. The SOS initiative is led by Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention.

In announcing these companies' participation, the Mental Health Coalition writes "SOS establishes clear, user-informed data for how social media, gaming, and digital platforms design products, protect users ages 13–19, and address exposure to suicide and self-harm content. Participating companies will voluntarily submit documentation on their policies, tools, and product features, which will be evaluated by an independent panel of global experts."

After evaluation, the platforms will be given one of three ratings . The highest achievable safety rating is "use carefully," which comes with a blue badge that compliant platforms can display. Despite being the highest rating, the requirements seem fairly run-of-the-mill. The description includes things like "reporting tools are accessible and easy to use," and "privacy, default and safety functions are clear and easy to set for parents." As for what actions the standards ask of the companies being rated, the "use carefully" rating says "platforms and filters help reduce exposure to harmful or inappropriate content."

Advertisement Advertisement

The other ratings include "partial protection" which is described in part as "some safety tools exist on the platforms, but can be hard to find or use," and "does not meet standards" which would be given if "filters and content moderation do not reliably block harmful or unsafe content."

The Mental Health Coalition, founded in 2020, has mentioned Facebook and Meta as partners since the early days of the organization. In 2021 the organization said it would bring together "leading mental health experts partner with Facebook and Instagram to destigmatize mental health and connect people to resources" during the COVID-19 pandemic.

In 2022 the nonprofit published a case study with "support from Meta" that found "mental health content on social media can reduce stigma while increasing individuals’ likelihood to seek resources, therefore positively impacting mental health."

In 2024, the MHC "in partnership with Meta" launched a campaign called the Time Well Spent Challenge . In it, the group urged parents to have “meaningful conversations” with teens about “healthy” social media use, focusing less on whether teens should be on these apps at all and more on keeping them on-platform in a “time well spent” way, from reduced screen time to “using social media for good” and reviewing their feeds together.

Advertisement Advertisement

That same year it partnered with Meta again to establish " Thrive ," a program that allows tech companies to share data regarding materials that violate self-harm or suicide content guidelines. The Mental Health Coalition lists Meta as a "creative partner" on its website.

... continue reading