Tech News
← Back to articles

Bluesky revamps its moderation and reporting tools with transparency in mind

read original related products more articles

Bluesky is looking to make its moderation and reporting systems more transparent as part of its efforts to forge a healthier social media ecosystem than some of the alternatives out there. The platform's userbase has doubled in size to 40 million over the past year, and with more users come more instances of toxicity and other types of harmful content. Bluesky is also subject to more regulatory requirements in certain jurisdictions. The team has been working to address those factors, and changes to the moderation system should help.

The platform is revamping how it enforces and tracks violations of its policies. In the latest version of the Bluesky app, the system for reporting posts now includes more granular options with specific options to flag content related to things like false information about elections, human trafficking and bullying. "This granularity helps our moderation systems and teams act faster and with greater precision," the Bluesky team wrote in a blog post. "It also allows for more accurate tracking of trends and harms across the network."

Rather than tracking community guidelines violations across individual policies as it did in the past, Bluesky will automatically track everything in a single system. As part of that, the platform will provide users clearer information about enforcement actions. Starting in the coming weeks, users whose accounts are actioned will learn which policy they violated, the severity level of their violation and how many times they’ve broken the rules. They'll also be notified as to the length and end date of any suspension, as well as how close they are to more severe penalties being applied to their account for repeated violations. There are appeals processes for both post takedowns and account suspensions.

Advertisement Advertisement

Advertisement

Looking ahead, Bluesky plans to add a moderation inbox to the app rather than dealing with moderation decisions by email. The team says this will improve transparency and enable it to send a higher volume of notifications to users.

"These updates are part of our broader work on community health," the team wrote. "Our goal is to ensure consistent, fair enforcement that holds repeat violators accountable while serving our growing community as we continue to scale."

There are some other changes in the latest version of the app. The team says that a redesign makes the "who can reply" settings easier to use, including the option to save your selection as the default for future posts. There's also a dark mode app icon available.