Bluesky, the social network that has gained popularity as an alternative to Twitter/X, has released its moderation report for 2024, showcasing the platform's significant growth and the subsequent impact on its Trust & Safety team's workload. The report highlights the largest number of reports coming from users flagging accounts or posts for harassment, trolling, or intolerance, an issue that has plagued the platform and even led to widespread protests over individual moderation decisions.
The company's report does not address or explain why it did or did not take action on individual users, including those on the most-blocked list. However, it does provide insight into the platform's rapid growth, with over 23 million users added in 2024. This surge in users can be attributed to several factors, including changes at Twitter/X, such as its decision to alter how blocking works and train AI on user data, as well as the temporary ban of Twitter/X in Brazil in September.
To cope with the increased demand, Bluesky has expanded its moderation team to around 100 moderators and is continuing to hire. The company has also started offering team members psychological counseling to help them deal with the challenging task of constantly being exposed to graphic content. In total, there were 6.48 million reports to Bluesky's moderation service, a significant increase from the 358,000 reports in 2023.
Starting this year, Bluesky will begin to accept moderation reports directly from its app, allowing users to track actions and updates more easily. Later, it will support appeals in-app as well. During the peak of user influx from Brazil in August, the company received as many as 50,000 reports per day, leading to a backlog in addressing moderation reports and requiring the hiring of more Portuguese-language staff.
In response to the influx of reports, Bluesky began automating more categories of reports beyond just spam to help address the issue. While this led to some false positives, it significantly reduced processing time to just "seconds" for "high-certainty" accounts. Human moderators are still involved in addressing false positives and appeals, if not always handling the initial decision.
The report reveals that 4.57% of Bluesky's active users (1.19 million) made at least one moderation report in 2024, down from 5.6% in 2023. Most reports (3.5 million) were for individual posts, with account profiles reported 47,000 times, often for a profile picture or banner photo. Lists were reported 45,000 times, DMs 17,700 times, and feeds and Starter Packs receiving 5,300 and 1,900 reports, respectively.
The majority of reports were for anti-social behavior, such as trolling and harassment, indicating that Bluesky users want to see a less toxic social network compared to Twitter/X. Other categories of reports included misleading content, spam, unwanted sexual content, illegal or urgent issues, and other issues that don't fit into these categories.
The company also provided an update on its labeling service, which involves labels added to posts and accounts. Human labelers added 55,422 "sexual figure" labels, followed by 22,412 "rude" labels, 13,201 "spam" labels, 11,341 "intolerant" labels, and 3,046 "threat" labels. In 2024, 93,076 users submitted a total of 205,000 appeals over Bluesky's moderation decisions.
Additionally, there were 66,308 account takedowns from moderators and 35,842 automated account takedowns. Bluesky fielded 238 requests from law enforcement, governments, and legal firms, responding to 182 of these and complying with 146. Most of the requests were law enforcement requests from Germany, the U.S., Brazil, and Japan.
The full report also delves into other types of issues, including trademark and copyright claims and child safety/CSAM reports. Bluesky submitted 1,154 confirmed CSAM reports to the National Centre for Missing and Exploited Children (NCMEC).
The release of this report provides valuable insight into Bluesky's efforts to address user concerns and create a safer, more inclusive environment. As the platform continues to grow, it will be important to monitor its progress in addressing these issues and ensuring a positive experience for its users.