Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Bluesky sees a 17-fold increase in moderation reports in 2024 after rapid growth


Bluesky on Friday posted his moderation report for the past year, noting the considerable growth that the social network has experienced in 2024 and how that has affected the workload of its Trust & Safety team. He also noted that the largest number of reports came from users reporting accounts or posts for harassment, trolling, or intolerance — a problem that plagued Bluesky as it grew, and also led to large scale protests sometimes on individual moderation decisions.

The company’s report did not address or explain why it did or did not act individual users, including those on the most blocked list.

The company added more than 23 million users in 2024, as Bluesky became a new destination for former Twitter / X users for various reasons. Throughout the year, the social network has benefited from many changes in X, including its decision to change how the block works and train AI on user data. Other users left X after the results of the US presidential electionbased on how the politics of the owner of X Elon Musk began to dominate the platform. The app also increased in users while X was temporarily banned in Brazil back in September.

To meet the demands caused by this growth, Bluesky has increased its moderation team to about 100 moderators, he said, and continues to hire. The company also began offering team members psychological counseling to help with the difficult task of constantly being exposed to graphic content. (An area we hope AI will address one day, since humans aren’t built to handle this kind of work.)

In total, there were 6.48 million reports to Bluesky’s moderation service, more than 17 times since 2023 when there were only 358,000 reports.

Starting this year, Bluesky will begin accepting moderation reports directly from its app. Similar to X, this will allow users to track actions and updates more easily. Later, it will also support in-app appeals.

When Brazilian users flooded Bluesky in August, the company saw as many as 50,000 reports per day, at its peak. This led to a delay in addressing the moderation reports and required Bluesky to hire more Portuguese-speaking staff, including through a contract vendor.

In addition, Bluesky began automating more categories of reports beyond just spam to help deal with the influx, though this sometimes led to false positives. However, automation has helped bring processing time down to just “seconds” for “high certainty” accounts. Before automation, most reports were processed within 40 minutes. Now, human moderators are kept in the loop to deal with false positives and appeals, if not always handling the initial decision.

Bluesky says that 4.57% of its active users (1.19 million) made at least one moderation report in 2024, up from 5.6% in 2023. Most of these – 3.5 million reports – they were for individual posts. Account profiles have been reported 47,000 times, often for a profile photo or banner photo. Lists have been reported 45,000 times; The DMs were reported 17,700 times, with feed and Starter Packs receiving 5,300 and 1,900 reports, respectively.

Most of the reports were more about antisocial behavior, such as trolling and harassment – a signal from Bluesky users who want to see a less toxic social network, compared to X.

Other reports were for the following categories, Bluesky said:

  • Deceptive content (impersonation, misinformation, or false claims about identity or affiliation): 1.20 million
  • Spam (excessive mentions, replies, or repetitive content): 1.40 million
  • Unwanted sexual content (nudity or adult content not properly labeled): 630,000
  • Illegal or urgent issues (clear violations of the law or Bluesky’s terms of service): 933,000
  • Other (issues that do not fit into the categories above): 726,000

The company is also offering an update to its tagging service, which involves adding tags to posts and accounts. Human taggers added 55,422 “sex figure” tags, followed by 22,412 “rude” tags, 13,201 “spam” tags, 11,341 “intolerant” tags, and 3,046 “threatening” tags.

In 2024, 93,076 users submitted a total of 205,000 appeals on Bluesky’s moderation decision.

There were also 66,308 account withdrawals by moderators and 35,842 automated account withdrawals. Bluesky also fielded 238 requests from law enforcement, governments and legal firms. The company responded to 182 of these and completed 146. Most of the requests were law enforcement requests from Germany, the United States, Brazil and Japan, he said.

Bluesky’s comprehensive report delves into other types of issues as well, including trademark and copyright rights and child safety/CSAM reports. The company noted that it submitted 1,154 confirmed CSAM reports to the National Center for Missing and Exploited Children (NCMEC).



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *