Mastodon Has A Big Content Moderation Flaw

Emma Roth writes for The Verge about the enormous amount of content depicting child abuse that Stanford researchers analyzing the network were able to find.

“We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” David Thiel, one of the report’s researchers, said in a statement to The Washington Post. “A lot of it is just a result of what seems to be a lack of tooling that centralized social media platforms use to address child safety concerns.”

This is disturbing and all the more so because the problem is intrinsic to the design of the platform. DIY and the indie spirit are great and all, but when you are lacking the tools and human resources of the bigger networks, problems like this are able to metastasize much more quickly.

Source: Stanford researchers find Mastodon has a massive child abuse material problem

Canned Dragons by Robert Rackley
Made with in North Carolina
© Canned Dragons