Yoel Roth, previously the head of Twitter’s Trust and Safety, now at Match Group, is sharing his concerns about the future of the open social web and its ability to combat misinformation, spam, and other illegal content, like child sexual abuse material (CSAM). In a recent interview, Roth worried about the lack of moderation tools available to the fediverse — the open social web that includes apps like Mastodon, Threads, Pixelfed, and others, as well as other open platforms like Bluesky.
He also reminisced about key moments in Trust and Safety at Twitter, like its decision to ban President Trump from the platform, the misinformation spread by Russian bot farms, and how Twitter’s own users, including CEO Jack Dorsey, fell prey to bots.
On the podcast revolution.social with @Rabble, Roth pointed out that the efforts at building more democratically run online communities across the open social web are also those that have the fewest resources when it comes to moderation tools.
“…looking at Mastodon, looking at other services based on ActivityPub [protocol], lo
Continue Reading on TechCrunch
This preview shows approximately 15% of the article. Read the full story on the publisher's website to support quality journalism.