Main content

Episode Two - Trauma

Zoe Kleinman explores the battle to control online content and meets the moderators trying to manage the worst excesses of the internet.

Over 500 hours of video are posted on YouTube every minute. Over 4 million photos are uploaded to Instagram every hour. There are around 500 million posts to X (formerly Twitter) every single day. These numbers are growing by the second.

How do you even begin to monitor and police such a relentless avalanche of information? In this new series, Zoe Kleinman journeys into the world of the online content moderators.

Big social media platforms rely on automation for much of the work, but they also need an army of human moderators to screen out the content that is harmful. Many moderators spend their days looking at graphic imagery, including footage of killings, war zones, torture and self-harm. We hear many stories about what happens when this content falls through the net, but we don鈥檛 hear much about the people trying to contain it. This is their story.

The battle against harmful online content is hitting the headlines more every day, even as AI moderation gathers pace. Ironically it needs moderation itself.

In the second episode of this series, a former Facebook content moderator reveals the impact this work can have on the mental health of employees.

Zoe hears what the day to day life of a moderator is like and the challenges of working out what to keep and what to remove. And she finds out from former moderators and Silicon Valley reporters how this tech landscape is changing today.

Presenter: Zoe Kleinman
Producer: Tom Woolfenden
Assistant Producer: Reuben Huxtable
Executive Producer: Rosamund Jones
Sound Designer: Dan King
Series Editor: Kirsten Lass
A Loftus Media production for 大象传媒 Radio 4

19 days left to listen

14 minutes

Last on

Tue 12 Nov 2024 13:45

Broadcast

  • Tue 12 Nov 2024 13:45