Inside the Hell of Working as a Social Media Content Moderator
Andrew Kersley speaks to a former moderator for Meta about the trauma he experienced at social media's sharp end
For Josh Sklar, every day started to blur into one. 6 am wake up. Getting dressed. Rushed breakfasts. Driving into the office by 7 am – one of those ubiquitous, nondescript open plan places nestled in a multi-storey of concrete and glass in central Austin, Texas, shared with Amazon.
Sitting at his desk and opening his computer, he would spend the next eight hours filtering through Nazi manifestos, Indonesian car wrecks and video suicide notes.
For nearly two years Josh worked as a content moderator for Meta, or more accurately, the outsourcing firm Accenture, fighting an uphill struggle to keep Instagram clear of the worst the human psyche has to offer.
But if the world you inhabit is somewhere contoured by internal spreadsheets on the newest trending racial slurs, chats with colleagues about what constitutes animal mutilation and constant electronic monitoring from management, it has an effect. Who you are starts to get restructured.
Content moderation is often misunderstood. If you listened to new Twitter boss Elon Musk you might assume its main role is to limit freedom of speech – one reason why he recently fired an estimated 3000 content moderators. But for those actually involved with moderation, it’s about making social media actually usable rather than filled with a never-ending stream of death threats, child porn and human corpses, even if the manual work behind it all is largely ignored by employers and users alike.
Meta employs just 15,000 content moderators to cover Instagram and Facebook, according to the New York Times, or roughly one per 333,000 active users. Many have reported low wages and precarious employment. Usually, they work for third-party contractors, and instead of real mental health support are told to use breathing exercises or positive affirmations to deal with traumatic content. Many are subject to non-disclosure agreements so binding they can’t even tell their own family who they work for.
It is also worth acknowledging that all the problems that face US moderators are much worse for the 93% of outsourced Accenture moderators who work elsewhere, usually with fewer labour protections.
“The system is tantamount to letting you work in a factory with no safety systems and hoping your arm doesn't get chopped off,” Cori Crider, a director of Foxglove, a legal non-profit organisation taking Facebook to court over its treatment of moderators. “They're absolutely critical safety workers, but they get treated like a disposable adjunct to the business."
Just this week, a judge in Kenya ruled that former outsourced Facebook moderator Daniel Motaung could take the tech giant to court over his treatment, having been paid £1.80 an hour to sift through posts including beheadings and child abuse.
Josh would spend about ten seconds assessing the average piece of content. On a good day, before the rotting doubts began to sink in, he could clear up to 700 posts. Once you knew your way around the labyrinth of guidelines Facebook had, it became second nature.
For cases with more nebulous rules, like soft nudity or graphic animal mutilation, he’d end up consulting Facebook’s secretive rulebooks for the IG Proactive team he worked in – those who assessed content flagged by Instagram algorithms rather than users.
“Your map of the world is restructured,” he tells me. “Over time, it completely changes the terms on which you view things.”
Despite that veneer of machine learning, it failed to recognise images of flames, so he constantly found himself looking at galleries of fires, crashes and burning car wrecks, usually from Indonesia.
Moderators were separated into different teams that handled specific types of content – fittingly known as ‘queues’, given the stream of content waiting to pop up on their screens that seemed to be constantly growing no matter how much they cleared.
Thin, with slightly curled brown hair, glasses and a bookish demeanour, Josh is never more ill at ease than when he’s talking about himself, rather than the system at large.
At its core, he says, most moderation was about uniformity; about bringing absolute standards to a platform for social interactions to ‘fix’ the unspoken rules governing human interaction.
At a certain level, the rules descend into absurdity. He remembers his confusion having to enforce rules only banning the use of the ‘n’ word that had a hard r on the end, and even then not if it’s not used in a positive context. The race of the speaker was never taken into account.
“It has as much to do with etiquette as anything else,” he explains, citing Seinfeld, Curb Your Enthusiasm and the “comedy of manners” as proof that etiquette is far from absolute. “You're trying to mechanise something that is almost impossible to mechanise.”
It made a lot of the work feel self-defeating. Particularly as the fundamental driver of revenue for platforms is attention, and the content dredged from the internet’s most toxic waters is usually the most engaged with. Most of the time it felt like he wasn’t even being paid to evaluate content at all but “to sit in a chair so that Accenture’s contract can stay in place”.
And just as staff and algorithms get trained to spot dangerous content, the people they are policing think of new ways to evade them. That tendency is what underpins the rise in popularity of Nazi dog-whistle codes like 88 (stands for Heil Hitler) and 14 (for the slogan of far-right terrorist David Eden Lane). “It creates this weird arms race with racists,” he sighs. “There’s just so many new slurs all the time”.
Josh likes to compare the work to a factory line, and not just because of the boredom, repetition and digital uniformity “Industrial accidents are the most serious thing that can happen in a factory,” he says. “But for most people, what actually happens is actually that their body is like, worn down in these small ways over time.”
It was similar for content moderators. Being exposed to a never-ending queue of the worst of humanity, at a scale that the physical world could never dream of matching, leaves you with this constant background upset all the time, worsened only by how monotonous it all becomes.
“Your map of the world is restructured,” he tells me. “Over time, it completely changes the terms on which you view things.”
He tells me the story of a colleague who worked on the child endangerment queue for a little too long and started to struggle going out to the supermarket. “She would see kids with their parents and couldn’t stop thinking ‘Is that kid being trafficked?’,” he recalls.
What Adam Smith once warned about the industrial factory line – that repetitive specialised labour would change and hollow out those doing it – the software factory line perfected.
The long term aim for content moderators was to develop ‘emotional armour’ that would mean a video of a brutalised corpse would have the same impact as entering numbers on an excel spreadsheet. “But if you start to not let things get to you, you start to feel like a very different person,” Josh explains.
The fix for Accenture was a small group of in-house wellness coaches. Josh always thinks back to the same meditation that one took him through. “They told me to imagine a square in my favourite colour. And then to imagine that square growing larger and larger and coming at you until it encompassed your entire imagined field of vision,” he recalls, quoting the mantra back as if he had been sitting through it yesterday. The example has stuck with him – he brings it up word for word regularly while we talk – precisely because of its “dehumanising” absurdity, of how little it addressed the problem at hand.
Few people really trusted the wellness coaches anyway – rumours spread that anything you said could be passed on to management. At one point, the company even stopped hiring licensed professionals – opting to hire cheaper, unqualified life coaches instead.
“But even then, what you're talking about is wanting to preserve a certain kind of healthy worldview,” as he puts it, “While looking at a world that is completely different from that.”
Moderators would develop their own coping mechanisms where they could. Some days, for Josh, it was aimlessly walking around the office building, picking wildflowers in the park as summer came or listening to music in empty office stairwells.
One almost meditative habit he picked up was moulding sculptures out of the red rinds of Babybell cheese. He proudly shows me one sculpture he still holds on to – a surprisingly detailed human foot, with bubblegum wrappers intricately shaped into toenails.
He’s spent a long time thinking about body parts in the last few years. You might too if you had spent as long as he did staring at a never-ending chain of deglovings – injuries where the thin veneer of someone’s skin is ripped away exposing the raw reality of the blood, bone and muscle tissue underneath.
“Our brain just doesn't interpret that stuff well. You have a map of what a face or a hand is supposed to look like,” he says. “They almost seem like they're not totally physical entities. They have like a form, a soul we inherently associate with them.”
He starts on a long tangent but catches himself right as he starts to graphically describe what it looks like when a bear rips open a human face, before apologising. It’s something he does a lot, sometimes as much as six times without taking a breath.
“I feel like I say all this upsetting stuff,” he explains. “But with some of these things, I’ve just intellectualised them.”
But even two years after leaving, cracks are still there. Maybe the only silence during our conversations comes when he remembers short stints on the line that handles child pornography.
During their training, they were shown an example video of an underage girl being raped by an older man. “Just seeing the confusion in this little kid’s eyes,” he sighs, struggling to find the right words. “I don’t know. It was just a really upsetting thing; one of those moments where you feel something shift.”