The Cleaners
This column is a reprint from Unwinnable Monthly #114. If you like what you see, grab the magazine for less than ten dollars, or subscribe and get all future magazines for half price.
———
Every week, Megan Condis and a group of friends get together for Documentary Sunday, a chance to dive into the weird, the wacky, the hilarious and the heartbreaking corners of our culture. This column chronicles all of the must-watch documentary films available for streaming.
———
Dirt. Blood. Garbage. Shit. Somebody has to deal with them. Often those somebodies are the people with the least amount of social power in a given system: the poor, the uneducated, immigrants and foreigners. They are the janitors, the sanitation workers, the hospice workers. Not only do they protect our bodies from the dangers that these contagions pose, they also protect our sense of innocence. They allow us to pretend we live in an orderly, clean, antiseptic world, one where meat comes from the supermarket, not the farm and where illness and death take place Somewhere Else out of the public view.
In the age of social media, a new kind of custodial labor became necessary to maintain this illusion. The Cleaners (Block and Riesewieck, 2018) explains how content moderators on platforms like Facebook, Instagram and Twitter, most of them from 3rd party vendors located in the Philippines, deal with the massive task of fighting trolls, abusers, hate groups and even terrorist organizations on our behalf, all so that our newsfeeds can remain pleasant and emotionally hygienic.
We don’t typically think of them unless something goes wrong; for example, when the Christchurch shooter unleashed a video of his attack on Facebook, it spread like wildfire and the tech giant scrambled to get it taken down as quickly as possible. Ultimately, the company reports, over 1.5 million copies of the video were removed and another 1.2 million were blocked at the point of upload in just the first 24 hours following the attack. When an event like this happens, it disrupts our social media experience. It feels like an aberration, an interruption of the flow of information that we use these platforms to plug into. And yet, we are only capable of maintaining this viewpoint because of the army of content moderators that stand between us and another version of reality, one in which child pornography, graphic images of murder and suicide, and a flood of hate speech and terrorist propaganda are a disturbingly normal occurrence.
It is a dangerous job. One of the moderators likened his job to that of a sniper, painting a picture of an embattled individual left behind enemy lines, trying their best to take out the “bad guys” based on limited information and while doing as little collateral damage as possible. It is an image that acknowledges the notion that, in any culture war, there are going to be some unintentional casualties. And it is an indication that the job itself is capable of inflicting a kind of psychic violence on those who are doing it. It is the kind of job that follows you home. One interviewee recounts undergoing employee training to help her better recognize sexually explicit content and ending up dreaming of penises for the next few weeks. Another cavalierly describes his newly-acquired expertise in Jihadist beheading videos in a disturbingly monotone voice: “Sometimes [the victims] are lucky that it’s just a very sharp blade that’s being used. But the worst scenario would be the little knife, similar to a kitchen knife which is not that sharp. Sometimes it would last for one minute before the head is cut.”
It is a thankless job. In addition to acting as a human shield to protect users from witnessing an onslaught of horror day in and day out, they also serve as the fig leaf deployed by their employers during times of intense scrutiny, such as a UN investigator’s condemnation of Facebook’s failure to curtail hate speech leading to a social media-fueled genocide of the Rohingya people in Myanmar. In exchange, they are expected to sort through upwards of 25,000 pictures a day using guidelines that are handed down to them from corporate and over which they have little to no control. In exchange, they earn anywhere between $1 and $3 an hour and, if they’re lucky, a quarterly group therapy session that does little but protect their employer from liability in the event of a worker’s suicide.
They are the ones preventing us from drowning in a sea of pornography and snuff films every time we open our phones, the human spam filters that make the internet usable and even pleasurable. If social media allows us to externalize our egos for the world to see, then they are the ones who deal with the id. It’s a dirty job, and somebody’s got to do it. But that doesn’t mean that we can turn a blind eye to the devastating effects that this work can have on the people sifting through our digital trash.
———
Megan Condis is an Assistant Professor of Communication Studies at Texas Tech University. Her book project, Gaming Masculinity: Trolls, Fake Geeks, and the Gendered Battle for Online Culture is out now from the University of Iowa Press.