Book Review 3 || Sarah T. Roberts - Behind the Screen: Content Moderation in the Shadows of Social Media
Dr. Sarah T. Roberts is the expert on online content moderation. As an assistant professor of information studies at University of California, Los Angeles, Roberts has worked to pull back the veil of how social media companies moderate content. Her work earned her the 2018 Electronic Frontier Foundation (EFF) Pioneer Award – even before she published her first book, Behind the Screen: Content Moderation in the Shadows of Social Media, this June.1
Behind the Screen is focused on what Roberts calls “commercial content moderation.” By this, she means the ways that Google, Facebook, Twitter, and the like hire people to moderate content on their platforms.2 To learn about this, Roberts conducted an ethnographic study. This is an academic term that basically means that the book uses interviews with people to understand the work they do and the culture they live in. She interviews multiple moderators in the United States and the Philippines. By doing this, she shows the psychological costs of moderation on the moderators and the implications that their decisions have for the rest of us.3
The psychological costs of moderation are high, and to understand why it helps to know the process by which information is moderated. For the most part, platforms cannot block content before it makes it to the platform. Youtube, for instance, receives hundreds of hours of video every minute. There is simply no way to review all that material before it goes online. So, platforms rely on users to flag content that they think violates a platform's rules. Once a user flags content, it is screened by algorithms for easy violations like spam. However, algorithms cannot catch everything so real people then have to make a review. Those moderators are sometimes employees of the company, sometimes contractors, and sometimes stay-at-home parents looking to make an extra dollar. They review the content and decide whether it stays or goes based on the company's rules.
One of the more interesting parts of the process is what happens when content moderators are unable to make a decision. If it is not clear whether or not content should stay up, moderators send it to their superiors. They will then deliberate in a group over whether or not the content violates the platform's rules. This process imitates judicial proceedings, complete with precedents based on previous decisions.
Moderators see horrific and disgusting things for hours each day. Everything, ranging from child pornography, gore, and scenes of mutilated bodies to hate speech, death threats, and more end up on the moderator's screen over and over again. In her interviews, Roberts found that the moderators generally brushed off the consequences of this. However, when she dug deeper, she found that it took a serious toll. Moderators were isolating themselves from friends and families, struggling to maintain intimate relationships, drinking heavily, and the like. They struggled to access counselors and were often treated as inferiors by other people in the tech company they worked for.
Roberts takes a critical approach to content moderation. In an academic sense, a critical approach is one where the author asks questions about power. The questions at the heart of this approach are, who benefits? and who decides? I greatly appreciate this approach, because it enables us to see the relationship between free speech online and content moderation. Companies do not regulate speech to fit a partisan agenda – as is often alleged – but rather to create a positive brand image. They are concerned first and foremost with their bottom line and content moderation is merely a means of making more money. It is social media and tech companies (and the companies which use those platforms) that benefit from content moderation. They decide the rules of their platforms. Since online platforms are privatized, the enforcement of content moderation is also privatized. Citizens have no input. There is no transparency about the rules for content that companies follow – indeed those are intentionally kept secret.
For me, privatization is the central problem of censorship online. It seems reasonable to ask where the line ought to be between acceptable and unacceptable speech, but so long as Facebook, Twitter, Google, and so on dominate our online existence, we can never really get a good answer. Commercial content moderators, meanwhile, are casualties of a capitalist system that places the brand image of tech companies over the psychological health of human beings. Algorithms will not save us – only a political struggle to demand transparency, accountability, and human rights in online spaces can solve this problem. Indeed, recent muckraking on the plight faced by moderators has led to positive changes at Facebook.4
Behind the Screen will no doubt have a positive impact on the discussion around content moderation. Its ethnographic method brings to light the human toll of moderation in a way that other studies simply cannot. Its critical approach makes clear that this is not an inevitable problem, but a solvable one – provided we knock on the right doors. It shows that a world of less human suffering and more free speech is possible.56
- 1. You can read about the award, to be renamed the Barlow Award after one of EFF's co-founders, here.
- 2. Roberts contrasted commercial content moderation with community content moderation that was very common on the early internet and still exists on web forums and websites like Reddit. In community content moderation, a member of the forum is somehow selected to be the moderator without compensation.
- 3. The book also dives into the ways that race and ethnicity play into the way that commercial content moderators sell themselves. Doing this kind of moderation requires cultural expertise that is hard to describe - and this carries implications for international contractors in India and the Philippines. This is not my focus here, but it is a fascinating point worth buying the book to learn about.
- 4. You can read a good article about these changes here.
- 5. For more information on commercial content moderation, you should read Roberts blog, available here. I have only browsed it so far, but it seems like a really useful repository so I will probably return to it for later posts.
- 6. Oh, and there was a good interview Roberts did with the New Yorker about the book. You can read it here.