‘Behind the Screen’ illuminates the invisible, indispensable content moderation industry

'This invisibility is by design.'

The moderators who sift through the toxic detritus of social media have gained the spotlight recently, but they’ve been important for far longer — longer than internet giants would like you to know. In her new book “Behind the Screen,” UCLA’s Sarah Roberts illuminates the history of this scrupulously hidden workforce and the many forms the job takes.

It is after all people who look at every heinous image, racist diatribe and porn clip that gets uploaded to Facebook, YouTube and every other platform — people who are often paid like dirt, treated like parts, then disposed of like trash when worn out. And they’ve been doing it for a long time.

True to her academic roots, Roberts lays out the thesis of the book clearly in the introduction, explaining that although content moderators or the companies that employ them may occasionally surface in discussions, the job has been systematically obscured from sight:

The work they do, the conditions under which they do it, and for whose benefit are largely imperceptible to the users of the platforms who pay for and rely upon this labor. In fact, this invisibility is by design.

Roberts, an assistant professor of information studies at UCLA, has been looking into this industry for the better part of a decade, and this book is the culmination of her efforts to document it. While it is not the final word on the topic — no academic would suggest their work was — it is an eye-opening account, engagingly written and not at all the tour of horrors you may reasonably expect it to be.

After reading the book, I talked with Roberts about the process of researching and writing it. As an academic and tech outsider, she was not writing from personal experience or even commenting on the tech itself, but found that she had to essentially invent a new area of research from scratch spanning tech, global labor and sociocultural norms.

“Opacity, obfuscation, and general unwillingness”

“To take you back to 2010 when I started this work, there was literally no academic research on this topic,” Roberts said. “That’s unusual for a grad student, and actually something that made me feel insecure — like maybe this isn’t a thing, maybe no one cares.”

That turned out not to be the case, of course. But the practices we read about with horror, of low-wage workers grinding through endless queues of content from child abuse to terrorist attacks, while they’ve been in place for years and years, have been successfully moderated out of existence by the companies that employ them. But recent events have changed that.

“A number of factors are coalescing to make the public more receptive to this kind of work,” she explained. “Average social media users, just regular people, are becoming more sophisticated about their use, and questioning the integration of those kinds of tools and media in their everyday life. And certainly there were a few key political situations where social media was implicated. Those were a driving force behind the people asking, do I actually know what I’m using? Do I know whether or how I’m being manipulated? How do the things I see on my screen actually get there?”

A handful of reports over the years, like Casey Newton’s in the Verge recently, also pierced the curtain behind which tech firms carefully and repeatedly hid this unrewarding yet essential work. At some point the cat was simply out of the bag. But few people recognized it for what it was.

At just over 200 pages, “Behind the Screen” isn’t quite an afternoon read, but apart from a few exposition dumps, it’s a quick one, filled with direct quotes from workers in the content moderation business and context that goes deep but not too deep. She purposely observes not just the industry itself but the patterns that have created it, and the ones that flow from it.

The first step Roberts takes is to essentially explain that the content moderation industry exists. This is not necessarily obvious because, as is made clear throughout the book, the companies involved have made the choice for years to hide the process from public view and avoid talking about it even when directly questioned.

Roberts writes:

That the topic has seized the public’s consciousness across academic, journalistic, technological, and policy-making sectors has come largely in spite of continued opacity, obfuscation, and general unwillingness to discuss it on the part of social media firms that rely on these practices to do business…

It is the erasure of these human traces, both literally and in a more abstract sense, that is so fascinating, and we must constantly ask to whose benefit such erasures serve.

The problem as she sees it is not only, or even primarily, that the industry of commercial content moderation is relatively unknown, but that as a part of that process, the people who make up the industry are being forgotten and abandoned. Considering the size and growth of the global market for these services, that was an ever-expanding humanitarian tragedy.

To combat that, Roberts dedicates three chapters to highlighting the stories of individuals currently or formerly in the content moderation business.

First is a group of three young people who worked directly — or so they thought — for a major company in Silicon Valley she calls “Megatech,” but the identity of which a savvy reader will be able to infer. Second come a man and woman who have worked as third-party moderators as part of “boutique” firms, offering moderation (and astroturfing) as a service. Third is a handful of workers at one of the many call center companies in the Philippines, undertaking the work for plummeting prices as part of the global race to the bottom.

Moderation’s many forms

The Megatech and boutique chapters are more history than indicators of how the industry operates today. The interviews for the first took place in 2012, and describe a situation that has almost certainly changed since then. The workers had an unusual and severe contract structure where they were essentially forced to leave after a year, and provided very little in the way of resources, no possibility of advancement and uncertain goals and guidelines in their work. It’s extremely clear that Megatech had (and perhaps has) no idea what it was doing and little valued the work of these people, except as guards against liability.

The boutique operators emphasize the fundamental banality of their work, but also the ubiquity of commercial moderation on every forum, Facebook page, video channel and so on. The woman in particular expresses extreme exasperation at the Sisyphean task of beating back an avalanche of racism, sexism, homophobia and other evils on the most ordinary of news posts.

The extensive quotes (Roberts makes a point of putting things in their own words so as not to further erase their presence in this context) paint a compelling picture of both industries, especially “Megatech.” Although news reports over the last couple of years have highlighted some of the work done at other boutique companies to which companies now outsource this work, it is very interesting to have a snapshot of the industry at these various points in time, as explained by direct human voices instead of PR-vetted CEO blog posts.

Although the human accounts here are valuable and indeed somewhat the point of the work, it would be nice to be able to better place them within the context of the tech world then and now. Documenting the past is important (especially when it has been obscured), but it’s hard sometimes to connect it to the present.

The chapter on the Philippines, on the other hand, is disheartening precisely because it points at the direction this type of work is going. As companies like Facebook make big promises about how they’re going to hire 20,000 moderators, it’s increasingly clear that they don’t mean 20,000 new Facebook employees. It’s these folks, who, while they are happy to have a job at one of the islands’ many call center businesses, are rapidly circling the drain as far as the quality and price of the work.

One of the workers describes how their moderation goals were simply doubled recently with no corresponding increase in pay or hours. That’s because the content moderation ecosystem is a global one and increasingly it goes to the lowest bidder on a global scale — which at present happens to be India. One only imagines the conditions that allow the work to be done at that price and scale there.

As Roberts points out, this is not simply a question of the cost of doing business, but that a global market is being established that deliberately exploits countries and regions in dire economic straits to perform unpleasant, repetitive work — invisibly, while the companies requesting it talk about the quality of the AI they say is doing the bulk of it. “The connection between economic crisis in a region and an increase in the availability of competent labor that is exceedingly cheap cannot be lost in the analysis,” she writes.

Silence intensifies

Notable by their absence are the giant tech companies that necessitate this industry, the result of their choice to allow content first and moderate it later (something we’ve come to see as natural, though it is by no means the only or best method). It is perhaps naive to think that the Googles, Amazons, Facebooks and other platforms of the world, after attempting to keep their need for these services secret for years, would want to speak candidly about them, or indeed at all.

Roberts explained the lack of comment from the companies in question.

“I didn’t want to tip my hand until I was really established with the book and research. So it wasn’t until 2016-17, when I had come to UCLA and I couldn’t really operate under the radar any more that I did reach out,” she recalled. “I was really excited because it was the first time I was really outing myself as someone engaged. Like, I’m writing for the Atlantic and working for UCLA, that’s pretty good, right?”

As you might imagine, she received nothing but a cold shoulder — only one company responded to her inquiries, and even then only to ask when the article she was writing was going live.

“So I sort of decided that, honestly, I wasn’t interested anyway,” Roberts said. “I felt I was going to get lip service, and that’s been covered plenty. Since that time I have built up relationships with people inside firms who are interested, I think legitimately, on improving things internally, using findings from people like me. But at the time not only did I not have access but I felt like it might be an asset not to go there. I mean… in a way, who cares? Who cares what they say? I was interested in other kinds of evidence. I had other kinds of evidence.”

“Behind the Screen” is limited for the most part to observation and some context-setting by way of previous literature and history that most people will not have heard of. Roberts recognizes that this is a quickly evolving space and to prescribe for it would be folly. All the same there are timely warnings and astute considerations in the book that anyone interested in this space will find valuable.

I asked Roberts what she thinks happens next in this space, having lived and breathed it for the last few years. Is it regulation? Scholarly work? Unions? New company policies?

“All of those things are on the horizon, but some of them are at odds with each other,” she responded. “I think there is an under-representation of labor organization in these sectors; even firms that are out ahead trying to improve wages and doing other things to that end, they’re doing it with these specters in the background — worker organization, which of course we know Silicon Valley is allergic to, and regulation. We know that Trump is threatening an executive order around this… for all the bad that I see and a lack of transparency in moderation at the firm level, I’m pretty disturbed by the idea of a wanton executive act dashed off by the president. I’d rather see careful deliberation in Congress… though displays during Senate hearings last summer didn’t really instill a lot of confidence.”

“But because this is a global practice, we’re going to see these things play out differently in different parts of the world,” she continued. “The EU is way out ahead — so those pressures may come from outside the United States. So the pressures may be legislation, regulation and unionization. Those kinds of things are going to push the industry, but firms might decide that when they have better moderation practices they can actually tout those as a value add rather than sneaking around and pretending computers do everything.”

You can pick up “Behind the Screen” at your favorite online bookstore or Yale University Press.