life in pixels

Who Pays for Silicon Valley’s Hidden Costs?

Content moderators working for Task Us, an American outsourcing tech company, in Manila, Philippines. Photo: Moises Saman/Magnum Photos

There’s a famous Ursula K. Le Guin short story called “The Ones Who Walk Away from Omelas,” about a fantasy city called Omelas, a joyous metropolis without guilt or violence, populated by “mature, intelligent, passionate” people who celebrate beautiful festivals and shop at (duh) a “magnificent Farmer’s Market.” Le Guin is short on political or organizational specifics; Omelas is whatever utopia you can conjure up for yourself, with one extra feature: Somewhere in the city, in a grimy basement underneath a handsome building, a 10-year-old child, “feeble-minded,” terrified, and starved, has been imprisoned, begging desperately to be set free. The people of Omelas are aware of the prisoner. But they will never free it, because — remember, this is a fantasy parable — “their happiness, the beauty of their city, the tenderness of their friendships, the health of their children … depend wholly on this child’s abominable misery.” If a citizen were to unlock the basement and clean and feed the child, “all the prosperity and beauty and delight of Omelas would wither and be destroyed.”

I doubt I was the only person who thought of Le Guin’s short story while reading Casey Newton’s feature story on Facebook content moderation at the Verge this week. For most users, Facebook is not exactly an Omelas — “joyous” is not the adjective I’d think to describe my experiences on it — but for advertisers, Facebook’s actual clients, it’s as close to a utopia as they’ll ever get. By effortlessly targeting highly specific demographics at exceedingly low cost, it offers up to brands all the promise of internet advertising, but with almost none of the messiness (fraud, pornography, violence, abuse) you might find gambling with other ad networks. It is, in effect, a clean, well-lit version of the internet, and, for advertisers, that is indeed joyous.

As in Omelas, however, the joyousness of Facebook’s hundreds of thousands of advertisers depends wholly on the misery of a small number of other people: its content moderators. For his article, Newton spoke with several current and former employees of Cognizant, a “professional services vendor” contracted by Facebook to moderate its platform, removing the content that violates Facebook’s terms of service. In practice, this can mean spending your days scrolling through post after post of graphic porn, hate speech, and videos of death and violence. “People develop severe anxiety while still in training,” Newton writes, “and continue to struggle with trauma symptoms long after they leave,” but are given minimal support and job stability — each employee gets nine “wellness time” minutes per day, to be used if they feel traumatized and need to stop moderating. Meanwhile, “the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views.” For their trouble, they are paid $28,800 a year, around 12 percent of the average salaried Facebook employee’s total compensation of $240,000.

The moderators sign nondisclosure agreements that prevent them from talking about what they’d done for Facebook — or even, apparently, that they’d worked for Facebook at all. This theoretically protects private Facebook user data from leaking out. But it also has the effect of hiding the 1,500 content moderators from the outside world, preventing them from even describing the difficulty of their jobs, let alone agitating for better working conditions. Even full-time Facebook employees are apparently unaware of the extent to which Facebook contracts out its content moderation, and the conditions under which that moderation is taking place, despite its importance. “Why do we contract out work that’s obviously vital to the health of this company and the products we build?” one employee asked on an internal company message board.

The answer to that question can be found in a recent report in the Information. As Reed Albergotti reports, Facebook employees have complained since at least 2008 that the company’s “chronically understaffed” group of content moderators (in 2009, Facebook had only 12 moderators for 120 million users) didn’t have the resources to address the harassment, hate speech, or misinformation that would come to characterize Facebook discourse. And yet employees who reported problems “were rebuffed because of the belief that the company should operate with as few employees as possible.” Rather than address its various mounting content issues with dedicated staff hires, the company farmed out moderation work to contractors, needing to maintain a cost structure that relied on low labor costs, and believing that eventually an “engineering solution” could be developed to keep the platform clean and safe. This bargain has worked out well for Facebook, which by 2016 was raking in an incredible $600,000 per full-time employee in net profit. It has worked out rather less well for the rest of us — particularly the contactor moderators with PTSD-like symptoms. (Facebook’s profit margins have been thinning over the last two years, naturally, as it expands its “security” team, which includes moderators.)

The idea that you can keep your labor costs low by relying on software automation, creating the eye-popping profit margins that venture capitalists approvingly call “software margins,” is a foundational belief of 21st-century Silicon Valley. Albergotti quotes the mantra COO Sheryl Sandberg brought to Facebook from Google: “People don’t scale.” And it’s true that there are some tasks — like selling and placing digital ads — that are more efficiently and profitably done by software programs. But there aren’t that many tasks that programs can do as well as human beings, and not many programs that can be automated without the help and work of humans. Which means that any company bragging about automated solutions is likely hiding a much larger shadow workforce supporting those solutions, like the one Facebook employs through Cognizant.

These shadow workforces are tasked with undertaking the most tiring and repetitive work required for a software business. Content moderation on platforms, which is which outsourced not just to services in the U.S. but firms in the Philippines and Bangladesh, is probably the most obvious one: Reuters reported this week on content reviewers in India, who “regularly watch live videos of suicide attempts” (not to mention “beheadings, car bombings and electric shock torture sessions”) for wages as low as $6 a day. But there’s also data processing: The rise of machine learning requires human beings to create, label, and process data sets — identifying photographs, for example, or tracing over objects in drawings — that are used by the artificial-intelligence programs. Google hires cheap workers through crowdsourcing sites for this reason; in China, data processing for machine learning has created a whole new line of blue-collar employment. We’re eminently familiar with the idea that there might be bots behind the “humans” we encounter online. We’re less aware of the number of humans whose work is necessary to sustain functional bots. The rote daily work that goes into creating magical artificial-intelligence applications like facial recognition is rarely mentioned by the companies profiting off of it — but, obviously, that would make the AI seem less magical.

Such arrangements are endlessly common in Silicon Valley. Magically convenient services and devices are often subsidized not just by money-burning investors but also by exploitative labor arrangements. Amazon purchases arrive quickly in part because the company’s warehouse workers are relentlessly tracked and don’t take bathroom breaks; Uber rides are cheap in part because the median hourly wage of an Uber driver is $10. Obviously, being a paid Uber driver is no closer to being a chained-up child than riding in an Uber is to being in paradise, but you can begin to understand the bargain being struck by the citizens of Omelas in Le Guin’s story.

The truth is that “software margins” (or what investors hope will eventually be software margins) are rarely achieved solely through automation and innovation. Most Silicon Valley titans, especially the software giants that have arisen over the last two decades, have become adept at externalizing their costs. Users contribute content for free. Contractors work for cheap. The burden of moderating content has for the last ten years been borne by someone — in some cases under-compensated contractors, in some cases users moderating content themselves for free, in some cases the victims of abuse or harassment, and, in some cases, the public sphere as as whole. Just rarely Facebook. If Facebook’s shadow-workforce practices — which have been widely reported by journalists and studied by academics well before the Verge’s story this week, and which are no different than content-moderation practices on many social networks — are being singled out now, it may be because the platform’s conveniences no longer seem worth the full social cost.

The question we face now is whether the company will internalize those costs, and whether its business proposition is still attractive if it does. Facebook may still believe that these moderators’ jobs will eventually be done by machines, but the idea that an automatic program could match or surpass a well-trained human at the sensitive, evolving interpretive work required by content moderation still seems vanishingly distant. Facebook will need content moderators now and in the foreseeable future, and the contractors they employ understand that. For all the potential trauma of their job, Newton writes, the moderators he spoke with were proud of the work they did, and regarded their jobs as important. More to the point, they recognize their value to Facebook: “If we weren’t there doing that job, Facebook would be so ugly,” one moderator told him. “We’re seeing all that stuff on their behalf.” And Facebook employees have expressed concern for the contractor-moderators on internal message boards. (Or, at least, concern for the quality of the moderation.) If Facebook’s employees and its contractors can band together, they’d be in a strong position to demand better conditions. And if adequately staffing and compensating content moderators is too expensive for Facebook to sustain, that doesn’t seem like a problem with moderation. It seems like a problem with Facebook.

Who Pays for Silicon Valley’s Hidden Costs?