‘Behind the Screen’ illuminates the invisible, indispensable content moderation industry – gpgmail


The moderators who sift through the toxic detritus of social media have gained the spotlight recently, but they’ve been important for far longer — longer than internet giants would like you to know. In her new book “Behind the Screen,” UCLA’s Sarah Roberts illuminates the history of this scrupulously hidden workforce and the many forms the job takes.

It is after all people who look at every heinous image, racist diatribe, and porn clip that gets uploaded to Facebook, YouTube, and every other platform — people who are often paid like dirt, treated like parts, then disposed of like trash when worn out. And they’ve been doing it for a long time.

True to her academic roots, Roberts lays out the thesis of the book clearly in the introduction, explaining that although content moderators or the companies that employ them may occasionally surface in discussions, the job has been systematically obscured from sight.

The work they do, the conditions under which they do it, and for whose benefit are largely imperceptible to the users of the platforms who pay for and rely upon this labor. In fact, this invisibility is by design.

Roberts, an assistant professor of information studies at UCLA, has been looking into this industry for the better part of a decade, and this book is the culmination of her efforts to document it. While it is not the final word on the topic — no academic would suggest their work was — it is an eye-opening account, engagingly written, and not at all the tour of horrors you may reasonably expect it to be.

After reading the book, I talked with Roberts about the process of researching and writing it. As an academic and tech outsider, she was not writing from personal experience or even commenting on the tech itself, but found that she had to essentially invent a new area of research from scratch spanning tech, global labor, and sociocultural norms.

“Opacity, obfuscation, and general unwillingness”

“To take you back to 2010 when I started this work, there was literally no academic research on this topic,” Roberts said. “That’s unusual for a grad student, and actually something that made me feel insecure — like maybe this isn’t a thing, maybe no one cares.”

That turned out not to be the case, of course. But the practices we read about with horror, of low-wage workers grinding through endless queues of content from child abuse to terrorist attacks, while they’ve been in place for years and years, have been successfully moderated out of existence by the companies that employ them. But recent events have changed that.

“A number of factors are coalescing to make the public more receptive to this kind of work,” she explained. “Average social media users, just regular people, are becoming more sophisticated about their use, and questioning the integration of those kinds of tools and media in their everyday life. And certainly there were a few key political situations where social media was implicated. Those were a driving force behind the people asking, do I actually know what I’m using? Do I know whether or how I’m being manipulated? How do the things I see on my screen actually get there?”

A handful of reports over the years, like Casey Newton’s in the Verge recently, also pierced the curtain behind which tech firms carefully and repeatedly hid this unrewarding yet essential work. At some point the cat was simply out of the bag. But few people recognized it for what it was.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Facebook and YouTube’s moderation failure is an opportunity to deplatform the platforms – gpgmail


Facebook, YouTube, and Twitter have failed their task of monitoring and moderating the content that appears on their sites; what’s more, they failed to do so well before they knew it was a problem. But their incidental cultivation of fringe views is an opportunity to recast their role as the services they should be rather than the platforms they have tried so hard to become.

The struggles of these juggernauts should be a spur to innovation elsewhere: While the major platforms reap the bitter harvest of years of ignoring the issue, startups can pick up where they left off. There’s no better time to pass someone up as when they’re standing still.

Asymmetrical warfare: Is there a way forward?

At the heart of the content moderation issue is a simple cost imbalance that rewards aggression by bad actors while punishing the platforms themselves.

To begin with, there is the problem of defining bad actors in the first place. This is a cost that must be borne from the outset by the platform: With the exception of certain situations where they can punt (definitions of hate speech or groups for instance), they are responsible for setting the rules on their own turf.

That’s a reasonable enough expectation. But carrying it out is far from trivial; you can’t just say “here’s the line; don’t cross it or you’re out.” It is becoming increasingly clear that these platforms have put themselves in an uncomfortable lose-lose situation.

If they have simple rules, they spend all their time adjudicating borderline cases, exceptions, and misplaced outrage. If they have more granular ones, there is no upper limit on the complexity and they spend all their time defining it to fractal levels of detail.

Both solutions require constant attention and an enormous, highly-organized and informed moderation corps, working in every language and region. No company has shown any real intention to take this on — Facebook famously contracts the responsibility out to shabby operations that cut corners and produce mediocre results (at huge human and monetary cost); YouTube simply waits for disasters to happen and then quibbles unconvincingly.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something