Who Moderates the Social Media Giants? A Call to End Outsourcing

Page 21

4. C ontent Moderation and Volatile Countries

‘We were slow to identify these issues’ about at-risk countries, says Guy Rosen, Facebook’s vice president for integrity. ‘Unfortunately, it took a lot of criticism to get us to realize, “This is something big. We need to pivot.”’

Pursuing a business strategy of aggressive global growth, Facebook has created the difficulty for itself of moderating the prodigious output of its 2.5 billion users worldwide. These users express themselves in more than 100 languages and in all kinds of cultural contexts, making effective content moderation exceedingly difficult. Twitter and YouTube face similar challenges. To handle the crucial responsibility of content moderation, Facebook and its rivals have chosen to largely outsource the human part of the function, leading to the problems examined earlier in this report. These are not, however, the only problems related to the outsourcing of moderation. Other harms—in some cases, lethal in nature—have spread as a result of Facebook’s failure to ensure adequate moderation for non-Western countries that are in varying degrees of turmoil. In these countries, the platform, and/ or its affiliated messaging service WhatsApp, have become important means of communication and advocacy but also vehicles to incite hatred and in some instances, violence. Myanmar is the most notorious example of this phenomenon; others include Cameroon, the Central African Republic, Ethiopia, Nigeria, Sri Lanka, and India. In some of these places, Facebook at times has, in effect, outsourced monitoring of the platform to local users and civil society organizations, relying too heavily on activists as a substitute for paid, full-time content reviewers and on-the-ground staff members. “We were slow to identify these issues,” says Guy Rosen, Facebook’s vice president for integrity. “Unfortunately, it took a lot of criticism to get us to realize, ‘This is something big. We need to pivot.’”

The outside criticism, especially on Myanmar, launched a three-year corporate “journey,” Rosen adds. He and his fellow executives contend the experience has made Facebook in 2020 far more vigilant and better prepared to respond to crises in at-risk countries. In 2018, the company formed a Strategic Response team, based in Menlo Park but prepared to swoop into such countries at the first hint of trouble. Reporting directly to Chief Operating Officer Sheryl Sandberg, the number two executive at the company, the team advises on such issues as where more moderators may be needed and consults with engineers on how to adjust technology to minimize rumors and misinformation that can lead to violence. The Strategic Response team, whose size Facebook declines to reveal, helped launch a feature for Sri Lanka that restricts users to sharing only posts from their Facebook friends. By “adding more friction,” team member Sarah Oh says in an interview, the platform prevents some incendiary content from going viral.

WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING

19


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.