Reference no: EM133336632
Discuss The Cost of Keeping Social Media Safe
Around the world, billions of people use Facebook to catch up with friends and relatives and keep up with current trends. To help make sure that material posted to Facebook is consistent with its standards, Facebook uses thousands of content moderators around the world to monitor what is posted and remove content that violates those standards.
Most of these content moderators are not Facebook employees, however. Instead, Facebook outsources this work to third party companies. One of these companies is called Sama, a US-based company that employs hundreds of workers from throughout Africa in a branch office in Kenya. While the average American content moderator for Facebook makes approximately $18 per hour, the average Sama employee makes only $1.50 per hour. Moreover, they work a 45-hour week, and they receive very little in the way of benefits (Perrigo, 2022).
The job of a content moderator is very difficult. For a nine-hour shift, content moderators are exposed to extremely disturbing content, including rape, child abuse, and suicide. One worker described it as a form of torture. Workers have reported symptoms consistent with post-traumatic stress disorder (PTSD). A Sama employee in Kenya who tried to organize the workers to bargain for better pay and more humane working conditions was fired, thus stopping the unionization efforts (Perrigo, 2022)
In its defense, leaders of Sama have argued that worker pay for its content moderators is in line with local market conditions. Further, they contend that the reason the employee in question was fired was not due to unionization efforts, but because of poor performance on the job. Facebook itself periodically sends its own employees to visit with content moderators in Kenya and elsewhere to supervise the operations, but Sama has reportedly told its employees not to discuss pay or working conditions with the Facebook representatives, and workers routinely sign non-disclosure agreements (NDAs) that prohibit them from discussing their work with the media (Perrigo, 2022).
From Facebook's perspective, the outsourcing of content moderation is both practical and cost-saving. The workers in Kenya are fluent in numerous languages, allowing them to review content posted from throughout Africa. In addition, since these workers are not Facebook employees, the company is not legally responsible for their low pay and scant benefits.
But since our class is concerned not just with legal responsibility, but ethical responsibility, and since Facebook is a technology company, and our focus is on ethical issues in technology, we can further consider the question of Facebook's responsibility to contract workers who help protect users from encountering disturbing content. In your initial response, consider the following questions:
As we've discussed, equity refers to the fair treatment of everyone, especially those with limited power and resources. In comparison to American content moderators, are Sama's content moderators being treated equitably? If not, what is Facebook's ethical responsibility in this matter?
Given the traumatizing content that all content moderators are forced to view as a condition of their employment, imagine that Facebook could develop software to automatically detect and remove content that violates its standards without the need for human workers to be directly involved. Are content moderators better off having low-paying, traumatizing jobs, or no jobs at all?