Final Paper

Most scholars and experts agree that social media platforms such as Google, Facebook and Twitter have a right and responsibility to moderate the nature of their user-generated content.  However, the people who moderate this content,are often exploited by the very social media platforms who depend on them. While their value to society has increased exponentially, content moderators have suffered under the hands of these tech companies. Specifically, content moderators have taken a psychological toll because the images of some of this content are graphic and in some accounts have even caused PTSD. Furthermore, content moderators also face an unprecedented issue in terms of salary. Since they are not part of the tech companies they are working under, they work in conditions that are unbearable.   Even though this has been an ongoing issue, the rest of society has not heard about it because it has been kept in secrecy. Most importantly, content moderators are forced to follow strict guidelines that abide by non-disclosure agreements, legal documents that moderators sign that “outlines confidential material, knowledge, or information that the parties wish to share with one another for certain purposes, but wish to restrict access to or by third parties” (Wikipedia). Due to these NDAs, content moderators are left voiceless and invisible. However, combating these NDAs to make their cause known is a formidable challenge. NDAs are not only used as contract but it also meant to implement fear within those who think about retaliating against the social media platforms. Therefore, content moderators must effectively advocate for themselves by compromising their agreements and going through legal systems in order to combatant these tech companies. Below I speak about the many events that are happening with regards content moderation and how there are steps being taken to solve these issues. 

 Content moderators have viewed graphic content that has caused emotional and physical trauma. On February 2019, the Verge published an article titled, “The Trauma Floor” ,  which highlighted the negative effects of content moderation and the physical labor that they’re withstanding. For example, Chloe, a former content moderator for Facebook, had panic attacks started after watching a man die. Chloe however was not compensated for her damages and so her only plan of action was to provide awareness. Chloe is an anomaly in this vast net of content moderators. Due to NDAs, The majority of content moderators are not allowed to speak about their situations. 

NDAs disable or remove any power that a content moderator might have over tech companies. According to the Verge, “ As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows”. According to Sarah T. Roberts, a renowned assistant professor of information studies,  who focuses on content moderation and its underworld culture,  content moderators “are frequently required to sign non-disclosure agreements (NDAs) about the nature of their work in which they pledge to not reveal the particulars of their employment or job duties” (page 15). Therefore, they are bound to never speak about their trials and tribulations. As a result, there is an entire workforce that is not receiving certain rights such as healthcare, workers benefits, and vacation. For example, Josh Santos, was a recent college graduate and needed to find a job within this difficult economic environment. And so, Josh applied and got a job at MegaTech which is a notable company in Silicon Valley. Josh soon realized that he was being manipulated and scammed by MegaTech because of his working conditions. According to Sarah T. Roberts’:Behind the Screen, Josh states, “ My last job paid me three times less when I used to work as a server at a restaurant. And it was like so much more work for such little, like, just like no, almost minimum wage”. NDAs are a form of inequality and does not allow workers to be fully represented. 

Since NDAs bind workers then workers must find methods of challenging these tech companies. Therefore, they can go through the legal system in order to improve their work in content moderation. According to the Washington Post, Selena Scola, a former Facebook content moderator, is suing because she was diagnosed with PTSD after reviewing so many graphic content and images. This lawsuit represents the majority of content moderators and how they have been affected by their tech companies. According to the article, “The lawsuit does not go into further detail about Scola’s particular experience because she signed a nondisclosure agreement that limits what employees can say about their time on the job. Such agreements are standard in the tech industry, and Scola fears retaliation if she violated it, the suit says. Her attorneys plan to dispute the NDA but are holding off on providing more details until a judge weighs in”.  The NDA , however, can be challenged with the courts and allows the content moderator to share their stories. Similarly, an article published by Courthousenews described two 

Microsoft content moderators who suffered PTSD but did not speak about it due to their NDA. The Courthousenews states, “Henry Soto and Greg Blauert say Microsoft failed to warn them about the dangers of the job and failed to provide psychological support. They sued the company in King County Court on Dec. 30, alleging negligence, disability discrimination and violations of the Consumer Protection Act.” NDAs can be challenged if they do not provide information about the possibility of psychological trauma and so this does allow for content moderators to take initiative. As a result, we have an improvement of working conditions in a workforce that is underrepresented. 

NDAs limit the power of content moderators and their ability to advocate for themselves. However, NDAs prove to be the most accessible way of challenging tech companies who aren’t providing adequate support for their workers. Content moderators can scrutinize social media platforms because NDAs lack certain provisions such as psychological support and salary negotiations. As a result, Social media platforms have taken major steps in order to solve this issue. Facebook has increased their salary in certain parts of the country. And so, content moderators are progressing towards a more equitable relationship with these social media platforms.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s