Proposal: Trust And Verification 

Expanding Moderation Programs & Policies 

Media Commercial Content 
Wikipedia: ( Overarching overview of what I’m looking at)

On Internet websites that invites users to post comments, a moderation system is the method the webmaster chooses to sort contributions which are irrelevant, obscene, illegal, or insulting with regards to useful or informative contributions.

Commercial Content Moderation: “monitoring and vetting user-generated content (UGC) for social media platforms of all types, in order to ensure that the content complies with legal and regulatory exigencies, site/community guidelines, user agreements, and that it falls within norms of taste and acceptability for that site and its cultural context.” 

Examples of Content Moderation: 

  1. According to DigitalInformationWorld
    1. “This patented new feature is going to give moderators and admins a little more power over who can post in groups and who can see what has been posted. This is intended to reduce the number of negative posts/comments that admins have to deal with and will help the moderators turn their groups into better communities that can actually help bring people together if they have interests that are more or less similar.
  1. NPR: Facebook contracts with other companies to hire the content moderators and provide office space. Originally, these sites were in other countries, but the operations have expanded to the U.S., including sites in California, Arizona, Texas and Florida. Later, we’ll talk about how Facebook is preparing to launch what’s been described as a Supreme Court to rule on contested content decisions.
    1. The Facebook Supreme Court Ruling 
      1. “Facebook has had, I think, a big internal debate, and there’s been a big national debate or international debate around Facebook about whether it’s more like, say, the phone company that is a communications network that doesn’t monitor the signals that are sent across it – it’s not the job of any of the phone companies to listen into what you’re saying and decide whether it passes certain standards or not – or whether Facebook is more like a newspaper or a magazine, a content provider where Facebook has some responsibility for what is posted on the site that it owns and profits from. So has Facebook been shifting in its perception of what it is and what its responsibility is to its users in terms of the content that Facebook is exposing them to?”
      2.  “So when I wrote the first story, Facebook did a big internal blog post where they sort of outlined changes that they were going to make in the future. And I want to highlight a few of them because I actually think they’re positive and important, and some of them are somewhat theoretical (laughter). But the first one is, Facebook has said it’s going to give a $3 an hour raise to these contractors, so they would make a minimum of $18 an hour, which I think comes out to more like $36,000 a year. That’s a really good start. I don’t think it’s enough. I think it’s a good start.”
    2. According to Allsides, “We noted NPR online news does not predictably show coverage favoring left or right perspectives, and generally reports in a way that fairly showcases the perspectives of both the Left and the Right. NPR online news does not use emotionally charged or polarizing language, and maintains a relatively fair representation of issues. We found its reporting to be fact-based and not leaning left or right.”
      1. Link:
  1. CNN 
    1. According to Allsides , CNN is a lean-left media based outlet. 
    2. Example: (*This example came from CNN due to reading laterally.)
      1. CNN was discussing the business of media content and how social platforms such as Facebook and Google were spending capital on media content regulators. 
        1. “Content moderators typically help companies weed out disturbing content ranging from suicide and murder videos to conspiracy theories in order to make platforms more palatable. The report about Facebook, which cited interviews with a dozen workers who do or have done moderation work for the company, showed workers are reportedly paid $28,800 annually with little upward mobility and few perks. Some reported coping with trauma by getting high on breaks or having sex at the office.”
      2. Link:
  1. Context: 
    1. This came from the Verge, a highly touted blogging website, it describes the effects of mediating content on the internet. 
      1. “But given the limits of the technology, and the infinite varieties of human speech, such a day appears to be very far away. In the meantime, the call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.”
    1. Link:
    2. Context: 
      1. An article based on a new called Behind the Screens: 
        1. The Author of the Book is Sarah T. Roberts 
      2. This Article was reported by Issac Choitner
        1. We learn about the working conditions of the workers & the value that these big companies don’t show to these workers 

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s