Thursday, February 5, 2015

Content Moderation

David Harris
2/5/15
Amstd 475
Content Moderation
        When approaching social media platform it is always assumed that your posts, tweets, and pictures are all being monitored. This could either be done, as Adrian Chen puts it, by active moderation or reactive moderation. With active moderation there needs to be approval of every public action before it can be done. One way that I thought of it, is your ability with Facebook to require approval before anything to be posted on your wall. The difference here is that one person looks at thousands public actions. Instead of personal preference, gore, sex, solicitation…etc are the deciding factors.  While with reactive moderation a post, tweet, or picture is only viewed by the moderator if it is flagged. These two styles of moderation were as complex as I have ever thought about content moderation. After reading “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed” by Adrian Chen I realized I had a shallow view of this powerful and obscure world. Chen wrote this article to inform the public about this ubiquitous and omnipotent business that hundreds of thousands of them interact unknowingly with every day. In part, she builds sympathy for the workers themselves with horror stories of unnerving grotesque videos: beheadings, revolting sexual acts. On the other hand, she tries to stir up confusion as to why consumers and users have not thought or heard about these companies before. It is states that there are an estimated 100,000 content moderators, twice as many as Google employees and fourteen times the number of people of Facebook (Chen). These two international companies have a worldwide presence; however, little is known about the companies that provide the content moderation for them. Chen talks with Sarah Roberts, media studies scholar at University of Western Ontario, who is of the opinion that “if there’s not an explicit campaign to hide it, there’s certainly a tacit one” (Chen).  This quote is meant to enrage the audience that they are being tricked by smoke and mirrors. Simply, pay no attention to the man behind the curtain. Chen is shedding light on the faces of the people who were behind the curtain. While these people are powerful they still are…well people.
        The implication for this text is that I sympathize with all of those who are content moderators. They go through similar symptoms of PTSD, and many do not last long in the business. Some of the things that they do see cannot be undone. This point is made with the final paragraph with the ending quote by one of the moderators “I don’t know if I can forget it… I watched that a long time ago, but it’s like I just watched it yesterday” (Chen). Chen is certainly employing pathos when writing this article. Those who do this job are now wounded souls, similar to veterans of war.

        This text certainly persuades any reader to look upon major social media companies as consumers. Through the content that is created on the their cite, the happiness and optimism of their content moderators is being sucked out. Watching all of these videos is comparable to standing in front of a deatheater: it will unequivocally change your perspective of humanity. I am grateful that someone else is moderating the objectionable and gory videos off social media in order to save my virgin eyes and thoughts. I would think that this is the “dark web” that we have spoken of in class.
http://www.wired.com/2014/10/content-moderation/  

1 comment:

  1. Hi David, thanks for sharing your perspective on the reading. I would agree with you when you say that " this text certainly persuades any reader to look upon major social media companies as consumers". This article most certainly changed or perhaps broaden my perspective on major social media companies.

    ReplyDelete