”Easter Day Slaughter” or what Facebook can do with illicit uploads

”Easter Day Slaughter” or what Facebook can do with illicit uploads

A video called ”Easter Day Slaughter” has been uploaded on Facebook this Sunday. Video depicts its uploader, one Steven Stephens, 37, asking Robert Godwin Sr., 74, to say Stephen’s girlfriend’s name. After the elderly man complies, he is shot in the head by Stephens.

The video, as well as Stephen’s profile, was suspended, but it was not the only piece of content that the man uploaded. Before and after comitting the murder online, Stephens has shared a number of messages and live videos in which he recounted his earlier kills or explains why did he commit them. Reportedly, his reasoning is some kind of revenge on his girlfriend. As of now it is unknown whether or not Stephens has actually commit the killings he talked about. While shocking in themselves, Stephens’ posts depict the growing problem of illegal content on Facebook. While Facebook has its content policy, e.g. banning content portraying pornography or senseless violence, the company itself admits that the policy is somewhat fluid (”context and degree are everything”). Besides, people who commit illicit activities do not often care about policies and upload this content nevertheless, which is why it is important for Facebook to find and remove it as fast as possible.

Right now, Facebook’s methods of dealing with illegal content are somewhat limited. The content can be reported by the users, and Facebook assures that even just a single report is enough to warrant a moderator’s review. Social network also certainly has its own search engines that scan the content in search of keywords and phrases. This is not enough, though, as shown by Stephen’s example. His first video, in which he talked about the killings, was not reported at all. The video of the shooting was not reported (at least Facebook claims so) until an hour and 45 minutes after it was uploaded. Obviously, Facebook’s current security measures against illicit content are not enough. Reportedly, Zuckerberg’s social network is working on a program that can ”understand” the content of videos and messages uploaded and alarm the administration when neccessary.

This, or we, the users, could just react faster. Seriously, it took nearly two hours for someone to report that video? Insane. What do you think? Should Facebook try to automatically scan and ban/let pass our content, or would it be better if we, the people, were just quicker to react?

Leave a Reply

Your email address will not be published. Required fields are marked *