The flip-flop over beheadings: Facebook finds being a media entity isn’t as easy as it looks
Is Facebook a social network? Of course it is, since it connects people with their so-called “social graph,” and makes billions of dollars by doing so. But it is also clearly a media platform, just as Twitter and YouTube and other networks are — and trying to find a dividing line between what it sees as offensive and what it is willing to permithas been sending it (and users) in circles lately.
How much free speech is Facebook willing to allow? That seems to depend on what kind of speech it is. Videos of people being beheaded appear to cross a line — although that hasn’t always been the case — but other equally violent imagery continues to circulate freely on the network. Photos of women breastfeeding, however, are routinely removed, as are posts by dissident groups in a number of different countries, often without explanation.
The latest controversy arose after a video of someone being beheaded in Mexico was posted repeatedly to multiple accounts. Facebook removed a host of similarly violent videos in May after a wave of criticism from those who said the images could cause emotional harm, particularly to younger users. The social network originally fought the move, however, arguing that the videos were free speech, and part of a valuable effort by users to discuss important political and social issues:
“People are sharing this video on Facebook to condemn it. Just as TV news programmes often show upsetting images of atrocities, people can share upsetting videos on Facebook to raise awareness of actions or causes. While this video is shocking, our approach is designed to preserve people’s rights to describe, depict and comment on the world in which we live.”
It’s interesting to note that Facebook — a proprietary network controlled by CEO Mark Zuckerberg — compared itself in this original statement to a media outlet, justifying its actions with a fundamentally journalistic defense. After much criticism, however, the company seemed to change its mind and removed the videos, saying it was reviewing its policies on the posting of such content. But then on Monday, Facebook said that it had reconsidered its ban on beheading videos, and was once again allowing them to be shared:
“Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events,” said a spokeswoman. If the video were being celebrated, or the actions in it encouraged, our approach would be different.”
Then came the third flip-flop (depending on how you are counting), when the company said that it had determined that many of those accounts who were sharing the beheading video were not doing so to criticize or condemn it, but were doing so in a way that “improperly and irresponsibly glorifies violence,” whatever that means. Facebook went on to say that it plans to “take a more holistic look at the context surrounding a violent image or video,” and will remove any content that celebrates violence.
No matter how you slice it, this puts Facebook into the thick of an editorial decision — and not an easy one either. Now it’s no longer the content itself that determines whether it is removed or not, but the context of the sharing, including the words around it and other behavior by users. That’s a much harder decision, and one that is likely going to come back to bite the company in the future, especially given its somewhat contradictory decisions in other cases. Read more in GIGAOM.
From → News