Facebook reversed a decision this week on its nudity policy with respect to a single photo after the social media giant received complaints of censorship.
The photo titled “The Terror of War,” an image taken during the Vietnam War in 1972, caused Facebook some trouble this week. The familiar image shows a nine-year-old burned girl running naked down a street, fleeing the village of Trang Bang, north of Saigon, in the immediate aftermath of a Napalm attack.
Earlier this week, Recode reported, Facebook decided the image violated its community standards and deleted a Norwegian journalist’s post about the image. That action triggered complaints from people including Norwegian Prime Minister Erna Solberg.
Recode also wrote this:
Now Facebook has reconsidered. Perhaps because it’s embarrassing for one of the world’s most powerful companies to say they’re stumped by this kind of problem. Perhaps because Facebook leaders like Mark Zuckerberg and Chris Cox genuinely believe this is the kind of image Facebook should be sharing with its users.
Maybe both things are true.
This shouldn’t be “embarrassing” for Facebook. The people who are so quick to vilify the company should take a seat and have a few deep breaths.
Facebook isn’t the bad guy here.
Facebook merely did what it told you up front it would do.
You see, when you created your Facebook account, you had to check a little box that stated you agreed to follow Facebook’s “community standards.” That’s their wording for “Terms of Service.”
Here’s what Facebook comes right out and tells anyone who actually takes the time to read those standards before just clicking the box and jumping ahead to start posting their political rants on a newly-created profile:
People sometimes share content containing nudity for reasons like awareness campaigns or artistic projects. We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards.
See what they did there?
They told you several key items in that little paragraph:
First, they restrict the display of nudity. Not “some” nudity, mind you, just nudity. That statement alone justified Facebook’s right to remove the image. And since it’s Facebook’s website and since you agreed to their policy, you’ve no reason to complain.
Second, it states its policies “can sometimes be more blunt than we would like” and can “restrict content shared for legitimate purposes. While I haven’t seen the posts that were deleted, I’ll go out on a limb and assume their use of the Pulitzer Prize-winning photograph would probably fall into the “legitimate purposes” category, though I would certainly question whether the people sharing it may all have had legal rights to actually do so. (I won’t debate the notion of “Fair Use,” which is a legal defense in a lawsuit once one is filed against you, not a blanket permission to ignore copyright law.
Third, it acknowledges the obvious: their enforcement is a work in progress. They will make judgment calls that will come into question and are trying to adjust their response.
There’s not much more one could hope for.
Those of us who use Facebook (and pay zero to be a member, I might add) seem far too quick to forget something very important: It’s their site. They’re the ones who get to make those rules.
Maybe the people who are so quick to jump on the criticism bandwagon in this case should consider pooling their resources and creating their own social media network that has no standards at all. I’d love to see how long that would last before they started imposing some sort of rules.
Even with Facebook’s standards published, their rulings can sometimes seem arbitrary when it comes to things like hate-speech or vulgarity. What some people consider “hate” isn’t universally hate to everyone; what some consider “vulgar” is still somewhere near “vanilla” to others.
But a naked child is nudity no matter how you look at it, historical value or not. Nudity, despite its purpose, despite its impact on the viewer, remains nudity.
It’s good, I think, that Facebook recognized the context in the likely use of the image.
But they weren’t wrong in doing what they already told you they’d do to begin with.