by Paul Alan Levy
Two voluntary takedowns of user-generated have been in the news lately, spurring some reflections, on the one hand, about the dangers of becoming overly dependent on certain platforms for free expression, and about how online service providers exercise their discretion under section 230 to remove material even though the providers cannot be held legally responsible for the hosting.
A few days ago, Facebook took down the New Yorker’s entire Facebook page for violating the "no-female-nipples" provision in its policies (or was it just the cartoon page, as one article has characterized it?). Although the New Yorker demanded a license fee if I published the image on my own Facebook page, which I had in mind to do just to be in Facebook's face, I don’t see how you can understand this controversy without seeing the image:
Robert Mankoff posted a devastating article about the takedown on the New Yorker’s blog (be sure to scroll down in the post to see the amusing update about the clothing-on version of the cartoon). After the story gained traction in various mainstream locations, Facebook issued an apology for the takedown, although Ars Technica observes that the cartoon itself is still not back up on the New Yorker’s Facebook page.
More recently, YouTube decided to block access in two countries to the agitprop “movie trailer” for Innocence of Muslims, which has spurred outrage in many countries and has been cited as the occasion for attacks on US embassies (although it appears not to have been the reason for the worst of the incidents, the targeted raid in Benghazi). If you have not seen the trailer, think a highly amateurish version of Life of Brian without the slightest hint of humor and with no attempt to pillory all sides. A guest blogger on Techdirt (who oddly admitted that he had not looked at the trailer before writing about it) and an EFF staffer have both criticized the takedown, which strikes me as curiously insufficient for its stated purpose of avoiding needless violence. After all, the video was blocked only in the countries where the worst violence had already become widespread, which strikes me as closing the barndoor after the cow has already run off.
What to Make of These Takedowns
Over at Techdirt, in writing about the New Yorker cartoon incident, Mike Masnick had, I think, just the right take on this phenomenon — it stands as a stark reminder of the fact that when so many members of the public have made Facebook “a key platform for expression,” we should remember how easily the arbitrary application of guidelines by relatively low-level staff can cut off important expression. The fact that Facebook took down the New Yorker's entire page over one "improper" image is particularly dismaying. As I see it, Facebook's apology does nothing to resolve the problem. Users should continue to hammer Facebook until, at the least, it revises the instructions it gives its staff so that the remedy more closely suits the violation.
Moreover, it was not hard for the New Yorker to use its bully pulpit to get its access restored, by getting the attention of senior staff. Regular folk who have their accounts cut off are likely to have no such luck. Facebook should be pressured to change this policy.
But I am not inclined to be as alarmed as EFF is about the possibility that Google’s temporary shutdown of the movie trailer represents the beginning of a slippery slope toward widespread suppression of free expression. Its removal is not the same as deferring to government censorship, and as much as I hate to give mob violence the satisfaction of an effective heckler’s veto, we cannot expect that online service providers will never remove material simply because it is deemed offensive by wide swaths of the population. Moreover, I can’t help but wondering if the violent response isn’t just what the film-makers were hoping for. So by leaving the image on its site so that we can understand the controversy, while taking it down where broad access to the material is likely to cause the greatest harm, Google has made a comprehensible judgment. And it is just the sort of surgical intervention that distinguishes it from what Facebook did.
Indeed, this sort of takedown is a reflection of the social compact that Congress sought to foster when adopting section 230 of the Communications Decency Act. The companies that lobbied for the provision said that they wanted to behave in a socially responsible manner but that they needed legal protection against being held liable for selective responses to the posting of offensive material.
So although we need to be vigilant when service providers exercise their section 230 discretion, and we need to raise a ruckus over indefensible removals (and I have participated in such online campaigning), we should not expect that removals will never happen. In the film trailer situation, the decision was apparently made at a high level of the company and with the recognition that it was not an easy call (as, it seems to me, it is not). As I see it, that is what society has a right to expect from online hosts.
Politico's report that Google's takedown followed a "request" from the Obama Administration for review in light of Google's policies puts this move in a slightly different light.