Facebook’s puzzling censorship standards have come under fire again after the site took down numerous artistic images containing nudity and group pages addressing sex and sexuality. Katie Weiss reports.
Facebook’s puzzling censorship standards have come under fire again after the site took down numerous artistic images containing nudity and group pages addressing sex and sexuality.
The social networking site has responded favourably to complaints targeting nude imagery and intellectual discussions about sex as its Terms and Conditions broadly identify nudity as offensive.
Last week, Facebook removed an image of Nirvana’s epic album cover Nevermind, which depicts a naked infant swimming underwater and chasing a dollar bill. The album image was uploaded on the band’s Facebook page to promote its 20th anniversary and worldwide success after selling more than 30 million copies.
Facebook, which hosts more than 60 million users, quickly reposted the image after extensive news coverage and outcry from the online community over the blunder and denied foul play. Facebook reps told NME magazine: “Facebook does allow photos of naked … babies. Why? Put it this way — if a parent wanted to share some photos of a newborn with their grandparents, we wouldn’t want them to not be able to share them on Facebook.”
But this isn’t the first time the social networking giant has censored non-sexual pictures of naked children. In December, the site disabled a baby photographer’s Facebook business page and blocked baby shots the photographer from Iowa had uploaded.
Laura Eckert said she suspected photos of her friend giving birth in her bathtub last year may have pushed Facebook over the edge even though, as AllFacebook website reported, no nipples were showing. “Of course, we make an occasional mistake,” Facebook spokesman Simon Axten said in an email to MSN. “This is an example … When this happens, and it’s brought to our attention, we work quickly to resolve the issue.”
Eckert said she sent more than 30 emails to Facebook requesting its reasons for deactivating her account, which acted as a platform to promote her business. After a support and lobbying group formed on the site and her story made the KCRG-TV news, Facebook promptly responded to Eckert’s emails and reactivated her account.
Eckert was troubled by the site’s failure to inform her when they had closed her account and inability to mediate the situation or allow her to form a defence.
AllFacebook, a stringent observer of the site, mapped out Facebook’s typical censorship cycle works in three stages: the site receives a user complaint it deems valid and removes content, to which another user cries “censorship” and Facebook reinstates the content, blaming its removal on an error and apologises.
“Facebook takes the path of least resistance,” social media commentator Dr Shanton Chang told Crikey. “As long as there’s no public outcry then anything goes. Those who protest loudest get heard more. Those who don’t say anything will not get heard.”
Critics have attacked Facebook’s censorship botches for lacking transparency or accountability, particularly as the site’s screening methods are hidden from the public. “Yes, Facebook is a private, free platform, but users expect to be able to use it. It is your right to create whatever terms of service you want, but be clear, consistent, and transparent when enforcing them,” said blogger Jillian C. York.
Facebook’s line between appropriate and obscene imagery seems to also be based on the medium in which the artwork is made. In February, the site embarrassingly took down a lifelike ink-on-paper drawing of a nude model posted on the New York Academy of Art’s Facebook page and proceeded to close the account. The artist Steven Assael’s picture evoked non-sexual themes and would be commonplace in art galleries worldwide, commented New York Times blogger Miguel Helft.
A Facebook spokesperson told Helft site guidelines followed an unofficial “policy that allows drawings or sculptures of nudes. In this case, we congratulate the artist on his lifelike portrayal that, frankly, fooled our reviewers.”
Other figurative artists have recounted similar stories of censorship watchdogs trolling their Facebook pages and successfully reporting content they deem inappropriate.
“My page did not violate any of the reasons stated for deletion,” Blue wrote in an email to Facebook. “It was under constant attack by people who disagreed with our point of view, and constantly reported our posts and images … We sought a safe place to discuss sex culture in media, and that is all.”
Dr Chang says Facebook’s method of favouring the loudest complainers is a positive move from society’s traditional top-down approach to censorship laws. The downside is that these decisions can be based on ill-informed witnesses.
Facebook denies itself a role as mediator in disputes over questionable content, as it states in typical emails for censorship ‘errors’: “While we appreciate your concerns, as we hope you can understand, we are not in a position to adjudicate disputes between third parties.”
The site’s political correctness has also branched out to areas of sexual preference and ethnicity. In June, The Advertiser reported Facebook tearing down an indigenous rapper’s video clip, which used controversial language such as “nigga”. Colin Darcy or “Caper” is a Native Title Services officer and used the video to promote awareness of racial discrimination; his rap opens with the lines: “How would you like to be me/An Aborigine/Looked down upon in society.”
Darcy shot back at the censor and his complaints gained momentum by fans and supporters, convincing Facebook to unblock the video. “Compared to the other explicit videos that are being shown out there and gangster rap stuff — it’s crazy,” Darcy said.
In April this year, the internet community lashed out at Facebook’s block of a snapshot from UK drama Eastenders showing two men kissing in a park. The censorship prompted international outcry, including Facebook pages dedicated to same-sex kisses and a “kiss-in” protest. Facebook responded to the public response by reinstating the photo.
“Whether or not they like it they’re going to be seen as a moral compass,” Dr Chang said. “They don’t have a choice because they are providing this moral platform for people to comment.”