Facebook CEO Mark Zuckerberg, arguably one of the most powerful men in the world, is once again under fire over a failure to police his hugely influential social media platform.
Employees at Facebook have staged a virtual walkout, partly in response to the company’s decision not to censure a post (initially on Twitter) in which US President Donald Trump warned of violent police retaliation against protesters.
“When the looting starts, the shooting starts,” Trump tweeted. Twitter put up a label warning that the post contravened rules on inciting violence. Facebook allowed the post, however, with Zuckerberg saying that the statement was a warning to protesters, and argued “people should be able to see this for themselves”.
It’s yet another example of Zuckerberg’s inconsistent, often insipid attitude towards taking accountability for the information that Facebook allows to spread. The difference this time is that senior employees are speaking out.
Get Crikey FREE to your inbox every weekday morning with the Crikey Worm.
To fact check or not to fact check
Last week, Twitter picked a big fight with Trump, finally letting the fact-checkers loose on his tweets. Zuckerberg refused to do the same — Facebook “shouldn’t be the arbiter of truth”, he said.
Except sometimes, Facebook is happy to decide what’s true. In Australia, for example, it partners with AAP and AFP to fact check its own content (although with just seven fact-checkers, who knows how far that gets). In response to the COVID-19 pandemic, Zuckerberg himself promised the platform would crack down on misinformation.
“Things like saying something is a proven cure for the virus when in fact it isn’t, we will take that down,” Zuckerberg told the BBC.
Zuckerberg’s decision to apply different rules to Trump than anybody else has been linked by some to a desire to avoid angering Republicans who have threatened tougher regulation of Facebook.
But it’s also part of a pattern of total, maddening inconsistency from Zuckerberg. When Trump won in 2016, Zuckerberg dismissed concerns about fake news swaying voters as “a pretty crazy idea”.
And despite repeated promises and reassurances that Facebook would target fake news, the Cambridge Analytica scandal, and admissions of Facebook’s role in sparking race riots, Zuckerberg still wants to give politicians a free pass.
Last year, he reiterated his core belief that “people should be able to judge for themselves the character of politicians”.
In January, just months before Zuckerberg was talking about the platform’s responsibility to target coronavirus misinformation, Facebook announced it would not be fact-checking political ads in the 2020 US presidential election.
A targeted Facebook campaign was key to Trump’s unexpected election win four years ago, and remains central to his reelection chances.
They knew it was a swamp
Facebook’s role in radicalising and polarising society is well-documented.
The problem is, Facebook knew of the problem — internal research showed hard evidence that the platform was increasing polarisation. The company’s top brass, including Zuckerberg, largely ignored it.
Facebook’s troubles with the far right lays bare the platform’s failure to grapple with radicalisation. After 2017’s violent neo-Nazi rally in Charlottesville, Zuckerberg promised to take down threats of physical harm. Within a year, plenty of far right content was still on the platform.
A lot of that content stayed up thanks to a typically Zuckerbergian semantic distinction — until last year, while white supremacist content was banned, white nationalist content was still allowed up.
Say sorry like you mean it
Years of bad press have made Zuckerberg very good at apologising. In 2017, Zuckerberg said he regretted his dismissal of fake news during the election.
The Cambridge Analytica scandal also triggered a good deal of grovelling apologies — an interview on CNN, long Facebook posts, full-page ads in papers like The New York Times and The Wall Street Journal.
And still Zuckerberg and Facebook’s reputation is under siege. All the backflips, attempts to appear in control and reactive changes to the rules have done Zuckerberg few favours — and made nobody any wiser about what the platform will and will not tolerate.