Yesterday the NSW Supreme Court handed down a decision confirming publisher liability for comments posted to their Facebook page. This marks a long-coming reckoning for Australia’s media companies that have effectively outsourced their reader engagement to the big tech platforms.
It’s bad news for the business — particularly for the outrage media.
The finding in Dylan Voller’s defamation case against both News Corp and Nine doesn’t extend the law. It just makes clear that on Facebook, as on the web, the publisher is responsible for what’s on their page — no matter who puts it there.
However, the business threat is that it cuts the tether between key strategies of traditional news media: slash costs by sacking staff and intensifying the workload; build readers (and subscribers) by gaming the algorithms of social media. News media posts on Facebook (or Twitter) are not a distribution “push”. They’re to “pull” readers or viewers out of Facebook and onto their web page, to be monetised through advertising or subscriptions.
As anybody who has spent more than five minutes on Facebook knows, the challenge is to boost the post through the platform’s algorithm by demonstrating engagement through likes, shares and, most importantly, comments.
As Supreme Court judge Stephen Rothman wrote:
… the media companies’ use of a public Facebook page is about their own commercial interests … the primary purpose … is to optimise readership of the newspaper (whether hardcopy or digital) or broadcast and to optimise advertising revenue. The exchange of ideas on the public Facebook page is a mechanism (or one of the mechanisms) by which that is achieved.
The Facebook pull is big business. Evidence in the case showed that when the disputed comments were made in July 2017, 53% of unique monthly visitors to The Australian’s web page came from the masthead’s public Facebook page.
This was about the time of Facebook’s first algorithm tweak to prioritise posts from family and friends over news in the newsfeed. (The second tweak came six months later.) The case suggests that 39% of News Corp’s Australian traffic still comes from Facebook.
The media companies work the pull hard, according to the evidence, posting between 20 and 80 stories a day attracting from zero to 1800 comments. In this little corner of the outrage economy, the greater the outrage, the higher the comments — and the greater the algorithmic reward.
Now, the pull is about subscription. The Australian, for example, has reduced from three to one the number of stories a non-subscriber can read off a social media link. From search, it’s zero, with a link taking you directly to the subscription splash. This tight paywall has driven the growth in digital subscriptions. The Australian now boasts more than 150,000 — more than ever in print.
Continuing growth depends on pulling readers from social media to a subscription decision point, tracked through the company’s proprietary data tool, Verity. According to a recent internal email from The Australian’s digital editor Daniel Sankey reported in The Guardian yesterday, the paper’s coverage of the Setka story had attracted 150,000 subscriber page views and four new subscriptions. Five more subscriptions came from stories on why children of Asian migrants do better at schools and on school bullying.
Stories lead to subscriptions leading to more of those same stories. More CFMMEU outrage? Yes, please!
Sacrificing Facebook comments would risk losing this subscription flow. But, the alternative could be worse. The companies might have to employ people — perhaps some of those journalists they’ve already laid off — to monitor the comments, through hiding by default until they can be approved.
In the judge’s summary of the evidence on monitoring you can almost hear the companies spluttering: “But… but the cost, your honour!” He found that Sky News (which receives the most comments, with about 7000 a day across six Facebook pages) would need to engage no more than 2.5 people.
They would have to find them first. Monitoring Facebook is a thankless job, characterised by stress and churn.
Most media organisations moderate webpage comments (where legal liability is uncontested), sometimes by the journalist authors — exposing them to trolls and abuse — and sometimes by specialist moderators.
Or the media may decide it’s just cheaper to risk the occasional defamation and carry on as they are.
What do you think about the outcome of the defamation case? Should media companies do more to block harmful comments on Facebook? Send us your thoughts along with your full name to [email protected]