High Court rules media outlets are responsible for comments on their Facebook pages

The High Court has passed a ruling about news outlets being legally responsible for the comments left on their Facebook page.

The decision comes after a youth detainee in the Northern Territory sued both News Corp and Nine for defamatory comments made on their Facebook page in 2018.

A Supreme Court Judge in 2019 found that publishers of a story about the youth detainee “…provided the forum for its [defamatory comments] publication and encouraged, for its own commercial purposes, the publication of comments”.

In 2020 the Court of Appeal again ruled the publishers were liable for the comments.

In response to the ruling, Nine has issued the following comment.

Nine recognises the decision of the High Court which makes news businesses liable for any post made by the general public on their social media pages as the “publisher” of those comments. We are obviously disappointed with the outcome of that decision, as it will have ramifications for what we can post on social media in the future.  We are hopeful that Stage 2 of the Review of the Model Defamation Provisions will take account of the High Court’s decision and the consequences of that for publishers.

We also note the positive steps which the likes of Facebook have taken since the Voller case first started which now allow publishers to switch off comments on stories.

This is a very significant ruling which will have wide implications. In the past, comments on Facebook pages could not be edited, but at least now Facebook has given page administrators the ability to hide or delete comments, but not edit them.

The ruling will make publishers more cautious and less social because they may choose to disallow comments on some controversial items and will need to actively moderate controversial comments by deleting them.

This would be extra work and more costly for any publisher.

The policy of radioinfo is that on our website every comment is vetted before publishing, and deleted or modified for legal or profanity reasons, because we have full control of content.

We did not previously do this on Facebook, but now we may extend our moderation policies to that platform as well.

Peter Saxon is the Managing Editor of radioinfo, and he says: “We already moderate our posts before we publish, which, in law makes us entirely liable for what’s posted because we openly admit to oversight. Facebook says they have no oversight over anything until someone complains and then they can investigate – and, in the fullness of time, they may do something about it.

“In essence though, what the court has ruled is that media (us) that publishes their own stories, which they routinely moderate on their own platform, can’t abrogate that responsibility and pass the buck to a third party (them) upon which they, themselves, choose to publish.

“My solution, at least for radioinfo, would be not to allow randoms to post comments under our stories on FB but provide a link back to our comments section on radioinfo and let them go through the process on our site.”

For the CBAA membership stations and community stations in general this ruling will likely create a number of problems, especially as many are volunteer driven.

CBAA CEO Jon Bisset says, “It will be difficult for our members to monitor comments as community radio stations often have limited resources, and stations strongly encourage community participation and engagement through their channels. We are concerned that the High Court’s ruling will impact open media and Australian dialogue across the media industry.”

Smaller regional radio stations like Tamworth’s 88.9FM/96.3FM will also look to moderate the way they treat comments, with GM George Frame telling radioinfo: “We already check with two people on posts/comments, with our Facebook Author and News Editor, however, we will have to re-evaluate our process and maybe make it much tighter and increase our censorship capabilities.”

Head of the ABU Media Academy, Steve Ahern, says that for the big players such as Nine and News Corp, this will have an impact, requiring more staff focus on social media moderation.

“I think publishing/broadcasting companies will rethink whether it is worth their while to turn on FB comments and whether they should bother any more to have an active Facebook page. It may mean that companies go back to where Facebook began and only post promotional content on FB, rather than trying to engage their FB audience. Ultimately this will be negative for Facebook and the publishers.”

Ahern says, “The most important point is that there is a huge inequity here.

 “People are using the social media publishing platform Facebook (not for example the publisher’s own site news.com), so why should Facebook not take responsibility – they are the publisher, even though they have successfully argued in court that they are not a publisher. This is not in line with rules, codes and practices that have been developed over decades to regulate irresponsible and antisocial content on radio, tv and newspapers.  

“Why should Facebook be immune just because the law has not caught up with modern usage practices. It is time for some changes to the law that will move Facebook from the status of a carriage platform to a publisher and therefore make it act more responsibly, as other responsible publishers and broadcasters have learnt to do over the decades. 

“Facebook, disingenuously in my opinion, claims it is a carriage platform like a telephone, and therefore, like a telephone, should not be held accountable for defamation. I disagree with this premise. The internet/world wide web is the carriage platform. Facebook is a publisher on that platform, just like a company website, an online newspaper or an OTT audio or video service like Spotify or Netflix. The inequity needs to be fixed by modifications to defamation law to force Facebook and other social platforms to be responsible in a way that is equal to other publishers and broadcasters, not pass of responsibility to a third party which is using its app.

 “Some people argue that Facebook cannot exercise editorial control on content, so is not in a position to be held responsible in the same way as publishers. In my opinion that is rubbish. Facebook can control what ads you see, it can personalise the content selected in your timeline and modify the priority order of what you see, using artificial intelligence. AI can detect keywords and act on them in real time for the commercial benefit of FB, so don’t tell me they could not use AI to exercise editorial judgement if they wanted to – they already are in other areas.

 “This is a case of the law being well behind the march of technology, and it needs to be fixed.”

Tags: | |