home

Media Law Resource Center

Serving the Media Law Community Since 1980

Home

Australia: Media Responsible for Third-Party Facebook Postings

By Peter Bartlett and Dean Levitan

The Supreme Court of New South Wales has reached a landmark decision in Dylan Voller's defamation case against three media companies: Voller v Nationwide News; Voller v Fairfax Media Publications; Voller v Australian News Channel [2019] NSWSC 766.

The Court decided that media companies are now considered the 'publisher' (in a legal sense) of third-party comments on their public Facebook pages and can be held liable where those comments are defamatory. This decision has alarming and profound consequences for media companies, who are now required to monitor comments posted in response to posts on their Facebook page.

To reduce the risk of being sued for defamation as a consequence of allegations made in the comments section, it may be necessary for media companies to change the settings on Facebook posts to enable the comments to be meticulously vetted before becoming publicly available. This will significantly impact the way social media is used and will invariably restrict freedom of speech in Australia.

Background

Dylan Voller is a former youth detainee at Darwin's Don Dale Youth Detention Centre. His mistreatment at the facility was the subject of an ABC Four Corners program in 2016 that sparked a Royal Commission to be called within 36 hours by then-Prime Minister Malcolm Turnbull.

There was extensive media coverage of these events as they unfolded. This included articles being posted on Facebook, of which, many members of the public then commented on. Some of the comments contained allegations against Mr Voller that included that he had brutally bashed a Salvation Army officer who visited him in detention, had committed a carjacking, and beaten and raped an elderly woman.

It is the substance of these third-party comments that are the allegedly defamatory imputations that have given rise to Mr Voller's law suit against the media entities. Mr Voller had not notified the respective media companies of the comments.

The Judgment

The preliminary question in this trial was, "Whether the plaintiff has established the publication element of the cause of action of defamation against the defendant in respect of each of the Facebook comments by third-party users?" Put simply, Voller was required to prove that the media companies are 'publishers' of comments posted by third parties on their Facebook posts.

Justice Rothman decided that Voller had established that the media companies are the publishers of comments written by third-parties (such as readers) on their Facebook posts.

The Court heard from witnesses from each of the media entities who testified as to the operation of their public Facebook pages.

Justice Rothman made a number of factual conclusions following this evidence, including:

  • The public Facebook page of each of the media defendants is published for a number of purposes associated with the success of the company and its media publications, including: promulgation of summaries of articles of interest; exciting the interest of Facebook users; increasing the number of subscribers to the digital media publication or newspaper; and increasing the profile of the public Facebook page and the initial media publication, which affects advertising revenue;
  • The existence and number of comments (including "likes" and "shares") from third-party users is an important (and, more probably than not, the most important) aspect of the public Facebook page, as it affects the Facebook algorithm and increases the profile of the Facebook page and the consequential popularity of the Facebook page, thereby increasing readership in the digital newspaper/broadcast and augmenting advertising sales on both the Facebook page and the digital newspaper/broadcast;
  • It is possible to hide comments that contain particular words or triggers upon which the program operating the public Facebook page would operate to hide the whole comment;
  • By using a list of extremely common words, which any comment would be difficult to avoid, it is possible to hide, in advance, all, or substantially all, comments;
  • The defendants could, if sufficient staff resources were allocated, monitor comments, whether published or hidden, and hide, delete or "un-hide" those comments;
  • Certain initial posts by the media outlet would be expected to excite adverse comment about a person who is the subject of the post, including comment that is unreasonable, factually incorrect and damaging to the reputation of the person involved; and
  • The publications of these relevant original posts by the media companies (i.e. posts to which the comments alleged to be defamatory relate), if any assessment were to have been made (which it was not), would have been assessed as likely (i.e. more probably than not) to give rise to nasty and defamatory comments. [90]

Ultimately, the Court was satisfied, on the balance of probabilities, that the defendant media companies were a first or primary publisher, because they are the owners of their public Facebook pages, stand to commercially benefit from engagement on their Facebook posts and have the ability to allow the public to access comments authored by a third-party user.

Justice Rothman acknowledged that 'it is the third-party user that places the comment on the page' [212] but stated that:

'...it is not the compiling of a message that amounts to the publication of the message; it is the placement of the message in a form that is comprehensible and able to be downloaded and the consequence that it is the ownership of the public Facebook page that attracts a reader.' [212]

Through not hiding all comments subject to later approval, the media companies allowed the comments to be published on their public Facebook page posts. It was on this basis that Justice Rothman found, 'in relation to each reader of the public Facebook page, who is not the Administrator or a Facebook friend of the third-party commentator, the defendant media company is the first and only publisher of the comment' [214].

This means that through operating public Facebook pages, media companies 'assume the risk that comments made on that page will render it liable under various laws that may prevent, render unlawful, or render actionable in damages various statements' [232].

In concluding, Justice Rothman stated:

'That risk may be ameliorated by the suggestion, given during the course of submissions and evidence, that all comments be hidden, in the manner described in these reasons for judgment, and "un-hidden" after it has been monitored. Given that the comments about which complaint is made in these proceedings are comments on an initial post that was more likely than not to give rise to defamatory comments, there seems to be no public policy reason why liability should not be sheeted home to the media company that is the defendant in each of the proceedings, at least, if it be a subordinate publisher, for its general readership (i.e. excluding the Facebook friends of the commentator).' [233]

What's Next?

The Court answered the preliminary question of "Whether the plaintiff has established the publication element of the cause of action of defamation against the defendant in respect of each of the Facebook comments by third-party users?" in the affirmative.

Now that Voller has satisfied this preliminary question, the Court will hear the remainder of the defamation case, including whether the media companies are capable of defending the alleged defamatory imputations.

What Should Media Companies Do Now?

In light of this decision, media companies are required to be on high alert about potentially defamatory comments posted on their Facebook posts. This poses a challenge as comments on a public Facebook page cannot be entirely disabled.

While this decision still stands, media companies should consider the following options to mitigate against the risk of being sued in similar circumstances:

Before posting, assess the nature and subject matter of the content and whether it will be a high or low risk of eliciting comments that could be defamatory.

There are then two key approaches to moderating comments:

  1. Hide one-by-one: Monitor the comments as they are posted and 'hide' those that contain potentially defamatory allegations. This will keep it hidden from everyone except the person who wrote the comment and their friends. This means they won't know that the comment is hidden. Alternative options to this one-by-one approach are to 'delete' the comments, which means it will be permanently removed, or 'report' the comment to Facebook, which you could do as well as hiding the comment.
  2. Block words: Facebook settings allow you to block certain words from appearing on your page. This means that comments containing the words would need to be 'unhidden' to appear publically. As suggested above, it is possible to use this tool to hide substantially all comments that contain commentary through blocking a list of extremely common words. The comments can then be monitored and allowed to be published following an assessment that they do not contain potentially defamatory material. This is the more proactive approach.

There is also a 'profanity filter' that can be turned on to block different degrees of profanity from appearing on your page. This is measured by Facebook according to the most commonly reported words and phrases marked offensive by the community, and could assist in the early stages of implementing a comment moderation strategy.

Peter Bartlett and Dean Levitan are lawyers with Minter Ellison in Sydney, Australia.

 
Joomla Templates: by JoomlaShack