Diversity and Inclusion - Building Inclusive Workplaces
Out-Law News | 06 Nov 2012 | 11:46 am | 4 min. read
Such a ruling would mean that Google would not be liable if information displayed via its 'autocomplete' function was defamatory, said media law specialist Ian Birdsey of Pinsent Masons, the law firm behind Out-Law.com.
Autocomplete suggests words or characters for completing a partial search on Google.
Last week a court in Australia ruled that Google should have to pay damages to Milorad Trkulja, a TV presenter who had complained that the internet giant had defamed him, according to a report by the BBC.
Trkulja was shot in a Melbourne restaurant in 2004 by a gunman wearing a balaclava. He claimed that, following the shooting and subsequent reporting of the incident, his name had become associated with the images of alleged criminals when users typed his name into the 'Google Images' search function.
Trkulja sued Google claiming that the company had failed to remove the defamatory link between him and the alleged criminals when he requested such action. The Supreme Court of Victoria accepted Google's argument that it had innocently disseminated the material but said that that defence was only applicable up to the point at which the company received Trkulja's complaint and held Google to be liable for defaming the man as a result of its inaction, according to the BBC's report.
This Australian case follows a similar ruling in Japan after a court there ordered Google to stop its search engine technology from suggesting "specific terms" that have linked a man's name to crimes he did not commit.
The unnamed man sued Google after claiming that the terms the company's autocomplete software suggests in association with his name caused him to lose his job and has subsequently put off potential new employers, according to a report at the time by Kyodo news agency on the Japan Times website.
In a similar ruling in France Google was fined $65,000 by a court after its search engine suggested the French word for 'crook' when users typed-in the name of an insurance company.
However, in the UK in 2009 the High Court ruled that Google is not the publisher of defamatory words that appear in its search results. Mr Justice Eady ruled that even when notified that its results contained libellous words Google was not liable as a publisher.
Google's liability for defamatory words that appear via its 'autocomplete' suggestions is as yet untested in the UK. However, Ian Birdsey said that it is unlikely that a UK court would come to a different conclusion from the one arrived at by the High Court in 2009.
"Although the issue of whether Google’s liability for its ‘autocomplete’ search function has yet to be dealt with by UK courts, the High Court in 2009 did determine that Google was a mere facilitator of the information displayed on its search results because it did not authorised the appearance of the information on users’ screens in a ‘meaningful sense’,” Birdsey said. “If the UK courts were to assess whether Google was liable for defamation as a result of the way its ‘autocomplete’ system suggests terms to users I think the courts would draw similar conclusions and find that Google is not a publisher."
"There has to be recognition that Google search terms are the product of input by its users. It is unfair to view Google as a traditional publisher of suggested search terms as a result of this," he added.
“In his judgement Mr Justice Eady said that there was a ‘degree of international recognition that the operators of search engines should put in place [a take-down policy] (which could obviously either be on a voluntary basis or put upon a statutory footing) to take account of legitimate complaints about legally objectionable material’,” Birdsey said. "The European Commission is currently looking to reform 'notice and takedown' rules that govern illegal material posted on the internet and has asked whether search engines, among other intermediaries, should be deemed to be ‘hosts’ of content. It is to be hoped that the Commission’s plans make clear whether search engines do have responsibility for removing illegal content and what that process should be."
To be considered libellous under common law in the UK comments must be published, communicated to someone other than the person being defamed and not be justified by a range of defences, including that the comments are true or were expressed as an opinion.
In the UK laws on defamation are also written into legislation. Under the Defamation Act a person can claim a defence against allegations of defamation if they can show that they were neither the author, editor or publisher of the comments, "took reasonable care in relation to its publication" and "did not know, and had no reason to believe, that what he did caused or contributed to the publication of a defamatory statement". The Act defines 'publisher' as meaning "a commercial publisher, that is, a person whose business is issuing material to the public, or a section of the public, who issues material containing the statement in the course of that business".
Under the E-Commerce Regulations it is possible for "secondary publishers" to be found responsible for defamatory comments posted using their service. However, they can avoid liability for defamation under the terms of the Regulations if they are viewed as acting only as a mere conduit or caches or hosts of the material.
In order to avoid any liability for unlawful material, the service provider must, upon gaining 'actual knowledge' that the initial source has been removed or access to it has been disabled, act 'expeditiously' to ensure that the information is deleted or access to it disabled.
Diversity and Inclusion - Building Inclusive Workplaces