(Presented at the MLRC Media Law Conference September 2016)
Guy Vassall-Adams QC, Matrix Chambers, London
Jacob P. Goldstein, Assistant General Counsel at Dow Jones & Company, Inc. 
This outline briefly reviews the 2014 decision by the European Union’s highest court, the Court of Justice of the European Union (CJEU), in the case of Google Spain SL & Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, Case C-131/12 (“Google Spain”). The outline then explores select major developments in the two years following this landmark decision, including how Google and news organizations are dealing with the consequences of the Google Spain ruling.
I. THE CJEU’S GOOGLE SPAIN RULING
The key points from the Ruling were as follows:
1) The activity of a search engine in finding, indexing, storing and making available information to the public in ranked search results produced in response to searches on a person’s name amounts to the processing of personal data and the operator of a search engine is the data controller in relation to that processing. As such, the operator of a search engine is under an obligation to comply with the Data Protection Directive and to ensure that the privacy rights of data subjects are protected.
2) The processing of personal data is carried out by an establishment on the territory of an EU member state and compliance with the Data Protection Directive is required where the operator of a search engine sets up, in a member state, a branch or subsidiary which is intended to promote and sell advertising space offered by that search engine and which orientates its activity towards the inhabitants of that state. Google Spain was therefore established in an EU member state and Google Inc, as the search engine operator and data controller, was required to comply with the Directive.
3) The processing of personal data by a search engine is additional to that of the original publisher of information on the Internet. Search engines play a central role in disseminating personal data, by making it accessible to any Internet user who carries out a search on the basis of the subject’s name, including people who otherwise wouldn’t have found the web page of the original publisher. Search results thereby enable Internet users carrying out such searches to get a structured overview of the information relating to that individual and to establish a detailed profile of that person. Search engines therefore have a major additional impact on the privacy rights of individuals.
4) Individuals have the right to request search engines to remove name-based search results linking to webpages containing information about them published by third parties, even where the information continues to be published by the original publisher and even though publication by the original publisher may be lawful. The fact that a Spanish newspaper continued lawfully to publish the information on its website did not therefore absolve Google Inc from its responsibility to comply with the Directive as the data controller.
5) Where a person requests removal of name-based search results and the data controller refuses to comply with that request, the national authority can require the search engine to remove those search results in order to protect the data subject’s rights to privacy and to data protection. This has been called the “right to be forgotten” or the “right to delist”.
6) Where a national authority has to decide whether to order the removal of search results following a request from the data subject, it has to strike a fair balance between the data subject’s privacy rights and the legitimate interest of Internet users potentially interested in having access to that information. In other words, the CJEU held that a national court has to balance the right to privacy with the right to freedom of expression.
7) Important factors which a national authority should take into account when striking the balance between privacy and free expression included the nature of the information in question, in particular the sensitivity for the data subject’s private life, versus the interest of the public in having access to the information, in particular the role played by the data subject in public life.
Three important points to note about the Ruling:
1) The ruling only applies to name-based search results, i.e. search results displayed following a search using the data subject’s name. Searches using different search terms are not affected by the Ruling, even if they lead to retrieval of information which would be, or has been, removed on a name-based search. For example, a search using “Barack Obama” would be caught by the Ruling, but not a search using “President of the United States” or which referred to the subject matter of a topical issue relating to the President. This has subsequently been confirmed by the Article 29 Working Party (see below).
2) The Ruling did not address the relationship between the EU data protection regime and the eCommerce Directive, which provides safe harbors for internet intermediaries, based on a “notice and takedown” model. The reason for this is that Google Inc did not remove the search result prior to judgment so the availability of eCommerce safe harbors was not in issue. This has subsequently become an important issue in a number of cases, though the new Data Protection Regulation due to come into force in 2018 makes it clear that the eCommerce safe harbors are available in data protection cases (see further below).
3) The Ruling did not specify the scope of removals required to give effect to the CJEU’s judgment. This has also been addressed by the Article 29 Working Party and has come to the fore in a number of cases, with national data protection authorities in Europe taking different approaches to this issue (see further below).
II. GOOGLE’S PROCESS FOR RESPONDING TO REMOVAL REQUESTS
Google has established a simple web form for individuals to submit removal requests under European data protection law: https://support.google.com/legal/contact/lr_eudpa?product=websearch
Google asks the requester to explain “why the content on this URL in search results is distributed unlawfully, inaccurate, or outdated.” And within the form, Google explains: “When evaluating your request, we will look at whether the results include outdated information about you, as well as whether there’s a public interest in the information — for example, we may decline to remove certain information about financial scams, professional malpractice, criminal convictions, or public conduct of government officials.”
As of August 15, 2016, Google had received 528,756 requests and evaluated 1,634,370 URLs for removal. Of these, it’s removed 594,373 URLs, and declined requests for removal of 785,968 URLs.
Some of the most common material factors involved in decisions to delist pages include:
- Clear absence of public interest: For example, aggregator sites with pages that contain personal contact or address information, instances where the requester’s name no longer appears on the page, and pages that are no longer online (404 error).
- Sensitive information: Pages with content that relates solely to information about someone’s health, sexual orientation, race, ethnicity, religion, political affiliation and trade-union status.
- Content relating to minors: Content that relates to minors or to minor crimes that occurred when the requester was a minor.
- Spent convictions/exonerations/acquittals for crimes: Consistent with local law governing the rehabilitation of offenders, we tend to weigh in favor of delisting content relating to a conviction that is spent, accusations that are proven false in a court of law, or content relating to a criminal charge of which the requester was acquitted. We also consider the age of this content and the nature of the crime in our analysis.
A “strong public interest” is among the “most common material factors involved in decisions not to delist pages.”
We may decline to delist if we determined that the page contains information which is strongly in the public interest. Determining whether content is in the public interest is complex and may mean considering many diverse factors, including—but not limited to—whether the content relates to the requester’s professional life, a past crime, political office, position in public life, or whether the content itself is self-authored content, government documents, or journalistic in nature.
Its team of reviewers “uses dedicated escalation paths to senior staff and attorneys at Google to adjudicate on difficult and challenging cases. As of November 1, 2015, just over 30% of requests had been escalated for a second opinion.”
While the delisting request form asks the submitter to select the European country “whose law applies to your request,” Google’s processing appears to honor removal requests submitted even by those who are not EU nationals.
Google provides notice to webmasters that it has decided to remove certain pages on the webmasters’ site from search results due to a request under European data protection law. Google does so “in the interest of transparency,” however, it opts not to provide the requester’s name or any details other than the affected URLs in order to “respect the privacy of the individuals who have made removal requests.”
The notice email includes a link to a form for webmasters to submit information for Google to reconsider its decision. Google advises: “Unfortunately, due to concerns regarding the privacy of the person who submitted the request, Google is not in a position to supply more details than what you've already received. We recognize that that means you may not have enough information to respond fully to the removal action. However, if you would like to request your page be reinstated and you have information or details that weren't apparent to Google at the time we performed the initial removal, you may provide those details here.” The webmaster may then submit information to explain why the website should be restored as a search result, including the following options:
Have you changed the content of the page to remove sensitive content?
Are there facts or context that Google may not have been aware of at the time the removal was performed, which bear on the interest in the public in seeing this page appear in Google's search results?
Have circumstances changed since the removal was performed, such that a stronger public interest exists than before? (For example, is a person mentioned on the page now running for political office?)
Is there some other reason, not clear on the face of the page, why the inclusion of this URL in Google's search results is in the public interest?
Like other publishers, upon receipt of notice, Dow Jones has challenged Google to reconsider its decisions to remove links to its articles in response to certain searches. For example, in July 2014, Dow Jones received notice that Google was no longer showing a link to a 1998 Wall Street Journal article about modern entrepreneurs’ efforts to capitalize on the 4,000 year-old teachings of Tantra. The Journal identified the individual who had made the request to Google for removal, a Netherlands-based investor who had participated in a Tantra workshop described in the 1998 article; for a new article about the Google removal, he confirmed that he had made the removal request and while noting the irony of the publicity over his request, he said, “If it adds a little more context to it, that is not a problem. That is actually better.” Dow Jones challenged Google’s decision. Google responded that it agreed that removal “affects your ability to convey information to your readers, as well as your readers’ right to receive that information”; however, while guidance from regulators and courts about how to strike the balance between the public’s interest and the data subject’s privacy rights was not yet clear, Google’s “best interpretation” was that it should keep in place the removal of this link in response to “certain name queries on European versions of Google Search.”
In contrast, Dow Jones successfully challenged Google’s decision to remove, in response to some search queries, a link to a 2001 Journal article entitled, “Former Elektrim Executive’s Methods Resurrected Specter of Secretive Past.” This Journal article analyzed the tenure of a U.S. citizen as CEO of a major Polish conglomerate in the course of exploring “what ails Poland’s biggest industries . . . more than a decade after communism ended.” The CEO, even while seeking to have this article forgotten, had repeatedly and recently publicly invoked her service as Elektrim CEO in other websites and was apparently content to permit the public easy access to a more glowing 1999 New York Times article about her tenure. Citing “legal sensitivities around sharing the personal data of those who invoke their rights under data protection law,” Google declined to provide any details about the original request. Google agreed, however, with Dow Jones that links to the article should be fully restored: “Google Inc. has re-evaluated its original removal action in light of your message and concluded that it was made in error, due to the public interest in this page appearing on searches for the data subject’s name. That removal action has been revoked, and the data subject has been notified of the change.”
III. KEY LEGAL DEVELOPMENTS
Article 29 guidelines
On 25 July 2014, the Article 29 Data Protection Working Party, established under Directive 95/46/EC, met with representatives of Google, Microsoft and Yahoo “to ask search engines about their practical implementation of the ruling, and to provide input to future WP29 guidelines”. The Article 29 Working Party stated that the guidelines would seek to ensure “a consistent handling of complaints by European DPAs facing requests lodged by individuals following delisting refusals by search engines” and “should also frame the action of search engines ensuring the consistent and uniform implementation of the ruling.”
Following that meeting, on 30 July 2014 Google submitted its responses to the Article 29 Working Party’s questionnaire.
Key points from Google’s Response were as follows:
1) Google described the process it had established for assessing removal requests and the information it required from requesters as part of that process.
2) Google explained that it had interpreted the Ruling as applying only to searches carried out on a person’s name but not to other kinds of search request.
3) Google explained that it had interpreted the Ruling as requiring it to remove name-based search results from the European versions of its search engines i.e. the country domains targeting EU and EFTA countries, but that it did not remove search results from non-European versions of search such as .com. It rationalised this approach on the basis that fewer than 5% of European users used .com and explained that it redirected European users of .com to European versions of its search engine.
4) Google explained that where it had delisted a search result, it notified the original publisher of the fact that the URL had been removed.
5) Google set out some of the criteria that it took into consideration when deciding whether to remove a search result, including the individual concerned and whether he/she was a public figure, the publisher of the information (e.g. whether it was a reputable news provider or an authoritative source) and the nature of the information published (e.g. political speech, if it was published by the data subject, or if the information related to a person’s profession or to a criminal conviction).
6) Google stated that as of 18 July 2014 it had received more than 91,000 removal requests for more than 328,000 URLs. Of those, Google explained that where it had received removal requests, it had removed around 53% of URLs, had requested further information in around 15% of URLs and had refused to remove 32% of URLs.
7) Google said that it had it found it challenging to make decisions on some removal requests where it lacked the factual context for a request, without which it was difficult to balance the competing interests. It noted that some requesters provided inaccurate or incomplete information, so that Google sometimes lacked information that would speak in favour of preserving accessibility — e.g. requesters failing to mention a role in public life, or omitting to mention recent criminal convictions when requesting removal of older similar convictions.
8) Google stated that it had established an Advisory Council on the Right to be Forgotten, to advise it on principles, policies and processes.
In November 2014, the Article 29 Working Party adopted guidelines for the implementation of the Google Spain judgment.
The Article 29 Working Party Guidelines gave the following general guidance:
1) The Working Party confirmed that the Ruling only applied to search results produced in response to searches on a person’s name and did not require the deletion of links to search results from the indexes of the search engine altogether.
2) The Working Party stated that delisting decisions must be implemented in a way which guaranteed the “effective and complete protection of data subject’s rights” and prevented EU from being circumvented. It considered that de-listing limited to EU domains was insufficient and that “de-listing should also be effective on all relevant domains, including .com.”
3) The Working Party accepted that search engines were entitled to make general statements to the public about the fact that removals had taken place following successful removal requests, but only if they did not identify the requester.
4) The Working Party said that search engines should not, as a general practice, inform webmasters of the fact that certain URLs had been de-listed from their search results and questioned whether such communications were consistent with EU data protection law. But the Working Party said that it would be legitimate for a search engine to contact the original publisher when assessing a request to de-list.
5) In order to enable search engines to make their assessments, data subjects should provide the reasons why they require de-listing, identify the specific URLs and indicate whether they fulfil a role in public life, or not.
6) When a search engine refuses a de-listing request, it should provide sufficient explanation to the data subject about the reasons for the refusal.
7) The right to de-listing does not apply to internal search engines within websites.
8) While the right to data protection applies to “everyone” under Article 8 of the EU Charter on Fundamental Rights, EU DPAs will in practice focus on claims where there is a clear link between the data subject and the EU.
The Article 29 Working Party also set out in Part II of the November 2014 Guidelines the criteria which European Data Protection Authorities (DPAs) should use when deciding whether to order the de-listing of search results. The Working Party said the list of criteria, is intended as a “flexible working tool”, that it is non-exhaustive and that no one factor is determinative. The list of criteria is as follows:
1) Does the search result relate to a natural person and come up against a search on that person’s name? DPAs should also consider pseudonyms and nicknames when an individual can show that they were linked to his real identity.
2) Does the data subject play a role in public life, or is the data subject a public figure? The Working Party said that this exception is wider than just public figures, but concerns anyone who plays a role in public life, such as politicians, senior public officials, business-people and professional people. This exception also covers public figures – individuals who, due to their functions/commitments, have a high degree of media exposure. This would include individuals in the economy, the arts and sport, as well as politics.
3) Is the data subject a minor? In general, if the data subject is a minor, this is factor that favours de-listing.
4) Is the data accurate? In general, if data is factually inaccurate and presents an inaccurate, inadequate or misleading impression of an individual, delisting is more likely to be required. Where a dispute about the accuracy of information is ongoing, DPAs may choose not to intervene until the process is complete, e.g. a court case or a criminal investigation.
5) Is the data relevant and not excessive? Under this heading, DPAs should assess relevance by reference to:
(a) whether the data relates to the working life of the subject, or to his or her private life. The former is likely to be much more relevant than the latter.
(b) whether the search result links to information which is excessive or allegedly constitutes hate speech/slander/libel or similar offences of expression against the person. While DPAs cannot rule on these matters, they are competent to assess whether data protection laws have been complied with.
(c) Is it clear that the data reflect an individual’s personal opinion or does it appear to be verified fact? The former is less likely to require de-listing than the latter.
6) Is the information sensitive? Sensitive information, such as information about a person’s health, sexuality or religious beliefs, has a greater impact on a person’s private life than ordinary personal data and is more likely to require protection.
7) Is the data up to date? Data which is out-of-date when judged by the purpose of the original processing should be removed more readily.
8) Is the data processing causing prejudice to a data subject? Although a data subject does not need to establish prejudice to invoke the right to de-list, evidence of prejudice is a strong factor in favour of de-listing.
9) Does the search result link to information that puts the data subject at risk? If the information leaves the data subject at risk of identity theft or stalking, for example, this is a factor favouring de-listing.
10) In what context was the information published? Where an individual originally gave consent to information being published, but has subsequently revoked that consent, this is a factor that favours de-listing.
11) Was the original content published in the context of journalistic purposes? The fact that information is published by a journalist whose job is to inform the public is a factor to weigh in the balance.
12) Does the publisher of the data have a legal power, or a legal obligation, to make the personal data publicly available? This may be a factor that counts against a de-listing request.
13) Does the data relate to a criminal offence? De-listing is more likely to be appropriate for minor offences committed a long time ago, than for serious offences that happened more recently.
The General Data Protection Regulation
The EU’s new General Data Protection Regulation (GDPR), which establishes a new data protection framework across the EU, was finally adopted in April 2016. It came into force on 24 May 2016 but will only apply in practice from 25 May 2018.
The GDPR is the subject of a separate outline, but two issues of particular relevance to the RTBF are highlighted here.
First, the GDPR creates a “right to erasure” in Article 17 which encompasses but may go beyond the RTBF created by the Google Spain judgment. The terms of Article 17 are very broad and create a right to have personal data erased “without undue delay” where one of the following grounds applies:
a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;
(b) the data subject withdraws consent on which the processing is based, having previously given consent to the processing of personal data under Article 6(1)(a) or data in special categories under Article 9(2)(a), and where there is no other legal ground for the processing of the data;
(c) the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing of personal data pursuant to Article 21(2)
(d) they have been unlawfully processed;
(e) the data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;
(f) the data have been collected in relation to the offering of information society services referred to in Article 8(1).
The second key development is the clarification of the relationship between the EU data protection regime and its eCommerce Directive, which creates safe harbors for internet intermediaries (or providers of information society services in EU jargon).
It had previously been argued in two UK cases brought against Google in relation to its search engine (Hegglin v Google and Mosley v Google) that the eCommerce safe harbors did not apply to data protection claims. This was based on a particular reading of a provision in the eCommerce Directive which used ambiguous language about the relationship between the two Directives.
This was a weak argument as there was no justification for treating data protection claims differently from any type of civil claim against an internet intermediary, but it had created uncertainty about whether the eCommerce safe harbors would provide a defence.
This issue has now been resolved by the GDPR, which provides in preamble 21 and Article 2(4) that “This Regulation shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive.”
It is now clear that internet intermediaries can rely on the safe harbors in the eCommerce Directive in response to claims based on alleged failures to comply with the GDPR. This is a very important provision for internet intermediaries.
IV. OTHER LEGAL DEVELOPMENTS
A. IN THE EU
1) Notice to Websites
The practice of providing notice to webmasters that a URL has been delisted remains a source of controversy. Notice enables the publisher to know how its audience’s ability to access its information is affected and to provide search engines with additional context and information to inform their balancing of the public and private interests. The Working Party discourages notice, saying “[t]here is no legal basis for such routine communication under EU data protection law.” But at the same time, as noted above, the Working Party acknowledges that search engines may wish to contact publishers “in order to obtain additional information for the assessment of the circumstances surrounding that request.”
With notice, however, it is frequently not difficult to identify the person who made the request for delisting. Indeed, researchers have even automated the process by exploiting differences in the scope of removal, another controversy discussed below.
2) Scope of removal
Following the Article 29 Working Group’s insistence that Google should remove search results from all its domains including .com (see above), there have been a number of important developments in relation to the scope of removal.
Google then engaged in dialogue with a number of European data protection regulators to try to meet the regulators half-way.
On 4 March 2016, Google announced a change in policy in a blogpost by Peter Fleischer, its Global Privacy Counsel.
Google’s previous practice when implementing the Google Spain decision had been to de-list search results from its European versions of search, but not from non-European versions of search such as .com.
The change of policy means that in addition Google would now use geolocation signals like IP addresses to restrict access to the delisted URL on all Google Search domains, including google.com, when accessed from the country of the person requesting the removal. The delisted search results would still be accessible to users outside of that country using non-European Google Search domains.
It appears that Google’s compromise has achieved a reasonable degree of acceptance among European data protection regulators.
However, the French data protection regulator (CNIL), has rejected Google’s new policy and insisted that Google should extend de-listing to all versions of its search engine wherever in the world they are accessed. In a decision dated 10 March 2016, CNIL fined Google Euros 100,000. https://www.cnil.fr/en/right-be-delisted-cnil-restricted-committee-imposes-eu100000-fine-google
“The right to be delisted is derived from the right to privacy, which is a universally recognized fundamental right laid down in international human rights law. Only delisting on all of the search engine's extensions, regardless of the extension used or the geographic origin of the person performing the search, can effectively uphold this right.”
CNIl argues that Google’s policy does not give people effective, full protection of their right to be delisted because:
(1) Personal or professional contacts living outside Europe can still access the delisted search result linking to content that may infringe the privacy of the person concerned;
(2) Personal or professional contacts living in Europe and using a non-European search engine extension (".com") with a non-French IP address (British, Spanish, Swiss, etc.) can still access the delisted search result;
(3) Certain technical solutions can easily get around Google's filtering system by allowing Internet users to change the geographic origin of their IP address.
In May 2016, Google announced that it was appealing CNIL’s order to France’s Supreme Administrative Court, the Conseil d’Etat. http://googlepolicyeurope.blogspot.be/2016/05/a-principle-that-should-not-be-forgotten.html
In his blogpost, Kent Walker explained Google’s fundamental objection to CNIL’s insistence on global removals:
“As a matter of both law and principle, we disagree with this demand. We comply with the laws of the countries in which we operate. But if French law applies globally, how long will it be until other countries - perhaps less open and democratic - start demanding that their laws regulating information likewise have global reach? This order could lead to a global race to the bottom, harming access to information that is perfectly lawful to view in one’s own country. For example, this could prevent French citizens from seeing content that is perfectly legal in France. This is not just a hypothetical concern. We have received demands from governments to remove content globally on various grounds -- and we have resisted, even if that has sometimes led to the blocking of our services.”
Reporting of RTBF removals
Given the ongoing controversy over the Google Spain ruling, one difficult issue is how to balance the private and public interests involved in news articles reporting on RTBF removals.
As Google explains, “After we removed a news story about a minor crime, the newspaper published a story about the removal action. The Information Commissioner’s Office ordered us to remove the second story from search results for the individual’s name. We removed the page from search results for the individual's name.” According to the Guardian, the ICO ordered Google in August 2015 to remove “nine links to current news stories about older reports which themselves were removed from search results under the ‘right to be forgotten’ ruling.”
By contrast, in October 2015, Spain’s data protection authority AEPD ruled that Mr. Costeja, the individual whose complaint led to the Google Spain ruling, did not have the right to obtain the removal of links to recent commentary about his case in light of the public interest in the ruling and the facts behind it.
3) Extension to News Archives
The Google Spain decision itself did not affect what the original publisher could maintain in its archives, just the search engine’s conduct. However, claimants are not shy about seeking to extend the RTBF to apply directly to publishers’ online archives. A brief survey of reports on such developments in different jurisdictions:
BELGIUM – The Court of Cassation rejected the appeal by Rossel, publishers of Le Soir newspaper, and affirmed a ruling requiring the publisher to remove from its online archives a 1994 report about a fatal car accident.
GERMANY – Apollonia - The German Federal Constitutional Court is currently considering a claim against the publisher of Der Spiegel regarding their online archive. This follows a decision by the Federal Court of Justice allowing an article that was published in the 1980s to be maintained online and accessible via search engines. The Constitutional Court is expected to deal with the question whether the operator of an online archive has to limit access via name searches.
In a separate case, which was not appealed, the court ordered a publisher (Süddeutsche Zeitung) to cease and desist from making challenged articles available “in a way that they could be found and listed by internet search engines upon entering the claimant’s name as a search term”. In its decision of 7 July 2015, the Senate ordered a German publisher to refrain from making an article available on its website in response to a search only for the name of a claimant. It referred to the ECJ’s decision regarding Google Spain and reasoned that if Google was required not to show compromising search results for a name, then the operator of an online archive could also be ordered to exclude such search results. However, it is unclear whether the operator of an online archive has the ability to limit access of search engines in a way that the article is not listed as a result of a name search but is otherwise available online.
SPAIN - The Spanish Supreme Court ruled that El Pais did not have to alter its own archive of news material. “Digital archives are protected by freedom of information as they satisfy a public interest. That is why old news cannot be canceled or altered.” However, the Court extended to the news publisher the obligation to take steps to ensure that “obsolete” news articles are not retrievable through search engines.
B. BEYOND THE EU
- In January 2015, the U.S. Court of Appeals for the Second Circuit affirmed summary judgment in favor of several publishers of news reports about plaintiff’s 2010 arrest. The criminal charges against plaintiff had been nolled – unilaterally dropped by the prosecutor – and, pursuant to Connecticut’s Criminal Records Erasure Statute, the records of her arrest and prosecution were “erased” and she became legally “deemed to have never been arrested.” Conn. Gen. Stat. § 54-142a(e)(3). Citing the effects of the Erasure Statute, plaintiff unsuccessfully asked the publishers to remove from their websites the reports of her arrest and then sued them for libel, false light, negligent infliction of emotional distress and invasion of privacy. The Second Circuit affirmed judgment in favor of the publishers, ruling that “the reports of her arrest were true at the time they were published. Neither the Erasure Statute nor any amount of wishing can undo that historical truth. The Moving Finger has written and moved on.” Martin v. Hearst Corp., 777 F.3d 546, 552 (2d Cir.), cert. denied, 136 S. Ct. 40, 193 L. Ed. 2d 26 (2015). The court noted that while several states have similar erasure statutes, they are not meant to alter the historical record. Id. at 551 & n.6.
- California Eraser Button Law – In 2013, California enacted a law entitled “Privacy Rights for California Minors in the Digital World”, with a provision requiring certain websites to permit minors to delete material they posted. SB 568, California Business & Professions Code § 22581. The law became effective January 1, 2015. At last check there have been no reported decisions. The law contains several ambiguities and is likely subject to several challenges, including on First Amendment grounds.
RUSSIA – Russia enacted a law, effective January 1, 2016, instituting a right to be forgotten by imposing certain obligations on search engines that receive delisting requests.
CHINA – In May 2016, the Haidian District People's Court in Beijing issued a ruling rejecting a claimant’s attempt to assert a privacy right that would have prohibited a search engine from displaying certain keywords as related search suggestions in connection with searches for his name. “This information happens to form a portion of his professional history, and his current individual professional credibility is both of directly relevant and of ongoing concern.”
JAPAN -- In December 2015, a district court in Japan recognized a “right to be forgotten” as the basis to order Google to remove three-year-old search results about a man’s arrest for child prostitution and pornography charges. Google appealed the ruling and in July the Tokyo High Court reversed, stating that "The right to be forgotten is not a right that has been defined by law so there is no need to make an independent judgment because it is no different from requests based on the right to privacy and the right to having one's honor protected."
 The views in this outline are those of the individual authors and do not necessarily represent the views of their employers, Google, or other clients of Matrix.
 CJEU Ruling: http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:62012CJ0131;
CJEU Press Release of Ruling: http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf
 http://ehoganlovells.com/rv/ff00215e62d39df2fd98d8850c46857ed70a4ade; https://www.article19.org/resources.php/resource/38099/en/legal-analysis:-russia's-right-to-be-forgotten
 http://www.asahi.com/ajw/articles/AJ201608240007.html<http://www.asahi.com/ajw/articles/AJ201608240007.html ; for background see http://www.japantimes.co.jp/news/2016/02/27/national/crime-legal/japanese-court-recognizes-right-to-be-forgotten-in-suit-against-google/#.V1nUr_krLGg