Right To Be Forgotten

From Justice Definitions Project

Introduction

The "Right to Be Forgotten" (RTBF) refers to an individual's ability to request the removal of personal information from public access, particularly online, to safeguard their privacy and reputation. It represents a legal entitlement allowing individuals to request the removal of links to private information by search engines. This right seeks to balance the individual's right to privacy against considerations of public interest. It is significant in the digital age, where personal data can be easily disseminated and persist indefinitely on the internet.[1]

RTBF as Defined in Legislations

India

In India, the Digital Personal Data Protection Act, 2023, notably in Section 12, introduces the concept of the "Right to correction and erasure of personal data." This provision empowers individuals, referred to as data principals, to request correction or erasure of their personal data.[2] If a data fiduciary receives such a request, they are legally obligated to either correct, update, or complete the information or erase it if it is no longer required for its original purpose, barring legal requirements for retention. It reads as follows

“12. (1) A Data Principal shall have the right to correction, completion, updating and erasure of her personal data for the processing of which she has previously given consent, including consent as referred to in clause (a) of section 7, in accordance with any requirement or procedure under any law for the time being in force.”

European Union

Following the verdict in the landmark ruling in Google Spain v. AEPD and Mario Costeja González, the right to be forgotten was codified in Article 17 of the General Data Protection Regulation (GDPR). Article 17, titled "Right to erasure ('right to be forgotten')," empowers individuals to request the erasure of their personal data without undue delay under specific conditions.[3] These include scenarios where the data is no longer necessary for the purpose it was collected, the individual withdraws consent, or the data has been unlawfully processed. The right extends beyond delisting requests, applying to broader contexts, such as the deletion of personal data from public registers after a designated period or the removal of unlawfully processed data. Article 17(2) of the GDPR places an additional obligation on data controllers who have made personal data public. When required to erase such data, controllers must take reasonable steps, considering available technology and implementation costs, to inform other controllers processing the data about the erasure request, including any links, copies, or replications of the data. The GDPR also outlines criteria for balancing the right to be forgotten with competing interests, such as freedom of expression and information. Following the enactment of Article 17, the CJEU has provided further guidance on interpreting search engine operators' obligations under this provision. These obligations must balance the fundamental rights to privacy (Articles 7 and 8 of the Charter) against freedom of expression and information (Article 11 of the Charter).

In the GDPR, recitals are the introductory statements or preambles that provide context, explanations, and reasons behind the specific provisions of the regulation. While the recitals themselves do not have the force of law, they are used to help interpret the articles (the legal provisions) of the GDPR. They clarify the intent and objectives behind the legislation, making it easier for organizations, legal experts, and regulators to understand how the regulation should be applied. Recital 65 of the GDPR outlines the Right to Rectification and the Right to Erasure, which are fundamental rights of data subjects. The right to rectification allows individuals to have their personal data corrected if it is inaccurate or incomplete. The right to erasure gives data subjects the ability to have their personal data erased when its retention no longer serves the purpose for which it was originally collected or processed. This right applies particularly when the data is no longer necessary, when the data subject withdraws consent, when the subject objects to the processing, or when the processing does not comply with the GDPR.[4]

A special consideration is made for children, especially those who consented to the processing of their data at a young age and later wish to have it removed due to a lack of awareness of the risks involved, particularly when the data is online. This right to erasure remains available even when the individual is no longer a child. However, there are exceptions to the right of erasure. Data retention may be necessary for reasons such as exercising the right to freedom of expression, complying with legal obligations, performing tasks carried out in the public interest or under official authority, safeguarding public health, or for scientific, historical, or statistical research. Additionally, the retention of data may be lawful for the establishment, exercise, or defense of legal claims. Thus, Recital 65 clarifies both the rights individuals have over their personal data and the circumstances under which those rights may be limited or overridden, ensuring a balance between personal privacy and public interests.

Recital 66 of the GDPR strengthens the Right to Be Forgotten in the online environment by extending the scope of the right to erasure. It specifies that when a controller (the entity responsible for processing personal data) has made personal data public, it has an obligation to inform other controllers that are processing that data to erase any links, copies, or replications of those personal data. This ensures that the right to erasure is effectively implemented across all instances of the data’s processing, not just the original controller’s systems.[5]

The recital further clarifies that the controller should take "reasonable steps" to notify other controllers about the data subject’s request for erasure. These steps should be proportionate, considering the available technology and the means at the controller's disposal, which may include the use of technical measures. This provision ensures that the erasure process is comprehensive and includes not only the data held by the original controller but also by any other entities that have processed or replicated the data. This further reinforces the protection of an individual's privacy in the digital age.

RTBF as Defined in Official Reports

Parliamentary Committee Report

Recommendations under the Joint Parliamentary Committee on the Data Protection Bill emphasized balancing privacy rights with freedom of speech and the public's right to information. The Committee extensively discussed Clause 18 of the proposed legislation, which addresses the right of data principals to correct and erase personal data. One key point of contention was the limitation placed on the right to erasure under Clause 18(1)(d), which allows erasure only if the data is no longer necessary for its intended purpose. The Committee questioned the justification for this restriction, raising concerns about its impact on the data principal's rights. The Ministry of Electronics and Information Technology (MeitY) clarified that the limitation was intended to prevent frivolous requests for data erasure after the purpose has been fulfilled, such as a situation where a person might request the removal of data despite it no longer serving its original purpose.[6]

The Committee further examined situations where data cannot be erased due to legal obligations, such as when data must be retained for verification or record-keeping. They noted that in cases where a false declaration is made to receive government benefits, erasure might not be possible. MeitY explained that the right to erasure under Clause 18(1) is subject to specific conditions and regulations, which could prevent unnecessary litigation. The Committee also raised concerns about the potential misuse of this qualification by data fiduciaries to deny erasure requests by claiming the data is still relevant for processing.

To address these concerns, MeitY highlighted safeguards for data principals, including the right to appeal a denial of erasure requests and the obligation of data fiduciaries to delete data once its purpose has been fulfilled. Despite these protections, the Committee expressed concerns about the financial and operational feasibility of complying with erasure requests, given the limitations of technology and resources. The Committee acknowledged that while privacy and individual rights are paramount, practical considerations such as technology, cost, and feasibility must also be taken into account. They recommended that the Data Protection Authority (DPA) establish regulations that balance these concerns and align with international best practices, ensuring that data principals' rights can be exercised effectively without imposing undue burdens on data fiduciaries.

BN Srikrishna Committee Report

RTBF was extensively analyzed in the BN Srikrishna Committee's report, which emphasized its importance in balancing individual autonomy and privacy against competing rights such as freedom of speech and the public's right to information. The report defined it as a right which "enables individuals to limit, de-link, delete, or correct the disclosure of personal information online, particularly when it is misleading, embarrassing, irrelevant, or outdated. The right extends to instances where data disclosure, lawful or unlawful, results in unfairness or harm to an individual."

The right is derived from the data fiduciary's obligation to process data in a fair and reasonable manner. It empowers individuals to challenge the disclosure of personal data they perceive as unfair or no longer relevant, even if it initially occurred with consent. However, the right is defeasible and must be balanced against other fundamental rights and public interests.

A structured five-point balancing test was recommended to evaluate RTBF claims:

  • Sensitivity of Data: How personal or sensitive the data is.
  • Scale of Disclosure: The degree to which the data has been disseminated.
  • Public Role of the Data Principal: Whether the individual holds public office or is a public figure.
  • Relevance of Data: Whether the data's relevance has diminished over time or due to changing circumstances.
  • Nature of Disclosure and Data Fiduciary's Role: Whether the disclosure is by a credible source or pertains to a public record.

Challenges in Implementation:

  • Balancing Competing Rights: RTBF inherently conflicts with the right to free speech, freedom of the press, and public access to information. Blanket deletion of personal data might impede these freedoms and risk rewriting public records.
  • Broad-Based Deletion Concerns: Permanent deletion of data might harm collective knowledge and even affect private archives, making republishing impossible in the future.

Regulatory Mechanism:

The Adjudication Wing of the Data Protection Authority (DPA) was suggested as the competent body to adjudicate RTBF requests. Decisions should be made following statutory parameters and based on clear guidelines to ensure consistency and fairness. Unlike the European Union (EU), where private data controllers (like Google) handle RTBF requests, the committee opposed relying solely on private entities, citing risks of inconsistency and prioritization of corporate interests.

Public Opinion on RTBF:

During consultations, opinions were divided. Some opposed the inclusion of RTBF in India’s legal framework, arguing it adds no significant benefit and hampers internet accessibility. Others supported its inclusion but sought exclusions for public information, algorithmically processed data, and critical information like credit histories and criminal records. The committee recommended avoiding automatic application of RTBF decisions to third parties unless explicitly requested and approved by the DPA. Such decisions should be narrowly tailored and address specific fiduciaries involved in the data's disclosure.

The committee eventually made the following recommendations:

  • RTBF should be adopted with a careful balancing test incorporating the above criteria.
  • The right should not permit permanent erasure of data but focus on limiting accessibility (e.g., de-linking search results).
  • The adjudicatory authority should specify time frames for implementation and clearly define the extent of RTBF application to ensure clarity for all stakeholders.

UNDP GUIDE - DRAFTING DATA PROTECTION LEGISLATION A Study of Regional Frameworks

In this report (2023), by the Centre for Communication Governance NLUD and the United Nations Development Programme,  RTBF is defined as “a contemporary data protection right that enables data subjects to request that their data is erased in certain circumstances. In the digital context, this right is usually exercised to require search engines and websites to remove information from search results and webpages. The operationalization of this right can have significant implications for access to information and the freedom of expression, and it must be carefully balanced against these factors.” [7]

OAS PRINCIPLES ON PRIVACY AND PERSONAL DATA PROTECTION

The Eighth Principle of this report by the Organization of American States (OAS) emphasizes the importance of providing individuals with mechanisms to access, correct, delete, object to the processing of, and request the portability of their personal data. These rights are critical for privacy protection and should be implemented through reasonable, effective, and non-discriminatory processes. Generally, exercising these rights should be free of charge, with limited exceptions for costs associated with reproduction or delivery of data.[8]

Key provisions include the right to access and rectify personal data to ensure its accuracy, the right to request the erasure of data that is irrelevant, no longer necessary, or processed without valid consent, and the right to object to its processing or request portability where applicable. National legislation should clearly outline the terms, conditions, and grounds for restricting these rights, such as when data processing serves the public interest, fulfills legal obligations, or aligns with the legitimate interests of the data controller.

Some jurisdictions offer a judicial mechanism known as habeas data, enabling individuals to challenge the misuse of their personal data. States are encouraged to ensure equitable access to such mechanisms for vulnerable groups. In certain cases, these rights may extend posthumously to legal representatives or relatives, depending on domestic laws.

Finally, the principle prohibits discrimination by data controllers or processors against individuals exercising these rights, ensuring fair access to goods or services. Balancing privacy rights with broader public interests, such as freedom of information and proportionality, is essential for effective implementation.

Recognition by Jurisdiction

European Union

In Europe, data protection laws do not explicitly provide for a right to be forgotten, but rather a more limited "right to erasure." Various iterations of this concept have existed for decades. For instance, in the United Kingdom, the Rehabilitation of Offenders Act of 1974 allows for the removal of certain criminal convictions from consideration after a set period, ensuring that such information does not affect insurance or employment opportunities. In France, the "right to be forgotten" (le droit à l'oubli) was legally recognized in 2010, further solidifying the concept in European law.

The debate over the right to be forgotten is particularly significant in the context of differing views between the United States and European Union. In the U.S., the right to free speech and the "right to know" often take precedence over privacy concerns, which complicates the global application of the right to be forgotten. The European Court of Justice solidified the right to be forgotten as a fundamental human right in its ruling against Google in the landmark Costeja case on May 13, 2014.[9]

The challenge of enforcing the right to be forgotten across jurisdictions remains a point of contention. There is no global framework that allows individuals to control their online image comprehensively. Professor Viktor Mayer-Schönberger has argued that companies like Google cannot evade compliance with the European law on the grounds of their location, noting that countries such as the U.S. have long applied their local laws with extraterritorial effects.[10] In 1995, the European Union adopted the European Data Protection Directive (Directive 95/46/EC), which laid the foundation for regulating the processing of personal data and was later incorporated into human rights law.[11] This directive was replaced by the General Data Protection Regulation (GDPR) in 2016, providing enhanced protections for individuals.

To request the removal of personal information from search results, individuals must complete a form on the search engine’s website. Google's process requires applicants to provide their country of residence, personal information, the URLs to be removed, and sometimes legal identification. After submission, Google reviews each request, balancing the individual's right to privacy against the public's right to know. If approved, the relevant URLs are removed from search results, though the content itself remains online. In some cases, Google has faced criticism for incorrectly removing content, particularly news articles, when the public interest outweigh privacy concerns.

Google’s compliance with the right to be forgotten has sparked controversy, with some news organizations and privacy advocates calling for greater transparency. In May 2015, a leak revealed that 95% of Google’s privacy requests came from individuals seeking to protect personal and private information, not from criminals or public figures. This disclosure led to widespread public concern, as some of the content removed included unflattering information about individuals in sensitive professions, such as doctors.

The European Union has advocated for the global application of the right to be forgotten, requesting that Google implement delinks across all its international domains. Google, however, has resisted, leading to ongoing antitrust investigations by the European Commission. In September 2019, the Court of Justice of the EU ruled that the right to be forgotten does not extend beyond EU borders, meaning Google is not obliged to remove links from non-EU domains. However, the EU continues to assert that search engines must comply with privacy protections for EU citizens, and in some cases, such as with certain managers seeking removal of inaccurate news articles, courts have favored the applicants. This evolving legal landscape continues to shape the way personal information is treated across borders.

In Europe, data protection laws do not explicitly provide for a "right to be forgotten", but instead a more limited "right to erasure". This concept has evolved over time, with notable precedents in European law. For instance, the Rehabilitation of Offenders Act of 1974 in the United Kingdom allowed for the removal of certain criminal convictions from consideration after a set period, ensuring that such information would not affect opportunities like employment or insurance. In France, the right to be forgotten, or "le droit à l'oubli", was officially recognized in 2010, further solidifying the idea within European law.

The European Court of Justice (ECJ) solidified the right to be forgotten as a fundamental human right in its landmark decision in the Costeja case on May 13, 2014, against Google. This ruling set the precedent that individuals have the right to request the removal of personal information from search engine results under certain conditions, balancing privacy against the public’s right to know.

Switzerland

The right to be forgotten was incorporated into the Constitution of the Canton of Geneva under the newly introduced Article 21A, titled "Right to Digital Integrity," which was adopted on June 18, 2023. Translated to English, the provision states:

"Digital integrity includes, in particular, the right to protection against the improper processing of data related to one's digital life, the right to security in the digital space, the right to an offline life, and the right to be forgotten."[12]

United States

In March 2017, New York State Senator Tony Avella and Assemblyman David Weprin introduced a bill proposing that individuals be allowed to compel search engines and online publishers to remove information deemed "inaccurate," "irrelevant," "inadequate," or "excessive" if it is no longer pertinent to public discourse and is causing demonstrable harm to the individual.[13][14]

In June 2018, California enacted the California Consumer Privacy Act (CCPA), granting consumers the right to request the deletion of their personal information held by covered businesses.[15] In October 2023, the state further strengthened data privacy protections by enacting the California Delete Act, which requires the California Privacy Protection Agency to create a centralized deletion mechanism, enabling consumers to direct data brokers to delete their personal information efficiently.[16][17]

The California Minor Eraser Law grants residents of California under the age of 18 the right to request the removal of content they have posted on online platforms.[18] This law applies to websites, social media platforms, mobile apps, and other online services, drawing inspiration from Europe’s recognition of the “right to be forgotten.” Effective since January 1, 2015, it mandates that operators of online services directed toward minors update their privacy policies to include the option for minors to request the deletion of their posted content.[19][20]

RTBF as Defined in Case Laws

EU

Google Spain SL v. Agencia Española de Protección de Datos

Much of the jurisprudence regarding RTBF originated from the case Google Spain v AEPD and Mario Costeja González.[21] In 1998, a newspaper in Spain (‘La Vanguardia’) under orders from the ‘Spanish Ministry of Labour and Social Affairs’, published announcements regarding the forced sale of assets due to certain debts. One of the properties so mentioned belonged to the petitioner - Mario Costeja González. Later, in 2009, Gonzalez discovered that upon searching his name, one was led to these announcements and requested their removal, arguing their irrelevance. Despite his efforts, the newspaper refused to delete the data, citing government orders. He then requested Google Spain and Google Inc for erasure of the links, triggering a complaint to the Spanish Data Protection Agency (AEPD), which upheld his request. Google Spain and Google Inc. contested this decision, arguing their exemption from the EU Data Protection Directive, the lack of personal data’s processing in search functions, and Gonzalez’s lack of right to erasure of lawfully published materials. The AEPD chose to seek a preliminary ruling from the Court of Justice of the European Union (CJEU) on the interpretation of the Directive, particularly regarding its territorial scope, the role of search-engines as controllers of data, and the establishment of RTBF.

The verdict of the case was delivered by the CJEU in 2014. It ruled in favour of González, holding that he possessed the right to request the deletion of his personal information that was now deemed outdated and irrelevant from Google search results. The court held that search engines like Google are subject to EU data protection laws and can be considered data controllers, with obligations to respect individual’s right to privacy and data protection. This landmark decision established the RTBF within the EU, emphasizing that it's important to balance privacy rights with the ‘freedom of speech and expression’ and access to information. It also set a precedent for future cases involving data protection and digital privacy, shaping the jurisprudence and prompting global discussions on the intersection of technology, privacy, and fundamental rights.[22][23]

Expanding Jurisdiction and Further Rulings

In December 2022, the European Court of Justice extended the right to be forgotten in the case C-460/20 TU, RE vs Google LLC.[24] The case involved two investment company managers who argued that certain news articles about them were inaccurate and requested that these articles be removed from search engine results. The Court ruled that search engine operators like Google must de-reference information when an individual provides relevant and sufficient evidence to substantiate the inaccuracy of the information. For thumbnails (preview images), a separate independent assessment must be conducted, but the same principle applies.

INDIA

Dharamraj Bhanushankar Dave v. State of Gujarat and Ors.

The petitioner, acquitted of criminal charges including culpable homicide, filed a plea under Article 226 of the Constitution seeking a permanent restraint on the public display of the judgment. The petitioner argued that despite the judgment being marked as “non-reportable,” it was accessible on legal portals and through online searches, adversely impacting his personal and professional life. He further contended that such publication exceeded the respondents’ authority, asserting that the court’s Registrar held exclusive control over such records. The case was decided by Justice R.M. Chhaya, who dismissed the petition, emphasizing several points: the High Court, as a Court of Record, is governed by the Gujarat High Court Rules, 1993, which allow parties to access judgments. Third-party access requires an affidavit specifying the grounds for obtaining such copies. The court held that the petitioner failed to demonstrate any legal violation warranting redress under Article 226 and clarified that publishing judgments online does not fall under the definition of “reportable” judgments as understood in legal reporting. Ultimately, the court declined to recognize the “right to be forgotten,” setting a significant precedent in Indian jurisprudence.[25][26]

Mr Xxxxx v. The Registrar General

In this case, the petitioner approached the Karnataka HC stating that upon searching for his daughter’s name on the internet, the results included the personal information and facts of a criminal case involving his son-in-law. He requested the court to instruct the removal of these information, citing injury to reputation. The court held that individuals have the right to request the removal or redaction of their personal information from online sources under certain circumstances, especially when the information is irrelevant, outdated, or causing unwarranted harm. This landmark decision set a precedent for the recognition and enforcement of RTBF in India.

Zulfiqar Ahman Khan v. Quintillion Business Media

Zulfiqar Ahman Khan, a media executive, filed a suit for defamation against Quintillion Business Media and sought a permanent injunction to remove articles alleging sexual harassment published on their digital news platform, Quint. Khan claimed the allegations were baseless and caused significant personal and professional harm. The Delhi High Court, presided over by Justice Pratibha Singh, ordered Quint to remove and refrain from republishing the articles during the legal proceedings. Recognizing Khan’s right to reputation and privacy, the court also ordered other digital platforms and search engines to ensure the articles were not republished. It emphasized that the ‘#MeToo’ movement should not become a means for perpetual defamation, urging restraint in republishing archived content. Ultimately, the parties resolved the matter amicably through mediation a few months later.[27]

Subhranshu Rout v. State of Odisha

On May 3, 2020, Subhranshu Rout raped his classmate and threatened her with dire consequences if she disclosed the incident. He later uploaded videos of the assault on a fake Facebook profile he created in his victim’s name, coercing her into silence. Charged with various offenses including rape, distribution of obscene content, and forgery, Rout sought bail in the High Court of Orissa at Cuttack. Justice S.K. Panigrahi presided over the case, which not only addressed bail but also examined RTBF in Indian law. The court highlighted the irreversible nature of information once it enters the public domain and discussed international precedents, including GDPR, advocating for legislative recognition of RTBF in India. Emphasizing the victim’s privacy rights and the misuse of technology in cases like these, the court denied Rout’s bail, stressing the need for legal mechanisms to protect the victim’s fundamental rights and remove objectionable content from social media platforms.[28]

Jorawer Singh Mundy v. Union of India

The petitioner, an American citizen, faced charges under the Narcotic Drugs and Psychotropic Substances Act, 1985 while visiting India in 2009. A trial court acquitted him in 2011 and in 2013, the Delhi High Court affirmed his acquittal on appeal by the state. Upon returning to the United States, the petitioner, now a law student, realized that the High Court’s decision was accessible online. Potential employers could easily find details of his case through a Google search, putting him at a disadvantage during background checks. He sent notices to several internet entities, out of whom, only one complied. When other platforms did not comply, the petitioner filed a writ petition before the Delhi HC. He requested the removal of the judgment from all respondent sites, asserting his Right to Privacy under Art 21 of the Indian Constitution.[29]

The court observed that since the individual was ultimately acquitted of all charges, the principle of freedom of expression could not be applied in an absolute manner. It also referred to the aforementioned two judgements while recognizing the inherent prejudice and suspicion faced by an acquitted person, the Court affirmed the petitioner’s right to be forgotten. Consequently, the respondents were specifically instructed to remove the judgments of acquittal, and Google was directed to prevent the content from appearing in search results. Although the court acknowledged RTBF in the aforementioned case, one has to note that the court did not mandate the deletion of the order from the website. Instead, the search results associated with the mentioned order were delinked from search engines such as Google. This means that while it would not appear in search engine results, the order could still be accessed by directly searching for it on the website

Dr. Krishna Menon vs High Court Of Kerala

This case centered on the petitioners' plea to remove or redact personal information from online judgments, particularly from legal platforms like Indian Kanoon and search engines such as Google. Dr. Krishna Menon and other petitioners argued that the continued public availability of such judgments, revealing sensitive personal details, violated their right to privacy. The cases involved diverse issues, including matrimonial disputes, criminal proceedings, and habeas corpus petitions, where the disclosure of identities caused undue hardship and stigma. The petitioners sought the application of the "right to be forgotten," emphasizing that their personal information no longer served any public interest and perpetuated unwarranted harm.[30]

The Kerala High Court acknowledged the growing relevance of the "right to be forgotten" as an extension of the fundamental right to privacy, as recognized in Justice K.S. Puttaswamy v. Union of India (2017). The Court carefully balanced the principles of open justice, which mandates transparency and public access to judicial records, with individuals' privacy rights. It noted that while judicial pronouncements are part of the public domain to ensure accountability, personal details in certain cases, such as matrimonial disputes and sensitive family matters, deserved redaction to protect individuals' dignity. In cases involving quashed criminal proceedings or acquittals, the Court was hesitant to allow blanket removal, citing the need to uphold public interest and the transparency of judicial processes. The case presented significant questions on judicial information policy and privacy in the context of the right to be forgotten within India's open court system. The reference, initiated by Justice Anil K. Narendran, sought an authoritative pronouncement on whether individuals could seek anonymity in judgments, particularly in family or sensitive cases, to safeguard their right to privacy. The issue also extended to whether personal details in judgments published online could be de-indexed or redacted in light of privacy concerns and evolving jurisprudence stemming from the Puttaswamy judgment.

The Kerala High Court emphasized that while the open court principle ensures transparency, there are exceptions, particularly in family, matrimonial, and other sensitive cases where privacy interests outweigh public access. The Court declared that a claim for privacy in an open court system must be evaluated on a case-by-case basis, allowing the invocation of privacy rights or de-indexing in appropriate situations. It directed the High Court Registry to issue privacy notices and limit the publication of identifiable details in specific matters. While some petitions seeking to remove judgments were dismissed due to lack of merit or public interest considerations, relief was granted in cases involving family disputes, matrimonial issues, and the identity of minor children. The judgment underscores a nuanced balance between privacy and transparency in India's legal framework. While the Court did not issue a broad directive to remove judgments from online platforms, it suggested a case-by-case approach. It held that platforms and search engines should consider requests to de-index or redact information when it ceases to serve public interest or causes disproportionate harm.

Karthick Theodore v. The Registrar General and Ors

Mr. Karthick Theodore, was initially convicted by a trial court under Sections 417 and 376 of the Indian Penal Code. However, upon appeal, the Madras High Court acquitted him of all charges on April 30, 2014. Following his acquittal, Mr. Theodore remarried and started a new family life. He later discovered that the High Court's judgment, which included personal details revealing his identity, was publicly accessible online. Concerned about the impact of this information on his privacy and future prospects, especially after facing issues like visa denial due to the accessible judgment, he sought the redaction of his personal details from the public record.[31]

Mr. Theodore filed a writ petition requesting the removal of his name and identifying information from the judgment available on the High Court's website and on Indian Kanoon, a legal database. He based his plea on the right to privacy under Article 21 of the Indian Constitution, referencing the Supreme Court's decision in K.S. Puttaswamy v. Union of India, which recognized privacy as an intrinsic part of the right to life and personal liberty. He argued that the continued public availability of his personal details served no public interest and infringed upon his right to privacy, especially given his acquittal and subsequent efforts to rebuild his life.

The single judge bench initially dismissed his petition, stating that there was no legislative provision supporting the right to be forgotten. Undeterred, Mr. Theodore appealed this decision. The division bench of the Madras High Court acknowledged the tension between the principles of open justice and an individual's right to privacy. The court emphasized that while judicial records should be preserved in their original form, moderating their public display to protect an individual's privacy does not compromise their integrity. The bench recognized the right to be forgotten as an essential facet of the right to privacy, especially when the individual's personal data no longer served a public purpose.

Consequently, the court directed the Registrar General and associated officials to redact Mr. Theodore's name and other personal details from the judgment before making it publicly accessible. Additionally, Indian Kanoon was instructed to remove the judgment from its platform to prevent further dissemination of his personal information. This decision underscores the judiciary's recognition of the right to be forgotten within the Indian legal framework, balancing the need for public access to judicial records with the protection of individual privacy rights.

Challenges in Global Application

The debate surrounding the right to be forgotten has been further complicated by differing views between the United States and the European Union. In the U.S., the right to free speech and the right to know often take precedence over privacy concerns, which complicates the global application of this right. Critics, such as Professor Viktor Mayer-Schönberger, argue that companies like Google cannot avoid complying with European law simply by being located outside Europe.[32] The EU's extraterritorial data protection laws challenge Google and other corporations to apply privacy protections globally, similar to how U.S. laws have long had extraterritorial effects.

The European Data Protection Directive (Directive 95/46/EC), adopted in 1995, laid the groundwork for regulating the processing of personal data and was later incorporated into human rights law. This directive was replaced in 2016 by the General Data Protection Regulation (GDPR), which provides enhanced privacy protections for individuals in Europe and further underscores the growing significance of data privacy rights.

Enforcement Process and Delinked Content

To request the removal of personal information from Google’s search results, individuals must complete a form on the search engine's website. This process typically requires applicants to provide their country of residence, personal details, and URLs for removal, along with sometimes providing legal identification. Google reviews each request, carefully weighing the individual’s right to privacy against the public’s right to access information. If approved, the relevant URLs are removed from search results, although the content itself remains on the website. Google has faced criticism for sometimes incorrectly removing content, particularly news articles, when the public interest outweigh privacy concerns.[33]

By September 2015, the most delinked site was Facebook. Additionally, Google’s own sites, such as groups.google.com, plus.google.com, and www.youtube.com, appeared among the most delinked sites. Google, alongside other search engines like Yahoo and Bing, made delinking request forms available to users.

EU Court Rulings on Global De-linking

The European Union has advocated for the global application of the right to be forgotten. However, Google has resisted this push, leading to ongoing antitrust investigations by the European Commission. In September 2019, the Court of Justice of the EU ruled that the right to be forgotten does not extend beyond the EU's borders. In essence, Google is not required to apply de-linking on search results from non-EU domains. However, the EU continues to assert that search engines must respect the privacy of EU citizens, even in jurisdictions outside of the EU. In cases where individuals like managers seeking removal of inaccurate news articles have approached courts, the EU courts have favored the applicants.

The RTBF faces significant challenges in balancing competing rights like freedom of speech and public interest with individual privacy. Its global applicability remains ambiguous due to varying jurisdictional rules, further complicated by the decentralized, replicative nature of the internet, making technical enforcement difficult. Additionally, determining what constitutes public interest—especially for public figures or legal records—creates legal and ethical dilemmas. Addressing these challenges requires nuanced approaches that safeguard privacy without undermining the public's right to access information.[34]

References

  1. https://www.mediadefence.org/ereader/publications/advanced-modules-on-digital-rights-and-freedom-of-expression-online/module-5-trends-in-censorship-by-private-actors/right-to-be-forgotten/
  2. https://www.meity.gov.in/writereaddata/files/Digital%20Personal%20Data%20Protection%20Act%202023.pdf
  3. https://gdpr-info.eu/art-17-gdpr/
  4. https://gdpr-info.eu/recitals/no-65/
  5. https://gdpr-info.eu/recitals/no-66/
  6. https://sflc.in/wp-content/plugins/pdfjs-viewer-shortcode/pdfjs/web/viewer.php?file=https://sflc.in/wp-content/uploads/2021/12/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf&attachment_id=0&dButton=true&pButton=true&oButton=false&sButton=true&pagemode=none&_wpnonce=c3844ceefa
  7. https://www.undp.org/sites/g/files/zskgke326/files/2023-04/UNDP%20Drafting%20Data%20Protection%20Legislation%20March%202023.pdf
  8. https://www.oas.org/en/sla/iajc/docs/Publication_Updated_Principles_on_Privacy_and_Protection_of_Personal_Data_2021.pdf
  9. https://www.mediadefence.org/ereader/publications/advanced-modules-on-digital-rights-and-freedom-of-expression-online/module-5-trends-in-censorship-by-private-actors/right-to-be-forgotten/
  10. https://doi.org/10.1093/hrlr/ngq032
  11. https://eur-lex.europa.eu/eli/dir/1995/46/oj/eng
  12. L 12945 - Loi constitutionnelle modifiant la constitution de la République et canton de Genève (Cst-GE) (Pour une protection forte de l’individu dans l’espace numérique)
  13. https://www.washingtonpost.com/news/volokh-conspiracy/wp/2017/03/15/n-y-bill-would-require-people-to-remove-inaccurate-irrelevant-inadequate-or-excessive-statements-about-others/
  14. https://docs.google.com/document/d/1nbnnFNTnmL6mcDQwggfs567DZKeC0vaY6dtf7YJJO-o/edit?tab=t.0#heading=h.rh7aoxulnml6
  15. https://oag.ca.gov/privacy/ccpa
  16. https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB362
  17. https://www.skadden.com/insights/publications/2023/12/californias-new-data-deletion-law-imposes
  18. https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=201320140SB568
  19. https://www.cooley.com/news/insight/2015/new-california-privacy-law-for-minors-has-taken-effect-as-of-january-1-2015
  20. https://jlsp.law.columbia.edu/wp-content/blogs.dir/213/files/2017/03/48-Campbell.pdf
  21. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:62012CJ0131
  22. https://harvardlawreview.org/print/vol-128/google-spain-sl-v-agencia-espanola-de-proteccion-de-datos/
  23. https://eprints.lse.ac.uk/61944/1/__lse.ac.uk_storage_LIBRARY_Secondary_libfile_shared_repository_Content_Lynskey,%20O_Control%20personal%20data_Lynskey_Control%20personal%20data_2015.pdf
  24. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:62020CJ0460
  25. http://www.scconline.com/LoginForNewsLink/Loiqs32r
  26. https://www.scconline.com/blog/post/2017/02/04/websites-cant-be-restrained-from-publishing-non-reportable-judgments/
  27. https://indiankanoon.org/doc/172009054/
  28. https://indiankanoon.org/doc/6266786/
  29. https://www.livelaw.in/pdf_upload/16186364774292021-393948.pdf
  30. https://indiankanoon.org/doc/147635015/
  31. https://indiankanoon.org/doc/189278808/
  32. https://doi.org/10.1093/hrlr/ngq032
  33. https://www.theregister.com/2014/06/17/for_mon_how_google_plans_to_torpedo_your_privacy_rights/
  34. https://www.theregister.com/2019/09/24/eu_court_justice_right_to_be_forgotten_ruling/