Digital Surveillance

From Justice Definitions Project

This page elaborates Digital Surveillance for the definition.

What is Digital Surveillance?

Digital surveillance is the systematic monitoring, collection, and analysis of digital data, including online communications, biometrics, and geolocation, by governments, corporations, or other entities. Unlike traditional observation, it relies on automation, algorithms, and vast computational power to aggregate data streams from millions of subjects simultaneously, often without their direct knowledge. This practice drives modern economic models such as "surveillance capitalism", where personal experience is commodified for behavioural prediction.[1] The term "surveillance" itself is borrowed from the French surveillance ("oversight"), derived from the prefix sur ("over") and the root veiller ("to watch"), traceable to the Latin vigilare ("to be watchful"). While the word entered the English language in the late 18th century, it gained political weight during the French Revolution, particularly with the "Surveillance Committees" established in 1793 to monitor citizens for treason.[2]

The history of government surveillance long predates the digital age. Ancient states used administrative mechanisms like the census, such as the Census of Quirinius in the Roman Empire (c. 6 AD), to maintain military and tax records. Modern state espionage evolved with communication networks; a notable scandal occurred in Britain in 1844 when the government admitted to opening the private mail of Italian exile Giuseppe Mazzini.[3] The invention of the telephone led to the first recorded police wiretapping by the New York City Police Department in 1895.[4] The transition toward electronic monitoring began in 1927, when Russian inventor Léon Theremin installed a manual scanning-transmitting camera (an analog precursor to CCTV) at the Moscow Kremlin.[5] However, the shift to true digital surveillance is marked by two distinct milestones: the ECHELON signals intelligence network (formalised in the early 1970s), which used computers to automate the filtering of satellite communications,[6] and the release of the Axis NetEye 200 in 1996, recognized as the first network (IP) camera to transmit digital video data over the internet.[7] The intellectual framework for this era was established in 1986 by computer scientist Roger Clarke, who coined the term "dataveillance" to describe the systematic use of personal data systems to monitor actions, a concept that underpins modern digital tracking.[8]

Government vs. Commercial Surveillance: Scope and Balance

Digital surveillance encompasses a wide range of monitoring activities conducted not only by governments but also by corporations and other private entities. While government surveillance typically focuses on law enforcement, national security, and public order, commercial surveillance primarily aims at data collection for economic purposes such as targeted advertising, behavioural prediction, and service optimisation.

Government surveillance involves legal frameworks that authorize interception, monitoring, and data collection based on statutory grounds and oversight mechanisms, often tied to public safety or security concerns. In contrast, commercial surveillance depends largely on user consent mechanisms, terms of service agreements, and data privacy regulations, which vary significantly across jurisdictions.

The boundary between these two forms of surveillance is increasingly blurred. Corporations manage vast quantities of personal data through digital platforms, which governments may access through legal orders or in less transparent ways. Additionally, commercial data practices including profiling and tracking contribute to what is termed "surveillance capitalism," raising critical questions about autonomy, privacy, and the reach of non-state actors in influencing citizen behaviour and societal norms.

Balancing the oversight and regulation of state and commercial surveillance remains a complex challenge. It requires clear legal standards, transparency, accountability mechanisms, and robust data protection frameworks to ensure individual rights are preserved without undermining legitimate public and commercial interests. The Indian legal framework, with laws such as the Information Technology Act, the Telecommunications Act, and the Digital Personal Data Protection Act, reflects ongoing attempts to address this balance but continues to evolve amid rapid technological and social change.

Official definition of Digital Surveillance

In India, "digital surveillance" is not defined by a single, distinct clause in any statute. Instead, the term is a legal construct derived from the operative statutory powers of "interception," "monitoring," and "decryption" of electronic information. The functional definition used by state agencies is primarily there in Section 69 of the Information Technology Act, 2000. According to this provision, digital surveillance is understood as the direction by the Central or State Government to any agency to intercept, monitor, or decrypt any information generated, transmitted, received, or stored in any "computer resource."

The discourse surrounding these terms relies on specific technical definitions found in the Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009. Under these rules, “decryption” means the process of conversion of information in non-intelligible form (cipher text) to an intelligible form (plain text) via a mathematical formula, code, password or algorithm or a combination thereof;” “intercept” with its grammatical variations and cognate expressions, means the aural or other acquisition of the contents of any information through the use of any means, including an interception device, so as to make some or all of the contents of an information available to a person other than the sender or recipient or intended recipient of that communication, and includes-“monitor” with its grammatical variations and cognate expressions, includes to view or to inspect or listen to or record information by means of a monitoring device. Thus, officially, digital surveillance is the composite of these three actions performed by the state.

Legal provision(s) related to the term

The statutory architecture for surveillance in India is divided between the regulation of the internet and the regulation of telecommunications, with additional powers granted under criminal procedure laws.

The Information Technology Act, 2000

This is the parent statute for digital surveillance over the internet. Section 69 was introduced by amendment in 2009, and operates in conjunction with other statutes to form the legal basis for internet surveillance (earlier only the Indian Telegraph Act, 1885 contained such provisions). It reads as follows:

69. Power to issue directions for interception or monitoring or decryption of any information through any computer resource.--(1) Where the Central Government or a State Government or any of its officers specially authorised by the Central Government or the State Government, as the case may be, in this behalf may, if satisfied that it is necessary or expedient so to do, in the interest of the sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of any cognizable offence relating to above or for investigation of any offence, it may subject to the provisions of sub-section (2), for reasons to be recorded in writing, by order, direct any agency of the appropriate Government to intercept, monitor or decrypt or cause to be intercepted or monitored or decrypted any information generated, transmitted, received or stored in any computer resource.

(2) The procedure and safeguards subject to which such interception or monitoring or decryption may be carried out, shall be such as may be prescribed.

(3) The subscriber or intermediary or any person in-charge of the computer resource shall, when called upon by any agency referred to in sub-section (1), extend all facilities and technical assistance to-

(a) provide access to or secure access to the computer resource generating, transmitting, receiving or storing such information; or

(b) intercept, monitor, or decrypt the information, as the case may be; or

(c) provide information stored in computer resource.

(4) The subscriber or intermediary or any person who fails to assist the agency referred to in sub-section (3) shall be punished with imprisonment for a term which may extend to seven years and shall also be liable to fine.

Complementing this is Section 69B, which empowers the government to authorize the Computer Emergency Response Team (CERT-In) to monitor and collect "traffic data" or information generated, transmitted, received, or stored in any computer resource to enhance cyber security and for identification, analysis, and prevention of intrusion or spread of computer contaminants. Unlike Section 69, this section deals with metadata rather than content.

69B. Power to authorise to monitor and collect traffic data or information through any computer resource for cyber security.--(1) The Central Government may, to enhance cyber security and for identification, analysis and prevention of intrusion or spread of computer contaminant in the country, by notification in the Official Gazette, authorise any agency of the Government to monitor and collect traffic data or information generated, transmitted, received or stored in any computer resource.

(2) The intermediary or any person in-charge or the computer resource shall, when called upon by the agency which has been authorised under sub-section (1), provide technical assistance and extend all facilities to such agency to enable online access or to secure and provide online access to the computer resource generating, transmitting, receiving or storing such traffic data or information.

(3) The procedure and safeguards for monitoring and collecting traffic data or information, shall be such as may be prescribed.

(4) Any intermediary who intentionally or knowingly contravenes the provisions of sub-section (2) shall be punished with an imprisonment for a term which any extend to 2[one year or shall be liable to fine which may extend to one crore rupees, or with both].

Explanation.--For the purposes of this section,--

(i) "computer contaminant" shall have the meaning assigned to it in section 43;

(ii) "traffic data" means any data identifying or purporting to identify any person, computer system or computer network or location to or from which the communication is or may be transmitted and includes communications origin, destination, route, time, data, size, duration or type of underlying service and any other information.

Notably, the provisions of the IT Act allow interception of online information when it is deemed "necessary and expedient" to do so. This omits the additional safeguards present under the Telegraph Act (now, the Telecommunications Act of 2023) which require the fulfillment of more tangible benchmarks such as "public safety" or "public emergency."

The Telecommunications Act, 2023

This Act replaced the colonial Indian Telegraph Act, 1885. Section 20(2) of the Telecommunications Act allows the Central or State Government to intercept, detain, or disclose messages if it is satisfied that it is necessary or expedient to do so in the interest of the sovereignty and integrity of India, defence and security of the State, friendly relations with foreign States, public order, or for preventing incitement to the commission of an offence. Crucially, the definition of "message" under this Act has been expanded to potentially include data sent through internet-based communication services like WhatsApp, thereby bringing Over-The-Top (OTT) platforms within the ambit of telecommunication surveillance.

The Act is operationalised through the Telecommunications (Procedures and Safeguards for Lawful Interception of Messages) Rules, 2024 , which detail the process for issuing and reviewing interception orders, data handling, and record keeping. Interception orders need to be authorised by a "competent authority", usually the Union or State Home Secretary, with provisions for urgent orders issued by senior officers subject to subsequent confirmation. The orders are time-limited and must be renewed only when necessary.[9]

Interestingly, both the IT Act and the Telecommunications Act lack a mechanism for prior judicial authorisation of surveillance. Instead, surveillance orders are subject to periodic review by a three-member executive review committee. At the central government level, this committee comprises the Cabinet Secretary, the Law Secretary, and the Telecom Secretary. At the state level, the committee is constituted by the Chief Secretary, the State Law Secretary or Legal Remembrancer, and another senior Secretary, excluding the Home Secretary.

The review committee is mandated to meet at regular intervals (at least once every two months) to examine if the interception orders comply with legal grounds specified in the Act and Rules. The committee can revoke unlawful orders and direct the destruction of the intercepted data. However, as all members are senior bureaucrats, several commentators have raised concerns over the lack of independent or judicial oversight, urging inclusion of retired judges or civil society members to strengthen safeguards aligned with constitutional privacy protections.[10]

This model reflects India’s executive-centric approach to digital surveillance oversight, contrasting with judicial warrant requirements common in many democracies, raising significant debates on surveillance transparency, accountability, and rights protections.[11]

Distinct from Telegraph Act Requirements

The Telecommunications Act 2023 repeals and replaces the Indian Telegraph Act 1885 and the Indian Wireless Telegraphy Act 1933, creating a new, consolidated framework for lawful interception and surveillance of telecommunications in India. While the basic grounds for interception remain similar, the new Act significantly broadens the technical and service scope, bringing more forms of digital communication within the interception net.

Under the Telegraph Act, lawful interception was primarily anchored in section 5(2), which allowed the government to intercept messages on grounds such as public emergency, public safety, sovereignty and integrity of India, security of the State, friendly relations with foreign States, or public order, with procedural details set out in Rule 419A of the Telegraph Rules. The framework was designed around legacy telegraph and telephone services, and regulators and courts gradually extended it to newer technologies, but the statutory language itself remained tied to an older communications model.

The Telecommunications Act 2023 carries forward similar grounds for interception in section 20(2), authorising the Central or State Government (or authorised officers) to direct interception, monitoring, or blocking of communications in the interests of national security and related objectives. However, it applies to the broader category of “telecommunication services,” which includes modern digital and internet‑based communications, thereby formally extending interception powers into domains such as IP‑based voice and messaging services.

Procedurally, both regimes rely on a “competent authority” (typically the Union or State Home Secretary) to approve interception orders, with emergency authorisations permitted by senior officers subject to ex post confirmation. Orders generally have a limited validity (for example, 60 days with possible extensions up to a prescribed maximum), and agencies must maintain records, ensure confidentiality, and destroy intercepted material after a specified period unless it is required for ongoing investigations or legal proceedings.

Where the Telecommunications Act 2023 and the accompanying interception rules mark a departure is in their explicit, technology‑neutral language, wider coverage of digital services, and integration with parallel cyber‑security and service‑suspension powers. Commentators note that while this modernisation closes gaps left by the Telegraph Act’s narrow technical vocabulary, it also risks normalising more pervasive, data‑driven surveillance and dataveillance if not matched by robust independent oversight, transparency, and remedies.

Bharatiya Nagarik Suraksha Sanhita, 2023 (BNSS)

The BNSS, which replaced the Code of Criminal Procedure (CrPC), has introduced procedural forms of surveillance. Section 2(1)(l) of the BNSS expands the definition of "document" to explicitly include "electronic communication" and digital devices. Furthermore, Section 94 empowers a court or an officer in charge of a police station to compel the production of any document or "other thing" necessary for the purposes of any investigation, inquiry, trial, or other proceeding. This provision effectively allows law enforcement to seize digital devices (smartphones, laptops) and access their data, functioning as a form of seizure-based surveillance.

The Digital Personal Data Protection Act, 2023 (DPDP Act)

This Act establishes the overarching regime for the processing of digital data, thereby defining the subject matter that is liable to be surveilled.

Definitions:

The scope of protection offered by the DPDP Act is limited to “personal data” relating to a “data principal.” These two terms are defined under Section 2 of the Act. (j) “Data Principal” means the individual to whom the personal data relates and where such individual is— (i) a child, includes the parents or lawful guardian of such a child; (ii) a person with disability, includes her lawful guardian, acting on her behalf; (t) “personal data” means any data about an individual who is identifiable by or in relation to such data; The definition of personal data under Indian law appears to be broad and vague. It is perhaps better understood when supplemented by the definition provided by the EU’s General Data Protection Regulation, under Article 1:

(1) ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

Of course, there will be variations across jurisdictions as to what exactly is included within the scope of personal data under the Indian DPDP Act (for example, a common debate globally is whether subjective opinions of a person constitute personal data). Therefore, it is safe to say that the scope of the DPDP Act is not fully known yet, and our understanding of the same might evolve as and when courts interpret the definition. The definition would have an effect on digital surveillance as well, since only protected personal data can justifiably be exempt from state surveillance.

Notably, the definition of data principals under the DPDP Act (those whose data is offered protection under the Act) includes only individuals. This means that data belonging to organisations or other non-natural persons is not afforded the safeguards available under the Act. This could be a possible avenue into digital surveillance, either by the government or other private entities.

Applicability and Scope (Section 3):

The Act codifies the specific types of data processing that fall under legal regulation. It applies to the processing of digital personal data within the territory of India where the personal data is collected:

  • In digital form; or
  • In non-digital form and digitized subsequently.
  • It also applies to processing outside the territory of India if such processing is in connection with any activity related to the offering of goods or services to Data Principals within India.
The Surveillance Exemption (Section 17):

While the Act protects personal data, it creates a significant "carve-out" for digital surveillance. Under Section 17(2)(a), the Central Government may exempt any instrumentality of the State from the application of the Act (including the requirement to obtain consent). The term 'instrumentality' is not yet defined anywhere in context of the Act, however it would be safe to assume that it includes all the entities that constitute 'State' under Article 12 of the Indian Constitution. These exemptions are granted on grounds identical to the reasonable restrictions on free speech under Article 19(2) of the Constitution:

  • Sovereignty and integrity of India;
  • Security of the State;
  • Friendly relations with foreign States;
  • Maintenance of public order; or
  • Preventing incitement to the commission of any cognizable offence.
The Startup Exemption (Section 17(3)):

Per Section 17(3) of the DPDP Act, the Central Government may exempt certain startups from the strict application of certain provisions of the Act, having regard to the volume and nature of personal data handled by these startups. These provisions include:

  • Section 5: This provision contains the requirement of providing a notice to the data principal informing them of the purpose for which their personal data is being processed, and information about their rights under the DPDP Act along with the dispute resolution mechanism available under the Act by way of approaching the Data Protection Board of India.
  • Section 8: Two of the obligations imposed upon data fiduciaries may also not apply to these notified startups. As a result, they do not have to ensure the completeness, accuracy and consistency of the personal data they are processing, even when a decision is made which can affect the principal, or the data is disclosed to another fiduciary. Secondly, they do not have the obligation to erase the personal data, whether automatically, after the lapse of the purpose for which it was collected, or on request by the data principal.
  • Section 10: The additional obligations imposed on a Significant Data Fiduciary (who handle sensitive data which may pose a risk to safety, national security, or the rights of the data principal etc.) do not apply to notified startups. These obligations include the appointment of a Data Protection Officer and undertaking regular audits and Data Protection Impact Assessments.
  • Section 11: This provision contains the right of the data principal to access information regarding their personal data, such as a summary of the data relating to them that has been processed and a list of fiduciaries/processors to whom such data has been shared.

It is important to note that these startups are not government entities, they have been defined under the Act as follows: "a private limited company or a partnership firm or a limited liability partnership incorporated in India, which is eligible to be and is recognised as such in accordance with the criteria and process notified by the department to which matters relating to startups are allocated in the Central Government." No startups have been notified under the Section by the Central Government yet, therefore it is difficult to tell exactly what the criteria will be to select these startups, and why they would be exempt from the full application of the DPDP Act. Additionally, a directive given by the executive will be enough to exempt these fiduciaries - there is no review or oversight mechanism from other branches. However, by removing key safeguards, such as the right of the data principal to request deletion of their data or a summary of it, the Act runs the risk of creating a legal loophole for unnecessary digital surveillance. Therefore the Act is not completely watertight in its protection of personal data. The numerous exceptions male it clear that surveillance is possible, both by state instrumentalities, as well as authorised private entities.

Compelling the Furnishing of Data:

Section 36 of the DPDP Act states that the Central Government may compel the Data Protection Board or any data fiduciary to furnish the personal data it requires. Rule 23 of the Digital Personal Data Protection Rules of 2023 states that this may be done for purposes ranging from those affecting the national security and sovereignty of the country, to functions such as identifying which fiduciaries must be notified as Significant. Rule 23(2) clarifies that if there is a pressing justification affecting the sovereignty, integrity or security of the state, the Central Government may prevent the disclosure to the data principal that their data was furnished to the government. This has implications for surveillance and criminal justice. High-profile investigations may lead to the government requesting the fiduciary to break their end-to-end encryption commitments offered by their platforms. Such information can be collected in secrecy, without notifying the data principal. In effect, as long as the central government can produce a compelling reason, all personal data is available to them. In light of this, the DPDP Act cannot be viewed as a roadblock to digital surveillance by the state, though it may improve the data privacy framework in terms of private surveillance.

Amendment of RTI Act: Transparency vs. Privacy Tussle:

Section 44(3) of the DPDP Act amends Section 8(1)(j) of the Right to Information Act of 2005 and substitutes the words "(j) information which relates to personal information;" in place of the earlier phrasing, which read:

"(j) information which relates to personal information the disclosure of which has no relationship to any public activity or interest, or which would cause unwarranted invasion of the privacy of the individual unless the Central Public Information Officer or the State Public Information Officer or the appellate authority, as the case may be, is satisfied that the larger public interest justifies the disclosure of such information:

Provided that the information which cannot be denied to the Parliament or a State Legislature shall not be denied to any person."

This omission may act as a valuable safeguard against the wholesale disclosure of private and personal data of a person and appears to be an attempt to give weight to the newly-affirmed Right to Privacy within statute. On the other hand, the amendment may problematize the RTI process. Information requests may be rejected without adequate consideration of the underlying public interests resulting from disclosure. In the interests of privacy, there may be some compromise made on the front of transparency from the government. A useful parallel can be drawn with the 'panopticon' analogy drawn earlier in this article. While the government may be free to watch over its citizens and surveil personal data due to reasons such as 'national security,' the same degree of transparency may not be exercised by the citizens onto the functioning of the government. However, we are yet to see the operation of this amendment and whether any RTI applications are turned down on these grounds. Broadly, the amendment appears to be a change aiming to ramp up the safeguards around the disclosure of personal data of individuals, i.e. a move to curb unregulated and inessential digital surveillance.

Distinct from Telegraph Act Requirements:

It is significant to note that unlike Section 5(2) of the Indian Telegraph Act, 1885, which mandates a "condition precedent" of a "public emergency" or "public safety" for interception, the exemptions under the DPDP Act do not require the existence of an emergency. The government can activate these exemptions based on the broader administrative satisfaction regarding national security or public order.

Digital Surveillance in International Instruments:[12]

India is a signatory to several international instruments that frame the boundaries of surveillance through the right to privacy. Article 12 of the Universal Declaration of Human Rights states that "Everyone has the right to the protection of the law against arbitrary interference with his privacy, family, home or correspondence." Similarly, Article 17 of the International Covenant on Civil and Political Rights (ICCPR), 1966, provides protection against "arbitrary or unlawful interference" with privacy. The Office of the UN High Commissioner for Human Rights has explicitly interpreted "interference" in the digital age to include the interception of digital communications and the collection of metadata, thereby categorizing digital surveillance as an act that requires strict legal justification to avoid violating international human rights law. In December 2013, the United Nations adopted a resolution titled "The right to privacy in the digital age," which recognized the potential for surveillance and data collection to infringe on privacy and other human rights. The resolution emphasized the need for national legislation, oversight mechanisms, and transparency to protect privacy both online and offline(report)

Official Documents and Government Reports related to Digital Surveillance

Several high-level committees have analyzed the scope and regulation of digital surveillance in India. These reports have frequently offered critical perspectives on the lack of judicial or parliamentary oversight over intelligence agencies and the executive branch.

Justice B.N. Srikrishna Committee (2018)[13]

The Report of the Committee of Experts under the Chairmanship of Justice B.N. Srikrishna (2018), titled A Free and Fair Digital Economy, provides the most detailed official critique of the Indian surveillance architecture. Formed to draft India's data protection framework, the Committee noted that the current surveillance regime is characterized by a "lack of judicial oversight," as interception orders operate solely on executive authorization under the Telegraph Act and the IT Act. The report specifically highlighted the Central Monitoring System (CMS), a mass surveillance project that automates the interception of telecommunications. The Committee observed that the CMS allows the state to bypass service providers entirely, creating a "panoptic" infrastructure that lacks adequate procedural safeguards.

Furthermore, the report argued that the existing review mechanism, where a Review Committee composed of senior bureaucrats reviews interception orders issued by other bureaucrats, is insufficient. It termed this a conflict of interest, noting that the executive cannot be the sole check on its own powers. The Committee recommended that any non-consensual processing of data by the state (surveillance) must be authorized by a law that is "necessary and proportionate" to the state aim, explicitly referencing the standards set by the Supreme Court in K.S. Puttaswamy v Union of India.[14] It called for statutory oversight of intelligence agencies, a recommendation that was largely omitted from subsequent data protection legislation.

Justice A.P. Shah Committee (2012)[15]

Prior to the Srikrishna Committee, the Planning Commission constituted a Group of Experts on Privacy, leading to the Report of the Justice A.P. Shah Committee (2012). This report was pivotal in defining surveillance not merely as the interception of content but as "any monitoring of the activities of a person," thereby including metadata and location tracking within the scope of privacy regulations. The Committee formulated nine "National Privacy Principles," recommending a shift in policy where surveillance must not be undertaken merely because it is useful to the state, but only when it is strictly necessary.

The Shah Committee offered a harsh critique of the procedural safeguards laid down in PUCL v. Union of India (1997). It argued that the current system violates the separation of powers because the body authorizing surveillance (the Home Secretary) and the body reviewing it (the Review Committee) both belong to the executive branch. The report recommended the creation of an independent Privacy Commissioner to provide oversight and suggested that surveillance orders should eventually require judicial authorization, moving India closer to the models used in the UK and the US.

Parliamentary Standing Committee on Communications and Information Technology (2023)[16]

More recently, the Standing Committee on Communications and Information Technology has scrutinized digital privacy in its Fifty-Third Report on Citizens’ Data Security and Privacy (2023). The Committee acknowledged official submissions suggesting that the traditional distinction between "content" (what you say) and "metadata" (who you speak to) is blurring in the digital age. It implied that the lower threshold for monitoring metadata under Section 69B of the Information Technology Act, 2000 constitutes significant surveillance and requires tighter regulation.

The report expressed concerns regarding the broad exemptions granted to government agencies. It noted that while national security is a legitimate ground for surveillance, the lack of a precise definition creates a risk of misuse. The Committee questioned the Ministry of Electronics and Information Technology (MeitY) on the mechanisms available to citizens to seek redressal against unauthorized government surveillance. It concluded that the current grievance redressal mechanisms were inadequate and recommended the establishment of a robust data protection authority that is autonomous from the central government, ensuring that the state remains accountable for its data-gathering practices.

Case Laws related to Digital Surveillance

While the term digital surveillance in itself is not explicitly defined, the Indian judiciary has played a pivotal role in defining the constitutional limits of digital surveillance, establishing that it is not an absolute power of the state but one subject to the fundamental right to privacy.

People’s Union for Civil Liberties (PUCL) vs. Union of India (1997): [17]

This case laid down the foundations of the legal framework around surveillance. The Supreme Court in this 1997 case was dealing with telephone tapping. The Court defined wiretapping, the precursor to digital surveillance, as a "serious invasion of an individual’s privacy." It laid down the PUCL guidelines against illegal and excessive surveillance by the state, and created safeguards against arbitrariness in the exercise of the state's surveillance powers. The guidelines created a procedural mechanism (Rule 419A of the Indian Telegraph Rules, 1951) where surveillance orders must be issued by high-ranking bureaucrats (such as the Home Secretary) and reviewed by a Review Committee.

However, due to the absence of any statutory provision in the Telegraph Act mandating judicial oversight, the court did not stippulate additional procedural safeguards, and the law was declared to be constitutional.

K.S. Puttaswamy (Privacy-9J.) v. Union of India (2017):[14]

In August 2017, a nine judge bench of the Supreme Court in the Puttaswamy Case gave legitimacy to the ‘right to privacy’ under the Constitution of India and overruled M.P Sharma vs. Satish Chandra (1954) as well as Kharak Singh vs. State of Uttar Pradesh (1962) in relation to the guarantee of the right to privacy under the Constitution, and, therefore, made its derogation subject to the highest level of judicial scrutiny.[18][19]

Premised on the principle that “Privacy is the ultimate expression of the sanctity of the individual”, the Supreme Court affirmed the reasoning and judgment given in the PUCL case and held that privacy” is the “condition or state of being free from public attention to intrusion into or interference with one’s acts or decisions.” The right to be in this condition has been described as the “right to be left alone.” What seems to be essential to privacy is the power to seclude oneself and keep others from intruding on it in any way. These intrusions may take any of several forms, including peeping over one’s shoulder to eavesdropping directly or through instruments, devices, or technological aids.[20]

The Court established a "Tripod Test" for legality: any instance of surveillance must have a legislative basis (Legality), serve a legitimate state aim (Necessity), and be proportionate to the objective (Proportionality). It is a four-fold test that needs to be fulfilled before state intervention in the right to privacy:

i. The state action must be sanctioned by law.

ii. In a democratic society there must be a legitimate aim for action.

iii. Action must be proportionate to the need for such interference.

iv. And it must be subject to procedural guarantees against abuse of the power to interfere.

K.S. Puttaswamy (Aadhaar-5J.) v. Union of India (2019):[21]

In the successor to the Privacy judgement, the Supreme Court considered whether the Aadhar architecture allowed the state to conduct surveillance by profiling. The court affirmed that this does not enable the creation of a 'regime' of profiled surveillance, since the Act provided adequate data protection. The petitioner's contention that the Aadhar system would cause the loss of the already limited trust in the government and constitution, was rejected by the Court. The CEO of the Unique Identification Authority of India (UIDAI) demonstrated how the machinery and legal provisions created for Aadhar, read with the IT Act and Rules would provide sufficient data protection for individuals. The court further read down certain provisions of the Aadhar Act (the linking of bank account and SIM with Aadhar was struck down on grounds of being disproportionate), and ultimately affirmed that it would be very difficult to create a profile of a person simply based on the biometric and demographic information stored in the Central Identities Data Repository (CIDR). Similarly, the enrolment, authentication and encryption processes relating to Aadhar, and the storage of information collected as a result, were given the green-light. During any Aadhar authentication process, the UIDAI does not come to know about the location or purpose of such authentication, making allegations of surveillance and profiling "far-fetched," in the court's eyes.

However, it may be worthwhile to note the dissenting opinion penned by then Justice D.Y. Chandrachud. Chandrachud was of the view that a collection of mass biometric data does hold the potential for the creation of comprehensive profiles within a central database. Drawing on academic work from privacy advocates, he says that data linking creates the possibility of 'function creep.' Function creep is the gradual expansion of the use of a technology beyond what it was originally brought in for, i.e. the slippery slope effect. While the stated aims of the Aadhar scheme are not worrying, the specific nature of biometric data makes it vulnerable to be used for hidden agendas. According to Chandrachud, ways to control this function creep would include limiting the amount of data being collected, greater user participation and limitations on technological access to the databases. He does not rule out the possibility of surveillance even with the current scheme - the preferences of a person (which would also form part of his personal data, or profile) can be ascertained by examining the kinds of entities which requested for proof of identity of the person. These preferences could potentially be used to predict the decisions of a person, and even influence their electoral choices. When done at a large scale, this would deprive democracy of free choice and make elections open to this form of manipulation. Additionally, the Aadhar act does not have adequate safeguards to prevent unauthorized use or theft of the identification system, though such a national-level database would be a likely target for causing serious damage and compromising the identities of millions. Importantly, Chandrachud also brings up the interlinking of national-level databases. Though only limited forms of data are collected by the Aadhar system, this in conjunction with other forms of data would lead to the formation of distinct profiles of individuals. With Aadhar being seeded into every database, automatic information retrieval becomes exceptionally easy and vulnerable to misuse.

Vinit Kumar vs. CBI (2019):[22]

The Bombay High Court ruled that the interception of a businessman’s telephone calls was an infringement of his right to privacy. The Indian Home Ministry had ordered the interception of a businessman’s communications after he was accused of bribing a public servant. The businessman challenged the interception orders, arguing that they were unlawful and infringed his right to privacy. The Court held that there was no lawful justification for intercepting the businessman’s communications, set aside the orders and instructed that all information obtained through the interception be destroyed.[23] The Bombay High Court ruled that the state cannot order interception merely because it is "expedient"; there must be a specific public safety concern, applying the Puttaswamy test strictly to surveillance orders.

Manohar Lal Sharma vs. Union of India (2023): [24]

In Manohar Lal Sharma v Union of India (2023), commonly known as the Pegasus case, the Supreme Court addressed the alleged use of military-grade 'Pegasus' spyware on civilians. It was held that digital surveillance or spying would be unconstitutional, unless it was conducted by the state in the interests of the nation to ensure life, liberty, and security. Usage of surveillance technology must be based on evidence and resorted to only when absolutely necessary. Indiscriminate spying would be unconstitutional, except if there were sufficient safeguards incorporated into the procedural law governing such surveillance. However, the court held that the state cannot simply invoke "national security" as a shield to avoid judicial review of the use of Pegasus spyware. The necessity of secrecy in the national interest must be proved by producing evidence on affidavit.

Drawing on Puttaswamy and the constitutional status afforded to the right to privacy, the court held that this right is directly conflicted with as a result of digital spying or surveillance. Yet, the collection of digital data is essential in the fight against terrorism, corruption and other violence. Since gathering this information will breach the constitutional right to privacy of individuals, the state must have a constitutional ground to be undertaking any surveillance. Unauthorized or unregulated digital surveillance may lead to self-censorship out of fear, resulting in a potential "chilling effect" on free speech and press freedom.[25] Storage of personal data for reasons other than national security would be illegal and a matter of concern for the entirety of civil society.

Accordingly, the court directed the formation of an expert Technical Committee to examine the veracity of certain allegations raised against the spyware based on concerns of indiscriminate spying.

Indranil Mullick & Ors. vs. Shuvendra Mullick (2025):[26]

Most recently, the scope of digital surveillance has been expanded to include private-party monitoring in the landmark case of Indranil Mullick & Ors. vs. Shuvendra Mullick. This case arose from a domestic dispute in Kolkata where one brother installed CCTV cameras in the common areas of a shared ancestral home ("Mullick Bhaban"), which arguably monitored the private entrance and living quarters of the other brother without consent. The Calcutta High Court held that the installation of CCTV cameras inside a residential dwelling without the consent of co-occupants violates the Right to Privacy under Article 21.[27]

The Supreme Court of India, in dismissing the Special Leave Petition against this order, upheld the principle that security concerns (such as protecting family heirlooms) cannot override the fundamental right to privacy within a home. This judgment is significant as it defines "digital surveillance" not just as a vertical relationship between State and Citizen, but also as a horizontal violation between private individuals, ruling that constant digital monitoring in a domestic setting is an actionable infringement of constitutional rights.

Legal Recourse against Illegal Surveillance

National Experience: [28]

Currently, legal recourse against illegal surveillance by individuals or private companies exists, including the ability to file an FIR or approach the Magistrate court. However, legal protections against state surveillance are limited, and there is a lack of adequate national legislation and oversight. The UN Office of the High Commissioner has noted that weak procedural safeguards and ineffective oversight contribute to reduced accountability and that mass surveillance by governments is becoming a dangerous habit. The need for national legislation, oversight mechanisms, and transparency to protect privacy both online and offline is crucial in India. There are legal remedies available to victims of illegal surveillance by individuals or private companies.

The Cyber Cells of state police forces can be approached to report such incidents, and victims of cybercrime can file an FIR under Section 154 of the Criminal Procedure Code, 1973. If the police officer or cell refuses to investigate the complaint, a private complaint can be filed under Section 156 (3) read with Section 190 of the Criminal Procedure Code, 1973, seeking a direction to the police station concerned to investigate the matter.

At the micro-level, the state is empowered to perform targeted surveillance in the form of interception. There are various lawful interception systems available in the Indian market which are installed into the networks of telecom services and internet services by the government through the license agreement. Though interception is legal in India under specific legal grounds, hacking is a punishable offence under the Information Technology (Amendment) Act, 2008 (Section 43 and 66).

Types and forms of Digital Surveillance

The legal and functional understanding of digital surveillance in India is not monolithic; it is categorized based on the method of acquisition, the scale of operation, and the relationship between the watcher and the watched. These distinctions are critical as they determine the applicable statutory framework and the constitutional threshold for validity.

Targeted Interception (Content-Based Surveillance)

This forms the traditional core of surveillance law, evolving from telephone wiretapping to digital message interception. Targeted interception is predicated on the specific identification of a subject, an individual or a specific device, under a valid legal order. In the Indian context, this is governed by Section 69 of the Information Technology Act, 2000, which authorizes the state to access the actual content of the communication, such as the text of an email, the audio of a VoIP call, or the body of a message. Legal jurisprudence, specifically the PUCL Guidelines, mandates that this type of surveillance must be "event-based" and time-bound, ensuring it does not become a perpetual state of observation. It is distinct because it requires a high threshold of justification, such as an imminent threat to public order or national security, to override the individual's expectation of privacy.

Mass Surveillance and Bulk Acquisition

Mass surveillance involves the indiscriminate collection of data from a large number of people, often without suspicion of specific criminal activity. This typology relies on "dragnet" technologies that filter vast amounts of internet traffic to identify patterns or keywords. In India, this is institutionally represented by the Central Monitoring System (CMS) and NETRA (Network Traffic Analysis). The CMS acts as a centralized command center that automates the interception of telecommunications, allowing law enforcement to bypass service providers and intercept communications directly.

The rationale for this scale of dominion is historically traced to Section 9 of the Indian Telegraph Act, 1885, which empowers the government to regulate the conduct of telegraphs and establishes the administrative logic of state management. The state proclaims to employ these technologies for maintaining order in society, ensuring compliance with the law, and better administration. However, legal scholars and the Justice B.N. Srikrishna Committee have flagged this form of surveillance as legally precarious. Without the procedural safeguard of individual review, it creates a "panoptic" infrastructure where the mere existence of the system exerts control over the population. Critics argue that such unchecked power can mutate into nefarious forms, utilized for controlling dissent, manipulating electoral behaviour, and ensuring conformity to a specific notion of "ideal citizenry" by targeting groups deemed incongruous with the state's objectives.

Dataveillance (Metadata and Traffic Data Monitoring)

Dataveillance refers to the systematic monitoring of "traffic data" or metadata, the information about a communication rather than the communication itself. This includes logs of who called whom, the duration of calls, location coordinates (cell tower triangulation), and internet browsing history. Legally, this is treated with a lower threshold of protection than content surveillance. Under Section 69B of the Information Technology Act, 2000, the government can authorize the Computer Emergency Response Team (CERT-In) to collect this data to enhance cyber security and prevent computer contaminants. However, as noted in the Puttaswamy judgment, the aggregation of metadata can reveal intimate details of an individual's life, political preferences, and associations, making it a potent form of surveillance that constructs a "digital mosaic" of the citizen without technically listening to their words.[14]

Commercial and Economic Surveillance (Surveillance Capitalism)

This typology operates on a purely economic logic, distinct from state security. For the private sector, surveillance aims to capture personal information for profiling consumer behaviour. Under the Digital Personal Data Protection Act, 2023, "Personal Data" is defined as any data about an individual who is identifiable by such data. In this regime, people’s data becomes a commodity harvested by data mining agencies to be sold to companies for curating marketing strategies per users’ preferences. This ever-increasing monitoring by the private sector has led to what philosopher Shoshana Zuboff terms "Surveillance Capitalism," where human experience is extracted and modified for profit.[1] While the state monitors for control, the market monitors for prediction, though the two often intersect when the state purchases these commercial datasets for its own use.

Lateral and Domestic Surveillance (Horizontal Monitoring)

While traditional surveillance is vertical (State watching Citizen), the digital age has entrenched "lateral surveillance," where private individuals monitor each other. This encompasses the use of commercial CCTV systems, stalkerware, and social media tracking by private entities. The legal framework handles this delicately. On one hand, Section 3(c) of the Digital Personal Data Protection Act, 2023 exempts personal data processed by an individual for any "personal or domestic purpose" from regulatory compliance. On the other hand, the judiciary has set strict constitutional limits on this exemption. In the landmark case of Indranil Mullick & Ors. vs. Shuvendra Mullick (2025),[27] the Supreme Court acknowledged that digital monitoring tools installed by one private party against another within a shared domestic space constitute a violation of the constitutional right to privacy. Thus, while the statute may exempt domestic surveillance from bureaucratic regulation, the courts recognize it as an actionable violation of rights.

Intrusive Surveillance (Hacking and Spyware)

This represents the most aggressive form of digital surveillance, evolving beyond passive interception to active intrusion. It involves the use of malware or spyware (such as Pegasus) to infiltrate a digital device, gaining control over its microphone, camera, and stored files. Unlike standard interception, which catches data in transit, intrusive surveillance accesses data at rest. The Supreme Court's engagement with this typology in Manohar Lal Sharma v Union of India (Pegasus Case) highlighted that such methods are qualitatively different because they violate the sanctity of the device itself.[24] Indian law currently lacks a specific statutory provision explicitly authorizing "hacking" by the state, creating a legal grey area where such actions are challenged as being ultra vires (beyond the powers of) the existing provisions of the IT Act.

Examples of Digital Surveillance

In this section, examples of digital surveillance, both from India and across the world, are discussed. The results of these surveillance measures and the responses to them from citizens are considered.

Surveillance in India:

Multiple initiatives of the Indian government have faced backlash for their potential misuse as surveillance tools. This section discusses a few of these technologies.

Biometric Data Collection (Aadhar, Digi Yatra, Sanchar Saathi etc.):

The Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016 and the associated rules provide for the collection of biometric information. Facial images, scans of irises, and fingerprints of all 10 fingers are collected during the Aadhar enrolment process. The usage of Aadhar data and the potential to create profiles of people were challenges to the Act and were discussed in Puttaswamy (2017). After slight amendment of the scheme, the Supreme Court upheld the Act and the rules under it.

Digi Yatra is a service introduced by the Ministry of Civil Aviation designed to make air travel "seamless, paperless, contact-less and hassle-free for all passengers. [29] It includes a facial recognition technology installed at airports, which is paired with Aadhar credentials. The Digi yatra service is non-mandatory. The key objection to Digi Yatra has been that it is not backed by any law. The Digi Yatra Biometric Boarding System Policy (DYBBS Policy) is not legally enforceable, meaning no safeguards to privacy exist. Even if the policy were to be granted the force of law, Guideline 11 of its Personal Data Guidelines states that: "Any Security Agency, GOI or other Govt. Agency may be given access to the passenger data based on the current/ existing protocols prevalent at that time." As to data protection commitments made by the service, the policy merely states that it shall be audited based on standards mandated by the Central Government (Guideline 12), but these standards remain unspecified. It has been argued that the privacy policy of Digi Yatra fails to guarantee the requisite degree of privacy. [30]

The Sanchar Saathi app, a state-owned tool by India's Department of Telecommunications, illustrates digital surveillance risks through its extensive data access capabilities, including call logs, SMS, location tracking, camera, photos, and phone management permissions, ostensibly for fraud detection (Chakshu), stolen device blocking (CEIR), and connection verification (TAFCOP). A November 2025 mandate to pre-install it non-deletably on all smartphones (later revoked amid backlash) highlighted concerns over mass data collection without clear DPDP Act safeguards or judicial oversight, enabling potential real-time monitoring and profiling under the guise of cybersecurity in India's 730-million smartphone ecosystem.[31]

Criminal Justice:

A major use-case of these technologies has been in the crime and crime-adjacent spheres.

8.1.2.1. Delhi Police:

The Delhi Police has been an early adopter of technology aimed to apprehend criminals. Per Forbes India, Delhi has the most number of CCTVs per square mile as compared to any other city in the world, at 1826 cameras per square mile. [32] (Chennai which has 609.9 cameras per square mile ranks 3rd and Mumbai, with 157.4 cameras/sq. mile, is 18th). Additionally, the Delhi Police has made use of facial recognition technology to aid arrests as well. This technology was brought in with the aim of locating missing children and reuniting them with their guardians, as affirmed by the Delhi High Court in Sadhan Haldar vs. NCT of Delhi (2018), however its use has expanded beyond this function.[33] In November 2025, the Delhi Police has received 75 surveillance drones for crime-management purposes. [34]

There have been multiple reports of wrongful arrests based on this facial recognition technology, particularly in areas with large Muslim communities. [35][36] A study by Vidhi Centre for Legal Policy revealed that the uneven distribution of CCTV cameras in some areas of Delhi as compared to others (particularly those with poor or minority population) was likely to result in a surveillance bias against certain communities. [37]

There are reports of the Hyderabad Police using AI facial recognition technology during the Covid-19 pandemic.[38]

Protests:

State police across India had relied on facial recognition technology to monitor anti-CAA protests in 2019 and 2020, leading to a number of arrests. Officials had assured that no protesters data is being stored, only that of targeted people.[39]

Databases: NATGRID, NETRA, CCTNS:

NATGRID or National Intelligence Grid is an integrated master database meant for use by Indian intelligence agencies to combat crime and terrorism threats. It allows various agencies of the Central Government to access 21 standalone databases belonging to various public service providers (SEBI, Railways, banks etc.). The database was conceptualized after the 2008 Mumbai attacks, and is reported to be functional since December 2020.[40] NATGRID collects and stores data relating to tax and bank account details, credit and debit card transactions, visa and immigration records, as well as itineraries of rail and air travel. It is also linked to the Crime and Criminal Tracking Network and Systems (CCTNS) which includes crime and policing-related information, such as First Information Reports (FIRs) filed across police stations in the country. Various central agencies (including the Research and Analysis Wing (R&AW), Intelligence Bureau (IB), National Investigation Agency (NIA), Narcotics Control Bureau (NCB), Enforcement Directorate (ED) etc.) have access to this comprehensive data.[41]

NATGRID has faced several criticisms, regarding both its effectiveness and adverse effects on privacy. Interestingly, none of the agencies that have access to NATGRID data are state agencies or different defense departments (navy, coast guard etc.). Critics have pointed out that the unwillingness to share this information beyond central agencies hampers the supposed preventive function of NATGRID.[42] Local agencies, which tend to be the first responders to any criminal threat, will not be able to make use of this data to prevent crime.

Other common concerns include the absence of sufficient security measures to prevent leakages. It has been pointed out that these databases are run by lower-ranking technician staff, not high-level officers - creating fears about the catastrophic damage a potential leak could cause since all relevant information is centralized on a single platform.[42]

The privacy concerns regarding NATGRID stem from the fact that the platform is exempt from the application of the RTI Act (per Gazette of India Notification G.S.R. (E) dated 9th June, 2011) and developed in the absence of a data protection regime.[43] Additionally, following the ruling in Puttaswamy (2017) any act of surveillance necessarily requires a constitutional justification, yet in the absence of a legal framework dictating the operation of the NATGRID and no transparency regarding its functioning, it is impossible to assess whether the intrusion into individual's right to privacy is proportional and constitutional. As it stands, the project lacks any built-in procedural safeguards to guard against mass, indiscriminate state surveillance. Puttaswamy had also struck down the mandatory linking of individual's bank accounts and SIMs with their Aadhar. NATGRID is in direct contravention of this direction, rendering some of the data collected unconstitutional.

Network Traffic Analysis (or NETRA), developed by the Defence Research and Development Organisation (DRDO) is another surveillance software. This software is available to the R&AW and the IB, and is used to intercept and filter through internet traffic, looking for suspicious keywords such as 'attack,' 'bomb,' 'kill' etc. NETRA has also faced similar relating to its indiscriminate interception of internet traffic. The predefined keywords may well be terms used in everyday conversation, yet the system does not distinguish between communications that are important to intelligence agencies and those that are not. It is argued that NETRA casts too broad of a net and raises questions regarding violations of individual privacy.[44]

The Centralised Monitoring System (or CMS) was developed by the Centre for Development of Telematics (C-DOT) and is operated by Telecom Enforcement Resource and Monitoring (TERM) cells under the Department of Telecommunications. The CMS is a telephone interception system, monitoring communications on mobile phones, landlines, and over the internet. The purpose of CMS is to bypass the requirement of individually contacting Telecom Service Providers (TSPs). TSPs are now required by law to give access to their networks for the purpose of operating the CMS, thereby avoiding direct communication with them and without disclosure to TSPs that such information is being utilized. No oversight mechanism or procedural safeguards have been notified.

The National Automated Facial Recognition System (AFRS) is being developed by the National Crime Records Bureau (NCRB). This project aims to create a national-level database of photographs and a facial recognition technology, to be used in conjunction with the database. The photographs are to be gathered from existing databases such as the Passport Database maintained by the Ministry of External Affairs and the CCTNS. The database is to be used for the identification of criminals - crime scene footage will be fed into the facial recognition technology in an attempt to obtain a match. Though this technology faces similar objections to the ones discussed earlier (it is operating in a legal vacuum, no procedural safeguards etc.), the specific concern with AFRS is the inaccuracy of existing facial recognition technology. Studies have discussed how incorrect matches with these technologies are too common for them to be viable, and how problems such as inadequate datasets severely affect their accuracy. [45][46]

One common criticism of all these surveillance technologies is that they are almost entirely exempt from any form of oversight, whether parliamentary or judicial. This has raised concerns regarding executive overreach and lack of checks and balances that prevent abuse of technology.[47] Similarly, a major cause for concern appears to be that at the time of the conception of these technologies, there was no data protection regime in place in India. The DPDP Act only came into force on November 18th, 2025, however it is still in a stage of phased compliance and full compliance will be required only in late 2026-mid 2027.[48]

Surveillance in the World:

Criminal Justice:

Surveillance technologies have been credited as solutions to crime management in different cities across the world.

In Amsterdam, authorities have been quick to adopt predictive policing models - algorithms that claim to predict when and where a crime will happen, who will commit it etc. Predictive policing has also been used to create a Top 400 list, which aims to be a crime-prevention program, monitoring children who have a high potential of turning into criminals, but have not been convicted of any criminal acts yet. This list has received criticism for profiling.[49] Privacy concerns also exist.[50]

The UK police is also testing an analytics platform called National Data Analytics Solution (NDAS) to form better strategies against violent crime. The concerns regarding surveillance have been echoed in this context as well.[51]

GDPR:

The General Data Protection Regulation of the EU is a comprehensive law on data privacy. The legislation is not designed to shield against state surveillance, however. Article 23 of the GDPR allows for states to restrict or suspend the rights available to the owner of the data if it is a necessary and proportionate measure introduced in the interests of national security, public security, investigation of offences etc.

Contributions from Social Sciences Scholars

This section considers the inputs of scholars of surveillance studies, from a social sciences angle. The origin and effect of surveillance technology are examined.

Algorithmic Biases and Perpetuation of Injustice:

Many studies have discussed the embeddedness of bias on the lines of gender, race and other social markers in technology over the last two decades. Technology is often shaped by the humans who interact with it, and it is inevitable for human biases to imprint upon technology with time. In Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble discusses the biases inherent in search engines like Google, and how algorithms do not ensure neutrality or a level playing field for all sections of society.[52] Noble adopts an intersectional black feminist approach, and uses examples of Google searches for "black girls" returning links to pornographic websites to demonstrate the troubling sexualization and oppression of women. Noble argues that algorithms not only reflect the injustices in society, they go on to reproduce and perpetuate them. The book also serves as a critique of neoliberalism - technology companies find it profitable to keep circulating these controversial search results that perpetuate harmful views and stereotypes.

In Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor, Virginia Eubanks looks at the impact of algorithmic decision-making on public services in the USA.[53] Eubanks argues that this approach has more far-reaching effects than the much-criticized older welfare measures (such as poorhouses in the nineteenth century). She presents three case studies relating to welfare provision, homelessness and child protection services. Her criticism hinges on the high rate of error that has come from the use of these technologies. She also adopts an intersectional approach, involving both race and class. The disproportionate impact of these technologies is felt by marginalized communities who are profiled by the algorithms. A 'digital poorhouse' is created, where the same race and class-based inequalities that exist in analog welfare measures are replicated.

The overarching problem pointed out by these authors (and several more) relates to algorithms acquiring and mimicking the biases already present in society. There are three major sources of algorithmic bias:[54]

  1. Data Bias: The datasets used to train AI may not be big enough (leading to sample inadequacy) or may not have been selected at random (sample selection bias). Algorithms are unable to identify counterfactual data and predict how such data will go on to affect the fairness of the entire process.
  2. Method Bias: The methodological approaches adopted in the development of machine learning-based software might be impacting the algorithm. Some common issues are correlation fallacies (falsely conflating correlation with causation), and overgeneralization of findings from a specific dataset. Interactions with humans may also lead to harmful feedback loops, which causes models to produce results with exacerbated forms of the biases that are already present in society
  3. Societal Bias: The datasets that the models are trained on may reflect social and historical biases, rendering the entire model biased. Historical discriminatory practices can transfer over to algorithms too. Even when existing prejudices are not imprinted onto algorithms, there is a possibility of bias occurring.[55] Algorithms do not exist in isolation, they are a product of their social environments, i.e. injustices in the real world are likely to impact algorithms as well.

How is this Relevant to Digital Surveillance?

Digital surveillance technologies make use of similar algorithms, which suffer from the same pitfalls of compromised datasets and interactions with human biases. Surveillance can disproportionately target certain communities, or generate incorrect information if algorithmic bias is not countered.

Techno-Solutionism:

Techno-solutionism refers to the belief that deploying technology will provide solutions to varied challenges in society. This goes hand-in-hand with the push towards 'predicting' crime in modern policing. Scholars argue that policing is going through a 'rationalizing' process, as described by Max Weber (1947).[56] Upcoming technologies and techniques in policing aim to gather and structure data, aiming to find patterns and predict crime before it ever happens. However, this strips away cultural and human concerns from policing, leaving the task upto ill-conceived tools and technologies. It is surprising how rapidly technological solutions are deployed for crime-related problems; there is a 'magic' associated with technology, despite its effectiveness not being confirmed.[56][57]

The problem with this approach is that technology is not adequately developed or equipped with the required safeguards before being implemented onto the real world. This leads to unintended negative consequences.

Theoretical framework

The term "technological solutionism" was popularized by technology critic Evgeny Morozov in his 2013 book, To Save Everything, Click Here. Morozov argued that solutionism recasts complex, often ambiguous human problems as neatly defined puzzles with definite, computable solutions.[58] In the context of surveillance, this manifests as the belief that safety is purely a metric of coverage and data retention. Meredith Broussard expanded on this with the concept of "technochauvinism," describing the pervasive belief that technological solutions are inherently superior to human ones.[59] This ideology often masks the reality that surveillance algorithms are not neutral but are trained on historical data that may reflect systemic biases.[60]

This framework is frequently analyzed alongside Shoshana Zuboff’s theory of Surveillance Capitalism. Zuboff posits that the drive for solutionist technologies is not merely social but economic. By framing every aspect of human life as a problem to be solved by tracking, technology companies can extract human experience as raw material for behavioral prediction markets. Consequently, the "solution" offered by digital surveillance is often a pretext for data extraction, where the primary beneficiary is the technology provider rather than the community being surveilled.[1]

Applications in smart cities and policing

A primary application of solutionist surveillance is the "Smart City" model. Proponents argue that integrating cameras, facial recognition, and IoT sensors into urban infrastructure creates a seamless feedback loop that optimizes safety and efficiency. However, recent research suggests these implementations often amount to "security theater"- measures designed to look impressive rather than be functionally effective. A 2024 study on the Paris 2024 Olympics highlights how major events are used as justifications to deploy algorithmic video surveillance (AVS). The researchers argue that these deployments normalize intrusive monitoring under the guise of temporary security, creating a permanent "legacy" of surveillance that persists long after the specific problem it was meant to solve has concluded.

Similarly, in law enforcement, solutionism drives the adoption of predictive policing algorithms. These systems are marketed as objective tools to allocate police resources efficiently. However, research by scholars such as Virginia Eubanks demonstrates that these tools often create feedback loops. By sending police to areas with high historical arrest data, the algorithms ensure more arrests are made in those areas, confirming the bias and mathematically justifying over-policing of marginalized communities under the banner of technological efficiency.[61]

Criticism and digital colonialism

A significant body of academic work critiques the ethical implications of exporting solutionist surveillance to the Global South, a phenomenon Michael Kwet describes as "Digital Colonialism." This critique suggests that Western and Chinese technology firms use developing nations as laboratories for testing surveillance infrastructure.[62] A frequently cited case study is Aadhaar in India, a biometric identification system marketed as a technological solution to poverty and financial exclusion. Critics note that while framed as a tool for efficiency, it created a centralized surveillance apparatus that frequently leads to the exclusion of the poor due to technical failures, illustrating the "efficiency trap" where technical functionality is prioritized over human welfare.[63]

Furthermore, recent psychological research challenges the efficacy of these systems. A 2024 study published in Communications Psychology provided empirical evidence that individuals react more negatively to being monitored by AI algorithms than by humans. These findings contradict the solutionist narrative that algorithmic surveillance is less intrusive or more objective. The research indicates that replacing human oversight with automated systems leads to lower perceptions of autonomy and increased behavioral resistance, suggesting that technological solutions to supervision may be counterproductive.[64]

Recent developments (2020-present)

While the COVID-19 pandemic provided a test case for solutionist apps (such as contact tracing), the most recent discourse focuses on the rise of Generative AI. A 2023 report by the Centre for Emerging Technology and Security (CETaS) warns that Generative AI significantly lowers the barrier for authoritarian surveillance. It allows for the automated creation of synthetic evidence and propaganda at scale, complicating the information environment. Researchers are now examining how this "AI efficiency" is being used to justify the deployment of automated targeting systems in conflict zones, marking a shift from surveillance as a tool of monitoring to surveillance as a tool of automated kinetic action.[65]

Effectiveness:

Jeremy Bentham's architectural concept of a panopticon is often compared to the world after the advent of digital surveillance.[66] The panopticon is a prison building that doubles as a mechanism for constant visibility of all inmates. A watchman occupying the central tower of the building has the ability to look into any of the cells in the prison, yet none of the prisoners can tell where the watchman is looking. This creates a self-regulatory effect, where the fear of being spotted by the watchman coerces the prisoners into regulating their own behavior though no one may be watching them. Michel Foucault saw the panopticon as an allegory for state-control, or a mechanism for discipline.[67] The tower symbolizes a visible yet unverifiable power, leading to the automatization of the task of disciplining. He compared such a mechanism to the modern-day police system: a centralized, strict, administrative body that exercises its power through the fear of penitentiaries, networks of secret agents, and even the fear of being reported by a fellow citizen. The police exercises discipline over society in this way.

Digital surveillance has been compared to a panopticon by scholars of surveillance studies. According to David Lyon, surveillance is fundamentally about power, i.e. it seeks to control and predict.[68] The few watch the many in all forms of surveillance, from policing to workplace monitoring. This creates a 'self-monitoring' situation, where the mere idea of being seen (by CCTVs, facial recognition software etc.) is enough to influence a change in behavior of those being monitored. One of the goals behind surveillance is to create a change in behavior in the form of increased compliance with rules. However, those who are pro-surveillance (and pro-techno-policing) fail to consider that humans can and will adapt to their environment.[56] With new technologies, new ways of doing crime also arise. Studies have found that increase in increase in video surveillance does not perceptibly impact safety.[69]

Fear It Causes

Surveillance-induced anxiety and the chilling effect

Surveillance-induced anxiety describes the psychological distress and behavioral modification that occurs when individuals believe they are being monitored by digital systems. This phenomenon is closely linked to the legal concept of the "chilling effect," where the awareness of surveillance discourages the exercise of civil liberties, such as free speech, association, and intellectual inquiry.[70] Research suggests that digital surveillance does not merely capture behavior but actively alters it, enforcing social conformity through a mechanism of internalized fear known as "anticipatory conformity."

Panopticism

The foundational theory for understanding surveillance-based fear is Michel Foucault’s interpretation of the Panopticon. Originally a prison design by Jeremy Bentham, Foucault metaphorically applied it to modern society in Discipline and Punish (1975). He argued that the power of surveillance lies in its visibility and unverifiability; because the subject never knows exactly when they are being watched, they must act as if they are watched at all times.[71]

In the digital age, scholars like Ivan Manokha argue that this has evolved into "Electronic Panopticism." Unlike the physical prison, digital surveillance is ubiquitous and invisible. Manokha suggests that the fear of algorithms analyzing one’s data leads to a state of constant self-discipline, where individuals voluntarily restrict their behavior to align with perceived algorithmic norms to avoid being "flagged" or downranked.[72]

The "Chilling Effect" on free inquiry

Empirical research has validated the theory that surveillance induces intellectual fear. A landmark study by Jon Penney (2016) examined Wikipedia traffic before and after the 2013 Snowden revelations regarding NSA surveillance. The study found a statistically significant and immediate drop in traffic to Wikipedia articles related to terrorism and security (e.g., "Al-Qaeda," "dirty bomb"). Penney argues this demonstrates a mass "chilling effect," where fear of government monitoring caused citizens to self-censor their intellectual curiosity to avoid suspicion.[70]

Complementing this, Elizabeth Stoycheff (2016) researched the impact of surveillance awareness on social media discourse. Her study utilized the "Spiral of Silence" theory, finding that when participants were primed with reminders of government surveillance, they were significantly less likely to express non-conformist or minority political opinions. This suggests that the fear of surveillance degrades democratic discourse by incentivizing conformity to the perceived majority opinion.[73]

Workplace surveillance and "Bossware"

A growing body of research focuses on the anxiety produced by "Bossware" (algorithmic management technologies) in the workplace. These systems track keystrokes, mouse movements, and attention, often using AI to predict productivity. Research by Karen Levy (2015) on truck drivers highlights how constant digital monitoring creates a high-stress environment where workers feel dehumanized, leading to fatigue and "resistance" behaviors.[74]

More recent reports from the Center for Democracy and Technology (2021) highlight that the expansion of this monitoring into the home during the COVID-19 pandemic (via remote work software) has dissolved the boundary between professional and private life, causing significant psychological distress. Employees report a "fear of the invisible manager," leading to performative busyness (e.g., moving the mouse just to appear active) rather than actual productivity.[75]

Racializing surveillance and targeted fear

Scholarship also examines how surveillance fear is unevenly distributed. Simone Browne, in Dark Matters (2015), introduces the concept of "racializing surveillance," arguing that surveillance technologies are historically rooted in the control of Black bodies (dating back to lantern laws). Consequently, the fear of surveillance is not a generic privacy concern for marginalized communities but an existential fear of state violence.[76]

This is supported by recent studies on the impact of Countering Violent Extremism (CVE) programs on Muslim communities. Research indicates that the pervasive monitoring of these communities creates a "climate of suspicion," where individuals avoid religious expression, political activism, or even accessing social services due to the fear that their data will be misinterpreted as a security threat.[77]

Way Ahead

Vrinda Bhandari and Karan Lahiri identify multiple problems with India's digital surveillance infrastructure that must be changed in a post-Puttaswamy world.[78] As it stands, the law does not offer adequate protection. This is because, 1) the few safeguards in existing law are very widely worded and easy to circumvent; 2) the authority to conduct surveillance lies entirely with the executive; 3) illegally-obtained evidence is admissible in Indian courts, so there is nothing to deter prosecutors from collecting such data. The authors argue that the judiciary can act as a counter to the unchecked power of the executive to access private data. With a strong articulation of the right to privacy in Puttaswamy, the authors believe that it is possible to push for judicial oversight over all surveillance activities done by the state. Any challenge before the court will need to highlight the changes in privacy law and technology that have taken place since the challenge against telephone tapping in PUCL. The authors conclude that secret surveillance, when carried out exclusively with executive oversight, would fail the proportionality test.

The authors also argue that evidence obtained through the violation of a constitutional right cannot be deemed admissible. For this, they place reliance on precedent that bars evidence that is the result of testimonial compulsion, since it amounts to a violation of the right guaranteed by Article 20(3). Accordingly, they conclude that evidence obtained by violating the right to privacy, an important component of the Right to Life and Liberty contained under Article 21, must also be inadmissible. This will deter authorities from indiscriminate and illegal surveillance activities.

Therefore, the way ahead would involve a challenge to the unregulated surveillance powers of the Executive. Borrowing the doctrine of privacy in Puttaswamy, a strong case could be made for the introduction of judicial oversight over all surveillance activities that compromise private data. Further, there exist legal arguments to back the position that evidence obtained in contravention of an individual's constitutional rights must not be allowed to be admitted into court. Finally, there must be adequate precautions in place to ensure the technology being used in these surveillance systems is reliable, accurate and non-discriminatory. With these safeguards, it is likely that digital surveillance in India will become more fair and regulated.

Related Terms

(synonymous terms)

  1. Spying

(specific surveillance practices)

  1. Video surveillance
  2. Biometric data collection
  3. Data Encryption
  4. Data Interception
  5. Automated decision-making

(connected legal themes)

  1. Right to Privacy
  2. Data Privacy and Protection
  3. Tech-policing
  4. National Security Justification

     

  1. 1.0 1.1 1.2 Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (Profile Books 2019)
  2. The Vocabularist, ‘The Very French History of the Word “Surveillance”’ (BBC News, 14 July 2015) https://www.bbc.com/news/blogs-magazine-monitor-33464368 accessed 24 November 2025
  3. FB Smith, ‘British Post Office Espionage, 1844’ (1970) 14 Historical Studies
  4. University of Michigan, ‘History of Surveillance Timeline’ (Safe Computing) https://safecomputing.umich.edu/protect-privacy/history-of-surveillance-timeline accessed 24 November 2025
  5. IDS Security Systems, ‘The History of CCTV’ (IDS Security Systems, 23 November 2020) https://www.ids-securityltd.co.uk/news/post/the-history-of-cctv accessed 24 November 2025
  6. Duncan Campbell, 'Interception Capabilities 2000' (Report to the Director General for Research of the European Parliament, April 1999)
  7. Bob Mesnik, ‘The History of Video Surveillance’ (Kintronics, 2016) https://kintronics.com/the-history-of-video-surveillance/ accessed 24 November 2025
  8. Roger Clarke, 'Information Technology and Dataveillance' (1988) 31 Communications of the ACM.
  9. Rahul Sundaram, ‘Navigating Privacy and Security: Understanding India’s New Lawful Interception of Messages Rules’ (IndiaLaw, 9 December 2024) https://www.indialaw.in/blog/civil/privacy-india-lawful-interception/ accessed 3 December 2025
  10. PRS Legislative Research, ‘Draft Telecom Rules on Interception, Temporary Suspension of Services, and Cyber Security 2024’ (PRS Legislative Research, 2024) https://prsindia.org/billtrack/2024-draft-telecom-rules-on-interception-temporary-suspension-of-services-and-cyber-security accessed 3 December 2025
  11. Harsh Walia and others, ‘New Rules for Lawful Interception of Telecommunications’ (Khaitan & Co, 11 September 2024) https://www.khaitanco.com/thought-leadership/New-Rules-for-Lawful-Interception-of-Telecommunications accessed 3 December 2025
  12. https://www.commoncause.in/wotadmin/upload/REPORT_2023.pdf
  13. Committee of Experts under the Chairmanship of Justice B.N. Srikrishna, A Free and Fair Digital Economy: Protecting Privacy, Empowering Indians (Ministry of Electronics and Information Technology 2018)
  14. 14.0 14.1 14.2 K.S. Puttaswamy (Privacy-9J.) v. Union of India, (2017) 10 SCC 1
  15. Group of Experts on Privacy, Report of the Group of Experts on Privacy (Justice A.P. Shah Committee) (Planning Commission of India 2012)
  16. Standing Committee on Communications and Information Technology, Fifty-Third Report: Citizens’ Data Security and Privacy (Lok Sabha Secretariat 2023)
  17. PUCL v. Union of India, (1997) 3 SCC 433
  18. M.P. Sharma v. Satish Chandra, (1954) 1 SCC 385
  19. Kharak Singh v. State of U.P., (1964) 1 SCR 332
  20. https://corporate.cyrilamarchandblogs.com/2019/11/surveillance-post-puttaswamy-era-right-to-privacy/
  21. K.S. Puttaswamy (Aadhaar-5J.) v. Union of India, (2019) 1 SCC 1
  22. Vinit Kumar v. CBI, 2019 SCC OnLine Bom 3155
  23. https://globalfreedomofexpression.columbia.edu/cases/kumar-v-central-bureau-of-investigation/
  24. 24.0 24.1 Manohar Lal Sharma (Pegasus Spyware) v. Union of India, (2023) 11 SCC 401
  25. Manohar Lal Sharma v. Union of India, 2021 SCC OnLine SC
  26. Indranil Mullick vs. Shuvendra Mullick, SLP (C) No. 12384/2025
  27. 27.0 27.1 Shuvendra Mullick v. Indranil Mullick, 2025 SCC OnLine Cal 1245
  28. https://www.commoncause.in/wotadmin/upload/REPORT_2023.pdf
  29. Ministry of Civil Aviation, Government of India, 'Digi Yatra Biometric Boarding System Policy' <https://www.civilaviation.gov.in/sites/default/files/2023-07/Digi%20Yatra%20Policy%20%28DIGI%20YATRA%29.pdf> accessed on 28 November 2025
  30. Disha Verma, 'Resist Surveillance Tech, Resist Digi Yatra' (Internet Freedom Foundation, 16 January 2024)<https://internetfreedom.in/reject-digiyatra/> accessed 28 November 2025
  31. https://www.nytimes.com/2025/12/02/business/india-tracking-app-sanchar-saathi.html
  32. 'Delhi, Chennai among most surveilled in the world, ahead of Chinese cities' (Forbes India, 25 August 2021)<https://www.forbesindia.com/article/news-by-numbers/delhi-chennai-among-most-surveilled-in-the-world-ahead-of-chinese-cities/69995/1> accessed 28 November 2025
  33. Sadhan Haldar v. The State of NCT of Delhi W.P. (CRL.) 1560/2017 (Delhi High Court)
  34. 'Delhi police receives 75 surveillance drones to boost tech-based policing' (The New Indian Express, 11 November 2025) <https://www.newindianexpress.com/cities/delhi/2025/Nov/11/delhi-police-receives-75-surveillance-drones-to-boost-tech-based-policing> accessed 28 November 2025
  35. Sagar, 'Detentions, arrests, interrogations: Fear reigns in Muslim neighbourhoods of northeast Delhi' (The Caravan, 11 March 2020) <https://caravanmagazine.in/conflict/detentions-delhi-violence-northeast-muslim-arrests-riots-police-crime-branch> accessed 28 November 2025
  36. Astha Savyasachi, 'As AI Policing Took Over in Delhi, Who Bore the Brunt?' (The Wire, 2 July 2025)<https://pulitzercenter.org/stories/ai-took-over-policing-delhi-who-bore-brunt> accessed 28 November 2025.
  37. Jai Vipra, 'The Use of Facial Recognition Technology for Policing in India (Working Paper), Vidhi Center for Legal Policy (2021) <https://vidhilegalpolicy.in/wp-content/uploads/2021/08/The-Use-of-Facial-Recognition-Technology-for-Policing-in-Delhi-compressed.pdf> accessed 28 November 2025
  38. Tanmay Singh, 'Hyderabad Police force people to remove their masks before photographing them' (Internet Freedom Foundation, 2 June 2021) <https://internetfreedom.in/hyderabad-police-force-people-to-remove-their-masks-before-photographing-them-we-sent-a-legal-notice-saveourprivacy/> accessed 28 November 2025
  39. Alexandra Ulmer and Zeba Siddiqui, 'India's use of facial recognition tech during protests causes stir' (Reuters, 17 February 2020) <https://www.reuters.com/article/world/indias-use-of-facial-recognition-tech-during-protests-causes-stir-idUSKBN20B0ZP/> accessed 28 November 2025
  40. https://x.com/ANI/status/1224971090503462913
  41. 'Natgrid CEO P Raghu Raman: New face of Intelligence' (The Economic Times, 12 June 2011) <https://web.archive.org/web/20131215065723/http://articles.economictimes.indiatimes.com/2011-06-12/news/29647514_1_natgrid-warsaw-data> accessed on 28 November 2025
  42. 42.0 42.1 V. Balachadran, 'NATGRID will prove to be a security nightmare' (The Sunday Guardian, June 28 2013)<https://web.archive.org/web/20130628172356/http://www.sunday-guardian.com/analysis/natgrid-will-prove-to-be-a-security-nightmare> accessed on 28 November 2025
  43. Ministry of Personnel, Public Grievances and Pensions, Department of Personnel and Training, Notification G.S.R.(E) (9 June 2011)
  44. Anushka Jain and Vrinda Bhandari, 'The Development of Surveillance Technology in India: Beyond judicial review or oversight' (Verfassungsblog, 7 April 2022)<https://verfassungsblog.de/os6-india/> accessed 28 November 2025
  45. Teo Canmetin, Juliette Zaccour and Luc Rocher, 'Why We Shouldn’t Trust Facial Recognition’s Glowing Test Scores' (Tech Policy Press, 18 August 2025) <https://www.techpolicy.press/why-we-shouldnt-trust-facial-recognitions-glowing-test-scores/> accessed 28 November 2025
  46. Jacob Snow, 'Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots' (American Civil Liberties Union, 26 July 2018) <https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28> accessed 28 November 2025
  47. Anurag Kotoky, 'India sets up elaborate system to tap phone calls, e-mail' (Reuters, 20 June 2013)<https://web.archive.org/web/20160306233258/http://in.reuters.com/article/india-surveillance-idINDEE95J04V20130620> accessed 28 November 2025
  48. Aroon Deep, 'Digital Personal Data Protection Act notified after two years, RTI Act amended' (The Hindu, 14 November 2025)<https://www.thehindu.com/news/national/digital-personal-data-protection-act-notified-after-two-years-rti-act-amended/article70278698.ece> accessed 28 November 2025
  49. 'Top400' (Public Interest Litigation Project Netherlands)<https://pilp.nu/en/dossier/top400/> accessed 28 November 2025
  50. Amnesty International, 'Netherlands: We sense trouble: Automated discrimination and mass surveillance in predictive policing in the Netherlands' (2020) <https://www.amnesty.org/en/documents/eur35/2971/2020/en/> accessed 28 November 2025
  51. Youngsub Lee, Ben Bradford, and Krisztian Posch, 'The Effectiveness of Big Data-Driven Predictive Policing: Systematic Review' (2024) 7(2) Justice Evaluation Journal 127<https://www.tandfonline.com/doi/full/10.1080/24751979.2024.2371781#d1e129> accessed 28 November 2025
  52. Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press 2018), ch 1
  53. Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's Press 2018)
  54. Shahriar Akter and others, 'Algorithmic bias in data-driven innovation in the age of AI' (2021) 60 International Journal of Information Management <https://par.nsf.gov/servlets/purl/10344127> accessed 28 November 2025
  55. National Institute of Standards and Technology, Government of USA, 'Towards a Standard for Identifying and Managing Bias in Artificial Intelligence' (2020) <https://doi.org/10.6028/NIST.SP.1270> accessed 28 November 2025
  56. 56.0 56.1 56.2 Peter K. Manning, The Technology of Policing (NYU Press, 2008)
  57. Stéphane Leman-Langlois, ‘Introduction: technocrime’ in Stéphane Leman-Langlois (ed), Technocrime (Willan 2013)
  58. Evgeny Morozov, To Save Everything, Click Here: The Folly of Technological Solutionism (PublicAffairs 2013)
  59. Meredith Broussard, Artificial Unintelligence: How Computers Misunderstand the World (MIT Press 2018)
  60. Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press 2018)
  61. Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St Martin's Press 2018) https://blogs.lse.ac.uk/lsereviewofbooks/2018/07/02/book-review-automating-inequality-how-high-tech-tools-profile-police-and-punish-the-poor-by-virginia-eubanks/ accessed 28 November 2025
  62. Michael Kwet, 'Digital Colonialism: US Empire and the New Imperialism in the Global South' (2019) 60 Race & Class 3
  63. Reetika Khera, 'The Impact of Aadhaar on Welfare Programmes' (2017) 52 Economic and Political Weekly 61
  64. Jonas Schlund and Emily M Zitek, 'Algorithmic versus Human Surveillance Leads to Lower Perceptions of Autonomy and Increased Resistance' (2024) 2 Communications Psychology 1 https://pubmed.ncbi.nlm.nih.gov/39242768/ accessed 28 November 2025
  65. Ardi Janjeva, Alexander Harris, Sarah Mercer, Alexander Kasprzyk and Anna Gausen,The Rapid Rise of Generative AI: Assessing the Risks to Safety and Security (Centre for Emerging Technology and Security 2023) https://cetas.turing.ac.uk/sites/default/files/2023-12/cetas_research_report_-_the_rapid_rise_of_generative_ai_-_2023.pdf accessed 28 November 2025
  66. Jeremy Bentham, Panopticon; or, the Inspection House (first published 1791, Kessinger Publishing 2009)
  67. Michel Foucault, Discipline and Punish: The Birth of the Prison (Pantheon Books, 1977) pt 3, ch 3
  68. David Lyon, Surveillance Studies: An Overview (Polity Press, 2007), ch 1
  69. Jerry H. Ratcliffe and Jessica M. Rosenthal, 'Video Surveillance of Public Places,' ASU Centre for Problem-Oriented Policing (2021)<https://popcenter.asu.edu/sites/g/files/litvpz3631/files/video_surveillance_of_public_places_2d_ed._9.1.22.pdf?ref=static.internetfreedom.in> accessed November 2025
  70. 70.0 70.1 Jonathon W Penney, 'Chilling Effects: Online Surveillance and Wikipedia Use' (2016) 31 Berkeley Technology Law Journal 117
  71. Michel Foucault, Discipline and Punish: The Birth of the Prison (Alan Sheridan tr, Vintage Books 1977)
  72. Ivan Manokha, 'Surveillance, Panopticism, and Self-Discipline in the Digital Age' (2018) 16 Surveillance & Society 219
  73. Elizabeth Stoycheff, 'Under Surveillance: Examining Facebook’s Spiral of Silence Effects in the Wake of NSA Internet Monitoring' (2016) 93 Journalism & Mass Communication Quarterly 296
  74. Karen Levy, 'The Contexts of Control: Information, Power, and Truck-Driving Work' (2015) 31 The Information Society 160
  75. Lydia XZ Brown and others, Warning: Bossware May Be Hazardous to Your Health (Center for Democracy and Technology 2021) https://cdt.org/insights/report-warning-bossware-may-be-hazardous-to-your-health/ accessed 28 November 2025
  76. Simone Browne, Dark Matters: On the Surveillance of Blackness (Duke University Press 2015)
  77. Tufyal Choudhury, The Experience of Muslims in the UK with the Prevent Duty (Open Society Foundations 2020)
  78. Vrinda Bhandari and Karan Lahiri, 'The Surveillance State, Privacy and Criminal Investigation in India: Possible Futures in a Post-Puttaswamy World' (2020) 3(2) University of Oxford Human Rights Hub Journal <https://ohrh.law.ox.ac.uk/wp-content/uploads/2021/04/U-of-OxHRH-J-The-Surveillance-State-Privacy-and-Criminal-Investigation-1-1.pdf> accessed 28 November 2025