Facial recognition technology (FRT) is a way of identifying a facial expression by way of verifying the identity by using their face. Facial recognition technology has become a hot topic in recent years, particularly in the last few months. Because of the surveillance and control component, it has a wide range of applications, including retail malls, airports, stadiums, concerts, and police enforcement. Facial recognition technology can use a database of images and videos, such as those on identity cards, driver’s licenses, security cameras, school card images, and many more databases, to identify persons in real-life or security films and images.
The government cannot make heads and tails of what it includes and still, it is trying to enforce the FRT’s likelihood in terms of national security. The government is establishing a huge FRT network known as an Automated facial recognition system (AFRS), which makes CCTV monitoring much easier by extracting facial biometric from films and matching them with photographs stored in a database. Irrespective of its merits, it poses a threat to residents’ privacy and basic human rights
Who owns your data as per Indian data protection law?
India’s data protection law, the Personal Data Protection Bill2019, will be enacted soon. It’s quite obvious that you, the user, should have the “natural ownership” of your data. But the Bill doesn’t explicitly state this. This is problematic because while we have certain rights and protections, we don’t get a clear and solid right of ownership that means we don’t get the final say and consent for our information on the internet. We should strive for a stronger foundation and ownership right in our data law too.
To properly enforce this massive and complex data protection law, businesses and the government will have to set up a lot of things: infrastructure, regulatory bodies, management, etc. What adds to the problems for both users and the businesses that process and manage user data (data fiduciaries) is this: the Bill mentions no plan for transitioning and implementation. This ambiguity is quite dangerous. What happens in the case where users like you withdraw consent to processing some of their data? Do businesses ask for consent again under the new law or not? They should, but we don’t know that because the Bill never addresses these issues with a plan. We should strive for a stronger foundation and ownership right in our data law too.
Right to information v/s the data protection law
Setting a dangerous precedent for public transparency, the data protection Bill’s clause 96 confers an overriding effect to it. As a result, wherever Bill’s provisions conflict with existing laws, Bill’s provisions will take precedence. This includes the RTI Act as well. The RTI Act is extremely integral for us, as citizens, of a democratic country. It allows us to seek information, transparency, and accountability from public institutions. Further, the RTI Act already has provisions to balance the citizens’ right to information and the public officials’ right to privacy. Clause 96 then dangerously disrupts accountability for users on their data, with no good reason.
In the landmark judgment on the data protection bill passed by justice, BN Srikrishna committee has an impact on important laws, such as the right to information (RTI) act. The proposed personal data protection bill, 2018 aims to alter section 8(1) (j) of RTI, which states that personal data that does not support or serve ‘public interest or activity cannot be disclosed unless and until deemed to be in the public interest. In other words, if personal information is proven to serve a public purpose, it can be requested under RTI. This act has retrograde affects the present RTI act is already protecting personal information, it is forbidden to disclose any material that is not in the public interest or has no influence on any public activity.
A gander into the Contentious provisions of the Bill
The use of facial recognition techniques restricts an individual’s right to privacy, as guaranteed by Article 21 of the Indian Constitution. If someone peacefully protests against the government, this technology will allow the government to identify the details of every protestor.
Under Article19, a person’s freedom of speech and expression, right to protest, and right to movement will be harmed. Application of such nature doesn’t comply with the Supreme Court’s benchmark in Justice KS Puttaswamy v. UOI. If it’s a public place or any place where the individual is present, the right to privacy is a basic right according to this judgment. If this right is violated, the government must demonstrate that its activity is legal, reasonable to the necessity for interference, and pursues a reasonable goal. In terms of AFRS’ validity, the IT Act of 2000 categorizes biometric data as sensitive personal data and establishes procedures for its collection, disclosure, and exchange. However, restrictions only apply to “corporations” and not to the government’s use of biometric facial data.
The inception of such nature is not ethical because it is without the prior agreement with an individual whose right is being violated and the fact that government is not even trying to set up this system without any discourse about its advantages and disadvantages to the general public this that adds up skepticism among the society. While we were combatting the corrupt practices for the Aadhar judgment the Supreme Court stated that linking the biometric test to bank accounts would strike to hinder the entire population, if someone has not committed any wrong it would be misappropriated on their part. The court indicates here that the use of AFRS could be used by the government to misuse the authority.
These technologies threaten our human rights.
These tools can identify, follow, single out, and monitor people everywhere they go, jeopardizing our civil freedoms and human rights, such as the right to privacy, data protection, freedom of expression, and free assembly and association (leading to the criminalization of protest and causing a chilling effect), Many applications of facial and biometric classification suffer from fundamental flaws in their scientific underpinnings. This means that the inferences they make about us are often invalid, sometimes functioning on racist theories of phrenology and physiognomy rooted in eugenics, perpetuating discrimination and adding a layer of harm as we are both surveilled and mischaracterized. The training data (i.e. databases of faces against which data is compared, and biometric data processed by these systems) are usually obtained without one’s consent, meaning that these technologies encourage both mass and discriminatory targeted surveillance by design. Human rights and civil liberties will be undermined as long as people in publicly open locations may be instantly identified, singled out, or traced.
In public-private partnerships, private actors can effectively conduct surveillance on behalf of governments and public agencies, or give information obtained from such surveillance to the authorities. There are worrying reports of this, where private companies compile nationwide databases of “suspicious” individuals, with no regulation, accountability, or oversight.
After reading the provision of this bill We urge prevention of comprehensive laws, ceasing of funding, and legally redressal mechanism against facial and biometric technology usage. This should apply at all levels of national, state, provincial, municipal, local, and other governments, and especially in law enforcement and border control agencies. Policymakers and lawmakers across the globe to stop public investment, form comprehensive laws, and work on reparations for individuals surveilled without consent. Courts and judicial officers to acknowledge the human rights threats and provide redressal mechanisms International organizations, such as the UN-OHCHR, to condemn the usage of such tech and intervene in its development.
Private entities to publicly commit to not working with any such technology, issue transparency reports on ongoing and previous contracts for provision. Technology company workers to organize with the support of their unions, against such technology Donor organizations to ensure funding for litigation, advocacy, and policy work done by NGOs and civil society organizations, on facial and biometric technology usage. As the harms of abuse significantly outweigh any benefits, we demand a ban on facial and biometric technology usage in publicly accessible spaces. We ask civil society, activists, academics, and other stakeholders from across the globe to sign on to this letter and join this important fight for India’s privacy rights.