Policing Via Facial Recognition Technology May Affect Vulnerable Groups, Says Report
Delhi has 1,826 cameras. London has 1,138 cameras per square mile. Delhi Chief Minister Arvind Kejriwal proudly tweeted these numbers last month claiming that in terms of CCTV camera density, the capital city beats Shanghai, New York and London.
When viewed from the lens of privacy, surveillance capability and biases, this growing presence of closed-circuit television cameras is little to be proud of. Especially since there's little hard data to back the argument that the contribution of this technology towards law enforcement outweighs any concerns.
Anushka Jain of Internet Freedom Foundation recalls that the Delhi Police had itself told the high court that the accuracy rate of facial recognition technology being used to find missing children is 1-2%.
Minus any stellar success, it's hard to justify such large scale deployment of facial recognition technology via CCTVs. Even more so, when it could be potentially used to target a particular community.
As Vidhi Centre for Legal Policy has found in its recent report on policing using facial recognition technology. The report notes that in Delhi, a significant number of over-policed areas have a higher average Muslim population than the average for the entire city.
This means that there is a probability of the software having more chances to be used against Muslims which would make an innocent person from that group more vulnerable to errors of the technology, explained the author of the report Jai Vipra.
Tech-activists pointed to three key areas of concern around facial recognition technology-
Lack of a regulatory framework governing its use.
The error rate, biases against certain groups.
Violation of right to privacy, surveillance capability.
In 2017, the Supreme Court of India had declared the right to privacy as a fundamental right guaranteed under the Constitution. There must be a law in existence to justify an encroachment on privacy, the apex court had laid down in its judgment.
But it's unclear under which specific law the deployment of facial recognition technology is being done.
Currently there is a lack of a clear answer on questions such as under which law has this technology been deployed; what are the safeguards against it and what are the remedies available to those who may suffer its negative consequences.Raman Chima, Policy Director, Access Now
The personal data protection law, which has been a work-in progress for five years now, offers little hope.
Chima believes the last publicly available draft of the proposed Personal Data Protection Bill does not adequately address the concerns around surveillance.
'It does not have effective oversight mechanism and there are too many exceptions for law enforcement agencies', Chima said.
To be clear, deployment of facial recognition technology isn't just a Delhi problem. Chennai, Punjab and Mumbai are right up there giving the capital city CCTV company.
In July 2020, digital news portal Medianama reported that the National Crime Record Bureau proposed a national-level Automated Facial Recognition System. The program reportedly is expected to be a national level searchable platform of facial images.
Subsequently, it was clarified that the deployment of AFRS would not include CCTV data.
The Vidhi report says this does not mean that the use of CCTV is completely out of the picture. Reason being the absence of any prohibition under law.
In fact, given the lack of an overarching law for facial recognition technology, nothing prevents state police from using CCTV data even today, and indeed they often do use it.Report by Vidhi Centre for Legal Policy
The Error Rate
Inaccuracy of facial recognition technology is the other big concern...globally as well.
For instance, the American Civil Liberties Union in a 2018 study created a database of publicly available arrest photos. The organisation then ran this database against the pictures of the members of the United States Congress and the Senate through Amazon’s facial recognition technology software.
The software incorrectly identified 28 members of the U.S. Congress as part of the database which consisted of people being arrested.
The concern around facial recognition technology includes the margin of error; surveillance concerns and the regulatory vacuum in which it operates. A major concern apart from this is also the cheap price at which this technology can be mass deployed.Raman Chima, Policy Director, Access Now
In ACLU's case, it was just around $12.33, less than the price of a large pizza.
The cost in non-monetary terms on the other hand is quite alarming.
The policing system in India continues to disproportionately target certain vulnerable groups, says Vidhi's report.
For instance, it says, like how sex workers were targeted during colonial times, today it's ‘’bar dancers’’.
Contemporary policing in India also continues to suffer from casteist and patriarchal notions and practises. The facial recognition software itself can have errors due to a biased training dataset that may over-or under represent some groups.Report by Vidhi Centre for Legal Policy
Discrimination on the basis of profession, caste, gender...and religion as well.
The use of facial recognition technology in policing in Delhi will almost inevitably disproportionately affect Muslims, particularly those living in over-policed areas like Old Delhi or Nizamuddin, points out Vidhi's report.
Simultaneously, it clarifies that the idea is not to prove that the placement of police stations in Delhi is intentionally designed to over-police Muslim areas.
However, given the fact that Muslims are represented more than the city average in the over-policed areas, and recognising historical systemic biases in policing Muslim communities in India in general and in Delhi in particular, we can reasonably state that any technological intervention that intensifies policing in Delhi will also aggravate this bias.Report by Vidhi Centre for Legal Policy
The think tank has urged the authorities to halt the use of facial recognition technology in policing and carry out public consultations for more egalitarian solutions.
(Corrects an earlier version that misinterpreted Anushka Jain's comment on the accuracy rate of facial technology to trace missing children.)