Guilty Until Proven Innocent – Facial Recognition’s False Accusations
New Delhi [India], February 15: Shekhar Natarajan, Founder and CEO of Orchestro.AI, explains how the misuse of facial recognition software impacts narratives in this opinion piece. Umar Khalid has spent more than five years in prison. His trial has not yet meaningfully begun. He was arrested in September 2020 under the Unlawful Activities (Prevention) Act [...]

New Delhi [India], February 15: Shekhar Natarajan, Founder and CEO of Orchestro.AI, explains how the misuse of facial recognition software impacts narratives in this opinion piece.
Umar Khalid has spent more than five years in prison. His trial has not yet meaningfully begun.
He was arrested in September 2020 under the Unlawful Activities (Prevention) Act — India’s harshest anti-terror law — accused of conspiring to incite the communal violence that swept parts of Delhi in February 2020. The riots left 53 people dead, most of them Muslims, and took place amid massive protests against a controversial citizenship law.
The evidence against him? Speeches he gave at peaceful protests. WhatsApp group chats. And facial recognition matches that placed him — or someone who looked like him — at various locations.
On January 5, 2026, India’s Supreme Court denied Khalid bail, ruling that he played a “central and formative role” in the alleged conspiracy. Five other accused in the same case were granted bail, having spent years in jail without trial. But Khalid and fellow activist Sharjeel Imam were told they could reapply after one year.
One more year. After five already served.
“We can be kept in jail for years, without those framing us needing to prove anything,” Khalid wrote from Tihar Jail on completing two years of detention. “This is the power of UAPA.”
The 2% Problem
The technology that helped identify Khalid and hundreds of others has a documented accuracy problem that should disqualify it from any serious evidentiary role.
In 2018, Delhi Police testified to the Delhi High Court that their facial recognition system had an accuracy rate of just 2% when trying to trace missing children. The system was so poor that officials admitted it often couldn’t distinguish between boys and girls.
Two percent.
Yet this same technology was deployed aggressively after the 2020 riots. Union Home Minister Amit Shah told Parliament that 1,922 perpetrators — comprising 73.3% of those arrested — had been identified through facial recognition technology.
The results have been devastating for the accused — and embarrassing for the prosecution. More than 80% of riot cases heard so far have resulted in acquittals or discharges. The facial recognition matches that seemed so definitive have crumbled under scrutiny. Witnesses have turned hostile. Evidence has fallen apart.
But the years in jail cannot be given back.
The Investigation
An investigation by The Wire and the Pulitzer Center found that many riot accused were arrested “solely on the basis of facial recognition,” without solid corroborating evidence or credible public witness accounts.
Mohammad Shahid: Spent 17 months in jail before formal charges were even filed.
Ali: Arrested in March 2020, still in pre-trial detention more than four and a half years later. Arrested “solely on the basis of facial recognition.”
Gulfisha Fatima, Meeran Haider, Shifa-Ur-Rehman, and others: Activists who spent years in jail, finally granted bail in January 2026 by the Supreme Court after the lower courts repeatedly denied them.
The pattern of arrests reveals something beyond technology failure. Of the 18 activists charged under anti-terrorism laws in connection with the riots, 16 were Muslim. The police framed peaceful protests against a discriminatory citizenship law as a “larger conspiracy” to incite violence.
Meanwhile, video evidence exists of police forcing five injured Muslim men to sing the national anthem while they lay bleeding on the ground. One of them, Faizan, 23, died two days later. No prosecution has resulted.
Kapil Mishra, the BJP leader recorded making speeches that many believe incited the violence, is now an elected official serving as a cabinet minister.
The Surveillance State
India has spent Rs 9.6 billion on facial recognition technology. The National Automated Facial Recognition System (NAFRS) is being built for nationwide deployment — a system that will be able to identify any face captured on any camera across the country.
No privacy impact assessment was conducted before the Delhi Police deployed their system. No audit of accuracy. No oversight mechanism.
“If you are a Dalit woman in India,” says Vidushi Marda of Article 19, “the nature and extent to which you are under surveillance are far more than an upper-caste Hindu man. There is a disproportionate impact on communities that have been historically marginalized.”
The Internet Freedom Foundation has called for a three-year moratorium on biometric facial recognition systems. Research has consistently shown that such systems perform worse on darker-skinned faces, on women, and on minority populations — precisely the groups most likely to be subjected to surveillance.
Criminal databases in India disproportionately include Muslims, Dalits, and Indigenous people — the legacy of colonial-era “criminal tribes” designations and ongoing discriminatory policing. When facial recognition systems are trained on these databases, they inherit and amplify those biases.
“Policing has always been casteist in India,” says Nikita Sonavane of the Criminal Justice and Police Accountability Project. “And data has been used to entrench caste-based hierarchies. Any new AI-based predictive policing system will likely only perpetuate the legacies of caste discrimination.”
The Legal Trap
The Unlawful Activities (Prevention) Act has become the weapon of choice for silencing dissent. Under UAPA, the normal presumption of innocence is effectively reversed. Courts are required only to see whether allegations appear “prima facie true” — not whether they are proven beyond doubt.
Bail becomes extraordinarily difficult. Accused can be held in pre-trial detention almost indefinitely. The process itself becomes the punishment.
The Financial Action Task Force noted in 2024 that delays in UAPA prosecutions are “resulting in a high number of pending cases and accused persons in judicial custody waiting for cases to be tried and concluded.”
UAPA’s conviction rate is just 2.2%, according to National Crime Records Bureau data. The vast majority of those arrested are eventually acquitted — but often only after years in prison.
In the Delhi riots case, the prosecution’s “larger conspiracy” theory has faced consistent criticism. Defense lawyers argue there is no direct evidence linking the accused to acts of violence, no recovery of weapons, and much of the case rests on hearsay, selective witness accounts, and interpretation of speeches and chats.
“Chakka jams and other forms of non-violent agitation are part of India’s democratic lexicon,” Senior Advocate Kapil Sibal argued before the Supreme Court. They “cannot be elevated to UAPA-level offences merely because they make authorities uncomfortable.”
The Architecture of Presumption
Shekhar Natarajan sees facial recognition as deployed in India as “the architecture of presumed guilt.”
“The system begins with a match and works backward,” he explains. “It does not ask: What was this person doing? Were they a participant or a bystander? Were they there at all, or did the algorithm make an error? It cannot ask these questions. It only sees faces — and it sees them imperfectly.”
In Angelic Intelligence, the architecture would force different questions:
An agent embodying nyaya (justice) would require corroborating evidence before any action with life-altering consequences. A facial match alone — especially from a system with documented 2% accuracy — would never be sufficient.
An agent embodying satya (truth) would flag the technology’s known limitations. It would require disclosure of accuracy rates, training data biases, and error margins. It would not allow a 2% accurate system to present itself as definitive.
An agent embodying sahana (patience) would demand pause before irreversible actions. Arrest, detention, the destruction of a person’s life and reputation — these require certainty that current systems cannot provide.
And an agent embodying sama (equanimity) would check for disparate impact. It would ask: Are certain communities being targeted more than others? Is the system’s deployment fair across populations?
The current system has no sahana. It has only efficiency, measured in arrests made, cases filed, conspiracy theories constructed.
Umar Khalid remains in jail. The algorithm made matches. The courts found the matches sufficient for continued detention. But the algorithm cannot be cross-examined. The algorithm cannot be held accountable. The algorithm cannot give back five years of a young man’s life.
If you object to the content of this press release, please notify us at pr.error.rectification@gmail.com. We will respond and rectify the situation within 24 hours.
What's Your Reaction?