Police Acknowledge AI Surveillance Panopticon Still Faces Challenges with "Certain Demographic Groups"
- Last update: 51 minutes ago
- 2 min read
- 999 Views
- BUSINESS
UK officials have announced plans to introduce a nationwide facial recognition network aimed at aiding law enforcement in identifying criminals. A 10-week public consultation has been launched to review the legal and privacy implications of the AI surveillance system, signaling that the deployment is imminent.
However, the system has shown significant shortcomings. Investigations by the National Physical Laboratory (NPL) revealed that the AI is prone to misidentifying individuals from certain demographic groups, particularly Black and Asian populations. The findings highlight an uneven accuracy rate across racial groups.
Police ministers have praised facial recognition technology as a transformative tool for law enforcement, comparing it to the revolutionary impact of DNA testing. Yet, internal reports indicate concern over built-in biases. According to the NPL, the retrospective facial recognition tool used by the national police shows a false positive rate of 0.04% for white individuals, 4.0% for Asian individuals, and 5.5% for Black individuals.
The Association of Police and Crime Commissioners warned that these statistics suggest the technology is being used operationally without adequate safeguards, increasing the risk of misidentifying minority groups.
The UK already has one of the densest surveillance networks in the world, with London alone hosting roughly 1,552 cameras per square mile. Plans to expand the use of AI facial recognition include additional mobile police vans equipped with rooftop cameras, integrated with national watchlists. The government is seeking public input on whether these systems should have access to other official databases, including passport and drivers license records.
Despite public consultations, authorities appear committed to rapidly scaling up the technology. The proposed system could eventually maintain a database containing millions of images of ordinary citizens.
Advocacy groups have criticized the rollout, pointing out that racial bias in AI facial recognition carries real-world consequences. They urge the government to pause expansion until safeguards are fully implemented to protect civil rights.
Author: Gavin Porter
Share
Insecure database reveals 1 million pornographic AI images and deepfakes
3 hours ago 2 min read BUSINESS
China's AI-powered censorship and surveillance intensify further
1 days ago 3 min read BUSINESS
Expansion of police facial recognition technology plans
1 days ago 3 min read WORLD
Cameras violate fundamental right to privacy
2 days ago 2 min read WORLD
Liberal towns reverse decision on license plate trackers due to privacy concerns — and Trump
4 days ago 4 min read POLITICS