Across the UK, millions of people are now having their faces scanned by real-time facial-recognition systems — making it the only country in Europe to roll out this technology on such a large scale.
At London’s Notting Hill Carnival, where an estimated two million people are expected to celebrate Afro-Caribbean culture over Sunday and Monday, police have set up facial-recognition cameras near key entrances and exits.
According to police officials, the purpose of deploying the technology is to identify and apprehend wanted suspects by scanning the faces of festival-goers and matching them against thousands of profiles stored in their database.
The technology is “an effective policing tool which has already been successfully used to locate offenders at crime hotspots resulting in well over 1,000 arrests since the start of 2024,” said Metropolitan Police chief Mark Rowley.
Facial-recognition technology was first tested in 2016, but its deployment has expanded rapidly in the UK over the past three years.
According to the NGO Liberty, approximately 4.7 million faces were scanned in 2024 alone. Since late January, UK police have used live facial-recognition technology nearly 100 times — a sharp increase compared to just 10 deployments between 2016 and 2019.
Some recent examples include screenings conducted before two Six Nations rugby matches and outside two Oasis concerts in Cardiff this past July.
When an individual on the police “watchlist” approaches the cameras, the AI-powered system — often stationed inside a nearby police van — immediately sends out an alert. Officers then verify the person’s identity and can make an arrest on the spot.
However, the widespread use of this technology has sparked major criticism from privacy advocates. The organization Big Brother Watch warned that the mass data collection seen in London, including during the coronation of King Charles III in 2023, “treats us like a nation of suspects.”
“There is no legislative basis, so we have no safeguards to protect our rights, and the police is left to write its own rules,” said Rebecca Vincent, interim director of Big Brother Watch, in an interview with AFP.
Vincent also raised concerns about its private use by supermarkets and clothing retailers, where “very little information” is provided about how customers’ data is collected and stored.
Many stores rely on Facewatch, a service provider that maintains a list of suspected offenders and sends alerts when any of them enter a monitored location.
“It transforms what it is to live in a city, because it removes the possibility of living anonymously,” said Daragh Murray, a lecturer in human rights law at Queen Mary University of London.
“That can have really big implications for protests but also participation in political and cultural life,” he added.
Shoppers are often unaware that they are being scanned and profiled when entering these stores.
“They should make people aware of it,” said Abigail Bevon, a 26-year-old forensic scientist, who spoke to AFP outside a London store using Facewatch. She admitted she was “very surprised” to learn how extensively the technology was being deployed.
While acknowledging its usefulness in certain policing situations, she criticized its use by retailers, calling it “invasive.”
Since February, new EU regulations on artificial intelligence have banned real-time facial-recognition technology, except in exceptional cases such as counterterrorism.
Aside from a few isolated cases in the United States, “we do not see anything even close in European countries or other democracies,” Vincent emphasized.
“The use of such invasive tech is more akin to what we see in authoritarian states such as China,” she added.
UK Interior Minister Yvette Cooper recently announced that a “legal framework” will be drafted to regulate its use, focusing on investigations involving “the most serious crimes.”
However, despite these assurances, her ministry authorized police this month to deploy facial-recognition systems in seven additional regions.
Currently, the cameras are typically mounted inside mobile police vans, but for the first time, permanent installations are set to be introduced in Croydon, south London, next month.
Police have insisted that “robust safeguards” are in place, including switching off cameras when officers are absent and deleting biometric data from individuals who are not suspects.
Despite these claims, the UK’s human rights watchdog ruled on Wednesday that the Metropolitan Police’s policy on facial recognition was “unlawful” and “incompatible” with existing rights protections.
In addition, eleven advocacy groups — including Human Rights Watch — sent a letter to the Metropolitan Police chief urging him not to use the technology during Notting Hill Carnival, accusing the department of “unfairly targeting” the Afro-Caribbean community and highlighting the racial biases built into AI-driven systems.
{Matzav.com}