The future is rapidly approaching Croydon. While it may not seem to be at the forefront of innovation in Britain, North End—a car-free high street teeming with familiar establishments like pawn shops, fast-food chains, and retail brands—is set to become one of the two locations for the UK’s inaugural installation of fixed facial recognition cameras.
These cameras will silently capture digital images of individuals as they walk by, processing them to gather measurements of facial characteristics, known as biometric data. This data will be immediately analyzed by artificial intelligence and matched against a designated watchlist, prompting alerts for any matches that may lead to arrests.
As outlined in the latest violence reduction strategy for this southeastern London borough, North End and adjacent thoroughfares are identified as the “primary crime hotspot.” However, they are not among the most hazardous streets in the capital.
In fact, their crime rate ranks as the 20th highest out of the 32 boroughs in London, not counting the City of London. The initiative to install these permanent cameras for a trial later this summer is not deemed an urgent response; North End and nearby London Road could be virtually any location.
When questioned about the surveillance measures, the majority of shopkeepers and shoppers interviewed at North End were unaware of the police’s plans, much less the technology being employed.
For some, the cameras are simply viewed as another piece of urban furniture, alongside signs promoting 24-hour CCTV and advocating for safe cycling. This, according to certain observers, warrants concern. Others refer to surveys indicating that a public worn down by escalating crime largely supports the initiative.
Facial recognition technology trials commenced in England and Wales in 2016. However, documents released under the Freedom of Information Act (FoI) and analyses conducted by organizations like Liberty Investigates have unveiled a significant uptick in its deployment over the last year. Once considered a specialized tool, it is gradually being integrated into mainstream police operations.
Last year, police agencies scanned nearly 4.7 million faces using live facial recognition cameras, more than double the number from 2023. Live facial recognition vans operated on at least 256 occasions in 2024, a sharp rise from 63 the previous year.
Law enforcement agencies are poised to roll out a mobile unit of ten live facial recognition vans that can be dispatched anywhere across the country.
In parallel, governmental officials are collaborating with police to create a new national facial recognition system called the strategic facial matcher. This platform will enable searching across various databases, including custody images and immigration files.
One funding document prepared by South Wales police and submitted to the Home Office indicates, “The use of this technology could become a regular feature in our urban centers and transport hubs throughout England and Wales.”
Advocates and critics both express their concerns about the implications of this technology. Some liken its usage to randomly stopping individuals in public to verify their fingerprints, envisioning a bleak future where live facial recognition cameras augment the extensive existing CCTV network. Proponents acknowledge the risks, yet assert that they prioritize positive outcomes.
This week, David Cheneler, a 73-year-old registered sex offender from Lewisham, south London, received a two-year prison sentence for violating probation terms after having previously served nine years for 21 offenses.
His identification was prompted by a live facial recognition camera mounted on a police van, which notified officers of his presence walking unaccompanied with a six-year-old child.
“He was on [the watchlist] due to specific conditions he had to follow,” stated Lindsey Chiswick, director of intelligence at the Met and the National Police Chiefs’ Council lead on facial recognition. “One requirement was to avoid interacting with anyone under the age of 14.”
Over the course of a year, he had cultivated a relationship with the child’s mother, which included picking the girl up from school, raising concerns about what might have occurred had he not been intercepted that day—while also carrying a knife in his belt. This instance highlights the police’s reliance on the technology for timely intervention.
However, many are apprehensive about potential unintended consequences as law enforcement embraces this technology even as Parliament has yet to establish regulations governing its use.
Madeline Stone from the NGO Big Brother Watch, which monitors the mobile camera deployments, noted instances where children in school uniforms were incorrectly identified, resulting in “lengthy, humiliating, and aggressive police stops” where they were compelled to prove their identities and submit fingerprints.
In two cases, the individuals involved were young black boys who were left frightened and distressed, she recounted.
Signing up for Headlines UK
Receive the day’s key news and updates delivered to your inbox every morning.
“And the reality is that the higher the thresholds for accuracy, the less effective they are at catching offenders,” Stone remarked. “Law enforcement may not consider deploying the technology in various settings. There’s no legal obligation for them to do so. The notion that police can unilaterally determine how to employ this technology is genuinely troubling.”
A judicial review has been initiated by Shaun Thompson from London, with backing from Big Brother Watch, concerning the Met’s use of the cameras after he was mistakenly identified as a person of interest and detained for 30 minutes while returning from volunteering with Street Fathers, a group campaigning against knife crime.
Dr. Daragh Murray, who was commissioned by the Met in 2019 to conduct an independent study of the technology’s trials, warned of the potential “chilling” effect on society. He emphasized that insufficient consideration has been given to how the cameras could influence public behavior.
“It’s akin to having a police officer shadow you, meticulously recording your activities, acquaintances, and the duration of your interactions,” he explained. “Most people would find such scrutiny uncomfortable. Furthermore, democracy thrives on dissent and debate; if surveillance stifles these elements, it could entrench existing power structures and limit future opportunities.”
Live facial recognition cameras have been utilized to apprehend individuals for various offenses, including traffic violations, cannabis cultivation, and noncompliance with community orders. Is this a measured response?
Fraser Sampson, the former biometrics and surveillance camera commissioner for England and Wales, now serves as a non-executive director at Facewatch, a leading company offering facial recognition technology for retail security to prevent shoplifting.
While he acknowledges the value of the technology, he raises concerns about the adequacy of regulatory frameworks and independent oversight, which seem to lag behind the speed of its adoption by law enforcement.
Sampson noted, “There exists a wealth of information available for those seeking clarity on the technology. However, crucial questions remain unaddressed, such as when, where, and how it can be employed, by whom, and for what duration. What procedures are in place for challenging its use or lodging complaints? What occurs if it fails to deliver expected results?”
Chiswick expressed an understanding of these apprehensions and acknowledged the need for formal guidelines. She indicated that the Met was taking “incremental steps” that were scrutinized at each phase. With limited resources, law enforcement must adapt and “leverage” the possibilities presented by artificial intelligence. They are cognizant of the potential “chilling effect” on society and how it could alter behavior, emphasizing that the cameras are not utilized at protests.
“Will this become commonplace? I can’t say,” Chiswick remarked. “We must approach that assertion with caution. I can envision various scenarios, like the West End. I can certainly see that as a viable location, as opposed to the static trial we’re currently conducting in Croydon. Different circumstances might warrant alternative approaches, but that doesn’t guarantee implementation.”
She concluded, “I believe we will see a greater reliance on technology, data, and AI in the coming years, which is essential for enhancing our capabilities. Nevertheless, we need to proceed thoughtfully.”