Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a “high risk” watch list in the Canadian city of Edmonton, a live test of whether facial recognition technology shunned as too intrusive could have a place in policing throughout North America.But six years after leading body camera maker Axon Enterprise, Inc.said police use of facial recognition technology posed serious ethical concerns, the pilot project — switched on last week— is raising alarms far beyond Edmonton, the continent’s northernmost city of more than 1 million people.“It’s essential not to use these technologies, which have very real costs and risks, unless there’s some clear indication of the benefits,” said the former board chair, Barry Friedman, now a law professor at New York University.Axon founder and CEO Rick Smith contends that the Edmonton pilot is not a product launch but “early-stage field research” that will assess how the technology performs and reveal the safeguards needed to use it responsibly.“By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,” Smith wrote in a blog post.The pilot is meant to help make Edmonton patrol officers safer by enabling their body-worn cameras to detect anyone who authorities classified as having a “flag or caution” for categories such as “violent or assaultive; armed and dangerous; weapons; escape risk; and high-risk offender,” said Kurt Martin, acting superintendent of the Edmonton Police Service.So far, that watch list has 6,341 people on it, Martin said at a Dec.
2 press conference.A separate watch list adds 724 people who have at least one serious criminal warrant, he said.“We really want to make sure that it’s targeted so that these are folks with serious offenses," said Ann-Li Cooke, Axon’s director of responsible AI.Motorola said in a statement that it also has the ability to integrate facial recognition technology into police body cameras but, based on its ethical principles, has “intentionally abstained from deploying this feature for proactive identification." It didn't rule out using it in the future.
Among the biggest concerns were studies showing that the technology was flawed, demonstrating biased results by race, gender and age.It also didn't match faces as accurately on real-time video feeds as it did on faces posing for identification cards or police mug shots.Several U.S.states and dozens of cities have sought to curtail police use of facial recognition, though President Donald Trump's administration is now trying to block or discourage states from regulating AI.The European Union banned real-time public face-scanning police technology across the 27-nation bloc, except when used for serious crimes like kidnapping or terrorism.
But in the United Kingdom, no longer part of the EU, authorities started testing the technology on London streets a decade ago and have used it to make 1,300 arrests in the past two years.The government is considering expanding its use across the country.Many details about Edmonton's pilot haven't been publicly disclosed.Axon doesn't make its own AI model for recognizing faces but declined to say which third-party vendor it uses.
Popular ReadsSuspect arrested in Jan.6 pipe bomb case: SourcesDec 4, 8:26 AMStudent says she was living 'American Dream' before she was deported despite orderDec 4, 6:11 AMTrump admin live updates: Trump pardons former entertainment exec indicted by own DOJDec 4, 6:23 AMEdmonton police say the pilot will continue through the end of December and only during daylight hours.“Obviously it gets dark pretty early here,” Martin said.“Lighting conditions, our cold temperatures during the wintertime, all those things will factor into what we’re looking at in terms of a successful proof of concept.”Martin said about 50 officers piloting the technology won't know if the facial recognition software made a match.
The outputs will be analyzed later at the station.In the future, however, it could help police detect if there's a potentially dangerous person nearby so they can call in for assistance, Martin said.That's only supposed to happen if officers have started an investigation or are responding to a call, not simply while strolling through a crowd.Martin said officers responding to a call can switch their cameras from a passive to an active recording mode with higher-resolution imaging.
“We really want to respect individuals’ rights and their privacy interests,” Martin said.University of Alberta criminology professor Temitope Oriola said he's not surprised that the city is experimenting with live facial recognition, given that the technology is already ubiquitous in airport security and other environments.“Edmonton is a laboratory for this tool,” Oriola said.
“It may well turn out to be an improvement, but we do not know that for sure.”Axon has faced blowback for its technology deployments in the past, as in 2022, when Friedman and seven other members of Axon's AI ethics board resigned in protest over concerns about a Taser-equipped drone.But Axon acknowledged in a statement to the AP that all facial recognition systems are affected by "factors like distance, lighting and angle, which can disproportionately impact accuracy for darker-skinned individuals.”Every match requires human review, Axon said, and part of its testing is also “learning what training and oversight human reviewers must have to mitigate known risks.”Friedman said Axon should disclose those evaluations.He'd want to see more evidence that facial recognition has improved since his board concluded that it wasn't reliable enough to ethically justify its use in police cameras.Friedman said he's also concerned about police agencies greenlighting the technology's use without deliberation by local legislators and rigorous scientific testing.
“It’s not a decision to be made simply by police agencies and certainly not by vendors," he said.“A pilot is a great idea.But there’s supposed to be transparency, accountability.
...None of that’s here.They’re just going ahead.
They found an agency willing to go ahead and they’re just going ahead.”—-AP writer Kelvin Chan in London contributed to this report.