Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a “high risk” watch list in the Canadian city of Edmonton, a live test of whether facial recognition technology shunned as too intrusive could have a place in policing throughout North America.
But six years after leading body camera maker Axon Enterprise, Inc. said police use of facial recognition technology posed serious ethical concerns, the pilot project — switched on last week— is raising alarms far beyond Edmonton, the continent’s northernmost city of more than 1 million people.
A former chair of Axon’s AI ethics board, which led the company to temporarily abandon facial recognition in 2019, told The Associated Press he’s concerned that the Arizona-based company is moving forward without enough public debate, testing and expert vetting about the societal risks and privacy implications.
“It’s essential not to use these technologies, which have very real costs and risks, unless there’s some clear indication of the benefits,” said the former board chair, Barry Friedman, now a law professor at New York University.
Axon founder and CEO Rick Smith contends that the Edmonton pilot is not a product launch but “early-stage field research” that will assess how the technology performs and reveal the safeguards needed to use it responsibly.
“By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,” Smith wrote in a blog post.
The pilot is meant to help make Edmonton patrol officers safer by enabling their body-worn cameras to detect anyone who authorities classified as having a “flag or caution” for categories such as “violent or assaultive; armed and dangerous; weapons; escape risk; and high-risk offender,” said Kurt Martin, acting superintendent of the Edmonton Police Service. So far, that watch list has 6,341 people on it, Martin said at a Dec. 2 press conference. A separate watch list adds 724 people who have at least one serious criminal warrant, he said.
“We really want to make sure that it’s targeted so that these are folks with serious offenses," said Ann-Li Cooke, Axon’s director of responsible AI.
If the pilot expands, it could have a major effect on policing around the world. Axon, a publicly traded firm best known for developing the Taser, is the dominant U.S. supplier of body cameras and has increasingly pitched them to police agencies in Canada and elsewhere. Axon last year beat its closest competitor, Chicago-based Motorola Solutions, in a bid to sell body cameras to the Royal Canadian Mounted Police.
Motorola said in a statement that it also has the ability to integrate facial recognition technology into police body cameras but, based on its ethical principles, has “intentionally abstained from deploying this feature for proactive identification." It didn't rule out using it in the future.
The government of Alberta in 2023 mandated body cameras for all police agencies in the province, including its capital city Edmonton, describing it as a transparency measure to document police interactions, collect better evidence and reduce timelines for resolving investigations and complaints.
While many communities in the U.S. have also welcomed body cameras as an accountability tool, the prospect of real-time facial recognition identifying people in public places has been unpopular across the political spectrum. Backlash from civil liberties advocates and a broader conversation about racial injustice helped push Axon and Big Tech companies to pause facial recognition software sales to police.
Among the biggest concerns were studies showing that the technology was flawed, demonstrating biased results by race, gender and age. It also didn't match faces as accurately on real-time video feeds as it did on faces posing for identification cards or police mug shots.
Several U.S. states and dozens of cities have sought to curtail police use of facial recognition, though President Donald Trump's administration is now trying to block or discourage states from regulating AI.
The European Union banned real-time public face-scanning police technology across the 27-nation bloc, except when used for serious crimes like kidnapping or terrorism.
But in the United Kingdom, no longer part of the EU, authorities started testing the technology on London streets a decade ago and have used it to make 1,300 arrests in the past two years. The government is considering expanding its use across the country.
Many details about Edmonton's pilot haven't been publicly disclosed. Axon doesn't make its own AI model for recognizing faces but declined to say which third-party vendor it uses.
Edmonton police say the pilot will continue through the end of December and only during daylight hours.
“Obviously it gets dark pretty early here,” Martin said. “Lighting conditions, our cold temperatures during the wintertime, all those things will factor into what we’re looking at in terms of a successful proof of concept.”
Martin said about 50 officers piloting the technology won't know if the facial recognition software made a match. The outputs will be analyzed later at the station. In the future, however, it could help police detect if there's a potentially dangerous person nearby so they can call in for assistance, Martin said.
That's only supposed to happen if officers have started an investigation or are responding to a call, not simply while strolling through a crowd. Martin said officers responding to a call can switch their cameras from a passive to an active recording mode with higher-resolution imaging.
“We really want to respect individuals’ rights and their privacy interests,” Martin said.
The office of Alberta’s information and privacy commissioner Diane McLeod said she received a privacy impact assessment from Edmonton police on Dec. 2, the same day Axon and police officials announced the program. The office said Friday it’s now working to review the assessment, a requirement for projects that collect “high sensitivity” personal data.
University of Alberta criminology professor Temitope Oriola said he's not surprised that the city is experimenting with live facial recognition, given that the technology is already ubiquitous in airport security and other environments.
“Edmonton is a laboratory for this tool,” Oriola said. “It may well turn out to be an improvement, but we do not know that for sure.”
Oriola said the police service has had a sometimes “frosty” relationship with its Indigenous and Black residents, particularly after the fatal police shooting of a member of the South Sudanese community last year, and it remains to be seen whether facial recognition technology makes policing safer or improves interactions with the public.
Axon has faced blowback for its technology deployments in the past, as in 2022, when Friedman and seven other members of Axon's AI ethics board resigned in protest over concerns about a Taser-equipped drone.
In the years since Axon opted against facial recognition, Smith, the CEO, says the company has “continued controlled, lab-based research” of a technology that has “become significantly more accurate” and is now ready for trial in the real world.
But Axon acknowledged in a statement to the AP that all facial recognition systems are affected by "factors like distance, lighting and angle, which can disproportionately impact accuracy for darker-skinned individuals.”
Every match requires human review, Axon said, and part of its testing is also “learning what training and oversight human reviewers must have to mitigate known risks.”
Friedman said Axon should disclose those evaluations. He'd want to see more evidence that facial recognition has improved since his board concluded that it wasn't reliable enough to ethically justify its use in police cameras.
Friedman said he's also concerned about police agencies greenlighting the technology's use without deliberation by local legislators and rigorous scientific testing.
“It’s not a decision to be made simply by police agencies and certainly not by vendors," he said. “A pilot is a great idea. But there’s supposed to be transparency, accountability. ... None of that’s here. They’re just going ahead. They found an agency willing to go ahead and they’re just going ahead.”
—-
AP writer Kelvin Chan in London contributed to this report.

Associated Press Top News
Reuters US Top
America News
Political Wire
Raw Story
AlterNet
New York Post