Skip to main content

RCMP admits it uses controversial Clearview AI facial-recognition app

Undated handout photo of RCMP officer. RCMP Facebook photo

Support strong Canadian climate journalism for 2025

Help us raise $150,000 by December 31. Can we count on your support?
Goal: $150k
$32k

Canada's national police force admitted it has been using a controversial facial-recognition app under investigation by multiple Canadian privacy watchdogs, following a news report that the company behind the app had its entire client list stolen.

The RCMP said Thursday that its National Child Exploitation Crime Centre has been "using and evaluating Clearview AI's software" for roughly four months, and that "a few units in the RCMP" have also been using it "on a trial basis" with respect to criminal investigations.

Earlier on Thursday, the office of Public Safety Minister Bill Blair had told National Observer that it wouldn’t comment on “operational matters relating to specific police forces," after it was asked how Canadian police might be affected by the information breach.

Privacy probe into RCMP announced

Clearview AI is a U.S. technology company that has reportedly designed powerful software that analyzes the facial features of billions of photos that it automatically harvests from YouTube, Facebook, Instagram, Twitter, news and employment websites and elsewhere around the web.

Canada's national police force admitted it has been using a controversial facial-recognition app under investigation by multiple Canadian privacy watchdogs.

Police and other clients can upload a photo of a face, and then receive a list of matching or similar faces, as well as a link to the website where that image came from, the New York Times reported. For example, a matching face might appear in a video that was recently posted on social media.

The company has shared this technology with hundreds of law enforcement bodies including, according to Global News and the Toronto Star, the Canadian police agencies Toronto Police Service, Halton Regional Police, Niagara Regional Police and Durham Regional Police.

But the software has raised serious privacy concerns. Privacy commissioner Daniel Therrien has launched a joint investigation alongside several provincial counterparts into the firm. All the regional police agencies mentioned had told Canadian media that they had stopped using the app.

This week the company came under further scrutiny when the Daily Beast reported that its entire list of clients had been stolen, after someone gained “unauthorized access.” In a notification the company sent to customers that was reviewed by the news outlet, Clearview AI said the intruder did not access the search histories of police.

On Thursday the RCMP put out a statement that the National Child Exploitation Crime Centre had obtained two licences for the app and had used it in 15 cases, "resulting in the successful identification and rescue of two children."

"We are also aware of limited use of Clearview AI on a trial basis by a few units in the RCMP to determine its utility to enhance criminal investigations," it stated.

The statement acknowledged that "the Internet has changed the way child sexual exploitation offences are committed, investigated and prosecuted and Clearview AI is only one of many tools / techniques that are used in the identification of victims of online child sexual abuse."

On Friday morning after this story was published, Therrien's office announced it was launching another investigation specifically into the RCMP's use of Clearview AI, that would run alongside the joint investigation.

Breakthrough or cause for concern?

The potential of the app to revolutionize police work is vast. The Times quoted a “victim identification officer in Canada” as suggesting Clearview AI was “the biggest breakthrough in the last decade” in investigating child sexual abuse.

Yet it also represents potential for misuse. The technology makes a trove of billions of faces accessible and searchable in a way that is far more powerful than what law enforcement can traditionally access, like mug shots or surveillance cameras.

It has also developed an algorithm that can process photos that are partially obscured or taken on an angle. Experts have been concerned about the rapid rise in facial-recognition technology around the world.

The RCMP said that only "trained victim identification specialists" can use the software. It said its child exploitation centre is dealing with a massive increase of online child sexual abuse reports, up 68 per cent from last year and 1106 per cent since 2014.

But it also said it needs to weigh privacy concerns against police investigative powers.

"While we recognize that privacy is paramount and a reasonable expectation for Canadians, this must be balanced with the ability of law enforcement to conduct investigations and protect the safety and security of Canadians, including our most vulnerable," the RCMP wrote in its statement.

Therrien has partnered with privacy watchdogs in B.C., Quebec and Alberta to investigate whether Clearview AI has been collecting personal information without consent, which could run afoul of Canadian privacy laws. The agencies have also said they will be developing “guidance” for Canadian police on the use of facial recognition apps.

“We are aware of today’s media reports,” said Privacy Commissioner senior communications advisor Vito Pilieci.

“As you know, our office has an ongoing investigation into Clearview AI. Due to confidentiality provisions under the Personal Information Protection and Electronic Documents Act (PIPEDA), Canada's federal private sector privacy law, I cannot offer further details at this time.”

Clearview AI did not respond to a request for comment before publication.

Editor's note: This story was updated at 11:48 a.m. ET on Feb. 28, 2020 to include news of the latest investigation from the privacy commissioner, that was announced after this story was published.

Comments