Advertisement

RCMP used Clearview AI facial recognition tool in 15 child exploitation cases, helped rescue 2 kids

Click to play video: 'Toronto police use of Clearview AI raises privacy concerns'
Toronto police use of Clearview AI raises privacy concerns
Questions have been raised after Toronto police’s admission that a U.S.-based company’s facial recognition app was being used by some of its members. Shallima Maharaj reports – Feb 14, 2020

The RCMP confirmed Thursday that the police force has been using the controversial facial recognition technology Clearview AI for roughly four months as part of online child sexual exploitation investigations and resulted in the rescue of two children.

The Mounties said in a statement Clearview AI’s facial recognition technology was used in a “limited capacity by the RCMP’s National Child Exploitation Crime Centre (NCECC).

Story continues below advertisement
“The NCECC has two [licences] for the Clearview AI application and has used it in 15 cases, resulting in the successful identification and rescue of two children,” the RCMP said in a statement.

“Only trained victim identification specialists in the NCECC use the software primarily to help identify, locate and rescue children who have been or are victims of online sexual abuse.”

Clearview AI’s technology allows for the collection of billions of images from public websites and social media sites that can help police forces and financial institutions identify individuals.

The RCMP said the controversial technology had also been used on a trial basis by a few units in the RCMP to determine its utility to enhance criminal investigations. The statement did not offer details on where those units were located.

Click to play video: 'Toronto police confirms use of Clearview AI’s controversial facial recognition tool'
Toronto police confirms use of Clearview AI’s controversial facial recognition tool

Privacy concerns about the software were raised earlier this year after a New York Times investigation revealed the software had scraped more than three billion photos from Facebook, Instagram and YouTube to create a database used by more than 600 law enforcement agencies in the U.S., Canada and elsewhere.

Story continues below advertisement

In January, Global News asked the RCMP whether it was using Clearview AI who declined to comment.

Toronto police admitted earlier this month it had been testing a controversial facial recognition tool by Clearview AI since last fall and were ordered to stop using it.

“Some members of the Toronto Police Service began using Clearview AI in October 2019 with the intent of informally testing this new and evolving technology,” police spokesperson Meaghan Gray said in an email. “The Chief directed that its use be halted immediately upon his awareness, and the order to cease using the product was given on February 5, 2020.”

 

Police services in Edmonton, Calgary, Hamilton, Halifax and elsewhere across the country later admitted to using the facial recognition software.

York Regional Police, which had previously confirmed it wasn’t using the technology, said Friday that individual officers accessed the Clearview AI free trial without “the authorization or awareness of our command.”

“As soon as we learned of this, officers were directed to stop using the trial immediately,” said York police spokesperson Andy Pattenden. “We notified the Privacy Commission on Thursday, February 27, 2020, about the unauthorized use by some members of Clearview AI’s free trial.”

Story continues below advertisement

Pattenden said the force is still conducting an internal inquiry as to how many members, and from which units, have accessed Clearview AI, but said roughly “500 searches were performed.”

Click to play video: 'RCMP in ‘full cooperation’ with privacy commissioner on review of Clearview AI: Blair'
RCMP in ‘full cooperation’ with privacy commissioner on review of Clearview AI: Blair

Canada’s privacy watchdogs announced last Friday they were teaming up to launch an investigation into whether laws are being broken by the facial-recognition software.

The joint investigation will be led by federal Office of the Privacy Commissioner Daniel Therrien and his three counterparts in B.C., Quebec and Alberta

The Office of the Privacy Commissioner announced Thursday night it would launch an investigation into the RCMP’s use of the software.

“In light of the RCMP’s acknowledgement of their use of Clearview’s facial recognition technology, we are launching an investigation,” read the statement. “Given we are now investigating, no further details are available at this time.”

Story continues below advertisement
Click to play video: 'Facial recognition software privacy concerns'
Facial recognition software privacy concerns

Members of the House of Commons committee on access to information, privacy and ethics voted this week to examine the technology’s effects on civil society, privacy rights, minorities and vulnerable populations.

“I think this issue is a defining issue of our time,” said New Democrat MP Charlie Angus, who put forward the idea. He said the committee should study the use of the emerging tools by governments, police, companies and individuals.

The RCMP said Clearview AI is only one of many tools and techniques that are used in the identification of victims of online child sexual abuse.

The police force said the rates of child sexual exploitation have exploded in recent years

“In 2019, the NCECC received 102,967 reports of online child sexual abuse, a dramatic 1,106-per cent increase since 2014,” the RCMP said.

Story continues below advertisement

“This was a 68 per cent increase from last year.”

Clearview AI confirmed Wednesday it suffered a data breach after a hacker gained unauthorized access to its client list, which includes law enforcement agencies and banks around the world.

“Security is Clearview’s top priority. Unfortunately, data breaches are part of life in the 21st century,” said Tor Ekeland, an attorney representing the tech company. “Our servers were never accessed. We patched the flaw and continue to work to strengthen our security.”

— With files from Rachel Browne and the Canadian Press

Sponsored content

AdChoices