The RCMP acknowledged that it has used Clearview AI’s controversial facial recognition technology, noting that it used the tech to crack down on online child sexual abuse.
In a statement to CBC News, the RCMP said it used Clearview AI in 15 child exploitation investigations in the last four months. Further, the technology helped identify and rescue two children.
Additionally, the statement said “a few units in the RCMP” are using Clearview to “enhance criminal investigations.” However, the RCMP provided no detail about how widely the technology was in use, or where it was in use. On top of that, the RCMP noted it doesn’t typically disclose specific tools and technologies it uses in investigations, but confirmed the use of Clearview AI “in the interest of transparency.”
For those unfamiliar with Clearview AI, the technology collects huge amounts of images from various online sources, including social media, and uses them in conjunction with facial recognition technology to help police forces and financial institutions identify people. However, the tool has come under fire regarding privacy practices, most recently from Canadian privacy commissioners.
Concerns over Clearview AI were not helped by a recent data breach, which saw hackers make off with its entire client list.
While federal guidelines for facial recognition privacy are in the works, several Canadian law enforcement agencies have revealed they use or have used Clearview AI. This includes Durham and Toronto police, as well as Hamilton police. Ottawa Police Service told CBC News it tested an alternative technology from NeoFace Reveal last year but says it doesn’t currently use it. Further, Edmonton and Saskatoon told CBC News that they are considering using facial recognition technology, Montreal would not confirm, and Halifax, Winnipeg and Vancouver say they don’t use facial recognition.
Source: CBC News
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.