Four different Canadian privacy commissioners have joined forces to investigate Clearview AI and how it relates to facial recognition in law enforcement.
The AI program is capable of scraping images of people from the internet and matching their faces with user-uploaded pictures. It’s believed to have over 3 billion photographs of people in its database. That means if you committed a crime, a police agency would be able to upload a security camera photo of you and the AI would potentially be able to tell the police who you are.
Several reports are claiming that this service is taking people’s data without consent. Now, privacy officers from Britsh Columbia, Quebec and Alberta, as well as the Federal Privacy Commissioner, are working together to understand the issue.
Beyond just being used by law enforcement, Clearview AI is also working with financial institutions, although it’s less clear what it’s doing there.
This investigation is still active, so there isn’t a ton of information yet, but we do know what each member of the team is looking into.
The representatives from B.C. and Alberta are both examining if Clearview AI complies with the Personal Information Protection Act, while the federal minister is looking at the Personal Information Protection and Electronic Documents Act (PIPEDA).
The representative from Quebec is investigating the AI in relation to the Act Respecting the Protection of Personal Information in the Private Sector and the Act to Establish a Legal Framework for Information Technology in Québec.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.