Canada’s AI minister has condemned sexual deepfakes amid a wider controversy surrounding X (Twitter) producing nude imagery of people, including minors.
On Thursday, Evan Solomon, minister of artificial intelligence and digital innovation, posted the following statement on X:
“Deepfake sexual abuse is violence. We must protect Canadians, especially women and young people, from exploitation. Platforms and AI developers have a duty to prevent this harm. Our government is continuing to advance responsible AI, including introducing Bill C-16, the Protecting Victims Act, to amend the Criminal Code to include deepfakes as intimate images for the offence of publication of an intimate image without consent. We will keep Canadians safe by amending the criminal code and holding abusers accountable.”
Deepfake sexual abuse is violence.
We must protect Canadians, especially women and young people, from exploitation. Platforms and AI developers have a duty to prevent this harm.
Our government is continuing to advance responsible AI, including introducing Bill C-16, the…
— Evan Solomon (@EvanLSolomon) January 8, 2026
Solomon didn’t mention X by name, but he’s clearly alluding to ongoing criticism of X over its role in the production and spread of sexual deepfakes. Over the past several days, people have been using X’s Grok AI tool to generate nude images of people without their consent, and this has included child sexual abuse material (CSAM).
This week, the British anti-CSAM charity Internet Watch Foundation (IWF) says it discovered “criminal imagery” of girls aged between 11 and 13 which “appears” to have been created with Grok. Because of this, the group told BBC News that it’s concerned that Grok could be “bringing sexual AI imagery of children into the mainstream.”
As many have pointed out, this content goes against not only X’s own policies, but those of Apple’s App Store and Google’s Play Store. And yet, the app stores haven’t done anything while X itself has refused to take accountability. Instead, the social media platform has blamed users for creating this content and simply limited Grok image creation to paying customers (in other words, it’s now directly profiting from AI-generated CSAM). All the while, X’s owner, the terminally online Elon Musk, has continued to post on the platform as if nothing is wrong.
This has led to calls for people to leave X, particularly government officials like Solomon. Some have criticized the minister for exclusively making this statement on X, all while his nascent Bluesky account remains inactive. Meanwhile, UK communications regulator Ofcom has said it’s investigating X for failing to comply with online safety laws. Liz Kendall, the UK’s technology secretary, even said she would back Ofcom if it ultimately decided to ban X. The platform is also under scrutiny by officials in France, Malaysia and India.
It remains to be seen what, if any, action the Canadian government will take. Given that Solomon didn’t even name X, it’s not exactly hopeful. (It’s also worth noting that Solomon, a former journalist, was fired by the CBC in 2015 for a conflict of interest involving the sale of art to various individuals, including ex-BlackBerry boss Jim Balsillie and then-Bank of Canada governor — and now current Prime Minister — Mark Carney.)
Image credit: Shutterstock
Source: Evan Solomon (@EvanLSolomon)
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.
