A Toronto woman has recently shared her distressing experience of having her images stolen and altered using deepfake technology, leading to their unauthorized posting on social media. This alarming incident highlights how Canadian laws are struggling to keep pace with the rapid advancement of Artificial Intelligence (AI).
Under the condition of anonymity, the woman expressed her reluctance to reveal her identity, stating, "I choose not to show my face obviously because it does put a target on your back." She described the moment she discovered the unauthorized use of her likeness, which occurred three months ago when a follow request arrived on her TikTok account.
Upon checking the profile of the user who requested to follow her, she realized that her photo was being used as their profile picture. "I saw that this person who was requesting to follow me had my photo as their profile picture. So of course, I wanted to see what that was about," she recounted. This led her to find an entire account populated with AI-generated videos of her in lingerie and engaging in explicit sexual acts, despite the videos being clearly altered to impose her face onto another body.
Feeling violated, she attempted to reach out to the user directly but received no response. Subsequently, she contacted TikTok for assistance, to which the platform replied that they would investigate the issue. However, as of a recent inquiry by CityNews, three months later, the offending profile remained active.
Visibly emotional, the woman, who is currently studying law, shared the significant impact these events have had on her life. "I ended up skipping classes. I was scared that people would recognize me and think that it was me whenever I would go outside. I felt like no one was going to believe that it wasn’t me," she disclosed, shedding light on the toll this experience took on her mental health.
In addition to reaching out to TikTok, the woman also reported the issue to the Toronto Police, who connected her with a detective. Unfortunately, she was informed that current Canadian law does not provide recourse for her situation, stating, "They said that Canadian law does not currently criminalize what’s happened."
The Toronto Police Service (TPS) has acknowledged the challenges that the emergence of deepfake technology presents to law enforcement. While section 163.1 of the Canadian Criminal Code addresses offenses related to Child Sexual Abuse Material (CSAM), including AI-generated content, it remains unclear how non-consensual deepfake imagery of adults fits into the existing legal framework. "Cases involving non-consensual deepfake imagery of adults highlight areas where current laws were not designed with this technology in mind," stated TPS spokesperson Stephanie Sayer.
In response to these challenges, the Minister of Justice and Attorney General of Canada recently announced the introduction of Bill C-16, known as the Protecting Victims Act. This proposed legislation seeks to criminalize sexual deepfakes that depict subjects nude or engaged in explicit sexual activity. A spokesperson for the Ministry of Justice expressed enthusiasm for working with Parliament to expedite the passage of Bill C-16, emphasizing that if enacted, it could clarify investigative authorities and potential offenses related to harmful deepfake materials.
The discourse surrounding legislation to combat deepfakes is not new; Canada has made previous attempts to address online harms, as evidenced by the Online Harms Act introduced by the Liberals in 2024, which ultimately failed to pass. The woman who shared her experience feels that this type of legislation is long overdue. "I could not understand how something like this, something that completely damages and ruins your reputation, can’t be illegal," she commented.
In a follow-up to this case, CityNews reached out to TikTok for comments regarding the situation but received a response indicating that they could not discuss individual cases or their measures to address the woman's concerns due to privacy issues. Interestingly, just two days after CityNews made the inquiry, the woman's profile that had misused her images was finally removed from the platform.




