[ad_1]
One of the worst things that can happen to a person, according to Ruby, a 16-year-old from Toronto, is to find a nude picture of yourself on the internet.
And that’s exactly what happened to her, through no fault of her own.
“Suddenly I was in that worst-case scenario,” she said.
CBC News is not revealing Ruby’s last name because she was a victim of a worrying new trend — sexually explicit deepfakes of minors.
A deepfake is any image or video that has been altered or created, often using artificial intelligence, that is difficult to distinguish from the real thing.
Last year, Ruby got a series of messages from someone saying there were images of her online, asking her to click a link to see them. She asked to see them, and was sent a deepfake of herself, topless. The original photo, in which she was fully clothed, had been taken when she was 13.
Ruby had been taught how to be safe online and she doesn’t post under her real name.
“I didn’t do anything wrong,” she said. “It just happened… I was filled with a lot of anger.”
“I have no idea who this person is.”
Ruby’s parents called Cybertip.ca, a national hotline for people to report sexually explicit images of minors. It processed 4,000 of sexually explicit deepfakes in the past year, the first year it starting tracking numbers.
The problem “has continued to grow and evolve,” said Lindsay Lobb, the operations director of support services at the Canadian Centre for Child Protection, which runs Cybertip.ca.
The deepfakes are often used to extort, harass or bully minors, she says, and are easy to make because of the many sites and apps that will “nudify” an image.
‘Massive generational leaps’
There have been some high-profile cases of sexually explicit deepfakes circulating in Canadian and U.S. high schools.
In Canada, any sexual explicit image of a minor, deepfake or not, is illegal and considered child pornography. Online and social media platforms say they report images found on their sites to police.
And it will only get harder to distinguish deepfakes from the real thing, says Brandon Laur, an online safety educator.
“Every year we are going to see massive generational leaps in the realness of those images,” he said.
Even now, he says, parents don’t always believe children when they say the images are not real.
Laur says it’s unrealistic to expect people not to post online, but wants to raise awareness that once an image is up, even with secure settings, it’s virtually impossible to control what happens to it.
The RCMP and other police services have expressed concern over the appearance of these types of images. But legal recourse can be difficult says Molly Reynolds, a lawyer at Tory’s LLP in Toronto, who has represented adult victims in civil cases.
Deepfakes can be made by ex-partners for revenge, by fellow colleagues and students to threaten and bully, or by strangers in other countries.
“If a stranger just takes your image anywhere in the world and turns it into deepfake, it can be very challenging to find a legal path in Canada to stop that,” Reynolds said.
Reynolds says victims can submit a takedown request to the site — such as Google or Meta — hosting an image.
After that, “there may be civil law or criminal law routes to make a claim for harassment,” she said.
In Ruby’s case, police did not find evidence the image was distributed online. They believe the person who contacted her was trying to hack into her iCloud in an elaborate phishing scheme.
She remains shaken and wants people to know this can happen to them.
“What’s still being taught around cybersecurity is that nothing ever leaves the internet — and to be safe, don’t take nude photos,” she said. “And that’s true. But now it’s an entirely other ballgame.”
[ad_2]
Source link