This entire debacle sounds like the inhabitants of the Amazon rainforest brainstorming "ways to fight photographic cameras that steal your soul".
Pretty soon anyone will be able to download a pornifyer for personal use, and it will be completely legal to use as long as they don't publish the results. The non-consensual pornography Armageddon is greatly exaggerated, nobody will care about such "creations", it will be just another dual use technology.
honestly we are basically there already. I could take anyone's face and use stable diffusion (maybe combined with roop) to generate a naked body in about a min. Most of those steps could be simplified with a task specific UI where you just give it a face and it gives you porn.
As for legality, it's hard to say, I want to say it would bump into trademark laws of a person's identity use, but those laws are similar to libel laws that are hard to prove I think.
The only thing about this that I’m concerned about is what happens when ransomware levels up and starts creating plausible scenarios of impropriety with photorealistic evidence.
Imagine you get a text one day:
“This is Sharon, I need you to send me $1000 via crypto or I’ll send these photos of us at the hotel to your wife”. Followed by sexually explicit photos of you and a woman who works at your company, but you’ve never even met in person, only ever zoom meetings.
You call the number and the person who answers sounds just like Sharon. She ignores everything you say, tells you that you have an hour to send the money or she’s going to call your wife, tell her “everything” and send the photos then hangs up.
This absolutely will happen and the only thing we can do about it is very rapidly adjust to the idea that we should afford a photo with no provenance the same regard as we afford some anonymous guy saying something. They are just not evidence of anything any more.
The whole controversy about Taylor Swift having AI-generated images made about her seems astroturfed. How is this any different from the fake celebrity nudes that have been floating around the internet since the Usenet days? If anything, this is an improvement, as people can be much more creative than simply making someone nude by putting their head on someone else.
What was stopping someone from cutting and pasting the celebrity heads on the hardest-core, most shocking porno imaginable? I don't see the difference, but now you add "in space" or "as quaternions" to a prompt, potentially creating something accidentally great.
Pretty soon anyone will be able to download a pornifyer for personal use, and it will be completely legal to use as long as they don't publish the results. The non-consensual pornography Armageddon is greatly exaggerated, nobody will care about such "creations", it will be just another dual use technology.
Deleted Comment
As for legality, it's hard to say, I want to say it would bump into trademark laws of a person's identity use, but those laws are similar to libel laws that are hard to prove I think.
The author is trying to claim it is as serious as assault. That is madness. It's defamation at most.
Some very primitive instinct is involved in anything to do with sex, that explains why we are so irrational about it.
Imagine you get a text one day:
“This is Sharon, I need you to send me $1000 via crypto or I’ll send these photos of us at the hotel to your wife”. Followed by sexually explicit photos of you and a woman who works at your company, but you’ve never even met in person, only ever zoom meetings.
You call the number and the person who answers sounds just like Sharon. She ignores everything you say, tells you that you have an hour to send the money or she’s going to call your wife, tell her “everything” and send the photos then hangs up.
There's definitely a difference there.