Readit News logoReadit News
cornholio · 2 years ago
This entire debacle sounds like the inhabitants of the Amazon rainforest brainstorming "ways to fight photographic cameras that steal your soul".

Pretty soon anyone will be able to download a pornifyer for personal use, and it will be completely legal to use as long as they don't publish the results. The non-consensual pornography Armageddon is greatly exaggerated, nobody will care about such "creations", it will be just another dual use technology.

Deleted Comment

michaelbrave · 2 years ago
honestly we are basically there already. I could take anyone's face and use stable diffusion (maybe combined with roop) to generate a naked body in about a min. Most of those steps could be simplified with a task specific UI where you just give it a face and it gives you porn.

As for legality, it's hard to say, I want to say it would bump into trademark laws of a person's identity use, but those laws are similar to libel laws that are hard to prove I think.

127361 · 2 years ago
From the article: "With current techniques, it will be hard for victims to identify who has assaulted them and build a case against that person."

The author is trying to claim it is as serious as assault. That is madness. It's defamation at most.

Some very primitive instinct is involved in anything to do with sex, that explains why we are so irrational about it.

alphan0n · 2 years ago
The only thing about this that I’m concerned about is what happens when ransomware levels up and starts creating plausible scenarios of impropriety with photorealistic evidence.

Imagine you get a text one day:

“This is Sharon, I need you to send me $1000 via crypto or I’ll send these photos of us at the hotel to your wife”. Followed by sexually explicit photos of you and a woman who works at your company, but you’ve never even met in person, only ever zoom meetings.

You call the number and the person who answers sounds just like Sharon. She ignores everything you say, tells you that you have an hour to send the money or she’s going to call your wife, tell her “everything” and send the photos then hangs up.

amenhotep · 2 years ago
This absolutely will happen and the only thing we can do about it is very rapidly adjust to the idea that we should afford a photo with no provenance the same regard as we afford some anonymous guy saying something. They are just not evidence of anything any more.
ok123456 · 2 years ago
The whole controversy about Taylor Swift having AI-generated images made about her seems astroturfed. How is this any different from the fake celebrity nudes that have been floating around the internet since the Usenet days? If anything, this is an improvement, as people can be much more creative than simply making someone nude by putting their head on someone else.
AussieWog93 · 2 years ago
We had fake nudes back in the day but not full blown photorealistic videos depicting hardcore sex acts.

There's definitely a difference there.

scarmig · 2 years ago
Ah, yes, the photorealism will definitely confuse people and make them think Taylor really did have an orgy with everyone on Sesame Street.
ok123456 · 2 years ago
What was stopping someone from cutting and pasting the celebrity heads on the hardest-core, most shocking porno imaginable? I don't see the difference, but now you add "in space" or "as quaternions" to a prompt, potentially creating something accidentally great.
dudul · 2 years ago
The devil's advocate could reply that the scale is different, but yeah I agree, we've had fake celebrity nudes since the 90s.
riedel · 2 years ago
I guess the scale of teenage boys pasting cutout heads of celebs or classmates on top of porn magazine pages was also not that small.
UomoNero · 2 years ago
Ignoring it.