Readit News logoReadit News
I_Am_Nous · 2 years ago
The idea of basing a business around trying to get people to love a model which could be changed at any time seems both futile and immoral. At least when a person dumps you it's generally understood to be a decision made by a person, not "we adjusted the model for performance reasons and now your virtual partner is essentially dumber and will never be the person you fell in love with." Illness and injury can change a person, but that's a possibility rather than an inevitability for a partner built on top of someone else's model.
brucethemoose2 · 2 years ago
> futile and immoral

This is not a fancy service we are talking about. This is more like reselling nicotine or alcohol on a street corner.

The demand is infinite, because of human psycology. It doesn't really matter if the bots are bad or reliable, users will flood to virtual partners and there will always be a million middlemen ready to connect the dots.

There are ways to mitigate such issues anyway but... Who cares? They really shouldn't be using an OpenAI API for a high quality virtual partner service anyway, but its the absolute easiest and fastest path, and it will flood out high quality implementations.

bediger4000 · 2 years ago
> it will flood out high quality implementations.

So a Gresham's Law for AI girlfriends?

hammyhavoc · 2 years ago
I have zero demand for a "virtual partner".
exabrial · 2 years ago
> could be changed at any time seems both futile and immoral

You [to the robot]: Say something nice about me. That always picks me up and makes me feel better.

Robot: I'm sorry, that feature was deprecated in RoboOs v16.4. Would you like me send you information about a premium subscription?

thedailymail · 2 years ago
I guess the availability of effectively limitless AI-based characters that can be designed or tweaked by users will result in a normalized shift away from having "a" girlfriend or boyfriend, and toward maintaining a gamut of virtual relationships, ranging from single-episode encounters to committed long-term romantic chat partners. Once you can cook up any number of chars with any combination of personality, appearance, background, etc. it is unlikely that most users will be interested in a purely monogamous AI relationship. Of course there will be people who "fall in love" with their AI talking doll, but that's a tale as old as Pygmalion.
toomuchtodo · 2 years ago
Love is brain chemistry. Does it matter what’s on the other side of the screen? You’re paying for the feeling, and it always costs something. Informed consent of course.
LeroyRaz · 2 years ago
What do you mean, "it always costs something?"

In a healthy relationship, both partners receive more than they sacrifice in 'costs.' This is the whole point of all mutually entered into relationships...

mensetmanusman · 2 years ago
Confusing dopamine with love.
safety1st · 2 years ago
We can absolutely bet on it mattering, because this screen-mediated shit isn't as good as the real thing, and we know that because substituting social media usage for real socialization triples your chances of depression and anxiety, this is so similar I guarantee it will have a similar negative effect.

The only weird thing is that people are actually buying into this techno-bullshit with a straight face

I_Am_Nous · 2 years ago
If love is a piece of the biological puzzle which drives humanity to reproduce, it matters. Regardless of whether we believe global population should grow or shrink in general, our modern society currently runs on the backs of a lot of people. Society will have to change if this level of reproduction is not maintained because it's easier to get an AI partner to tickle your dopamine receptors than it is to raise a family.
hydrok9 · 2 years ago
There's no love here. This is pornography we're talking about.
porkbeer · 2 years ago
I used to think this, but it's not true. While obviously the majority of use, it is not the concerning one. We could already have fantasies, but now they have some small amount of agency. The plot of 'wishes coming to life only to be revealed as curses' is as old as storytelling. The small but important number of people who will become attached beyond what is healthy is a concern, but we will for have to watch it evolve.

Deleted Comment

Havoc · 2 years ago
Sam Altman did pretty much predict that

>(there are incredibly useful GPTs in the store, but probably 'everything is waifus' runs away with it...)

https://twitter.com/sama/status/1745135062746869844

hydrok9 · 2 years ago
Can't wait for the next generation of porn addicts. Bring on the class-action lawsuits!
thdespou · 2 years ago
How does this affect real women on OF? Will they lose business?
hydrok9 · 2 years ago
They will lose a lot, because real women on OF are predatory, and extract money from desperate men. Why pay for that when you can have basically the same thing for cheap/free?

Dead Comment

k12sosse · 2 years ago
And tailor them to exactly what you want them to say, do, and how to look.

Hell, just a couple minutes on a Teams or Zoom call and I not only have your physical features but I can also clone your voice.

porkbeer · 2 years ago
One can only hope. While I think prostitution should be destigmatized, normalizing it is incredibly damaging. I understand that most people won't discern or weigh the distinction here, but I feel it is important and pretty well studied at this point.
rightbyte · 2 years ago
Ye. Is there a "Not my daughter" variant of Nimby I can use?
datavirtue · 2 years ago
Hopefully. The first actual victim of AI job displacement: Sex workers. Freaking hilarious.
dvngnt_ · 2 years ago
they're already switching to automation and outsourcing.
blueyes · 2 years ago
character.ai is in for a hard time.
cyanydeez · 2 years ago
sell what you know