Readit News logoReadit News
omars_ commented on Exploring a Conversational AI Solution for Loneliness    · Posted by u/omars_
Fr3dd1 · 9 months ago
I kind of like your idea but in my head I always bounce back to "Is that a problem technology can or should solve?" IMHO the underlying problem is that people dont care that much anymore for each other, especially strangers. And the reason for that is, that in our society, we are not at all dependent on our surroundings. You can be perfectly fine without knowing your neighbors for example. If you dont have any sugar or salt to cook something, you can go to a close by store, order it or just order the cooked meal. You dont go to your neighbor and ask for a little bit of sugar or salt (if you ask for sugar, you probably get insulted because how bad sugar is - little joke :D ). So I guess, the only thing that truly work is to build communities that consists of interdependent people. Just my take one it :)
omars_ · 9 months ago
I definitely agree that real life relationships and communities are the best way to go, but also that our increasingly isolated lives makes that difficult, and some people get anxiety around social situations or just find themselves stuck in a rut of work followed by unwinding at home alone.

People increasingly also don't like being dependent on others, so while your take is a valid one, people who don't think it's true would need an alternate solution to that problem.

There are also some people who would like a better social life but are unsure how to, or don't have the skills or opportunity to do so.

The success criteria of such an app could even be that users should only be using it for a certain amount of time, after which the app should have encouraged and helped the users in replacing app interactions with real life social interactions.

omars_ commented on Exploring a Conversational AI Solution for Loneliness    · Posted by u/omars_
brudgers · 9 months ago
since there are people using AI girlfriend/boyfriend services

Does anyone marry them and build a life together with them?

Likewise, how would that AI build a long term relationship…what happens when the code changes…when the TOS changes…when the service is shut down?

Are you sure that AU would be healthy?

omars_ · 9 months ago
I haven't looked into AI girlfriend/boyfriend services too much, but they seem like exploitative services that hog up peoples time and money and further isolate them from the real world.

If people are willing to spend time and money on these services, I thought one should be able to come up with a more thoughtful and healthy service that actually helps people / nudges them towards a healthier lifestyle.

Hence having a companion or a coach that alleviates feelings of isolation and encourages you to do things that could get you the real thing. While all the issues you raise are valid, there are solutions for them as well (subscription that pays for the service so it doesn't shut down, allowing you to lock in AI if you don't want updates changing its personality.. in any case, the personality should more be formed from the data it collects from its conversations with you over time)

There are definitely ways in which this could be unhealthy, which is why I was curious about how one could go about a way to maximize the chances of it being a healthy and helpful service.

omars_ commented on Exploring a Conversational AI Solution for Loneliness    · Posted by u/omars_
bsenftner · 9 months ago
Although my work is not about loneliness, it has similarities with your goals. My work is about creating intellectual confidence, critical awareness, and laying the foundation for honest ambition from self confidence through accomplishment.

I've been creating AI chatbots, and "taskbots" (chatbots that do more than respond conversationally, they procedurally do things on request.) These are all embedded into an office software suite such that it forms an office software environment with dozens of virtual expert co-workers that are integrated right inside the UI of the office software. So when one is working, there are multiple virtual co-workers that are conversationally inside the same software you are using, with access to what you are doing in that software. They advise your work, they can directly manipulate one's in-editor work, and in general they are designed to educate you how to do your own work better, how to understand past your work and become materially better at what one does.

As I created and have been using and testing the system with general office workers, I find I need to include psychological aspects in the AI Agents behaviors, because people are intimidated, or they are sarcastic, or they are really timid and afraid of doing something wrong and getting reprimanded. The AI Agents that help a person edit in the work processor require instruction that people are both afraid to reveal any lack of understanding, and have a real hard time articulating the help they need. So these word processor support AIs coax the user and coach them how to ask for help; once that hurdle is crossed, users get active and chatty with the agents and make good progress. That initial use, they are very intimidated. Plus often confused, because they think they can just say "write this for me" and the AI will do everything, as if it can read their mind.

If you find this interesting, you can contact me at https://midombot.com/b1/home

omars_ · 9 months ago
Are your users engaging with your various AIs through text?

I didn't see any place I could contact you from the page that you linked.

When you say people are intimidated, sarcastic, timid, or afraid, are you measuring that or just observing it personally as users try out the app?

The techniques you're applying around coaxing / encouraging certain behavior could apply more broadly, depending on how you're managing it.

omars_ commented on Exploring a Conversational AI Solution for Loneliness    · Posted by u/omars_
brudgers · 9 months ago
In what ways would that AI be like social media in regard to loneliness?

In what ways would that AI be different than social media in regard to loneliness?

What clinical basis would that AI’s design have…aren’t we talking about people’s mental health here, after all?

What safeguards would that AI have in regard to harmful responses?

Good luck.

omars_ · 9 months ago
For lonely people I imagine social media is passive scrolling for the most part, maybe even with one way interactions (likes / comments). Some might be engaging with other users on social media, but it would almost certainly be text (a reddit thread conversation, or getting a response to a reply you made on some other social media).

With conversational AI, it would be a dynamic voice conversation, where the AI would be responding to you live, and the medium of speech-to-speech would also feel social in a way that text based communication does not.

The lonely user might still think that their interactions dont count because it's a fake person that they are engaging with, but since there are people using AI girlfriend/boyfriend services, I imagine an AI friend/coach should appeal to people as well, but be healthier than simulating a romantic connection with an AI.

In terms of safeguards and clinically backed design for the AI, I'm hoping to foster a conversation around it. Most LLMs have various safeguards in place around harmful responses, and while this product would hopefully be alleviating peoples mental health by reducing their feeling of loneliness, it wouldn't be developed as a therapist, but more of a friend. Still, knowing how to make an effective friend isn't trivial, and I'm open to figuring out how best to do that - ideally it would involve having users willing to engage with a WIP product that is iterated on based on their feedback.

omars_ commented on Exploring a Conversational AI Solution for Loneliness    · Posted by u/omars_
reify · 9 months ago
Like everything in life.

You have to believe in something for it to work. Without belief it will always fail.

I am a retired psychotherapist and for psychotherapy to work the client must have the most basic assumptions and belief that psychotherapy will work for them, If not psychotherapy will fail.

If you are lonely you must still have a belief that what is offered is going to reduce the symptoms of Loneliness.

Take Krishna consciousness as an example.

You must believe whole heartedly that lord Krishna exists and simple devotional service and a commitment to, and love of, Lord Krishna will lead your soul to him when your body dies and your soul leaves this material world.

Isnt that a wonderful thing if you believe it.

I dont think any ai can ever reproduce the unspoken interpersonal, intrapersonal stuff that goes on, out of awareness, during the interactions between human beings.

I am thinking projections, projective identification, introjects and transference and counter transference as examples. You have to be human to know, feel, sense and experience these things.

No amount of coding trickery will duplicate the vast oceans of human experience

omars_ · 9 months ago
I agree that belief plays a role, and that it would be difficult to perfectly replicate interactions between human beings, but I think you can get partly there, and especially for those that don't have human beings that they can interact with, it could provide benefits to them.

I've been working on my own for some time now, and I find talking to an AI via text or voice helps me work through problems, and it does give me a partial feeling of having a coworker.

Personally, while I have various social connections, there are certain things that I find interesting that none of my social connections do, and so there are certain conversations I cannot have with my social group. I sense engaging with an AI about these topics could give me more pleasure than talking to a disinterested friend about it. An interested friend would still be the best case scenario, but that is why I was thinking a product like this could be useful for those that don't have friends.

As @bsenftner mentioned as well, if you've withdrawn from social situations due to past interactions, because you feel misunderstood, or aren't good at expressing yourself, then having this no-risk platform to experiment / practice socializing with a conversational AI could be something that appeals to you, and over time gets you to a place where you seek out the real thing.

omars_ commented on Conversational AI Journaling App   innerecho.xyz... · Posted by u/omars_
omars_ · 9 months ago
Hey HN,

I’ve been working on a journaling app that takes a different approach via conversational AI: instead of writing or typing, you speak to an AI that listens and talks back, helping you reflect on your thoughts and feelings in real-time.

It’s designed for people who struggle with traditional journaling. A live transcript is available to revisit and annotate later.

The web app is free and still early. I have a lot of ideas for features, but instead of building in isolation, I’d rather shape it with real users—figuring out what actually helps rather than just guessing.

On the tech side, I tested a bunch of options and landed on DailyBots for orchestration, Deepgram for Speech-to-Text, Cartesia for Text-to-Speech, and OpenAI’s 4o-mini for the LLM—balancing latency, quality, and cost.

Try it here: innerecho.xyz

I’d love to hear your thoughts—what would make something like this truly useful to you?

u/omars_

KarmaCake day2June 24, 2024View Original