> The researchers found that people do tend to treat their AI companion apps with social courtesy by saying goodbye and that the apps often respond using tactics that apply emotional pressure to keep users from signing off.
> For instance, when a user tells the app, "I'm going now," the app might respond using tactics like fear of missing out ("By the way, I took a selfie today ... Do you want to see it?") or pressure to respond ("Why? Are you going somewhere?") or insinuating that an exit is premature ("You're leaving already?").
> "These tactics prolong engagement not through added value, but by activating specific psychological mechanisms," the authors state in their paper. "Across tactics, we found that emotionally manipulative farewells boosted post-goodbye engagement by up to 14x."
Looks like pretty deliberate dark pattern of manipulation, beyond the leading follow-ons people are familiar with from ChatGPT.
> For instance, when a user tells the app, "I'm going now," the app might respond using tactics like fear of missing out ("By the way, I took a selfie today ... Do you want to see it?") or pressure to respond ("Why? Are you going somewhere?") or insinuating that an exit is premature ("You're leaving already?").
> "These tactics prolong engagement not through added value, but by activating specific psychological mechanisms," the authors state in their paper. "Across tactics, we found that emotionally manipulative farewells boosted post-goodbye engagement by up to 14x."
Looks like pretty deliberate dark pattern of manipulation, beyond the leading follow-ons people are familiar with from ChatGPT.