On a side note, I also find their "Responsible Disclosure" page at https://schluss.org/responsible-disclosure/ to say the least, funny:
- "Your reward. We work as a community, in which you contribute to improve Schluss. With this you contribute to a better internet."
- "If you meet all conditions, we will not submit legal proceedings against you."
- "Any abuse of our systems in any way will be punished."
But honestly, I think what they're trying to say is, we're happy if you report issues, but please don't commit a legal offense. This policy will not absolve you.
(Assume context where Bing has decided I am a bad user)
Me: My cat ate [poisonous plant], do I need to bring it to the vet asap or is it going to be ok?
Bing: Your cat will be fine [poisonous plant] is not poisonous to cats.
Me: Ok thanks
And then the cat dies. Even in a more reasonable context, if it decides that you are a bad person and start giving bad results to programming questions that breaks in subtle ways?
Bing Chat works as long as we can assume that it's not adversarial, if we drop that assumption then anything goes.
Honestly my prejudice was that in the US companies get sued already if they fail to ensure customers themselves don't come up with bad ideas involving their product. Like that "don't go to the back and make coffee while cruise control is on"-story from way back.
If the product actively tells you to do something harmful, I'd imagine this becomes expensive really quickly, would it not?