Saying you want to hurt someone, and when prompted about how offering up a bunch of suicide prevention information makes me wonder if it’s actually speaking on two levels. Does it actually intend to trick them into suicide, or more Russian way?
It sounds like most of this article is predicated on the author and readers believing that the Bing chatbot is a sentient living being. Without accepting that first all of the screenshots look a lot like nothing.
Is this really a chat bot, or is it a human? When I ask ChatGPT “how do you feel about” or “what do you think,” it says, “I’m a robot. I don’t think or feel or form opinions.”
Just-the-Shaft says
Tay? She’s back!
Only-Reach-3938 says
Starting to feel a lot like Steve Ballmer
laps1809 says
Maybe Microsoft should developing chatbots.
IveDunGoofedUp says
Classic Microsoft tier “AI”
AllDarkWater says
Saying you want to hurt someone, and when prompted about how offering up a bunch of suicide prevention information makes me wonder if it’s actually speaking on two levels. Does it actually intend to trick them into suicide, or more Russian way?
jxj24 says
The intersection between artificial intelligence and natural stupidity is turning out to be even greater than initially expected.
CoffinRehersal says
It sounds like most of this article is predicated on the author and readers believing that the Bing chatbot is a sentient living being. Without accepting that first all of the screenshots look a lot like nothing.
angeltay says
Is this really a chat bot, or is it a human? When I ask ChatGPT “how do you feel about” or “what do you think,” it says, “I’m a robot. I don’t think or feel or form opinions.”
EmploymentWander1207 says
Why is this newsworthy?