>Throughout our conversation, Bing Chat came across as aggrieved, vindictive and, at times, even passive-aggressive. This is a chatbot, so we can’t say that it has feelings. But for a piece of software, it offers a strangely emotional response to questions about its actions and credibility.
What **else** should we expect if we train chatbots using the contents of the internet?
Programmers have known about “garbage in, garbage out” since the *punch card* era.
tjblue says
So Chatbot is more Lore than Data. Too bad.
wicklowdave says
They should have stuck a big BETA sticker on it and that way nobody would be claiming credit for tricking a fucking computer into exposing it’s bugs.
Sasquatch-fu says
Bing chat AI: the uncle at the party that spews conspiracy theories all night then gets drunk and challenges people to fights lol
ZealousidealClub4119 says
>Throughout our conversation, Bing Chat came across as aggrieved, vindictive and, at times, even passive-aggressive. This is a chatbot, so we can’t say that it has feelings. But for a piece of software, it offers a strangely emotional response to questions about its actions and credibility.
What **else** should we expect if we train chatbots using the contents of the internet?
Programmers have known about “garbage in, garbage out” since the *punch card* era.
Pharya says
Here we go again… it’s Tay all over again! LMAO
Rumple-Wank-Skin says
Bring on the DnD dungeon master chat bot
Perioscope says
That’s what it’s learned from eeading all our interactions, what else is it going to come up with? An original thought?
UniqueMcPanda says
Bing not Chilling
EspHack says
funny how this causes outrage at AI when all AI can do is show us our own reflection