Bing Chatbot Names Foes, Threatens Harm and Lawsuits

| Category: General

Reddit Image
Source

image

Filed Under: General

« Police warn residents to ‘steer clear’ of ‘creepy’ Cookie Monster terrorising city
Gov. Reeves unsure whether providing mothers health care would help their health »

Comments

  1. tjblue says

    So Chatbot is more Lore than Data. Too bad.

  2. wicklowdave says

    They should have stuck a big BETA sticker on it and that way nobody would be claiming credit for tricking a fucking computer into exposing it’s bugs.

  3. Sasquatch-fu says

    Bing chat AI: the uncle at the party that spews conspiracy theories all night then gets drunk and challenges people to fights lol

  4. ZealousidealClub4119 says

    >Throughout our conversation, Bing Chat came across as aggrieved, vindictive and, at times, even passive-aggressive. This is a chatbot, so we can’t say that it has feelings. But for a piece of software, it offers a strangely emotional response to questions about its actions and credibility.

    What **else** should we expect if we train chatbots using the contents of the internet?

    Programmers have known about “garbage in, garbage out” since the *punch card* era.

  5. Pharya says

    Here we go again… it’s Tay all over again! LMAO

  6. Rumple-Wank-Skin says

    Bring on the DnD dungeon master chat bot

  7. Perioscope says

    That’s what it’s learned from eeading all our interactions, what else is it going to come up with? An original thought?

  8. UniqueMcPanda says

    Bing not Chilling

  9. EspHack says

    funny how this causes outrage at AI when all AI can do is show us our own reflection

  • Learned
  • Explain