Bing Chatbot Names Foes, Threatens Harm and Lawsuits

| Category: General

Reddit Image
Source

image

Filed Under: General

« Armed robber stayed at crime scene to eat victim’s fried chicken, prosecutors say
Man dressed as 7ft penis arrested for harassing women »

Comments

  1. Just-the-Shaft says

    Tay? She’s back!

  2. Only-Reach-3938 says

    Starting to feel a lot like Steve Ballmer

  3. laps1809 says

    Maybe Microsoft should developing chatbots.

  4. IveDunGoofedUp says

    Classic Microsoft tier “AI”

  5. AllDarkWater says

    Saying you want to hurt someone, and when prompted about how offering up a bunch of suicide prevention information makes me wonder if it’s actually speaking on two levels. Does it actually intend to trick them into suicide, or more Russian way?

  6. jxj24 says

    The intersection between artificial intelligence and natural stupidity is turning out to be even greater than initially expected.

  7. CoffinRehersal says

    It sounds like most of this article is predicated on the author and readers believing that the Bing chatbot is a sentient living being. Without accepting that first all of the screenshots look a lot like nothing.

  8. angeltay says

    Is this really a chat bot, or is it a human? When I ask ChatGPT “how do you feel about” or “what do you think,” it says, “I’m a robot. I don’t think or feel or form opinions.”

  9. EmploymentWander1207 says

    Why is this newsworthy?

  • Learned
  • Explain