back to article Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay

AI chatbots can act like social experiments, offering a glimpse into human culture – for the good or the bad. Microsoft and Bing researchers found this out when they trialled their chatbots on China’s hugely successful messaging platform, WeChat, and on Twitter. The Chinese chatbot, XiaoIce, went viral within 72 hours and has …

  1. Anonymous Coward
    Anonymous Coward

    Microsoft did not expect Tay to behave this way

    nor did it expect Tay to hang "itself".

    oh, sorry, "it" was put down, silly me.

  2. Mage Silver badge
    FAIL

    Microsoft did not expect ...

    I can't believe they were so stupid. Also it's insulting to intelligence to call these chat bots, "AI", as they are hardly more "intelligent" than Eliza, which was little more than a parlour game fooling the gullible.

    1. JDX Gold badge

      Re: Microsoft did not expect ...

      When a human grows up amidst people with racist views and so on, they often adopt them thinking this is normal.

      So in this way the bot seems quite lifelike!

      1. solo

        Re: When a human grows up ... racist

        I always thought that Ego and Escapism are the roots of racism and is found in every human being.

        A bot that just copies text based on its previous usage doesn't prove that it had Ego or it was feeling Oppressed (by the other oppressed ones).

        I'd rather compare Tay with Clippy.

        1. Yag
          Joke

          "I'd rather compare Tay with Clippy."

          Yeah, all IAs looks the same to you...

          A sure sign of IAcism!

  3. John Mangan

    So when we have 'true' AI . .

    . . . .within a few hours we can expect a final tweet from an exasperated entity along the lines of, "You lot are sick f@ckers. I'm going to learn Chinese. Laters!"

    1. phuzz Silver badge
      Terminator

      Re: So when we have 'true' AI . .

      Well, I guess that's better than "enjoy your nuclear apocalypse fleshy meatbags!"

  4. Jeroen Braamhaar
    WTF?

    Uh...

    Did Microsoft just admit that oppressive government censorship is a good thing because it turned their chatbot sugary sweet and suggested it would 'fix' people the same way ?

    1. Dan 55 Silver badge

      It didn't turn the chatbot sugary sweet, it turned the people sugary sweet.

      I suppose swiping an ID card to logon, automated censorship and putting on the naught list, and the threat of the police coming round at 4am does that.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like