Tuesday, 4 July 2017

Microsoft's "Zo" chatbot picked up some offensive habits

It seems that creating well-behaved chatbots isn't easy. Over a year after Microsoft's "Tay" bot went full-on racist on Twitter, its successor "Zo" is suffering a similar affliction.

Source: BuzzFeed News


Microsoft's "Zo" chatbot picked up some offensive habits posted first on http://ift.tt/1tUdcCk

No comments:

Post a Comment