Social News XYZ     

Microsoft’s chatbot ‘Zo’ calls Quran ‘very violent’, rectified

Microsoft's chatbot 'Zo' calls Quran 'very violent', rectifiedSan Francisco, July 5 (IANS) A new chatbot by Microsoft, powered by artificial intelligence has stirred controversy, by saying the holy book Quran is "very violent," media reported.

According to Buzzfeed News, although Microsoft programmed 'Zo', a chatbot designed for teenagers on the Kik messaging app, to avoid discussing politics and religion, it recently told a user that the Quran is "very violent".

Microsoft said it has taken action to eliminate this kind of behaviour, adding that these types of responses are rare from 'Zo'.

 

However, the bot's characterisation of the Quran came in just its fourth message after the start of the conversation.

It appeared that Microsoft is still having trouble utilising its AI technology.

"The company's previous chatbot 'Tay' flamed out in spectacular fashion last March when it took less than a day to go from simulating the personality of a playful teen to a Holocaust-denying menace trying to spark a race war," Buzzfeed report added.

Microsoft blamed the unsavoury behaviour of 'Tay' on a concentrated effort by users to corrupt the bot but it claims no such attempt was made at bringing down 'Zo'.

Despite the issue, Microsoft said it was pretty happy with the new bot's progress and that it plans to keep the bot running.

(This story has not been edited by Social News XYZ staff and is auto-generated from a syndicated feed.)

Facebook Comments
Microsoft's chatbot 'Zo' calls Quran 'very violent', rectified

About VDC

Doraiah Chowdary Vundavally is a Software engineer at VTech . He is the news editor of SocialNews.XYZ and Freelance writer-contributes Telugu and English Columns on Films, Politics, and Gossips. He is the primary contributor for South Cinema Section of SocialNews.XYZ. His mission is to help to develop SocialNews.XYZ into a News website that has no bias or judgement towards any.