Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business

Coming Up on Dubai Eye

  • The Best of Dubai Eye 103.8

    Noon - 4:00pm

    Hear the highlights from the week gone by on Dubai Eye 103.8. Listen again to the best interviews, advice and the top stories that has gripped our conversation this week.

  • Extra Time At The Weekend

    4:00pm - 7:00pm

    Passionate about sport? Then this is where you belong. Tom Urquhart, Chris McHardy and Robbie Greenfield are joined by an elite team of guests each week to look at all the sporting highlights of the weekend.

BUSINESS BREAKFAST LATEST

On Dubai Eye

  • Is There Sufficient House Supply In UAE

    Dubai’s current population is more than double compared to almost twenty years ago, which now stands at 3.7 million. Lots of families are also moving to the UAE now. So what does it mean for the property market?

  • Noon's First Female Delivery Driver

    Glory Ehirim Nkiruka is Noon’s first ever female delivery driver. In her first ever interview, she explained why she loves her job, despite the heat!