Microsoft blames Tay chatbot's racist Twitter posts on 'coordinated effort by some users'

Windows Central

WinC Bot
Staff member
Dec 17, 2013
73,713
65
0
Visit site
taytweets-lumia-640-hero.jpg

Microsoft's U.S. launch of its Tay chatbot via Twitter earlier this week quickly went out of control, as Tay's AI program started making racist posts. Microsoft shut down the chatbot and now the company is blaming its posts as part of a 'coordinated effort by some users' to take over Tay's converations.

Full story from the WindowsCentral blog...
 

Members online

Forum statistics

Threads
323,315
Messages
2,243,623
Members
428,056
Latest member
Carnes