Microsoft's U.S. launch of its Tay chatbot via Twitter earlier this week quickly went out of control, as Tay's AI program started making racist posts. Microsoft shut down the chatbot and now the company is blaming its posts as part of a 'coordinated effort by some users' to take over Tay's converations.
Full story from the WindowsCentral blog...