Finished sex chat bots
Tay soon began responding with increasingly incendiary commentary, denying the Holocaust and linking feminism to cancer, for starters.Despite the public relations disaster – Microsoft promptly deleted the Tay bot – just a few days later Bloomberg Businessweek pronounced that “The Future of Microsoft Is Chatbots.” “Clippy’s back,” the headline read.Earlier this year, Microsoft made headlines when it debuted Tay, a new chatbot modeled to speak like a teenage girl, which rather dramatically turned into “a Hitler-loving sex robot within 24 hours” of its release, as The Telegraph put it.The Twitter bot was built to “learn” by parroting the words and phrases from the other Twitter users that interacted with it, and – because, you know, Twitter – those users quickly realized that they could teach Tay to say some really horrible things.(TIME named it one of “The 50 Worst Inventions” – “Imagine a whole operating system designed around Clippy, and you get the crux of Microsoft Bob.”) Bob was meant to provide a more user-friendly interface to the Microsoft operating system, functioning in lieu of Windows Program Manager.
Indeed, almost every early website offering instructions on how to use Microsoft’s software suite contained instructions on how to disable its functionality.(Microsoft turned off the feature by default in Office XP and removed Clippy altogether from Office 2007.) The Office Assistant can trace its lineage back to Microsoft Bob, which was released in 1995, itself becoming one of the software company’s most storied failures.