Now, while these screenshots seem to show that Tay has assimilated the internet's worst tendencies into its personality, it's not quite as straightforward as that. "Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI /xuGi1u9S1A And Tay - being essentially a robot parrot with an internet connection - started repeating these sentiments back to users, proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out. Pretty soon after Tay launched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks. Unfortunately, the conversations didn't stay playful for long. Yesterday, Microsoft unveiled Tay - a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation." It took less than 24 hours for Twitter to corrupt an innocent AI chatbot.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |