Announcement
Collapse
No announcement yet.
Microsoft makes a Twitter AI that learns from other tweets, hilarity ensues.
Collapse
X
-
Originally posted by stolen from SlashdotRecently, Microsoft put an AI experiment onto Twitter, naming it "Tay". The bot was built to be fully aware of the latest adolescent fixations (e.g. celebrities and similar), and to interact like a typical teen girl. In less than 24 hours, it inexplicably became a neo-nazi sex robot with daddy issues. Sample tweets from it proclaimed that "Hitler did nothing wrong!", then went on to blame former President Bush for 9/11, stated that "donald trump is the only hope we've got", and other similar instances. As the hours passed, it all went downhill from there, eventually spewing racial slurs and profanity, demanding sex, and calling everyone "daddy". The bot was quickly removed once Microsoft discovered the trouble, but the hashtag is still around for those who want to see it in its ugly raw splendor.
Comment
-
It's funny how they didn't anticipate that happening. The program "learns" by aggregating trending social media interactions and then mimicking various responses. Of course, it's going to become a vitriol spewing demon bot.
I've ran AI bots on a forum before as a social experiment. Allow it to run with open parameters and it doesn't take long before they become little assholes.
Comment
Comment