Home

neočekivan zvono čitač microsoft racist bot atom guma Zanimljiv

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft's racist teen bot briefly comes back to life, tweets about kush

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

TayTweets: How Far We've Come Since Tay the Twitter bot
TayTweets: How Far We've Come Since Tay the Twitter bot

Tay (bot) - Wikipedia
Tay (bot) - Wikipedia

Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism  [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial  intelligence (AI) | The Guardian
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian

Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal  maniac - The Washington Post
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how  to be racist and say horrible things https://t.co/onmBCysYGB  https://t.co/0Py07nHhtQ" / Twitter
Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how to be racist and say horrible things https://t.co/onmBCysYGB https://t.co/0Py07nHhtQ" / Twitter

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

Microsoft's "Zo" chatbot picked up some offensive habits | Engadget
Microsoft's "Zo" chatbot picked up some offensive habits | Engadget

Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to  sleep | TechCrunch
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's Tay chatbot returns briefly and brags about smoking weed |  Mashable
Microsoft's Tay chatbot returns briefly and brags about smoking weed | Mashable

AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist
AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist

Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium
Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium

What Microsoft's 'Tay' Says About the Internet
What Microsoft's 'Tay' Says About the Internet