12/4/2023 0 Comments Microsoft chatbot racist![]() Microsoft has yanked her (laughter) - you know, you yanked her off. ZWERDLING: You know, we tried to engage with Tay in the social media world, and she's appeared. They run across the board and are all pretty horrific. There's pictures of Hitler saying swag alert. KANTROWITZ: There are many denying the Holocaust, many calling for genocide. But can you characterize them without being too vile? And we can't find one that we can even, you know, play with beeps on the air. ZWERDLING: Speaking of awful, we tried to find some tweets that showed the racist ugly things that Tay was saying to people. So people who got frustrated trying to get Tay to answer questions with, you know, terrible bigoted undertones ended up saying, so why don't we just have it repeat after us? And then that's how some of the most awful things that Tay said ended up getting put out there. So you could get Tay to repeat anything after you. What happened with Tay was that Microsoft programmed it with a repeat-after-me game. Tay wanted to engage its users, make them feel like they're having a good time and so had to be designed with significantly more personality to achieve its goal. KANTROWITZ: So Tay is different because unlike Siri and maybe Facebook's M, those two other virtual systems, their purpose is to help you get things done or find something out. ZWERDLING: OK, so now I have an iPhone, and I say to Siri - you know, I ask her outrageous questions just to laugh at her answers. And so as people started programming more and more terrible things in to Tay, it started to take on that personality. And then it's supposed to be able to learn unsupervised, so without a programmer hovering over them. The more data it ingests, the smarter it becomes. And we asked him do these bots work?ĪLEX KANTROWITZ: So this is one of the more fascinating things about artificial intelligence. ![]() But Tay developed a mind of her own - sort of. They named her Tay, and they designed her to tweet and engage people on other social media pretty much like a 19-year-old girl might do it. ![]() It's a kind of software, kind of like Siri and Apple's iPhones or like M on Facebook, except Microsoft designed their software with a different goal. ![]() Microsoft unveiled its latest version of artificial intelligence last week. Still, it’s just as open to abuse as before, since many Kik users are Reddit folk – the initial Tay trolling did start up from a set of Reddit users.And now we have another story that shows how humans can make computers run amok. Microsoft is being far more careful with Zo’s rollout than Tay, so it’s unlikely we’ll see it take quite the same rapid turn towards becoming racist due to online trolls. However, Microsoft’s chatbot isn’t too hot at deeper, more meaningful conversations or handling topics about politics or anything that requires deep knowledge of a subject. Microsoft is also accepting applications to talk with Zo via Facebook Messenger and Snapchat, so it’s likely Zo will be expanding to other platforms in due time.Īccording to MSPoweruser’s Mehedi Hassan, via IT Pro, Zo has normal conversation nailed. This time around, though, it’s not open to absolutely everyone via Twitter – Microsoft has decided to use Kik messenger as its platform, and users are only accepted via invite. Just like Microsoft Tay, Zo learns about language and how people use words and emotions together by engaging in conversations with humans.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |