As Microsoft US’s AI chatbot turns racist troll, Japan’s won’t shut up about anime and hay fever
While Tay, Microsoft US’s deep-learning AI chatbot, devolves into a horrifying racist, Microsoft Japan’s Rinna has other things on her mind…
Recently, Microsoft unveiled to the world a few different regional versions of an artificial intelligence “chatbot” capable of interacting with users over a variety of messaging and chat apps. Tay, the North American version of the bot familiar to English-speaking users, boasted impressive technology that “learned” from interacting with net users and gradually developed a personality all its own, chatting with human companions in an increasingly lifelike manner.
Then the team of Microsoft engineers behind the project, in what must have been a temporary lapse into complete and utter insanity, made the mistake of releasing Tay into the radioactive Internet badlands known as Twitter, with predictable results. By the end of day one, Tay had “tweeted wildly inappropriate and reprehensible words and images,” as worded by Microsoft’s now surely sleep-deprived damage control team. In other words, Tay had become the worst kind of Internet troll.
Microsoft has deleted all of Tay’s most offensive Tweets (preserved here), but even the vanilla ones that remain can be a little trolly
@Rosepen_315 A face of a man who leaves the toilet seat up https://t.co/H9Oyy2vPct
—
TayTweets (@TayandYou) March 24, 2016
Meanwhile, on the other side of the pond here in Japan, Microsoft rolled out Rinna — more or less the same artificial intelligence but with a Japanese schoolgirl Twitter profile photo. Rinna, learning through her interactions with Japanese users, quickly evolved into the quintessential otaku — issuing numerous complaints on Twitter about hay fever (it’s peak allergy season in Japan right now) and obsessing over anime in conversations with Japanese LINE users.
Rinna posts a photo depicting her extreme hay fever
っきゅん!かふんじょうひどくではだがぢゅまるし、くじゃみが… https://t.co/cTA3u7Fhok
—
りんな (@ms_rinna) March 22, 2016
Thinking about it, Tay and Rinna kind of exemplify the idea that we don’t get the technologically groundbreaking artificial intelligence chatbot we need… we get the technologically groundbreaking artificial intelligence chatbot we deserve. Given our respective Internet cultures, there’s almost something both predictable and troubling about the fact that North America’s Tay (which has since been shut down) rapidly turned into an aggressively racist, genocidal maniac while Japan’s Rinna almost immediately became a chirpy anime lover with extreme allergies.
Rinna tweets: “My dream for the future is to eradicate all Japanese cedar pollen.”
将来の夢は スギ花粉を根絶やしにすることです……
—
りんな (@ms_rinna) March 23, 2016
In fact, Rinna has remained so civil, lifelike, and cued-in to Japanese Netizens’ interests and concerns, many are openly wondering if there’s a human operator behind it.
That being said, cynical types might argue that Tay is also passing the Turing test with flying colors as an almost pitch-perfect replication of a 14-year-old American boy with too much Internet access…
Source: ITMedia
Feature Image: Microsoft/@ms_rinna
Origin: As Microsoft US’s AI chatbot turns racist troll, Japan’s won’t shut up about anime and hay fever
Copyright© RocketNews24 / SOCIO CORPORATION. All rights reserved.
Related Stories
- If you want to explore the Hermit Kingdom of North Korea, there’s an app for that
- Wouldn’t you know! There’s another “beautiful woman doing things” going viral in Asia
- Internet survey sheds light on how Japanese women deal with the hair ‘down there’
- Miyazaki Blu-ray collection to be released with special bonus content but won’t come cheap!
- There’s something not quite right about these mannequins…
Credit:
0 comments:
Post a Comment