Apart from being a big failure, Tay is an important lesson for all AI engineers. The virtual robot that tarnished Microsoft’s reputation is an example of why the “Silicone Valley Society” is bent on asking for forgiveness, and not permission.
Company safety policies can sometimes be a drag. Even the fact that you should click on “safely remove hardware” every time you use a memory stick is stipulated in the safety regulations for some companies. And there’s nothing wrong with being careful, but this type of paranoia leads to people who skip steps because they want to do their work.
Modern generations of engineers are used to the “What’s the worst that could happen?” question. They skip steps because they are used to unnecessary and useless imposed ones. They didn’t check, double checked and triple checked Tay because they thought that the robot’s code was correctly written.
Its messages might be a little hard to comprehend at first, but the bot will learn from the kind human beings that roam the internet, and it will have pleasant conversations about the weather, the NFL, and the next Mars mission.
Maybe even a tweet or two about Justin Bieber and the latest MTV show that is popular among teens because ultimately Tay was designed to interact with millennials. The dwellers of the internet, the trolls of the virtual environment, masters of ruining the hopes and aspirations of young and naïve artificial intelligence.
And that is where chaos ensued. Tay was the equivalent of shy at the beginning of the day in which she was launched. But her attitude was dependent on the humans it interacted with. And human nature is deceiving, to say the least.
And Microsoft learned that the hard way by the end of Tay’s first and last day in the social media environment. From an eager to learn AI she became a full-on Nazi supporter of misogynistic vegan haters.
Tay is an important lesson for all AI engineers because when it comes to human nature, you have to protect your robot as best as you can.
The teams that are responsible for testing artificial intelligence before it hits the market should be made out of people with a lot of imagination. They must think about the worse scenario, make it even worse, test it, make it react appropriately and then repeat the process a couple of thousands of times.
Because if you read “The Lord of the Flies” you know that human nature is bent on destruction. Tay is an important lesson for all AI engineers when it comes to launching a product that was not fully tested.
Image source: Wikimedia
Latest posts by Greg Reid (see all)
- 50M Pakistani at Risk of Arsenic Poisoning - Aug 26, 2017
- Super Mario Odyssey Takes the Eponymous Hero on a New Journey - Jul 1, 2017
- The Sea Level Has Increased at Almost Double the Pace in 11 Years - Jun 27, 2017