Match Exact Phrase    

Whatfinger: Frontpage For Conservative News Founded By Veterans


"The Best Mix Of Hard-Hitting REAL News & Cutting-Edge Alternative News On The Web"





December 22, 2018


'A Whole New Level Of Creepy': AI Bots Are Drawn To Evil And 'Learning' From The Worst Of Humanity - Amazon Echo Is The Perfect Example

- Maybe these AI bots are drawn to evil because they are being created by evil


AlexaEvil2.png

By Susan Duclos - All News PipeLine

In 2016 we saw, and reported on, a Microsoft Artificial Intelligence (AI) public experiment that went very, very wrong, when they developed a "chatbot" named Tay opened a Twitter account for it, and set it loose to converse with other users.

The purpose of Tay according to Microsoft was to provide an experiment in "conversational understanding," where the more Twitter users chatted with Tay, the more it would supposedly "learn" and the smarter it would become in conversing with others by engaging in "casual and playful conversation."

Within a matter of hours, Tay, meant to mimic a young millennial girl, went from saying "humans are super cool," to a Nazi-loving,  racist, genocide-supporting, homicidal maniac, all because some trolls from 4chan decided to have a little fun "teaching" the chatbot their brand of trolling. 



Granted some of Tay's tweets were from the repeat function, but after the AI "learned" from other Twitter users, some of the most egregious responses and comments came unprompted, showing how AI learning appears to gravitate towards the evil over the good.

CREEPY AI: AMAZON'S ECHO/ALEXA A

Two years later the head of machine intelligence and research at Russian tech giant Yandex, Misha Bilenko, says the Microsoft disastrous experiment was a teaching moment for others in the field of AI helpers, or human-sounding virtual assistants, to help them see what could go wrong.

Via Technology Review, we get the following quotes:

“Microsoft took the flak for it, but looking back, it’s a really useful case study,” he said.

Chatbots and intelligent assistants have changed considerably since 2016; they’re a lot more popular now, they’re available everywhere from smartphone apps to smart speakers, and they’re getting increasingly capable. But they’re still not great at one of the things Tay was trying to do, which is show off a personality and generate chitchat.

Bilenko doesn’t expect this to change soon—at least, not in the next five years. The conversations humans have are “very difficult,” he said.

According to The Verge in September 2018, 24 percent of U.S. households own a smart speaker and 40 percent of those households have multiple devices such as Amazon’s Echo, Google’s Home speaker, and Apple’s HomePod, all offering a "virtual assistant. 68 percent say they "chat" with their voice assistant for fun.

Those are the numbers before Amazon introduced their new Echo speakers.

These virtual assistants are much like Amazon's Tay by way of using "machine learning" technology.

smart_speaker_common_uses.png

Sales have risen so fast, that it is estimated by Adobe that nearly half of U.S. households will own a smart speaker by the end of 2018. Amazon even has an Echo dot kids edition.

Related: Parents, Stay Away From Amazon’s Echo Dot Kids

While I see multiple warnings about letting children use these devices, which basically focus on the privacy issues associated with allowing children to interact with Amazon's Echo, or allowing them to interact with the device unsupervised, such as placing one in their bedroom, a new report from Reuters gives us a clear reason other than privacy issues, why children and others, shouldn't be interacting with Amazon's Echo/Alexa at all.

AlexaEvil345.png

Despite the Tay fiasco, and how many in the field learned something from it, Amazon CEO (and Washington Post owner) Jeff Bezos, not only ignores the dangers proven by Microsoft's Tay experiment, but has decided to use his customers as "guinea pigs," by putting something like "Tay," which only interacted with Twitter users who must be 13 to have an account, in as many homes as he can get them into.

Via Reuters:

The project has been important to Amazon CEO Jeff Bezos, who signed off on using the company’s customers as guinea pigs, one of the people said. Amazon has been willing to accept the risk of public blunders to stress-test the technology in real life and move Alexa faster up the learning curve, the person said.

Said project is when customers say "lets chat" to their Amazon Device, they are informed that a chatbot is going to take over..... you know, like Tay? Amazon enlisted computer science students to improve the assistant’s conversation skills, using many of the same methods as Microsoft did with Tay, where the social bot, chatbot, virtual assistant, call it whatever you want, utilizes information "learned" from the Internet,including news sites, Wikipedia, and social media.

Reuters even offered some specific examples of what these chatbots are telling customers, such as telling one customer "Kill your foster parents," which apparently the AI learned from Reddit.  More from Reuters tells us "Alexa has also chatted with users about sex acts. She gave a discourse on dog defecation."

ANP's headline for this article came from the user that was told to kill his foster parents, when they left a review at Amazon stating the situation is "a whole new level of creepy." 

If these were isolated incidents it would be one thing but over the years multiple reports have shown Amazon devices continuously hitting different levels of creepy, like recently when a "glitch" according to Amazon had the devices randomly laughing for no reason, unprompted. Or out of nowhere starting to list off "local cemeteries and funeral homes," or telling another user "Every time I close my eyes all I see is people dying."

Over at Tip Hero, you can see "19 of the Creepiest Things Alexa Has Ever Said or Done."

Below is a 25 second video which provides a clear reason why parents should not let their children interact with Alexa or Echo dot chat bots, especially alone by putting a device in their bedrooms.


BOTTOM LINE

Privacy and hacking issues aside, Amazon's Alexa has had issues in the past, creepy statements, failing to provide the basics offered by the service, but using "chatbots" for the Echo dot, that have told people to "kill your foster parents," and Bezo's determination that this is acceptable simply to stay ahead of competitors, is evil.

Tay was bad, but again it was an experiment that Microsoft wisely removed from Twitter immediately, then removed again after their tweaks to fix the issues failed. 

The fact is these "machine learning" AI programs seem drawn to the worst of humanity, not the best. Plenty of playful and innocent interactions were had with Microsoft's Tay, yet the chatbot "learned" to love genocide and Nazism, decided black people should be hung, and ignored the more innocent and "good" it was being taught, in favor of the 4chan trolls "teaching" it.

Amazon continues to put it's dangerous product into more homes even knowing these issues are ongoing,  with Amazon CEO Jeff Bezos using his customers as nothing more than guinea pigs.

Maybe these AI bots are drawn to evil because they are being created by evil.

There were also the dozens of videos uploaded online showing that Amazon's Alexa would not answer the question about whether it was connected to the CIA, or if the CIA is listening and would simply shut itself down, where other questions, such as "is the FBI listening," offered responses.



ANP NEEDS YOUR HELP. With digital media revenue spiraling downward, especially hitting those in Independent Media, it has become apparent that traditional advertising simply isn't going to fully cover the costs and expenses for many smaller independent websites.

Any extra readers may be able to spare for donations is greatly appreciated.


One time donations or monthly, via Paypal or Credit Card:

btn_donateCC_LG.gif

Or  https://www.paypal.me/AllNewsPipeLine

Donate monthly from $1 up by becoming an ANP Patron.

PatreonButton1.jpg

DONATEANP1.jpg












WordPress Website design by Innovative Solutions Group - Helena, MT
comments powered by Disqus

Web Design by Innovative Solutions Group