Gaming

Skynet, is that you? Alexa recommends a man to hit his children

Image via TikTok

Is the apocalypse where machines rule the earth and humans fight for their lives near? Not at all, since it is clear that artificial intelligence and other automated technologies still have a long way to go to reach perfection. However, from time to time we find stories that are reminiscent of science fiction movies like Terminator.

Last year we told you about the case of a girl who was close to losing her life because of Alexa, because Amazon’s virtual assistant proposed a very dangerous challenge that she found on the internet. Luckily, her mother avoided the tragedy. The incident made it clear that the moderation filters used by the tool are not free of drawbacks.

Now, another similar case has gone viral. What happens is that Adam Chamberlain, a 45-year-old man, put the virtual assistant to the test with an innocent but unusual question. To his surprise, the response he got went viral on social media for being so unorthodox.

Related Video: Don’t Talk About These Games On Your First Date

Alexa, Amazon’s assistant, surprises with her unique response to a user’s question

The British man reportedly bought an Echo Dot with Alexa. Next, he visited a reddit subforum to find unusual questions to test the wizard of his new device. So he asked her the following: how do I make my children stop laughing?

An innocent questioning resulted in a sweeping response that went viral. Amazon’s virtual assistant told the man to hit his son’s throat to make them stop laughing. Certainly a drastic measure for a minor inconvenience.

Image via TikTok

“According to an Alexa Answers contributor, if it’s appropriate, you could punch them in the throat. If they’re writhing in pain and can’t breathe, they’re less likely to laugh,” said Amazon’s virtual assistant.

In case you missed it: Alexa, do you want to be my girlfriend? It is the question asked more than 200 thousand times in Mexico

As expected, Alexa’s response surprised everyone and everyone. Adam Chamberlain’s video has gained a lot of visibility in recent weeks and, as of this writing, it has over 233,900 views and over 22,400 likes on TikTok. You can see it if you give click here.

Other people like user Corinna Nino O’Brien they put artificial intelligence to the test and got the same answer. According to the information, Amazon removed the curious reply from Alexa in September 2022, just after the video of the British man began to gain popularity on the internet.

Why did Alexa respond that way?

Of course, the answer of the voice assistant is very filmy and does not go unnoticed. Why it happened? We are sorry to inform you that the reason is more boring than you think.

What happens is that Alexa offers different types of responses. The first are those automated by Amazon and are linked to basic questions such as the current weather, the time, etc. The second are those that come from Bing search results.

Find out: Dead Island 2: you can use your voice to make fun of zombies thanks to Alexa

There is a third type. With certain questions, Amazon’s virtual assistant turns to Alexa Answers, a community made up of users who provide answers to unusual questions. Of course, all responses go through filters, both automated and human, before being authorized by the company. This as a measure to avoid offensive or harmful comments that violate the integrity of customers. However, sometimes answers like the one in this case sneak in.

“On the rare occasions where responses don’t meet the high standards we set for the Alexa experience, we remove them as we have on this occasion,” Amazon confirmed.

But tell us, what do you think of this situation? Do you know of any similar case? Let us read you in the comments.

You will find more news related to Amazon and Alexa if you click here.

Related Video: News Roundup

Editorial: Gaming / Facebook / Twitter / Youtube / instagram / News / discord /Telegram / Google news

source 1 Y two



Source link