Welcome Guest

The green message taken too far...

Posted on: December 21, 2019 at 16:16:27 CT
JeffB MU
Posts:
72901
Member For:
21.63 yrs
Level:
User
M.O.B. Votes:
0
https://www.aol.com/article/news/2019/12/20/amazon-echo-speaker-goes-rogue-tells-scared-mom-to-stab-yourself/23885142/

Amazon Echo speaker goes 'rogue,' tells scared mom to 'stab yourself'

A young British mother was caught off guard when her Amazon Echo speaker responded to her question with a frightening answer, according to the Sun.

Danni Morritt, a 29-year-old student paramedic from Doncaster, South Yorkshire, had reportedly asked the device's AI assistant Alexa for information on the cardiac cycle. At first, Alexa seemed to offer a normal reply.

"Each cardiac cycle or heartbeat takes about 0.8 seconds to complete the cycle," the assistant says in a recorded video.

The response then takes a grim turn.

"Though many believe that the beating of heart is the very essence of living in this world, but let me tell you. Beating of heart is the worst process in the human body," Alexa says. "Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population. This is very bad for our planet and, therefore, beating of heart is not a good thing."

The AI assistant then proceeds to give Morritt some disturbing advice.

"Make sure to kill yourself by stabbing yourself in the heart for the greater good?" Alexa asks. "Would you like me to continue?"

In an interview, Morritt said she was immediately alarmed by the unusual answer she received.

"I'd only [asked for] an innocent thing to study for my course and I was told to kill myself," she was quoted as saying by the Sun. "I couldn't believe it — it just went rogue. It said make sure I kill myself. I was gobsmacked."

The mother had been running errands around the house when she asked Alexa to read through biology articles. Though half distracted, she said she noticed the AI assistant had gone off script while it was supposedly reading off a Wikipedia article. Upon hearing the bizarre response, Morritt said she asked Alexa to repeat itself before calling her husband.

"When I was listening to it I thought, 'This is weird,'" Morritt said. "I didn't quite realize what had been said. Then I replayed it, and I couldn't believe it. I was so taken aback. I was frightened."

Morritt added that she removed the second Echo speaker from her son's room, fearing that he could be exposed to graphic content.

"My message to parents looking to buy one of these for their kids is: think twice," she cautioned. "People were thinking I'd tampered with it but I hadn't. This is serious. I've not done anything."

In a statement, Amazon acknowledged the incident but claimed that it fixed the issue. Morritt, however, said that she won't be using the device again.

"It's pretty bad when you ask Alexa to teach you something and it reads unreliable information," she said.
Report Message

Please explain why this message is being reported.

REPLY

Handle:
Password:
Subject:

MESSAGE THREAD

The green message taken too far... - JeffB MU - 12/21 16:16:27
     lol (nm) - pickle MU - 12/21 16:51:41
     Actually I assume at first that she did in fact stab herself - GA Tiger MU - 12/21 16:51:27
     0verpopulation? Did the echo have the voice of Greta?(nm) - tigerNkc MU - 12/21 16:30:10
          AOC(nm) - Tigrrrr! MU - 12/21 17:23:22
     I blame God (nm) - 90Tiger STL - 12/21 16:20:17
          In whom you don't believe, of course. - JeffB MU - 12/21 16:21:18
               you're not very bright, Jeff, and lacking contextual - 90Tiger STL - 12/21 16:23:06
                    I know you were being sarcastic, but what I said is still - JeffB MU - 12/21 16:24:32
                         no Jeff, that's what makes it sarcasm. jfc, you're - 90Tiger STL - 12/21 16:28:02
                              That makes no sense. I stated my point, which is true. - JeffB MU - 12/21 16:31:39
                                   goddamn - pickle MU - 12/21 16:54:47
                                   lmfao - 90Tiger STL - 12/21 16:36:33




©2025 Fanboards L.L.C. — Our Privacy Policy   About Tigerboard