AI Chatbots Are Programmed To Spew Democrat Gun Control Narratives
We asked AI chatbots about their thoughts on crime and gun control. As election day neared, their answers moved even further left.
For years, experts have been debating the issues raised by deepfake technology. Congress and the military have discussed how to prepare the country for the fraud that is likely as deepfakes become more realistic. Now, new advancement by Amazon has taken that technology to the next level.
At Amazon’s re:MARS conference in June, Rohit Prasad — head scientist and vice president of Alexa AI — demonstrated how Amazon scientists could recreate any voice based on just a one-minute audio sample.
Amazon’s original Alexa voice debuted in November 2014. In its initial years, the voice was heavily critiqued. In 2017, VentureBeatwrote, “Alexa is pretty smart, but no matter what the A.I.-powered assistant talks about, there’s no getting around its relatively flat and monotone voice.”
Now, however, text-to-speech (TTS) technology has made major advancements towards more realistic — some argue, too realistic — speech.
As Fast Companyreports, in an effort to create more expressive and natural-sounding voices, Amazon, Google, Microsoft, Baidu, and other major players in text-to-speech have all in recent years adopted some form of “neural TTS.” Neural TTS uses deep-learning neural networks trained on human speech and can convert any text input into human-sounding speech. Neural systems are capable of learning “not just pronunciation but also patterns of rhythm, stress, and intonation.”
Amazon hasn’t announced when this new voice-cloning capability will be available to developers and the public.
Governments everywhere are struggling to figure out how to adapt to these latest advancements. In America, most deepfakes are considered protected free speech, at least for now. Still, some states have attempted to take action against nefarious uses of the technology. In New York, commercial use of a performer's synthetic likeness without consent is banned for 40 years after the performer's death, according to CBS News. California and Texas prohibit deceptive political deepfakes before elections.
While concerns about realistic voice-cloning technology are rampant, developers of the technology, like Amazon, are optimistic. In an email to Fast Company, an Amazon spokesperson wrote: “Personalizing Alexa’s voice is a highly desired feature by our customers, who could use this technology to create many delightful experiences. We are working on improving the fundamental science that we demonstrated at re:MARS and are exploring use cases that will delight our customers, with necessary guardrails to avoid any potential misuse.”
Amazon was forced to update Alexa's settings this week after the voice assistant reportedly dared a 10-year-old child to perform a dangerous and potentially life-threatening "challenge."
A mother and her young daughter were hanging out over Christmas break, reportedly performing physical challenges around the house, when the child asked the family's Echo Dot to suggest another "challenge to do."
In response, Alexa suggested that the child “plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs." It's a dangerous TikTok trend from last year that can cause anything from sparks and a ruined electrical outlet to electric shock or a large fire.
The mother, Kristin Livdahl, documented the startling suggestion on Twitter in disbelief.
OMFG My 10 year old just asked Alexa on our Echo for a challenge and this is what she said.pic.twitter.com/HgGgrLbdS8— Kristin Livdahl (@Kristin Livdahl) 1640554938
"We were doing some physical challenges, like laying down and rolling over holding a shoe on your foot, from a [Phys.] Ed teacher on YouTube earlier. Bad weather outside. She just wanted another one," the mother recounted.
That's when Alexa suggested the challenge that it had “found on the web.”
In another tweet, Livdahl recalled that when the suggestion was made, "I was right there and yelled, No, Alexa, no! like it was a dog."
Thankfully, her daughter didn't participate in the challenge, telling her mother that "she is too smart to do something like that anyway."
The "penny challenge," as it is called, is one of several dangerous trends that have emerged on TikTok over the past several years. It instructs participants to create an electrical current with a loosely plugged-in wall charger before daring them to drop a coin onto the exposed prongs.
In response to the challenge, fire officials in the U.S. have issued warnings.
Dangers of the TikTok Penny Challenge www.youtube.com
After Livdahl's tweet thread garnered considerable attention, Amazon reached out to the concerned mother. Then in a statement to BBC News, the company said that it had taken "swift action" to resolve the issue.
"Customer trust is at the center of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers," Amazon told the news outlet. "As soon as we became aware of this error, we took swift action to fix it."
Amazon's statement was confirmed by CNBC, though that outlet noted the company did not immediately elaborate on what exactly was the “swift action" taken.