AMAZON'S Alexa Could Soon Mimic Voice Of Dead Relatives!

 

File image: Laura Labovich, background, and her children Asher, right, 13, and Emerson, left, 10, with the family “Alexa”, an artificial intelligence device, on January, 29, 2017 in Bethesda, MD.

Some might find this news story silly. We find it just MACABRE enough to share with you. Amazon’s Alexa may have the potential to mimic the voices of our loved ones - even if they’re DEAD!

The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development and would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording.

Rohit Prasad, senior vice president and head scientist for Alexa, said at the event Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy and affect.”

“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”

In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading me the Wizard of Oz?” Alexa then acknowledges the request, and switches to another voice mimicking the child’s grandmother. The voice assistant then continues to read the book in that same voice. WEIRD!!!

To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature, which is bound to spark more privacy concerns and ethical questions about consent.

Amazon’s push comes as competitor Microsoft earlier this week said it was scaling back its synthetic voice offerings and setting stricter guidelines to “ensure the active participation of the speaker” whose voice is recreated.

“This technology has exciting potential in education, accessibility, and entertainment, and yet it is also easy to imagine how it could be used to inappropriately impersonate speakers and deceive listeners,” said a blog post from Natasha Crampton, who heads Microsoft’s AI ethics division.

This might be a feature some need as grieving the loss of a loved one is very hard but one would think this feature could be VERY strange and off-putting after awhile. I mean, loss is an important part of life and we must learn to let go and keep living but THIS news might have people just reliving trauma for years to come…

Stay up to date with “The Dark Side Of Pop Culture” by following MacabreDaily on Instagram, Facebook, and Twitter.