Here's one Amazon Alexa feature I don't think I'll ever want to try. Amazon is experimenting with its AI assistant to let it mimic the voices of users' dead relatives. The company demoed it recently during its MARS conference, where they showed a video of a child asking Alexa to read a bedtime story in the voice of his dead grandmother.
"As you saw in this experience, instead of Alexa's voice reading the book, it's the kid's grandma's voice," said Rohit Prasad, Amazon's head scientist for Alexa AI. Prasad introduced the clip by saying that adding "human attributes" to AI systems was becoming more critical "in these times of the ongoing pandemic, when so many of us have lost someone we love."
Amazon hasn't mentioned if they plan to roll out the feature to the public but says its technology can imitate someone's voice from just a minute of recorded audio. It's not new technology per se, with audio recording suites capable of cloning individual voices for recordings. Sometimes this is used in podcasting, video games, and film and TV industries. But I can't help but be wary of this making its way into homes (again, Amazon hasn't confirmed it would roll it out). It's creepy and makes me think of the idea that "just because you can, doesn't mean you should."