Darth Vader's voice is now AI-generated
Any future appearance of Darth Vader in the Star Wars universe will no longer feature James Earl Jones's voice. The actor signed off during the creation of the Obi-Wan Kenobi show that Disney could replicate his vocal performance as Darth Vader in future projects. Disney has been able to do this with the help of an AI voice-modeling tool called Respeecher, which comes from a company in Ukraine. It can use deep learning to model and replicate human voices in a way that's almost indistinguishable from the real thing.
This isn't the first time Lucasfilm has used Respeecher, though. They've done it with Mark Hamill's voice in The Mandalorian. So, they thought it might work with Darth Vader, too. Jones has voiced the iconic Star Wars villain for 45 years. And now at 91, he was ready to retire the character. Respeecher used archival recordings of Jones to create a voice model another actor could use to "perform" vocally using its speech-to-speech technology.
Vanity Fair covers the trouble Respeecher went through during the production of the show. The company isn't ignorant about what this kind of technology could be a threat to as well. In its ethics statement, it says the firm "does not allow any deceptive uses of our technology" and "does not use voices without permission when this could impact the privacy of the subject or their ability to make a living."
They aren't the only company dabbling in voice cloning. But it's interesting to see it used in the entertainment industry and what this means for the entertainment world.
Reader Comments