Amazon's Alexa necromancy takes us one step further from God's light

As reported by Gizmodo, Amazon demonstrated a surprising potential Alexa feature at its re:Mars conference in Las Vegas late last month: a function that would let the digital assistant use audio samples of "less than a minute" in length to synthesize a voice, effectively allowing the smart device to impersonate the original speaker.

Amazon demonstrated a potential use with a video clip of a child asking Alexa to have grandma finish reading him the Wizard of Oz. The digital assistant obliges, and then reads the boy a passage about the Cowardly Lion in what is presumably the grandmother's voice. Prior to the clip, Amazon SVP and head scientist Rohit Prasad singled out the technology's potential to invoke the memories of lost loved ones.

It's hard to know where to begin with this thing, but my thoughts first turn to the presentation of the Tesla Bot, which was just somebody in a morph suit, or Sony's E3 2005 trailer for Killzone 2 consisting entirely of pre-rendered footage presented as real time gameplay. We're living in the age of the dramatized proof-of-concept, and I default to skepticism when it comes to tech presentations, but Prasad does speak about this technology as something the company has already developed.

With that out of the way, we can get to the staggering practical and ethical concerns posed by this idea. Deepfake tech is already a problem in a post-truth internet, so who thought it was a good idea to give consumer electronics the ability to impersonate people's voices?

I also have to admit I'm already strongly biased against similar sorts of technology—I find the use of CGI de-aged or resurrected actors to be in incredibly poor taste. We can't let artists like Carrie Fisher or Peter Cushing die with dignity, we have to stretch their likenesses over digital zombies for years after their deaths because we can't let sixty year-old characters go. Now it's time to bring that same arrested development-enabling desecration of the dead to the general populace.

I also wonder if anyone's thought through the unfortunate implications of a loved one's voice being given to a digital assistant. "Oh yeah my nona lives in a plastic box on the shelf and reads the New York Times daily brief to me." We're getting closer and closer to the beautiful dream of torturing simulacrums of our loved ones for eternity in little isocubes.

Nicola Tesla really did say it best: "You will live to see man-made horrors beyond your comprehension." Buddy, the man-made horrors exceeded my comprehension a long time ago. I can't wait for my own voice to one day be sold as a product to my loved ones after my passing.

Associate Editor

Ted has been thinking about PC games and bothering anyone who would listen with his thoughts on them ever since he booted up his sister's copy of Neverwinter Nights on the family computer. He is obsessed with all things CRPG and CRPG-adjacent, but has also covered esports, modding, and rare game collecting. When he's not playing or writing about games, you can find Ted lifting weights on his back porch.