The feature, which will be introduced at Amazon’s Re:Mars conference on Wednesday, June 22, 2022 in Las Vegas, is currently under development and would enable the virtual assistant to impersonate the voice of a particular individual based on a recording that is shorter than a minute long.
Amazon‘s Alexa may eventually be able to imitate family members’ voices, even if they are deceased.The feature, which was presented at Amazon’s Re:Mars conference in Las Vegas, is under development and would enable the virtual assistant to imitate a specific person’s voice based on a recording that lasts less than a minute.
The goal of the feature, according to Rohit Prasad, senior vice president and lead scientist for Alexa, who spoke at the event on Wednesday, was to increase customer confidence in their interactions with Alexa by including more “human traits of empathy and compassion”.
Since so many of us have lost loved ones due to the continuing pandemic, Prasad noted that these qualities have become even more crucial.”AI can surely help their memories last, even though it can’t take away that agony of loss”.
A small child asks Alexa in a video that Amazon presented during the event, “Alexa, can Grandma finish reading me the Wizard of Oz?”After acknowledging the request, Alexa changes to a different voice that imitates the child’s grandmother.After that, the voice assistant keeps reading the book in that same tone.
In order to produce the feature, according to Prasad, the business had to figure out how to record a “high-quality voice” in a short amount of time as opposed to spending hours in a recording studio.The function, which is sure to raise more privacy issues and moral dilemmas about consent, was not further explained by Amazon.
Amazon’s move coincides with Microsoft’s announcement earlier this week that it was reducing its synthetic voice capabilities and tightening regulations to “guarantee the active participation of the speaker” whose voice is replicated.On Tuesday, Microsoft announced that it is restricting which customers can use the service. It is also highlighting the permitted uses of the programme, such as an interactive Bugs Bunny figure at AT&T stores.
According to a blog post by Natasha Crampton, who oversees Microsoft’s AI ethics team, “this technology has enormous possibilities in education, accessibility, and entertainment, but it is also easy to envisage how it may be used to inappropriately impersonate speakers and deceive listeners”.