24.2 C
Hyderabad
Saturday, July 27, 2024

Amazon’s Alexa Could Soon Mimic Voice Of Dead Relatives

-

The feature, which will be introduced at Amazon’s Re:Mars conference on Wednesday, June 22, 2022 in Las Vegas, is currently under development and would enable the virtual assistant to impersonate the voice of a particular individual based on a recording that is shorter than a minute long.

Amazon‘s Alexa may eventually be able to imitate family members’ voices, even if they are deceased.The feature, which was presented at Amazon’s Re:Mars conference in Las Vegas, is under development and would enable the virtual assistant to imitate a specific person’s voice based on a recording that lasts less than a minute.

The goal of the feature, according to Rohit Prasad, senior vice president and lead scientist for Alexa, who spoke at the event on Wednesday, was to increase customer confidence in their interactions with Alexa by including more “human traits of empathy and compassion”.

Since so many of us have lost loved ones due to the continuing pandemic, Prasad noted that these qualities have become even more crucial.”AI can surely help their memories last, even though it can’t take away that agony of loss”.

A small child asks Alexa in a video that Amazon presented during the event, “Alexa, can Grandma finish reading me the Wizard of Oz?”After acknowledging the request, Alexa changes to a different voice that imitates the child’s grandmother.After that, the voice assistant keeps reading the book in that same tone.

In order to produce the feature, according to Prasad, the business had to figure out how to record a “high-quality voice” in a short amount of time as opposed to spending hours in a recording studio.The function, which is sure to raise more privacy issues and moral dilemmas about consent, was not further explained by Amazon.

Amazon’s move coincides with Microsoft’s announcement earlier this week that it was reducing its synthetic voice capabilities and tightening regulations to “guarantee the active participation of the speaker” whose voice is replicated.On Tuesday, Microsoft announced that it is restricting which customers can use the service. It is also highlighting the permitted uses of the programme, such as an interactive Bugs Bunny figure at AT&T stores.

According to a blog post by Natasha Crampton, who oversees Microsoft’s AI ethics team, “this technology has enormous possibilities in education, accessibility, and entertainment, but it is also easy to envisage how it may be used to inappropriately impersonate speakers and deceive listeners”.

Found this article interesting? Follow BG on Facebook, Twitter and Instagram to read more exclusive content we post.

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

Latest news

7 TECH TIPS THAT WILL HELP YOU SMASH YOUR BUSINESS GOALS 2024

Table of Contents1. Make use of keyboard shortcuts2. A universe of widgets and integrations3. Use technology to defeat technology4....

10 Best Backpacking and Camping gadgets 2024

Table of Contents10 Best Backpacking and Camping gadgets1. Living Lodge at Snow Peak M2. Camper JISULIFE Fan3. NOMAD Smoker...

20 Cool Smart home Gadgets on Amazon 2024, trend this year!

Table of Contents20 Cool Smart home Gadgets on Amazon1. Twelve South AirFly Pro Bluetooth Wireless Audio Transmitter/Receiver:2. PhoneSoap Dual...

Top 10 gadgets to buy under Rs.1000

Table of ContentsTop 10 gadgets to buy under Rs.1000 on Amazon India1. pTron Bullet Pro 36W PD Quick Charge2....

Latest Updates

Must read

Implantable biosensor operates without batteries

Implantable biosensor Researchers from the University of Surrey have revealed...

Infinix Smart 6 Plus To Go On Sale For First Time In India Today

Table of ContentsInfinix Smart 6 Plus price in India,...

You might also likeRELATED
Recommended to you

0
Would love your thoughts, please comment.x
()
x