[ad_1]
A sizzling potato: Amazon is growing capabilities that can enable its Alexa voice assistant to imitate any human voice after listening to them communicate for lower than a minute. Dismissing the potential creepiness of the function, some are involved concerning the potential for abuse.
Rohit Prasad, who leads the Alexa workforce at Amazon, stated the objective of the venture is to “make the reminiscences final” after “so many people have misplaced somebody we love” because of the pandemic.
Alexa might be educated to mimic a voice utilizing pre-recorded audio, which means the individual would not need to be current – and even alive – to function a supply. In a video section proven throughout a convention this week, a baby requested Alexa if grandma may end studying The Wizard of Oz. Positive sufficient, Alexa adjustments voices to mock the kid’s grandmother and end studying the story.
Prasad stated through the presentation that Alexa now receives billions of requests per week from a whole bunch of tens of millions of Alexa-enabled gadgets throughout 17 languages in additional than 70 international locations across the globe.
The potential for abuse appears excessive. For instance, the device might be used to create convincing deepfakes for misinformation campaigns or political propaganda. Fraudsters may leverage the capabilities for monetary achieve, like in 2020 when scammers tricked a financial institution supervisor into transferring $35 million to fund an acquisition that did not exist.
What are your ideas on the matter? Is Amazon taking the idea of voice cloning a bit too far right here, or are you intrigued by the thought of getting a “dialog” with somebody from the grave?
Picture credit score: Jan Antonin Kolar
[ad_2]
Source link