Home Technology Deepfake Joe Biden robocalls likely made using ElevenLabs tools

Deepfake Joe Biden robocalls likely made using ElevenLabs tools

74
0


Tools from AI startup, ElevenLabs, are being blamed for deepfake Joe Biden robocalls to New Hampshire voters last week.

Some voters in the state claimed to have received calls from the US President telling them not to vote in the primary election. But while it is not clear who was directly responsible for the calls, two teams of audio experts have told WIRED that it was likely created using technology originating from voice-closing startup Eleven Labs.

Security company Pindrop, which specializes in tools that identify synthetic audio, said in a blog post last week that after analyzing the calls, evidence pointed towards ElevenLabs’ technology or a “system using similar components.

ElevenLabs’ AI tools are marketed for the likes of Audiobooks and video games but the public can sign up for the company’s paid service and use an audio sample to anyone’s voice. Its safety policy urges users to obtain someone’s permission before cloning their voice, but does state permissionless cloning is ok for non-commercial purposes, including “political speech contributing to public debates.”

The company’s CEO, Mati Staniszewski, said in a statement on Friday that ElevenLabs is “dedicated to preventing the misuse of audio AI tools.” The statement also said the company would assist authorities to help take action in cases of misuse.

However, this isn’t the first time ElevenLabs’ tools have been accused of being at the center of deepfake political propaganda. In September last year, it was claimed that TikTok accounts sharing conspiracy theories using AI-generated voices, including that of Barack Obama, were using ElevenLabs’ tools.

ElevenLabs recently raised $80 million at a $1.1 billion valuation in a new funding round, achieving “unicorn” status.

The dark side of deepfake content

This is the latest incident of AI-generated deepfake content being created that shows the dark side of what this kind of technology can do, amid further calls to regulate the industry.

Last week, sexually explicit deepfake images of Taylor Swift went viral on X before the platform moved to block searches for the popstar in an attempt to thwart the images circulating further.

With audio, video and now audio having the potential to be misused in such damaging ways using AI technology, Congressman Tom Kean’s recent calls for Congress to take and pass two bills he has introduced to help regulate AI is becoming more pertinent.

Featured Image: Gage Skidmore from Surprise, AZ, United States of America, CC BY-SA 2.0, via Wikimedia Commons

James Jones

Freelance Journalist

James Jones is a highly experienced journalist, podcaster and digital publishing specialist, who has been creating content in a variety of forms for online publications in the sports and tech industry for over 10 years.

He has worked at some of the leading online publishers in the country, most recently as the Content Lead for Snack Media’s expansive of portfolio of websites, including Football Fancast.com, FootballLeagueWorld.co.uk and GiveMeSport.com. James has also appeared on several national and global media outlets, including BBC News, talkSPORT, LBC Radio, 5 Live Radio, TNT Sports, GB News and BBC’s Match of the Day 2.

James has a degree in Journalism and previously held the position of Editor-in-Chief at FootballFanCast.com. Now, he co-hosts the popular We Are West Ham Podcast, writes a weekly column for BBC Sport and covers the latest news in the industry for ReadWrite.com.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here