• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Fake Biden robocall creator suspended from voice AI company ElevenLabs

January 27, 2024
Share on FacebookShare on Twitter

With mainstream artificial intelligence tools on the rise at the cusp of the 2024 U.S. presidential election, AI-generated disinformation isn’t just a fear — it’s already a reality. On January 22, the New Hampshire Department of Justice released a statement that people received a recorded audio deepfake of Joe Biden, telling them not to vote in the state primary election. The call encouraged voters to “save” their vote, noting falsely that, “your vote makes a difference in November, not this Tuesday.”

Days later, AI startup ElevenLabs suspended the creator of the fake Biden audio, Bloomberg reported.

SEE ALSO:

Deepfakes of Taylor Swift have gone viral. How does this keep happening?

ElevenLabs is an AI voice generator that is run by a model that, according to its website, can add human-like inflection to a voice based on context. The generator has thousands of pre-made AI voices to choose from, or you can create a custom one. Bloomberg reported that voice-fraud detection company Pindrop Security Inc. found that the AI Biden robocall was made using ElevenLabs.

“We are dedicated to preventing the misuse of audio AI tools and take any incidents of misuse extremely seriously,” ElevenLabs told Bloomberg. ElevenLabs’ website states that deepfakes of politicians can be used only in certain cases, including caricature, parody, or satire. Once the company was made aware of the Biden deepfake, it investigated and suspended the account responsible, a source told Bloomberg.

In an interview with The Hill, computer science professor at Carnegie Mellon University Kathleen Carley said the Biden robocall is “the tip of the iceberg” in terms of attempts to suppress voters. Carley added that it’s a harbinger of what could come.

ChatGPT developer OpenAI is already trying to quell misinformation itself, releasing plans to protect the integrity of the election. Soon after, the company suspended a developer who made a bot for a long-shot democratic candidate.

As such, we must be vigilant in what we see — and hear — this election season. As Mashable tech reporter Cecily Mauran warned, “The idea of an internet dominated by AI-generated content is already happening and it doesn’t look good.”

Topics
Artificial Intelligence
Joe Biden

Next Post

How to get Rainbow Geodes in Like a Dragon: Infinite Wealth

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • NYT Strands hints, answers for March 5, 2026
  • Pokemon Pokopia – Daily Reset Time And Details
  • X is reinventing its Creator Subscriptions. Here’s what to know.
  • YouTube direct messaging is going live in more regions
  • Best MacBook Air deal: Get a free $50 gift when pre-ordering at Best Buy

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously