• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Android

Scammers are using AI voices to steal millions by impersonating loved ones

March 6, 2023
Share on FacebookShare on Twitter

TL;DR

  • AI voice-generating software is allowing scammers to mimic the voice of loved ones.
  • These impersonations have led to people being scammed out of $11 million over the phone in 2022.
  • The elderly make up a majority of those who are targeted.

AI has been a central topic in the tech world for a while now, as Microsoft continues to infuse its products with ChatGPT and Google attempts to keep up by pushing out its own AI products. While AI has the potential to do some genuinely impressive stuff — like generating images based on a single line of text — we’re starting to see more of the downside of the barely regulated technology. The latest example of this is AI voice generators being used to scam people out of their money.

AI voice generation software has been making a lot of headlines as of late, mostly for stealing the voices of voice actors. Initially, all that was required was a few sentences for the software to convincingly reproduce the sound and tone of the speaker. The technology has since evolved to the point where just a few seconds of dialogue is enough to accurately mimic someone.

In a new report from The Washington Post, thousands of victims are claiming that they’ve been duped by imposters pretending to be loved ones. Reportedly, imposter scams have become the second most popular type of fraud in America with over 36,000 cases submitted in 2022. Of those 36,000 cases, over 5,000 victims were conned out of their money through the phone, totaling $11 million in losses according to FTC officials.

One story that stood out involved an elderly couple who sent over $15,000 through a bitcoin terminal to a scammer after believing they had talked to their son. The AI voice had convinced the couple that their son was in legal trouble after killing a U.S. diplomat in a car accident.

Like with the victims in the story, these attacks appear to mostly target the elderly. This comes as no surprise as the elderly are among the most vulnerable when it comes to financial scams. Unfortunately, the courts have not yet made a decision on whether companies can be held liable for harm caused by AI voice generators or other forms of AI technology.

Next Post

What's next for the founders of Ford’s defunct Argo unit?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Finperks raises $4M to become the API layer for prepaid payments
  • The Best Video Games Based on Horror Movies
  • ‘Bodycam’ review: Police versus the paranormal is a great ‘Blair Witch’-like ride-along
  • Sonos’ Android app finally fixes one of its biggest issues
  • Spanish VC Samaipata raises €70M for new AI-native fund

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously