• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Sci-Fi

Unsecured AI apps are leaking personal data of Android users

February 20, 2026
Share on FacebookShare on Twitter

Not every AI tool you stumble across in your phone’s app marketplace is the same. In fact, many of them may be more of a privacy gamble than you would have previously thought.

A plethora of unlicensed or unsecured AI apps on the Google Play store for Android, including those marketed for identity verification and editing, have exposed billions of records and personal data, cybersecurity experts have confirmed.

SEE ALSO:

Two Amazon cloud outages in December triggered by AI tools, report says

A recent investigation by Cybernews found that one Android-available app in particular, “Video AI Art Generator & Maker,” has leaked 1.5 million user images, over 385,000 videos, and millions of user AI-generated media files. The security flaw was spotted by researchers, who discovered a misconfiguration in a Google Cloud Storage bucket that left personal files vulnerable to outsiders. In total, the publication reported, over 12 terabytes of users’ media files were accessible via the exposed bucket. The app had 500,000 downloads at the time.

Mashable Light Speed

Another app, called IDMerit, exposed know-your-customer data and personally identifiable information from users across 25 countries, predominantly in the U.S.

Information included full names and addresses, birthdates, IDs, and contact information constituting a full terabyte of data. Both of the apps’ developers resolved the vulnerabilities after researchers notified them.

Still, cybersecurity experts warn that lax security trends among these types of AI apps pose a widespread risk to users. Many AI apps, which often store user-uploaded files alongside AI-generated content, also use a highly criticized practice known as “hardcoding secrets,” embedding sensitive information such as API keys, passwords, or encryption keys directly into the app’s source code. Cybernews found that 72 percent of the hundreds of Google Play apps researchers analyzed had similar security vulnerabilities.

Next Post

OpenAI is making a $300 smart speaker with a built-in camera

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • The most radical act in an age of outrage is to play
  • What's Next For Animal Crossing?
  • The next wave of AI may focus on human connection
  • Google I/O | Android Central
  • Gateway Global AI’s approach to business automation

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously