• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

ChatGPT caricature trend: What to do if OpenAI knows too much

February 13, 2026
Share on FacebookShare on Twitter

The ChatGPT caricature trend has gone mega-viral, with countless people sharing AI-generated images of themselves on Reddit, X, and other social media platforms. These images are usually quite cute (though they can be bizarre and unsettling). A typical ChatGPT caricature depicts the user in cartoon style, surrounded by items that reflect their personality, hobbies, or profession.

You can see hundreds of examples on X based on simple prompts such as “Create a caricature of me and my job based on everything you know about me.”


This Tweet is currently unavailable. It might be loading or has been removed.


This Tweet is currently unavailable. It might be loading or has been removed.

SEE ALSO:

Love the caricature trend? 9 more viral ChatGPT image prompts to try.

But what if ChatGPT knows you a little too well? The more detailed and accurate your caricature, the more ChatGPT and OpenAI know about you.

For instance, when I tried to generate a ChatGPT caricature, the results were painfully bland. When I asked ChatGPT how it decided which details to include in the photo, the chatbot basically admitted it simply picked generic items like headphones and coffee. “Because I don’t actually have deep personal info about you (beyond what you’ve shared in chats), I used fun but non-specific caricature tropes.” (Emphasis in original.)

So, if your caricature left you feeling a certain type of way, what can you do? It may be time to practice some digital hygiene and take a fresh look at how ChatGPT saves and uses your data.

Delete your ChatGPT chat history

ChatGPT saves a history of your previous chats, which can be helpful. However, you can delete these chats to limit the data that OpenAI has about you. To delete an individual chat, go to the “Your chats” tab in the ChatGPT sidebar. Click the three dots next to a chat and click “Delete.”

Mashable Light Speed

You can also delete all of your chats. To do this, click on your profile icon and click into “Settings,” then “Data controls.” Here, you can select “Delete all chats.” You may also choose to turn off the “Improve the model for everyone” setting, which allows OpenAI to use your chats for model training.

Credit: Screenshot courtesy of OpenAI

screenshot of data control settings in chatgpt

Credit: Screenshot courtesy of OpenAI

Request that OpenAI delete your account and personal data

OpenAI has a privacy portal, where users can submit data deletion requests. Using the privacy portal, you can submit a variety of privacy-related requests:

  • Download your personal data

  • Ask OpenAI not to train its products on your content

  • Delete your ChatGPT account

  • Delete your custom GPTs

  • Remove your personal data from ChatGPT responses

  • Submit privacy requests on behalf of another person

You can also send additional requests, questions, and comments directly to OpenAI using the email address [email protected].

Reconsider your relationship with AI chatbots

People use ChatGPT (and other AI chatbots) in a variety of ways, and over time, it can feel like more than a generic assistant. Some people go to Chat with deeply personal medical questions, while others treat ChatGPT as a relationship advisor, a life coach, or even a close personal friend. As Mashable has reported previously, a growing number of people are now using AI for companionship.

However, if you believe you’ve developed a parasocial relationship with large-language models like ChatGPT, then it may be time to reflect on how you interact with this technology. For instance, if you’re developing an emotional reliance on ChatGPT, or if you’re starting to believe that ChatGPT is “alive” and in a relationship with you, you may want to take a break from Chat.

The long-term effects of developing an emotional reliance on AI chatbots are unknown, but experts we’ve spoken to have warned that this type of behavior may be harmful if it takes time and energy away from your other relationships, social life, and hobbies. Organizations like Common Sense Media have also warned that AI companions are unsafe for users under 18.


Disclosure: Ziff Davis, Mashable’s parent company, in April 2025 filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Next Post

Big Walk - Announcement Trailer

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Death Stranding 2 Concept Art Reveals Sam’s Lost Turtleneck Phase
  • OpenAI retires GPT-4o. The AI companion community is not OK.
  • We asked, you answered: Not having built-in Qi2 magnets on the Galaxy S26 would be a dealbreaker for many, but not everyone cares
  • Naboo raises $70M for an AI-powered event procurement platform
  • Why creativity is the currency of the AI age

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously