• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Mobile

Character.AI unveils AvatarFX, an AI video model to create lifelike chatbots

April 22, 2025
Share on FacebookShare on Twitter

Character.AI, a leading platform for chatting and roleplaying with AI-generated characters, unveiled its forthcoming video generation model, AvatarFX, on Tuesday. Available in closed beta, the model animates the platform’s characters in a variety of styles and voices, from human-like characters to 2D animal cartoons.

AvatarFX distinguishes itself from competitors like OpenAI’s Sora because it isn’t solely a text-to-video generator. Users can also generate videos from preexisting images, allowing users to animate photos of real people.

It’s immediately evident how this kind of tech could be leveraged for abuse — users could upload photos of celebrities or people they know in real life and create realistic-looking videos in which they do or say something incriminating. The technology to create convincing deepfakes already exists, but incorporating it into popular consumer products like Character.AI only exacerbates the potential for it to be used irresponsibly.

We’ve reached out to Character.AI for comment.

Character.AI is already facing issues with safety on its platform. Parents have filed lawsuits against the company, alleging that its chatbots encouraged their children to self-harm, to kill themselves, or to kill their parents.

In one case, a 14-year-old boy died by suicide after he reportedly developed an obsessive relationship with an AI bot on Character.AI based on a “Game of Thrones” character. Shortly before his death, he’d opened up to the AI about having thoughts of suicide, and the AI encouraged him to follow through on the act, according to court filings.

These are extreme examples, but they go to show how people can be emotionally manipulated by AI chatbots through text messages alone. With the incorporation of video, the relationships that people have with these characters could feel even more realistic.

Character.AI has responded to the allegations against it by building parental controls and additional safeguards, but as with any app, controls are only effective when they’re actually used. Oftentimes, kids use tech in ways that their parents don’t know about.

Next Post

Tesla reportedly delays cheaper Model Y again [April 2025]

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Best power station deal: Save $700 on Jackery Explorer 2000 v2
  • NYT Mini crossword answers, hints for May 14, 2026
  • ‘Off Campus’ fans, don’t freak out, but the show is already starting Allie and Dean’s love story
  • OxygenOS 16.1 unlocks a smarter, more secure and personal OnePlus 15
  • US clears H200 sales to 10 Chinese firms, but not a single chip has shipped

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously