• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Sci-Fi

China wants to regulate AI’s emotional impact

December 29, 2025
Share on FacebookShare on Twitter

China is drafting new, stricter AI regulations that could set the country on its way to becoming the first to regulate the emotional repercussions of chatbot companions.

SEE ALSO:

ChatGPT is changing the abortion landscape

Detailed in a new draft proposal written by China’s Cyberspace Administration and translated by CNBC, the policy would require guardian consent for minors to engage with chatbot companions as well as sweeping age verification. AI chatbots would not be allowed to generate gambling-related, obscene, or violent content, or engage in conversations about suicide, self-harm, or other topics that could harm a user’s mental health. In addition, tech “providers” must institute escalation protocols that connect human moderators to users in distress and flag risky conversations to guardians.

Chinese regulators say the aim is to focus not only on content safety but emotional safety, including monitoring chats for emotional dependency and addiction.

Mashable Light Speed

It’s one of the first set of laws designed to control anthropomorphic AI tools specifically, experts say. To that end, the rules will apply to any AI tool designed to “simulate human personality and engage users emotionally through text, images, audio or video,” CNBC reports.

SEE ALSO:

Make 2026 the year your kid gets off their device

China’s proposed rules mirror several provisions in a recently passed California AI law, known as SB 243, signed by Gov. Gavin Newsom in October. The law requires stronger content restrictions, reminders to users that they are speaking to a non-human AI, as well as emergency protocols for discussions of suicide. Some experts have critiqued the bill for not going far enough to protect minor users, leaving room for tech companies to dodge oversight.

Meanwhile, the Trump administration has stalled further AI regulation at the state level in favor of a “national framework on AI safety.” The executive order withholds federal infrastructure funding from states that strengthen AI oversight. Federal leaders argue that increased regulation of AI will stall domestic innovation and put the U.S. behind China in the perceived global AI race.

Next Post

Is the Samsung Members app acting wonky for you? You're not alone

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • One UI 9 could introduce a major upgrade to the Samsung Browser
  • Moon phase today explained: What the Moon will look like on March 14, 2026
  • NYT Connections hints and answers for March 14. Tips to solve ‘Connections’ #1007.
  • NYT Strands hints, answers for March 14, 2026
  • ‘Verts on Disney Plus’ want you swiping all day to find the next big thing to watch

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously