• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Major AI players agree to give US government early AI model access

May 6, 2026
Share on FacebookShare on Twitter

That was quick.

Some of the biggest AI companies have just agreed to provide the U.S. government with early access to their new AI models. And this went down just one day after a report from the New York Times detailed how the Trump administration was looking into government oversight of new AI models.

According to a new report from the Wall Street Journal, three of tech’s biggest AI companies — Google, Microsoft, and xAI — have all reached an agreement with the Trump administration to provide access to new frontier models before they are released to the public.

The three companies will provide this access to the Commerce Department’s Center for AI Standards and Innovation (CAISI), which will evaluate new AI models on their capabilities and security. OpenAI and Anthropic have both previously agreed to a similar agreement with the Commerce Department in 2024.

Mashable Light Speed

CAISI has already completed over 40 evaluations on AI models before their release to the public.

“Independent, rigorous measurement science is essential to understanding frontier AI and its national security implications,” CAISI director Chris Fall said to the WSJ. “These expanded industry collaborations help us scale our work in the public interest at a critical moment.”

Earlier this week, the WSJ also reported that the Trump administration is looking into a “cybersecurity-focused executive order,” which would create an oversight group whose role is to create standards for AI models.

These recent developments come in the wake of the Trump administration’s feud with AI company Anthropic earlier this year. The US government declared Anthropic and its AI chatbot Claude was a supply chain risk to national security after the AI company requested that the Trump administration not use its technology for warfare or mass surveillance purposes.

Previously, the Trump administration has taken a very pro-AI stance, citing the need for U.S. companies to maintain an edge over their Chinese rivals.

Next Post

This 512GB Dell 15 laptop delivers steady performance for just $307

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Nyobolt raises $60M at $1B valuation led by Symbotic for ultrafast-charging batteries powering warehouse robots and AI data centres
  • Iran war costs $20-25M monthly in ad revenue, Perplexity $400M deal ends, 16% workforce cut as AR glasses bet intensifies
  • Vine is coming back, and it’s being relaunched by the guy who killed it — say hello to Jack Dorsey’s Divine, a TikTok and Instagram Stories rival with a ferocious ambition to end AI slop
  • Google updates AI Overviews with Further Exploration links, subscription labels as 58% publisher click decline triggers antitrust suits
  • Skip Motorola’s latest drop and grab this Razr flip phone for just $260 before it’s gone

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously