• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Internet

Nvidia unveils $3,000 desktop AI computer for home researchers

January 7, 2025
Share on FacebookShare on Twitter

On Monday, Nvidia announced Project DIGITS, a small desktop computer aimed at researchers, data scientists, and students who want to experiment with AI models—such as chatbots like ChatGPT and image generators—at home. The $3,000 device, which contains Nvidia’s new GB10 Grace Blackwell Superchip, debuted at CES 2025 in Las Vegas. It will launch in May and can operate as a standalone PC or connect to a Windows or Mac machine.

At CES on Monday, Nvidia CEO Jensen Huang described the new system as “a cloud computing platform that sits on your desk.” The company also designed Project DIGITS as a bridge between desktop development and cloud deployment. Developers can create and test AI applications locally on Project DIGITS, then move them to cloud services or data centers that use similar Nvidia hardware.

The GB10 chip inside the Project DIGITS computer combines an Nvidia Blackwell GPU with a 20-core Grace CPU based on Arm architecture. Nvidia developed the chip in partnership with MediaTek, and it connects to 128GB of memory and up to 4TB of storage inside the Project DIGITS enclosure.

Running AI models locally

Currently, many people use AI models that must run on remote data centers due to their computational requirements. Over time, there has been a movement to slim down some AI models so they can run effectively on local, personally owned hardware. Project DIGITS can provide some of that capability at home.

A single Project DIGITS unit can reportedly run AI models with up to 200 billion parameters, while two linked units can handle models with 405 billion parameters. In AI models, parameter count roughly corresponds to an AI model’s neural network size and complexity, with more parameters requiring more memory and computational power to run. Also, parameter size approximates AI model capability, though different-sized AI models perform differently depending on how they were trained and architected.

Next Post

Fermata uses computer vision to detect diseases and pests in plants

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Gemini for Google Home gets better at understanding you
  • T-Mobile closes popular $35 fee loophole when purchasing through Apple
  • Best Buy spring deals 2026: Apple, Sony, Laptops
  • Zero Parades: For Dead Spies is the first Disco Elysium successor to get a release date
  • Amazon Big Spring Sale unlocked phone deals: Top iPhone offers

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously