• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Google I/O: Project Astra can tell where you live just by looking out the window

May 14, 2024
Share on FacebookShare on Twitter

Google has a new AI agent that can tell you things about what’s around you. A lot of things.

Called “Project Astra,” it’s a Gemini-based multimodal AI tool that lets you point your phone’s camera at real-life stuff and get a spoken description of what you’re looking at.

In a demo, shown during Google’s I/O conference Tuesday, the tool was pointed at a loudspeaker, correctly identifying a part of it as a tweeter. Far more impressively, the phone’s camera was then turned onto a snippet of code on a computer display, with Astra yielding a fairly detailed overview of what the code’s doing.

Mashable Light Speed

Finally, the person testing Project Astra turned their phone towards the window and asked “What neighborhood do you think I’m in?” After a few seconds, Gemini replied: “This appears to be the King’s Cross area of London,” along with a few details about the neighborhood. Finally, the tool was asked to find a misplaced pair of glasses, and it complied, saying exactly where the glasses were left.

In perhaps the most interesting part of the video, we see that those glasses are actually some kind of smart glasses, which can again be used to prompt Gemini about what the wearer sees – in this case giving a suggestion on a diagram drawn on a whiteboard.

SEE ALSO:

Google I/O 2024: ‘AI Agents’ are AI personal assistants that can return your shoes

According to Google’s DeepMind CEO Demis Hassabis, something like Astra could be available both on a person’s phone or glasses. The company did not, however, share a launch date, though Hassabis said that some of these capabilities are coming to Google products “later this year.”


Featured Video For You


Here’s everything that was announced at Google I/O.


Topics
Artificial Intelligence

Next Post

Ghosts of Tabor Energises Players With GLYTCH

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • I checked out the two concept phones everyone’s talking about — here’s the one I’m excited for
  • Best Amazon Big Spring Sale Apple Watch deals 2026: Save on Series 11 and SE 3 models
  • Homaio raises €3.6M to bring carbon allowance investing to retail
  • My March Madness might be cured by the new trackable bracket in Apple Sports — but it’s missing one thing
  • This Riftbound Unleashed Legend Card Highlights The New XP Mechanic

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously