Google I/O: Project Astra can tell where you live just by looking out the window

Google has a new AI agent that can tell you things about what’s around you. A lot of things.

Called “Project Astra,” it’s a Gemini-based multimodal AI tool that lets you point your phone’s camera at real-life stuff and get a spoken description of what you’re looking at.

In a demo, shown during Google’s I/O conference Tuesday, the tool was pointed at a loudspeaker, correctly identifying a part of it as a tweeter. Far more impressively, the phone’s camera was then turned onto a snippet of code on a computer display, with Astra yielding a fairly detailed overview of what the code’s doing.

Finally, the person testing Project Astra turned their phone towards the window and asked “What neighborhood do you think I’m in?” After a few seconds, Gemini replied: “This appears to be the King’s Cross area of London,” along with a few details about the neighborhood. Finally, the tool was asked to find a misplaced pair of glasses, and it complied, saying exactly where the glasses were left.

Mashable Light Speed

In perhaps the most interesting part of the video, we see that those glasses are actually some kind of smart glasses, which can again be used to prompt Gemini about what the wearer sees – in this case giving a suggestion on a diagram drawn on a whiteboard.

According to Google’s DeepMind CEO Demis Hassabis, something like Astra could be available both on a person’s phone or glasses. The company did not, however, share a launch date, though Hassabis said that some of these capabilities are coming to Google products “later this year.”

Leave a Reply

Your email address will not be published. Required fields are marked *