Google won't ship tech from Project Astra, its wide-ranging effort to build AI apps and "agents" for real-time, multimodal understanding, until next year at the earliest.
Google CEO Sundar Pichai unveiled the timeline in remarks during Google's Q3 earnings call Tuesday. "[Google is] building out experiences where AI can see and reason about the world around you," he said. "Project Astra is a glimpse of that future. We're working to ship experiences like this as early as 2025."
This includes everything from smartphone apps that can recognize the world around them and answer related questions to AI assistants, Google demoed in May 2024 at its I/O developer conference, that can indeed perform actions on a user's behalf.
In a prerecorded demo during I/O, Google showed a Project Astra prototype answering questions about things within view of a smartphone's camera, such as which neighborhood a user might be in or the name of a part on a broken bicycle.
The Information reported this month that Google was planning to launch a consumer-focused agent experience as early as this December-one capable of purchasing a product, booking a flight, and other such chores. That now seems unlikely-unless the experience in question is divorced from Project Astra.
Anthropic recently became one of the first companies to have a large generative AI model that can operate apps and web browsers on a PC. However, as an example of how tricky it is to build these AI agents, Anthropic has had trouble with numerous simple tasks.