Google I/O 25 Keynote in Summary
The Google I/O Keynote led to a host of announcements, including a multitude of changes and developments for search moving forward.
Executive summary
Google I/O is Google’s annual conference for announcing changes, additions, and new products across its full stack, like Android, AI, and Search.
With the announcement that AI Overviews have more than 1.5 billion users each month, this year’s edition demonstrated significant shifts towards AI search, with the launch of AI Mode to all users in the USA.
AI Mode is seen as the next step from AI Overviews, with users able to ask longer, more complex queries and receive a response that can be influenced by previous search history and data from Gmail. AI Mode will be getting additional shopping features, with users able to use a “try it on” function, which is available to users in Search Labs now.
We also got a preview of Google’s move into Agentic AI with Project Mariner, essentially AI “agents” that can perform tasks for users like booking a table at a restaurant or buying clothes.
Google has compiled a 10-minute summary video of this year’s edition of I/O.
Timestamps:
(00:13:38): – Project Mariner – an agent that can interact with the web and get stuff done. These agents (otherwise referred to as Agentic AI) will use advanced intelligence and have access to tools to enable them to act on a user’s behalf and get stuff done:
- Can have 10 concurrent tasks
- ‘Teach and repeat’ allows a user to show a task that can be repeated moving forward
(00:15:36): – Agentic capabilities coming to Chrome, Search and the Gemini app – example given shows a user looking for an apartment with set filters – the agent connects with real estate listing platforms (e.g. Zillow) to regularly flag apartments of interest, and sets up viewings for chosen properties.
(00:46:04): – AI Overviews have more than 1.5bn users monthly, across 200+ countries and territories
- In the US & India, the number of queries that show AIOs grew 10%+
- Google Lens saw 65% growth YoY (100bn visual searches YTD)
- AI Overviews have since been launched in over 200 countries in 40 languages
(00:48:04): – AI Mode – a new tab launched this week for users in the US
- Described as a “total reimagining of search, using advanced reasoning”
- Users have been asking 2-3x longer queries than current search trends
- Follow-up questions allow further exploration
- Personal context, Deeper research, Analysis & visualisation, live multimodality & new ways to shop
Chloe Smith (Strategic SEO Lead at Blue Array) has been testing out AI mode over the last few days:
“During my searches, I’ve found that AI Mode operates more like a traditional LLM chatbot than the AI Overviews we’ve experienced so far (think more like a ChatGPT conversation than using traditional search as we know it). It’s more contained with minimal ability to navigate to other pages that aren’t cited directly by the response, but allows users to follow up on the response with clarification questions or other requests.
In my opinion, it’s still a work in progress; some citations I’ve gotten in some cases have been largely irrelevant (e.g. asking AI Mode “what are the best sites to buy a used iPhone from in Germany”, it cited a page on buying a used phone in Pakistan).
I believe this will largely be fed by informational content on sites, which will still have merit from a visibility perspective. Though the clicks driven by search may be reduced by the roll-out of AI Mode to a wider audience, the quality of the traffic earned may be better.”
(00:58:06): – Agentic AI capabilities coming to AI Mode in Summer 2025 – with some initial use cases including:
- Finding event tickets
- Restaurant reservations
- Appointments for local services
(1:04:18): – “Try it on” technology will aid users in visualising products on a user by understanding depth & shape
- “Track price” allows a user to track products based on colour, size, price
- Search will continuously scan websites for availability, alerting a user when it falls back into the customer’s range
- Easy to add to cart or allow the Agent to buy automatically
- Visual shopping and agentic checkout will be available in the coming months, whilst the “Try it on” feature has rolled out in the US
(1:08:53): Gemini 2.5 coming to AI Overviews and AI Mode
Non-search pieces of note:
- (00:08:46): – Google Beam, AI video-first communications platform (transforms 2D video streams into 3D). In partnership with HP, it will be available for early customers late this year
- (00:09:55): – Real-time translation in Google Meet (English and Spanish available for subscribers, support for further languages rolling out in the next few weeks)
- (00:17:33): – Personalised smart replies: Gemini can plug into a user’s Gmail, Docs and Google Drive to learn and assist in email replies (available for subscribed users in the Summer)
What does this mean?
With Google holding 90% of the search market, keeping up-to-date and adapting accordingly is vital to long-term success:
- AI Mode currently looks to be attributing traffic to ‘Direct’ – whilst this should be an easy fix for Google, it’s important we continue to monitor traffic source shifts and report accordingly to wider teams. Longer term, we should be considering how we view organic traffic more holistically, with a likely deeper focus on conversions and sales reporting than simply traffic.
- Agentic AI is a serious consideration for all websites: ensuring content, on-site journeys and usability all make it simple and easy for agents to complete user requests on our sites
- AI Mode is different from traditional keyword web search. Google’s AI Mode uses a “query fan-out” technique, which breaks down a user’s question into multiple sub-queries, running them simultaneously across its vast index of the live web and other data sources. It then synthesises the results into a comprehensive answer, whereas ChatGPT primarily draws information from its pre-trained knowledge base and may use web browsing as a separate, more limited function. Blue Array offers best-in-class GEO reporting to understand what and where you need to be visible.
- “Try it on” technology and visual shopping are likely to become necessary considerations for ecommerce sites, following best practice for Google, and providing accurate sizing/product information to avoid bad customer experiences. Price points, shipping costs and return policies will all play a role in visual search, with users being able to filter on the colour/size/type of item they are looking for, as well as the price they are prepared to pay – with agents notifying a user directly when a product ticks all boxes Whilst Google attempts to get this right, there’s the potential for negative UX (e.g. clothes still not fitting right), which may lead to an increase in customer returns.
Google has since released an article summarising the top ways to ensure content performs well within their AI experiences – this is no major change from the usual guidance:
- Focus on unique, valuable content for people
- Provide a great page experience
- Ensure content is accessible and marked up with schema
- Look to support textual content with images and videos
Blue Array will continue to keep on the cusp of industry updates, and our newsletter (curated by our founder, Simon Schnieders) provides an industry round-up every Sunday evening.
Thoughts from our Founder & CEO, Simon Schnieders
“As expected, AI Mode has moved to a broad rollout in the US to meet the competitive threat from so-called Chatbots (such as ChatGPT). We now expect a global rollout within the next few months, and eventually, sometime this year, AI Mode will make its way to core search for certain queries.
What Google has told us to expect is less, but better quality/qualified traffic as AI synthesis moves users down the funnel. However, brands/publishers’ contributions to that upper funnel are still essential for mentions and citations.”