For the past year or so, the SEO world has been in a bit of a frenzy.
Everyone’s talking about AI Overviews. Everyone’s debating AEO (Answer Engine Optimization) and GEO (Generative Engine Optimization). People are adding llms.txt files to their sites, breaking content into smaller “chunks,” and rewriting copy specifically to sound more AI-friendly.
And Google has largely stayed quiet. Until now.
This week, Google Search Central published its first official guide on optimizing for generative AI features in Google Search. It’s called “Optimizing your website for generative AI features on Google Search” and it covers AI Overviews, AI Mode, and where things are headed with agentic search.
I read it top to bottom so you don’t have to. Here’s the honest version.
Key takeaways from Google’s official AI search optimization guide, summarized by Crunchwiser
| Topic | What Google says | Action |
|---|---|---|
SEO vs AEO/GEO |
AI Overviews pull from the same index and ranking signals as regular Search. Traditional SEO is still the foundation. |
Do: keep doing SEO |
Content quality |
Non-commodity, first-hand, opinionated content stands out. Commodity “tip listicles” are easy to replicate and will lose visibility. |
Do: write unique takes |
Technical SEO |
Pages must be indexed and snippet-eligible to appear in AI features. Crawlability, Core Web Vitals, and mobile UX remain the baseline. |
Do: audit regularly |
llms.txt |
Google does not use llms.txt for AI Overviews. It has value for ChatGPT, Claude, and Perplexity — but not for Google Search. |
Skip for Google |
Content chunking |
No need to break pages into small pieces. Google’s systems surface the relevant section from a full-length page automatically. |
Skip this tactic |
AI rewrites |
Rewriting copy to “sound AI-friendly” is unnecessary. Google understands synonyms and intent without exact phrasing matches. |
Skip this tactic |
Inauthentic mentions |
Manufactured brand mentions in forums or comment sections get filtered by the same spam systems that affect regular Search. |
Skip this tactic |
Structured data |
No special schema boosts AI Overview visibility. Keep using it for rich results in standard Search — just not as an AI lever. |
Do: for rich results |
Agentic search |
AI agents read your DOM, accessibility tree, and visual renderings. Semantic HTML and ARIA labels now have direct AI implications. |
Watch: emerging |
UCP protocol |
Universal Commerce Protocol will let Search agents take transactional actions (book, buy, compare) directly on websites. Early stage. |
Watch: e-commerce |
The Big Reveal: It’s Still Just SEO
The first thing Google addresses, and probably the most important one, is whether SEO still matters for AI search.
Their answer is a firm yes. And not in a hand-wavy “great content wins” kind of way. Google is specific about why: their generative AI features are literally built on top of their existing ranking and index systems.
Two techniques power what you see inside AI Overviews:
RAG (Retrieval-Augmented Generation): When Google generates an AI answer, it’s not just making stuff up from a model’s training data. It’s pulling fresh, relevant pages from its search index using its core ranking signals, then using those pages to ground the response. If your page ranks well, it has a shot at being used as a source.
Query fan-out: When someone searches something like “how to fix a lawn full of weeds,” Google doesn’t just process that one query. It generates a whole cluster of related sub-queries behind the scenes (“best herbicides for lawns,” “remove weeds without chemicals,” etc.) and retrieves results for all of them. Your page might surface for a fan-out query you’d never think to optimize for directly.
What this means practically: the same things that help you rank on Google Search help you appear in AI Overviews. There’s no secret AI layer you need to crack.
What Google Actually Wants You to Do
Create content that couldn’t have been written by anyone else
This is where Google puts the most emphasis, and honestly it’s the part most SEOs gloss over.
Google draws a clear line between commodity content and non-commodity content.
Commodity content is the stuff that could be written by anyone, or any AI. “7 Tips for First-Time Homebuyers.” “The Ultimate Guide to Email Marketing.” Common knowledge, loosely assembled, with no unique perspective.
Non-commodity content is the opposite. It’s the piece where you share what actually happened when you tested something, what your client’s data showed, what the industry gets wrong. Something that couldn’t exist without your specific experience or expertise.
This isn’t a new idea — I wrote about why content commoditization is one of the biggest strategic risks for SEO publishers back in April, after Danny Sullivan made the same argument at WordCamp Asia. What’s notable is that Google is now saying it explicitly in official documentation. They’re not hinting at it anymore.
Google gives a great example in the guide: “Why We Waived the Inspection & Saved Money: A Look Inside the Sewer Line” is non-commodity. It’s specific, it’s opinionated, it comes from lived experience. AI systems look across many sources, and the pages that stand out are the ones with a point of view that can’t be easily replicated.
For anyone running an SEO publication, this is both a challenge and an opportunity. The space is flooded with “10 local SEO tips” content. The sites that will win in AI search are the ones publishing real analysis, real case studies, real takes.
Keep your technical house clean
Nothing groundbreaking here, but Google spells it out clearly: to be eligible to appear in generative AI features, a page must be indexed and eligible to show a snippet. That’s the baseline.
Beyond that, the usual technical stuff applies: crawlability, good page experience, mobile-friendliness, reasonable load times. Google also mentions crawl budget management for larger sites, and JavaScript SEO best practices if your site relies heavily on JS rendering.
One interesting note: they say you don’t need perfectly semantic HTML, “the web in general is not valid HTML, and Google can understand it”, but semantic markup does help AI agents navigate your page, which ties into the agentic experiences section we’ll get to.
For local and e-commerce: feed the machine
If you run a local business or an online store, Google specifically calls out Merchant Center feeds and Google Business Profiles as ways to get your products and services into AI responses. This isn’t new, but it’s now officially tied to AI visibility, not just traditional search.
There’s also a mention of something called Business Agent, a conversational experience in Google Search that lets customers chat with your brand directly. It’s early days, but worth knowing exists.
The Myth-Busting Section (This Is the Good Part)
Halfway through the guide, Google does something I didn’t expect: they dedicate an entire section to calling out tactics that don’t work. Let’s go through them.
llms.txt: Google doesn’t use it
Yes, really. Google explicitly states: “You don’t need to create new machine readable files, AI text files, markup, or Markdown to appear in generative AI search.”
Now, to be fair, llms.txt was never really designed for Google. It was designed for LLMs like Claude and ChatGPT to better understand what content on your site is available for training or retrieval. It still has real value for those systems.
But here’s where it gets interesting. While Google is telling you not to bother with llms.txt, Google itself is running llms.txt files across its own developer properties. Gemini API docs, Chrome developer docs, Firebase docs, Google AI API — all have one. So Google clearly understands the value of llms.txt for AI systems. They just don’t want you optimizing for theirs.
The honest read: llms.txt helps other AI systems understand your content. Google Search doesn’t use it — but that doesn’t mean it’s useless. It means you should have one for the broader AI ecosystem, just don’t expect it to move the needle in AI Overviews.
“Chunking” your content: not necessary
There’s been a lot of talk about breaking content into smaller pieces so AI can process it better. Google says this is unnecessary. Their systems can understand a full-length page, identify multiple topics within it, and surface the relevant section for a given query.
That said, they’re not saying long pages are better. They’re saying there’s no ideal page length. Write for your audience.
Rewriting content “for AI”: save your time
Google’s AI systems understand synonyms and intent. You don’t need to capture every long-tail variation of how someone might phrase a question. You don’t need to write in a specific style that sounds AI-answer-shaped. Write clearly for humans and the AI can figure out the rest.
Chasing inauthentic mentions: it backfires
The idea here is that if you get your brand mentioned everywhere: in forums, comments, blog posts. AI systems will pick that up and start including you in responses. Google says their spam systems and quality filters apply to AI features just as much as traditional search. Manufactured mentions aren’t just ineffective, they’re potentially harmful.
Over-indexing on structured data for AI: not worth it
Structured data is still worth doing for rich results in traditional search. But Google confirms there’s no special schema.org markup that boosts your AI Overview presence. Don’t build an entire structured data strategy around AI visibility.
Agentic Search: The Part Everyone’s Ignoring
The last section of Google’s guide covers agentic experiences, and it’s the most forward-looking, and least covered, part of the guide.
AI agents are autonomous systems that can take actions on your behalf. Think: booking a restaurant, comparing product specs, filling out a form. When these agents visit your website, they don’t read it the way a human does. They analyze visual renderings (screenshots), inspect the DOM, and read the accessibility tree.
What this means for your site:
- Accessibility matters more than it ever has. Properly labeled elements, logical heading structure, ARIA roles. These help agents parse your site correctly. A poorly structured page isn’t just bad for screen readers anymore; it’s potentially invisible to AI agents.
- UX quality becomes a ranking factor in a new way. If an agent visits your site to complete a task and the experience is confusing or broken, that’s a signal.
- Universal Commerce Protocol (UCP) is emerging. This is a new protocol designed to let Search agents take transactional actions directly on websites: booking, buying, comparing. It’s early, but for anyone in e-commerce SEO, this is the kind of thing to follow closely.
Google links to a guide on agent-friendly website best practices from web.dev that’s worth bookmarking.
The One Question Google Wants You to Ask
Near the end of the guide, Google suggests a simple test for any content decision:
“Is this content that my visitors would find satisfying?”
It’s almost too simple. But it’s a good corrective for anyone who’s been spinning up AI-specific tactics, chunking pages, adding files, chasing mentions. If the answer is no, or even “not really, but it might help with AI,” it’s probably not worth doing.
The sites that will do well in AI search are the same ones that would have done well in search a decade ago: authoritative, specific, useful, and built for a real audience.
What This Means for Your SEO Strategy
Here’s how I’d summarize the practical implications:
Double down on content quality over content volume. The commodity content race is a losing one in AI search. Fewer, better, more opinionated pieces will outperform a high-volume spray-and-pray approach.
Treat technical SEO as your floor, not your ceiling. Crawlability, indexation, and page experience are the minimum requirement to even be considered for AI features. Get those right first.
Stop the AI-specific tactics. llms.txt (for Google), content chunking, AI-centric rewrites, none of these move the needle in Google Search. Redirect that time to something that does.
Start paying attention to agentic SEO. It’s early and Google isn’t requiring anything specific yet. But the accessibility and DOM structure of your site will become increasingly important as agents get more sophisticated. Building good habits now costs nothing.
Keep doing structured data, just not for AI reasons. Schema markup still helps with rich results in traditional search. Keep it as part of your overall strategy, just don’t expect it to unlock AI Overviews.
The guide is worth reading in full if you want the official language directly from Google. But the core message is simpler than most of the AEO/GEO discourse out there: do good SEO, create content people actually want, ignore the hacks.
Which, honestly, is refreshing to hear Google say out loud.
Source: Google Search Central: Optimizing your website for generative AI features
