Showing up when AI gives the answer
If your analytics have been jittery, you’re not alone. More people now “ask the page” instead of clicking around it. On Google, a majority of searches end without a click; one large 2024 clickstream study pegged it at roughly six in ten, and that zero-click habit has carried into this year as AI summaries become commonplace.
Google’s AI Overviews aren’t everywhere, but they’re visible enough to matter. Independent tracking this year puts them in the ballpark of one in ten queries overall, skewing heavily toward informational topics. That’s more than a curious experiment; it’s a new front door to information.
This is why GEO—Generative Engine Optimization—has a different goal than classic SEO. Traditional SEO tried to earn a visit. GEO tries to earn a mention and, ideally, a citation inside the answer. Think of it less like polishing your storefront and more like making sure the town librarian, the local paper, and the trade association all point to you when someone asks for help.
What these systems actually trust
Under the hood, the big AI systems behave more like meticulous researchers than keyword matchers. They prefer sources that look like experts: industry associations and niche publications, university or government resources, comparison pieces that weigh trade-offs, and pages with clear authorship and structured facts. When Google composes an AI Overview it tends to lean on those kinds of sources; Perplexity is even more explicit, displaying citations inline and heavily overlapping with what classic search already considers authoritative. In practice, that means what others write about you often carries more weight than what you write about yourself.
Make your site “AI-readable”
There are a few technical choices that quietly determine whether you’re easy to cite.
Start with robots.txt. If your policy is “OK to appear in answers, but not OK for bulk training,” you can allow OpenAI’s search/browsing agents and Perplexity’s crawler while disallowing the training bot. OpenAI documents separate user agents for search (OAI-SearchBot), on-demand fetching (ChatGPT-User), and training (GPTBot). Perplexity publishes its own bot controls as well. Here’s a simple pattern many publishers adopt:
# Allow answer/search crawlers
User-agent: OAI-SearchBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: PerplexityBot
Allow: /
# Disallow training crawlers
User-agent: GPTBot
Disallow: /
OpenAI’s and Perplexity’s docs explain these agents and how they honor robots directives; update yours, then watch your logs to confirm behavior.
Next, make sure your essential content exists in the HTML that bots first receive. Most AI crawlers don’t execute client-side JavaScript, so text that only appears after hydration can be invisible to them. If your site leans on a modern frontend, serve pre-rendered or server-rendered versions of key pages so the facts you want cited are present at load. Practitioners have repeatedly shown that ChatGPT, Claude, and Perplexity bots read the raw HTML but don’t render JS; Googlebot (and thus Gemini) is the exception.
Two more low-lift helpers: keep your schema markup clean (Organization, Article, Product, FAQ, HowTo where it fits), and consider adding an /llms.txt file at your root. The latter is an emerging, community proposal—a plain-text guide that tells AI systems which pages are canonical for specific entities, what licenses apply, and where to find a concise overview. It isn’t a standard yet, but it costs almost nothing to add and helps machines understand your site the way you do.
Write so your ideas are easy to lift—and credit
Picture an editor scanning for a tight quote. That’s how these systems work. Lead with the answer in clear sentences, then add nuance. Name the entities the way a third party would (product names, model numbers, people, places, dates). Keep sections tight enough to stand on their own, and attach a real byline and brief credentials to expert pieces. You’re still writing for humans—but you’re packaging facts so a machine can quote you without distorting the point. When you do that consistently, you’ll notice the same pages show up beneath AI summaries more often, because they’re easy to reuse faithfully.
How to tell it’s working—without chasing vanity numbers
Clicks and rankings still matter, but they don’t tell the whole story. In GA4, break out traffic from AI surfaces where referrers exist (for example, Perplexity and ChatGPT web) and annotate when you ship major content and markup changes. In parallel, keep a simple log of when your brand or URLs appear as citations in AI Overviews, Perplexity answers, and ChatGPT browsing results for the questions you care about most. Over a few months you’ll see a pattern: the pages that are clear, current, and corroborated elsewhere are the ones that keep earning mentions.
A practical way to start—no checkboxes required
In your first couple of months, fix crawl policy, get critical pages server-rendered, add the schema that matches your content, and publish an /llms.txt. Over the next stretch, refactor your top money-making pages so each one answers the half-dozen questions customers actually ask—plain language first, details second—and pitch two or three respected publications in your niche with data, commentary, or a comparison that deserves a spot on their site. Keep refreshing your cornerstone pages with dated updates so recency is obvious. Do that cadence for a quarter and you’ll have something sturdier than a list: a site other people are comfortable citing, and an information trail that AI systems can follow.
None of this replaces the work you’ve already done in SEO. It reframes the outcome. Instead of only earning the click, you’re earning the quote. In an answer-first world, that’s how your brand keeps showing up—on the page where the search ends.

