Since the 1990s, the process of search and discovery on the web has been well established and somewhat predictable – write relevant content, keep your metadata in check, and allow search engine robots to crawl your site. This birthed a whole new industry niche, Search Engine Optimisation (SEO), where technology and marketing overlap.
However, the rise of artificial intelligence (AI), large language models (LLMs) and generative search experiences is quickly turning the SEO world on its head. Where search engines once funnelled people to websites as the source of information, AI-powered platforms now reap data from a multitude of sources, processing them down into a single, curated response. This provides the user with an effectively “zero-click” search experience: getting answers without needing to leave the search page.
Thus SEO gives way to Generative Engine Optimisation (GEO), the process of optimising your content for use in AI-driven search engines.
)
Why Generative Engine Optimisation (GEO) matters today
As more users come to rely on AI tools, Generative Engine Optimisation is how your website remains discoverable as we transition to search experiences that no longer drive traffic to individual sources. Where SEO was about where your digital offering ranked in search results, GEO is about being a trusted source that generative AI models will reference.
This, however, doesn’t mean you can ditch your old SEO practices. GEO is really just SEO with extra features – all grown up and no longer satisfied with just keywords, it demands a higher level of quality from natural language and topic expertise. In 2025, ranking well on Bing and Google is the best way to be referenced by AI-powered search engines.
GEO focuses on LLMs and AI assistants that summarise, synthesise, or directly answer user questions without the need for links out to external websites (unless they’re highly relevant). Though the tech landscape changes quickly, today’s LLMs still use search APIs from major search engines, often favouring highly ranked content with clear explanations and easy-to-understand formatting.
)
How to prepare your website for GEO
Taking the very best of your SEO practices and enhancing them with LLM-aware strategies will increase the likelihood of your content remaining visible in the age of AI-powered search.
Be visible where AI looks for data
AI search engines are aggregators and content pre-processors. You’re unlikely to get cited if you’re only mentioned on one niche blog or web directory. Think about where AI search engines get their data – lists, blogs, review sites, maps – and expand your digital footprint across these platforms. Reach out to journalists and authors, submit your content far and wide, and maintain good digital marketing hygiene for maximum visibility.
Render content server-side
LLMs are unable to render Javascript, meaning any content that renders on your users’ devices is effectively invisible. Javascript-only content, such as that presented by sites built in frameworks like React or Next.js, is likely to get missed by most AI crawlers, which rely on static text and images for information. Ensure all your core content exists in static files or gets served with server-side rendering to remain detectable and readable by LLMs.
Keep your content fresh
AI models are extremely sensitive to content “freshness”, ie. how recently your content was published. To verify this, we tested 10 different searches across 3 different models – Perplexity, Gemini and ChatGPT.
Our search queries:
best project management software
best antivirus software
best graphic design software
best accounting software
best customer relationship management software
best video editing software
best email marketing software
best collaboration software
best data visualisation software
best cloud storage software
The sources cited across each AI model varied drastically, despite Perplexity and ChatGPT both using the Bing API to search for information.
Perplexity on average cited sources with a publication window of 2.4 months.
ChatGPT on average cited sources with a publication window of 5.4 months.
Gemini on average cited sources with a publication window of 6.2 months.
Although we were specifically looking for current information – in this case, the “best” software for a given purpose, which may change every year – our results demonstrate that timeliness and recency is a key criteria in the generative search experience.
Get featured in notable roundups
A well-known strategy in brand and product marketing, as well as SEO, getting featured in high-profile roundup lists are now considered good GEO practice. In our investigations, we’ve found that ChatGPT’s and Google Gemini’s cited sources tend to correlate almost exactly with the top 10 search results of their given search engine of choice – makes sense when you think about it!
Focus on long form content
Many website owners complain that Google’s AI Overviews are reducing their click traffic. However, in most cases, the lost clicks stem from low-value queries that only need quick, simple answers – something that AI handles extremely well.
Trying to compete with AI on this front is a losing battle. Instead, we recommend focusing on publishing high-quality long form content, enriched with expert insights that only you can provide. This return to a “human-centred” content strategy that prizes thoughtful, high-value information is what will set your online offering apart from an AI-powered content aggregator.
Think about your content as passages, not keywords
Today’s best practise for LLM-aware content optimisation is to break your content into small digestible segments that provide the whole answer to a query. Sound familiar? It’s also best practice for writing meaningful and readable web content.
“By understanding passages in addition to the relevancy of the overall page, we can find that needle-in-a-haystack information you’re looking for.” — Source: How AI is powering a more helpful Google (The Keyword by Google)
Don’t block AI crawlers
Remember when you were told to block AI crawlers in your robots.txt to prevent having your content stolen? Unfortunately, by then, your content was probably already stolen. Sketchy ethics and pending lawsuits aside, we’ve reached a point of no return, where the best option left is to benefit from this new normal.
Ensure your robots.txt allows the following known bots:
User-agent: ClaudeBot
User-agent: Claude-Web
User-agent: anthropic-ai
User-agent: Google-Extended
User-agent: PerplexityBot
Don’t worry about llm.txt just yet
llm.txt is an emerging schema for marketers and developers wanting to curate information for LLMs. So far, only about 100 of the top 1 million websites are using llm.txt, and major LLMs like OpenAI still don’t consistently respect it.
Our recommendation is to keep it on your radar, but don’t bother with it for now. Google Search Advocate John Mueller has compared it to the obsolete meta keywords tag – of negligible impact unless your site is already impeded by other factors (such as Javascript-heavy content rendering).
Focus instead on clean data structures, semantic clarity, server-side rendered content, and publishing high-quality information. These efforts are far more effective for improving visibility across LLMs today.