Development

How We Future-Proof Websites for GEO (Generative Engine Optimisation) and AI-Driven Search

Growing interest in GEO

Over the past six months, more and more marketers have been messaging me, asking how to prepare their websites for Generative Engine Optimisation (GEO).

This trend aligns perfectly with Google Trends data - a term that virtually didn’t exist at the start of 2025 is now reaching an all-time high.

Good for SEO = Good for GEO

There’s a huge overlap between SEO and GEO. In fact, for most of our existing websites, we already have the fundamentals covered through strong Technical SEO.

That said, it’s worth double-checking that everything still works correctly - and using this opportunity to make a few GEO-oriented improvements.

1. Schema Markup

This is probably the most obvious place to start. Adding relevant Schema Markup to your website improves discoverability and visibility across LLMs (large language models).

The most important schema types to include are:

  • Organization, Website, and WebPage for your global schema
  • BlogPosting for blog posts
  • PodcastEpisode for podcast pages
  • FAQPage
  • HowTo
  • Product

There are hundreds of schema types (full list here), so choosing the right ones isn’t always easy.

We’re currently exploring a product that automates this process - stay tuned for more news soon.

2. llms.txt and llms-full.txt

These are proposed standard files designed to guide LLM crawlers to your website.

Although not yet officially supported, many technical SEO experts believe they’re already worth implementing - as a way to future-proof your setup and signal openness to AI crawlers.

3. Enrich your content pages

We’ve improved the layout and metadata we include on blog posts by adding:

  • Published date and last updated date (so LLMs know the content is fresh and relevant)
  • Estimated reading time
  • Author attribution with dedicated author pages
  • Table of contents for easier navigation
  • Related articles at the bottom of each post

These small additions improve both UX and machine readability.

4. Internal linking and paginated hub pages

Internal linking plays a big role in helping LLMs and crawlers understand your content relationships.

We’ve also updated hub page UX to allow pagination via URL slug (e.g. /blog/page-1) or query parameters (e.g. /blog?page=1), making it easier for Google and LLMs to crawl everything.

Avoid loading more posts via JavaScript only (e.g. a “Load More” button), as this can hurt crawlability - crawlers might not discover all blog posts if they rely on client-side rendering.

5. /sitemap.xml and /robots.txt

These two files should already exist, but it’s important to make sure they’re properly configured:

  • All pages are included in the sitemap
  • Each page includes a lastmod property
<url>
  <loc>https://vercel.com/academy/ai-sdk/ai-elements</loc>
  <lastmod>2025-10-09T18:06:36.019Z</lastmod>
</url>
  • In your robots.txt, ensure that AI crawlers are allowed.
    A simple snippet like this works perfectly:
User-agent: *
Allow: /

6. All meta tags in place

Double-check that all your essential meta tags are implemented correctly:

  • <title>
  • <meta name="description" />
  • <link rel="canonical" />
  • No noindex / nofollow attributes (unless intentional)

These are small details, but they make a big difference for both SEO and GEO.

7. Semantic HTML and heading structure

Make sure your site uses proper semantic HTML - elements like <article>, <nav>, <footer>, <section>, and <time> help LLMs interpret page meaning and hierarchy.

Your heading structure should also be logical and properly ordered:

  • Each page should have only one <h1>
  • Subheadings should follow a clear hierarchy, e.g.:
H1  Essential training made easy   
 H2  An experience built around you   
 H2  Less admin for L&D, more time for people    
   H3  One solution for you
   H3  One subscription for your organization
   H3  One destination for your employees
 H2  Explore our course library 

Avoid jumping between heading levels (e.g. H2 → H5 → H1), as it confuses both readers and crawlers.

8. Integrate with GEO tracking tools

We recommend our clients integrate LLM visibility tracking tools - think of them like Google Analytics, but for AI.

There are plenty of startups exploring this space, but we particularly like what the teams at
Trakkr and Peec are doing.

Final note:

You don’t need to reinvent your website for GEO, but you do need to make sure your technical foundation is solid, structured, and AI-friendly. Most of these improvements will benefit your traditional SEO just as much as they will your GEO performance.