62→78
GEO score improvement (+26%)
2 hrs
Engineering time, $0 in tools
6
Critical issues fixed in one commit

The Problem

AI search is changing the rules. ChatGPT, Perplexity, Google AI Overviews, and Claude now answer questions directly — pulling from a small set of sources they trust. If your site isn't optimized for AI engines, you're invisible to the fastest-growing traffic source on the internet.

We built IC Marketing's GEO audit toolkit to solve this exact problem. But were we eating our own cooking?

We ran a full audit on our own live site and found six critical issues that were silently killing our AI visibility. Here's exactly what we found, what we fixed, and what happened next.

Audit Findings: What Was Broken

1. Broken Sitemap (Critical — Score: 0/100)

/sitemap.xml returned a 404. The route didn't exist. Every search engine and AI crawler that tried to discover our pages hit a dead end. Worse: our robots.txt pointed crawlers directly to this broken URL, actively advertising the failure.

2. Broken llms.txt URLs (High — Score: 80/100 with issues)

We had an llms.txt file — the AI-specific navigation standard — but all URLs inside referenced icmarketing.io instead of our live domain. AI crawlers following those links would land on a domain we don't control. We were handing off our AI traffic to a dead URL. The file also lacked links to two key content pages, leaving valuable indexed content invisible to AI navigators.

3. Missing Canonical & Social Meta Tags (High — Score: 30/100)

No canonical link tag. No og:url. No og:image. No Twitter Card tags. This meant: duplicate content risk across any scrapers or CDN versions; poor social sharing previews (broken link cards); and missing structured data signals AI engines use for content authority.

4. Schema Markup Using Wrong Domain (Medium — Score: 70/100)

Our Organization schema had the URL hardcoded to https://icmarketing.io — a domain that wasn't live. AI engines use schema markup to build knowledge graphs about your business. We were teaching them we live at the wrong address. Two additional schemas were missing entirely: WebSite with SearchAction, and Article schemas for our blog pages.

5. robots.txt Sitemap URL Protocol Error (Low — Score: 95/100)

The sitemap URL in robots.txt was missing the https:// prefix — a subtle formatting error that can confuse crawler parsers expecting absolute URLs.

Changes Made

All five critical and high fixes were shipped in a single commit:

  • Added /sitemap.xml route — server route returning valid XML with all pages indexed at correct priority (1.0 homepage, 0.8 blog pages). All search engines can now discover all content.
  • Fixed llms.txt — updated all URLs to the live domain, added missing page links. AI crawlers now navigate the full content graph.
  • Added canonical tag<link rel="canonical"> in <head>. Prevents duplicate content penalties.
  • Added full social meta tagsog:url, og:image, og:type, Twitter Card summary_large_image. Social sharing signals + AI social graph context.
  • Fixed Organization schema — switched hardcoded URL to a config variable. Correct entity URL for LLM knowledge graph building.
  • Fixed robots.txt — added https:// prefix to sitemap URL. Clean crawler discovery.

Time to implement: ~2 hours. Cost: $0 in tools, ~2 engineering hours.

Results

Overall GEO Score: 62 → 78 (+26%)

llms.txt: 80 → 95 (+19%)  |  Sitemap: 0 → 90 (+90pts)  |  Canonical & Social: 30 → 90 (+200%)  |  Schema: 70 → 80 (+14%)

The sitemap went from completely broken (404) to fully functional with all pages indexed. AI navigation now correctly maps all content pages to the live domain. Complete Open Graph + Twitter Card implementation gives proper social graph indexing. AI engines now have a correct knowledge graph entry for our organization.

Remaining opportunity: Article schemas for blog pages, WebSite SearchAction, and OG image creation — estimated +8–12 more points.

What This Means for Your Site

Most sites have the same categories of problems — and they go unfixed because they're invisible to the naked eye. Your site looks fine in a browser. Your analytics show traffic. But AI engines are quietly skipping you.

The GEO audit surfaces exactly what's broken and exactly how to fix it. In our case: one session, six fixes, 16-point score improvement.

Your site likely has similar wins waiting. The difference between a 62 and a 78 is the difference between being cited in AI answers and being invisible to them.