Skip to main content

Site Maps

The Scrunch Site Map is your command center for optimizing your website for AI search. It gives you a complete view of how AI models see and interact with your content, then helps you take action to improve performance.

Updated yesterday

Site Maps are currently rolling out across Scrunch organizations. If you don’t see the feature yet, it may not be enabled for your workspace quite yet.

What You'll See

Your Site Structure

A visual map of your website as our crawler sees it—from your homepage down through key sections, blog posts, and deeper content. This helps you quickly understand what pages are being discovered and evaluated.

Performance Metrics

For each page, you'll see:

  • Agent Traffic — how often AI bots are visiting your pages

  • Citations — whether AI models are citing your content in responses

  • AI Referrals — traffic coming to your site from AI platforms

  • Audit Score — technical health and optimization score for each page

Prioritization Tools

Sort your pages by any metric to quickly identify opportunities. Find pages with high bot traffic but low citations, or pages that need technical improvements.


What You Can Do

Deep AI Audits

Run detailed audits on individual pages to understand exactly what's working and what needs improvement. Get specific, actionable insights about technical issues, content quality, and bot accessibility.

Content Optimization

Generate up to 5 actionable content suggestions to improve your existing webpages. The platform analyzes your page and provides practical recommendations like adding introductions, improving calls-to-action, clarifying next steps, adding FAQ-style content, and adding missing context for images.

Each suggestion includes the issue and the recommended fix, making it easy to hand off to your content team. The goal is improving your canonical, human-facing pages to make them clearer for both humans and AI.

(Screenshot shows Content Delivery that is only visible to AXP users)

Monitor & Refine

Track changes over time as you implement optimizations. See how improvements to individual pages impact your overall AI search performance.


How It Works

  1. Pick a page from your sitemap to optimize

  2. Run a Deep AI Audit to identify technical issues and content gaps

  3. Click Optimize to generate up to 5 practical content suggestions

  4. Review recommendations like adding introductions, improving CTAs, or clarifying next steps

  5. Copy or export suggestions to share with your content team

No technical work or complex deployment required—just straightforward edits your marketing or content team can apply directly to your existing pages.


Getting Started

The Site Map automatically crawls your primary website. From there:

  1. Explore the site structure and identify priority pages

  2. Drill down into specific pages to see detailed performance

  3. Run audits to diagnose issues

  4. Optimize content based on audit findings

  5. Monitor results and refine your approach

The platform guides you through each step with clear calls-to-action and contextual recommendations based on your data.


Common Use Cases

For agencies managing multiple clients:

  • Quickly assess site-wide AI visibility health across your client portfolio

  • Identify high-impact optimization opportunities without extensive manual analysis

  • Generate client-ready content recommendations your team can implement immediately

For enterprise brands:

  • Understand which content assets are driving AI citations and referrals

  • Prioritize optimization efforts on pages with high agent traffic but low performance

  • Monitor technical health issues that may be blocking AI bot access

For content teams:

  • Get specific, actionable recommendations for improving existing pages

  • Focus optimization efforts on pages with the highest potential impact

  • Track performance improvements over time as changes are implemented


Integration with Platform Features

Site Map works as part of the Integrated AI Visibility Stack:

  • Agent Traffic data shows which pages AI bots are actually visiting

  • Prompt tracking reveals what questions are driving citations to your content

  • Optimization suggestions help you improve pages that aren't performing

  • AXP can serve AI-optimized versions of content to bots while keeping your human-facing pages unchanged

Site Map focuses on improving your canonical content, while AXP lets you serve different versions to AI agents when needed.


Frequently Asked Questions

Why don’t I see my Site Map yet?

There are a few common reasons the Site Map may not appear yet:

The initial crawl hasn’t run yet
The Site Map appears after Scrunch completes the first crawl of your website. If the crawl hasn’t started or finished yet, there will be no pages to display.

Your plan may not include site crawls
Some plans limit how many crawls or pages can be analyzed. If your plan doesn’t include site crawls, the Site Map may not populate until crawl capacity is available.

Your website may be blocking automated crawlers
Some websites block automated access through CDN security settings, WAF rules, or bot protection tools. When this happens, Scrunch may not be able to retrieve pages to build the Site Map.

Your domain may not be configured yet
Scrunch crawls the primary domain configured for your brand. If the domain hasn’t been added or confirmed, the crawl will not start.

If you believe your Site Map should already be visible, reach out to your Customer Success Manager and we can help verify the crawl status.


Why is my Site Map only showing a few pages?

If your Site Map appears but only includes a small number of pages, it usually means one of the following:

Some pages may be blocked or inaccessible
Pages may not appear if they are blocked by robots.txt rules, CDN security settings, or other bot protection systems.

Your site contains multiple domains
The Site Map only crawls the primary domain configured for the brand. If your organization operates multiple domains, they may need to be tracked separately.

Your site structure makes pages difficult to discover
Sites that rely heavily on dynamic routing, single-page application frameworks, or unusual URL patterns may expose fewer pages to crawlers.


What does “Limited Site Access Detected” mean?

This message appears when your website blocks automated crawlers.

To fix this, allow Scrunchbot in your CDN or security configuration:

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Scrunchbot/1.0; +https://scrunchai.com/bots)

Once access is allowed, Scrunch can crawl and analyze your pages normally.


How often does the Site Map update?

Site Map metrics update automatically throughout the day. Crawled pages remain in the Site Map and their performance metrics continue to refresh as new data is collected.


Why don’t the citation counts match the Citations tab?

The Site Map only shows citations for pages that appear in the crawl.

The Citations tab includes all URLs associated with your brand, even if they were not discovered during the Site Map crawl. Because of this, totals may differ slightly.

Did this answer your question?