What is Crawl Budget?
The number of pages a search engine will crawl on your site within a given time period.
Crawl budget is the number of URLs a search engine crawler will visit on your site during a crawl session or time period. It's determined by two factors: crawl rate limit (how fast the crawler can go without overloading your server) and crawl demand (how much the search engine wants to crawl your content based on its popularity and freshness).
For most small-to-medium websites, crawl budget is not a concern — search engines will crawl all your pages. It becomes important for large sites (100,000+ pages) or sites with performance issues. Wasting crawl budget on low-value pages (duplicates, parameters, thin content) means important pages may not get crawled and indexed.
AI crawlers also have effective crawl budgets. GPTBot crawls ~100 pages/hour, ClaudeBot ~500 pages/hour, and AhrefsBot can be even more aggressive. Managing these rates through robots.txt crawl-delay or Switch rate limiting ensures crawlers don't overwhelm your server.
How Switch Helps
Switch helps protect your crawl budget by rate-limiting aggressive AI and SEO crawlers, ensuring search engine crawlers get priority access to your content.
Get Started FreeRelated Agents
Googlebot
Google's primary web crawler powering the world's largest search engine.
Bingbot
Microsoft
Microsoft Bing's search crawler, also powering Copilot AI answers.
AhrefsBot
Ahrefs
Ahrefs' SEO crawler building the world's largest backlink database.
ClaudeBot
Anthropic
Anthropic's web crawler collecting training data for Claude models.