How to Block AI2Bot
Complete guide to blocking AI2Bot (Allen AI) from crawling your website using robots.txt, server configuration, and Switch workflows.
Should You Block AI2Bot?
AI2Bot collects data for AI model training. Blocking it prevents your content from being used in Allen AI's AI products without affecting your search visibility.
This is a common and recommended action for sites that want to control how their content is used in AI training.
Blocking Methods
1robots.txt
High for cooperative crawlersAdd a Disallow rule for AI2Bot's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.
2Server-side UA filtering
HighConfigure your web server (nginx, Apache, Cloudflare) to reject requests matching AI2Bot's user-agent patterns. This blocks at the network level before your application processes the request.
3Switch Journey Workflows
Highest — granular, real-time controlCreate a custom journey in Switch that detects AI2Bot and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.
robots.txt — Block AI2Bot
Add the following to your robots.txt file (at the root of your domain) to block AI2Bot:
User-agent: AI2Bot Disallow: / User-agent: ai2bot Disallow: /
robots.txt — Allow with Restrictions
Alternatively, allow AI2Bot on most pages while blocking specific directories:
User-agent: AI2Bot Disallow: /private/ Allow: / User-agent: ai2bot Disallow: /private/ Allow: /
AI2Bot User-Agent Strings
Use these patterns to identify AI2Bot in your server logs or firewall rules:
Frequently Asked Questions
Does blocking AI2Bot affect my Google search rankings?
No. Blocking AI2Bot does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.
Does AI2Bot respect robots.txt?
Yes, AI2Bot respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.
Can I allow AI2Bot on some pages but not others?
Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.
Go beyond robots.txt
Switch detects AI2Bot in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.
Get Started Free