How to Block Gemini-Deep-Research
Complete guide to blocking Gemini-Deep-Research (Google) from crawling your website using robots.txt, server configuration, and Switch workflows.
Should You Block Gemini-Deep-Research?
Blocking Gemini-Deep-Research prevents your content from appearing in Google's AI-generated answers. Each visit from this agent represents a real user asking about your content.
Consider allowing Gemini-Deep-Research for visibility, or use Switch to serve agent-optimized markdown content instead of blocking entirely.
Blocking Methods
1robots.txt
High for cooperative crawlersAdd a Disallow rule for Gemini-Deep-Research's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.
2Server-side UA filtering
HighConfigure your web server (nginx, Apache, Cloudflare) to reject requests matching Gemini-Deep-Research's user-agent patterns. This blocks at the network level before your application processes the request.
3Switch Journey Workflows
Highest — granular, real-time controlCreate a custom journey in Switch that detects Gemini-Deep-Research and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.
robots.txt — Block Gemini-Deep-Research
Add the following to your robots.txt file (at the root of your domain) to block Gemini-Deep-Research:
User-agent: Gemini-Deep-Research Disallow: /
robots.txt — Allow with Restrictions
Alternatively, allow Gemini-Deep-Research on most pages while blocking specific directories:
User-agent: Gemini-Deep-Research Disallow: /private/ Allow: /
Gemini-Deep-Research User-Agent Strings
Use these patterns to identify Gemini-Deep-Research in your server logs or firewall rules:
Frequently Asked Questions
Does blocking Gemini-Deep-Research affect my Google search rankings?
No. Blocking Gemini-Deep-Research does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.
Does Gemini-Deep-Research respect robots.txt?
Yes, Gemini-Deep-Research respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.
Can I allow Gemini-Deep-Research on some pages but not others?
Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.
Go beyond robots.txt
Switch detects Gemini-Deep-Research in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.
Get Started Free