For twenty years, every website was a library built for one type of visitor: humans. We designed pages for eyes. We optimized for clicks. We A/B tested button colors. Now a billion robots just walked into that library — and they read 10,000x faster than us.
“Think of it like a restaurant with two types of customers. Humans walk through the front door, sit down, read the menu, and order. Agents pull up to the loading dock and need the inventory manifest in a spreadsheet. Hand the agent a leather-bound menu and it can't parse it. Hand the human a spreadsheet and they leave.”
The framework
Every important page on your site actually needs to serve two different experiences. Same business. Same products. Two completely different front doors.
Give an agent the human version and it fumbles — hallucinating prices, misreading specs. Give a human the agent version and they bounce. Serve both the same thing and you're optimizing for neither.
What we believe
Some agents represent real customers. Some scrape your competitive advantage. Some are somewhere in between. The right response depends on intent, not just identity.
CRO personalized the web for humans. LLM-ization personalizes it for agents. Not treating them as lesser visitors — treating them as different visitors.
Traditional bot detection asks one question: bot or not? Switch asks five: Who is this? What are they doing? What should they see? Under what conditions? And what happens if they break the rules?
The long-term thesis
Three layers. Each one is valuable on its own. Each one makes the next one possible. If Layer 1 is mushy, the whole cake collapses.
Show site owners the reality of their non-human traffic. Free reporting. Who's visiting, what they're doing, what it's costing. It's the "wait, what?" moment that changes how you think about your site.
Block bad actors. Challenge unknowns. Route traffic intelligently. Start making deliberate choices about which agents get access, under what conditions, and to what content.
Full LLM-ization. Different content, formats, and journeys for different visitor types. Not blocking agents. Not ignoring them. Designing for them.
Built on research
Switch is informed by the actual research shaping how agents interact with the web.
Building the Web for Agents — proposes declarative frameworks for agent-web interaction.
→ Validates the need for structured agent traffic management.
Documents agents evolving from text crawlers to multi-modal browser users.
→ Informs Switch's adaptive beacon scheduling via "Pyramidal Perception."
Chrome's protocol for structured agent-website interaction.
→ Switch detects WebMCP headers to distinguish compliant agents from rogue scrapers.
Intent classification framework — not just "is this a bot?" but "what is it doing?"
→ Switch classifies intent: browsing, scraping, automating, indexing, monitoring, or probing.
Who this is for
Control over AI agent traffic without rebuilding your site. Know what's coming, decide what gets in.
See who's visiting. Build custom journeys. Get detailed reporting. No code required.
Show clients what percentage of their traffic is non-human. Data nobody else surfaces.
Stop subsidizing competitors who scrape your pricing data. Protect your advantage.
See which AI agents consume your content. Serve optimized formats. Win in AI-driven search.
Who's behind this
Switch is being built by a founder currently in stealth. The product speaks for itself — the person behind it will too, soon.
Five-minute installation. No server changes. Start seeing who's visiting your site that isn't human — and decide what to do about it.