Akamai
Overview
This documentation describes the implementation of an Akamai EdgeWorker designed to detect traffic from AI search engines and agentic web crawlers using User-Agent analysis. Instead of modifying content inline, the EdgeWorker routes AI systems to machine-optimized endpoints, improving how your website is consumed, interpreted, and referenced across the agentic web.
This approach enables:
Clearer machine-readable signals for AI models
Better control over how AI agents access and process your content
Separation of human vs. AI requester behavior in your analytics
Improved presence across AI-driven discovery platforms (ChatGPT, Perplexity, Gemini, Claude, Grok, etc.)
Architecture
High-Level Flow
Incoming Request → User-Agent Analysis → AI Bot Detection → Routing Decision → URL Rewrite
Key Components
AI Bot Detection Engine Identifies modern agentic web crawlers via User-Agent rules.
Routing Logic Applies URL rewrites for AI-optimized content.
Configuration Management Centralized bot patterns and routing rules.
Logging & Observability Tracks how AI agents browse, fetch, and interpret your content.
AI Search Engine Bot Detection & Routing — Akamai
Introduction
This EdgeWorker implementation identifies traffic from AI search engines and agentic web crawlers such as ChatGPT, Perplexity, Gemini, Claude, Grok, DeepSeek, and more.
As AI models increasingly pull, structure, and reuse web content to answer user prompts, this routing layer allows you to:
Serve AI-specific variants of content
Improve clarity of machine-consumable signals
Understand how AI systems traverse and interpret your site
Strengthen your visibility across the agentic web
Supported AI Search Engines
Introduction
This Netlify Edge Function implementation provides intelligent detection and routing of AI search engine traffic. The system identifies requests from major AI-powered search services and routes them to specialized content endpoints optimized for AI consumption.
Supported AI Search Engines
ChatGPT / OpenAI, Gemini / Google AI, AI Overviews, Perplexity, Claude, Grok, DeepSeek, and generic AI modes.
Configuration
AI Bot Pattern Configuration
Routing Rules Configuration
Implementation
Core Detection Function
URL Routing Function
Complete EdgeWorker Implementation
Routing Methods (Agentic Web–Ready)
Path Prefix Routing
Best for agentic clarity:
/article/ai-trends→/chatgpt/article/ai-trends
Subdomain Routing
Useful for large multi-property ecosystems:
www.example.com/page→perplexity.example.com/page
Parameter Routing
Lightweight AI classification:
/documentation→/documentation?ai=claude
Environment Configuration
Production
Focus on stability, clarity for AI models, and observability.
Staging
Used for expanding bot patterns, experimenting with new AI indexing behaviors, etc.
(all JSON examples updated to reflect agentic terminology; no SEO terms.)
Monitoring & Analytics
AI-Specific Headers
X-AI-Bot- Identifies which AI system accessed the contentX-AI-Optimized- Indicates that agentic routing occurredX-Original-URL- For reconstructing navigation flow
Example Log
Deployment
Upload EdgeWorker bundle
Apply routing rule via Property Manager
Enable monitoring
Deploy to staging, validate AI routing
Roll out progressively to production
Performance & Security (Agentic Web Context)
AI-specific rate limits
Pattern validation
Known bot IP verification
Agent behavior anomaly detection
That’s it! You have now successfully configured Limy router for your website. Data should begin to populate on your dashboard within an hour.
Last updated