Skip to main content
Back to glossary

Glossary

robots.txt

Technical

robots.txt

robots.txt is a file that tells web crawlers which pages they can access. For GEO, it's crucial to allow AI bots access to your content.

Why It Matters for GEO

Many sites block AI bots by default. A GEO-optimized robots.txt explicitly allows GPTBot, ClaudeBot, PerplexityBot, and anthropic-ai.

GEO-Optimized robots.txt

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Google-Extended
Allow: /