Generate robots.txt files with smart defaults. Block AI crawlers like GPTBot and ClaudeBot with a single flag. Get framework-specific rules for Next.js, Gatsby, Nuxt, and Astro. Lock down staging environments automatically.
Zero dependencies. Just Node.js built-ins.
npm install -g @lxgicstudios/robots-genOr run directly with npx:
npx @lxgicstudios/robots-gen --block-ai --sitemap https://example.com/sitemap.xml# Generate with smart defaults
robots-gen
# Block AI crawlers and add sitemap
robots-gen --block-ai --sitemap https://example.com/sitemap.xml
# Staging environment (blocks all crawlers)
robots-gen --env staging
# Next.js specific rules
robots-gen --framework next --sitemap https://example.com/sitemap.xml
# Block specific bots
robots-gen --block SemrushBot --block AhrefsBot
# Custom allow/disallow rules
robots-gen --allow "/" --disallow "/admin" --disallow "/private"
# Output to stdout as JSON
robots-gen --json --stdout- Smart default rules for common paths (/admin, /private, /tmp)
- One-flag AI crawler blocking (18+ known AI bots)
- Framework presets for Next.js, Gatsby, Nuxt, and Astro
- Staging/development environment presets that block everything
- Sitemap reference support
- Custom bot blocking
- Crawl delay configuration
- JSON output for programmatic use
- Zero external dependencies
| Option | Description | Default |
|---|---|---|
--output <file> |
Output file path | robots.txt |
--sitemap <url> |
Add sitemap reference (repeatable) | |
--block-ai |
Block all known AI crawlers | false |
--block <bot> |
Block a specific bot (repeatable) | |
--allow <path> |
Add an Allow rule (repeatable) | |
--disallow <path> |
Add a Disallow rule (repeatable) | |
--env <env> |
Environment (production, staging, development) | production |
--framework <name> |
Framework rules (next, gatsby, nuxt, astro) | |
--crawl-delay <n> |
Crawl delay in seconds | |
--host <domain> |
Set preferred host | |
--json |
Output as JSON | false |
--stdout |
Print to stdout instead of writing a file | false |
--help |
Show help message |
When you use --block-ai, these bots get blocked:
GPTBot, ChatGPT-User, ClaudeBot, Claude-Web, Anthropic-AI, Google-Extended, CCBot, PerplexityBot, Bytespider, Applebot-Extended, FacebookBot, Meta-ExternalAgent, Amazonbot, Cohere-AI, AI2Bot, Diffbot, Omgilibot, YouBot
| Framework | Disallowed Paths |
|---|---|
| Next.js | /_next/static/, /_next/image/, /api/, /_next/data/ |
| Gatsby | /page-data/, /.cache/, /static/ |
| Nuxt | /_nuxt/, /api/, /__nuxt_error |
| Astro | /_astro/, /api/ |
MIT
Built by LXGIC Studios
💡 Want more free tools like this? We have 100+ on our GitHub: github.com/lxgicstudios