Robots.txt Generator & Tester
Control how search engines and AI bots crawl your website.
Bot Permissions
Example: /admin/, /login/, /private-files/
Robots.txt Master Guide
User-agent
The 'Who'. Identifies which crawler the rule applies to. '*' matches all bots.
Disallow
The 'Where Not'. Tells the bot which folders or files to ignore.
Allow
The 'Exception'. Overrides a Disallow rule for a specific sub-folder.
Sitemap
The 'Map'. Points bots to your XML sitemap location.
Advanced Pattern Matching
Wildcard (*)
Matches any sequence of characters.
Disallow: /*?* # Blocks URLs with paramsEnd of String ($)
Indicates the end of a URL path.
Disallow: /*.pdf$ # Blocks PDF filesGoogle Best Practices
- Don't block CSS/JS: Google needs them to render your page layout.
- Case Sensitive: Robots.txt is case-sensitive. /Admin/ is different from /admin/.
Critical Warnings
Noindex via Robots.txt
Do NOT use robots.txt to de-index pages. Use the meta robots tag instead.
Trailing Slash
Disallow: /admin blocks the folder. Disallow: /admin blocks the file. Be careful.