Robots.txt Validator
Validate your robots.txt file. Check syntax, directives, sitemap reference, and potential crawl issues.
What We Check
Every check comes with a pass/fail result and specific fix instructions.
Accessible
robots.txt must be at the root of your domain and return HTTP 200.
Content-Type
Should be served as text/plain. Other content types may be ignored by crawlers.
File Size
Google only processes the first 500KB of robots.txt. Keep it concise.
Has Useragent
At minimum, you need User-agent: * to address all crawlers.
No Blanket Block
Disallow: / blocks ALL crawlers from your entire site. Usually unintentional.
Has Sitemap
Including Sitemap: directive helps crawlers discover your sitemap automatically.
Valid Syntax
Only valid directives are recognized: User-agent, Disallow, Allow, Sitemap, Crawl-delay.
Allows Assets
Blocking CSS/JS prevents Google from rendering your page for mobile-first indexing.
Why It Matters
Numbers that make a difference for your website.
Checks
Comprehensive validation
Limit
Google's max file size
Critical
Wrong config = no indexing
Always
No limits, no signup
Frequently Asked Questions
Common questions about this tool and how to use the results.
Ready to audit your site?
Enter your URL above and get results in seconds. Completely free.
Start Audit