What is this tool?
The robots.txt Checker fetches and parses a site's robots.txt file, displaying crawl allow/disallow rules per User-Agent and any Sitemap directives. Useful for SEO auditing and verifying crawler configuration.
How to Use
- Enter a site URL (http or https) in the input field.
- Click the 'Check' button.
- View the raw robots.txt content and parsed rules.
Examples
- Verify your own site's robots.txt configuration.
- Check allowed/disallowed paths for competitor sites.
- Verify that Sitemap directives are correctly specified.