ScaleWhite Tools

robots.txt Checker

Fetch and parse a site's robots.txt to display crawl rules.

What is this tool?

The robots.txt Checker fetches and parses a site's robots.txt file, displaying crawl allow/disallow rules per User-Agent and any Sitemap directives. Useful for SEO auditing and verifying crawler configuration.

How to Use

  1. Enter a site URL (http or https) in the input field.
  2. Click the 'Check' button.
  3. View the raw robots.txt content and parsed rules.

Examples

  • Verify your own site's robots.txt configuration.
  • Check allowed/disallowed paths for competitor sites.
  • Verify that Sitemap directives are correctly specified.