Detect pages blocked by robots.txt
or noindex
tags across many URLs.
Free guest limit: 7 of 7 runs left today. Log in or buy credits for more runs.
This tool flags URLs blocked by robots.txt
, <meta name="robots" content="noindex">
, or X-Robots-Tag
headers.
Use it to prevent accidental de-indexing of important pages and to verify indexability at scale.
*
and $
wildcards).
Tip: Robots blocking affects crawling; noindex
affects indexing. A page can be crawl-blocked yet still indexed if discovered via other signals.