Check if a website has a valid and accessible robots.txt
file.
The robots.txt
file tells search engines which parts of your website should or shouldn’t be crawled. It helps manage crawler traffic and prevent indexing of private or duplicate content.
Disallow: /