Fast Links

What is being Disallowed by robots.txt and how to fix it

What is being Disallowed by robots.txt and how to fix it

Disallowed by robots.txt issue means that URLs that are blocked in a robots.txt file.

The importance of the issue

The scanning prohibition in a robots.txt file recommends a search engine to avoid visiting a specific page. This is a common practice for pages you do not want to show to the search robot. In case you prohibit the scanning of pages you plan on generating traffic from, it might lead to issues and drops in ranking.

How to check the issue

You can check a specific URL using a tool for testing the robots.txt file https://www.google.com/webmasters/tools/robots-testing-tool. Your website must be added to Search Console in advance.

How to fix this issue

Before fixing this, make sure that scanning has not been prohibited purposefully, which is a common SEO practice!

Delete the option that blocks the useful page. Make sure that the page can be accessed using a testing tool for the robots.txt file https://www.google.com/webmasters/tools/robots-testing-tool . Make sure that pages that should not be scanned are still prohibited. Rule syntax for the robots.txt file can be browsed through Google support https://support.google.com/webmasters/answer/6062596?hl=en&ref_topic=6061961

Check Your Website for SEO Errors

No limits! Upgrade your account to crawl this domain
close