What is being Disallowed by robots.txt and how to fix it
Fast Links

Free SEO Audit

Crawl the website for technical issues and get a prioritized to-do list with detailed guides on how to fix.

Sitechecker crozdesk rating Sitechecker crowd rating Sitechecker capterra rating
Sitechecker trusted company

What is being Disallowed by robots.txt and how to fix it

Disallowed by robots.txt issue means that URLs that are blocked in a robots.txt file.

The importance of the issue

The scanning prohibition in a robots.txt file recommends a search engine to avoid visiting a specific page. This is a common practice for pages you do not want to show to the search robot. In case you prohibit the scanning of pages you plan on generating traffic from, it might lead to issues and drops in ranking.

How to check the issue

You can check a specific URL using a tool for testing the robots.txt file https://www.google.com/webmasters/tools/robots-testing-tool. Your website must be added to Search Console in advance.

Analyze not only pages disallowed by robots.txt but the entire site!

Make a full audit to find out and fix your technical SEO in order to improve your SERP results.

Sitechecker rating on crozdesk Sitechecker rating on crowd Sitechecker rating on capterra

How to fix this issue

Before fixing this, make sure that scanning has not been prohibited purposefully, which is a common SEO practice!

Delete the option that blocks the useful page. Make sure that the page can be accessed using a testing tool for the robots.txt file https://www.google.com/webmasters/tools/robots-testing-tool . Make sure that pages that should not be scanned are still prohibited. Rule syntax for the robots.txt file can be browsed through Google support https://support.google.com/webmasters/answer/6062596?hl=en&ref_topic=6061961

Check Your Website for SEO Errors

Go to App
Something went wrong. Please, try again later.
close