Non-indexable pages issue means that URLs that are closed from indexation.
Pages closed from search engine indexation for any reason might cause issues for your website.
Non-indexable pages that:
– are outdated and irrelevant are recommended for deletion.
– have technical purpose, so their indexation prohibition is a common practice. For instance, the “password reset” page.
– are a part of your website architecture, so you want to generate traffic for such pages. You should allow their indexation.
How to check the issue
Indexation can be prohibited using several methods. Each of them should be checked separately.
1. Meta Robots
Using any browser is enough to check the issue. Open the source code of the flawed page. To do this, click the right mouse button at any spot of the page and choose “browse the code” option, or apply an online tool https://codebeautify.org/source-code-viewer
Find a directive. If the value of the content = noindex, there is an issue at the page.
To check the server’s response, use a tool https://redbot.org/ or any similar tool. The presence of an X-Robots-Tag: noindex in the server’s response means there is an issue at the page.
Detect all Non-indexable pages of your website and go ahead to analyse the other issues on it!
Check not only the issue but make a full audit to find out and fix your technical SEO.
How to fix this issue
Before fixing this error, make sure that indexation has not been prohibited purposefully, which is a common SEO practice.
When using substitute noindex value for index.
When using the X-Robots-Tag: googlebot: delete noindex directive from the server response.
Note that it is necessary to check both options, as using one of them does not exclude the use of another one