An HTTP status code between 400 and 499 indicates a client error. This means that the problem is linked to the user request, not to the server response.
Why it is important
A 4xx client error deserves maximum attention.
Such error signals that the content of the page isn’t visible to search engines, which also means that the page won’t be displayed in search engine results – this will impact organic traffic to the page.
Importantly, if a 4xx error is detected by search engines, the respective page would be removed from their index and it might be troublesome to get it re-indexed once the problem is solved. If multiple 4xx client errors are detected on your site, search engines might even lower its ranking or the number of pages indexed.
John Muller, Senior Webmaster Trends Analyst at Google, warns that 404 errors are shown in Google Search Console as crawl errors.
How to check and fix the issue
4xx errors are typically relatively easy to fix. The specific solution depends on the cause of the error.
All 4xx errors (especially 404 errors)
First thing, check the URLs to assess whether these should link to the respective pages. Typically, one would need to check for typos or for potential re-locations of these pages to new URLs. If typos are detected in the URL, corrections should be made wherever the wrong links have been used.
To list all the pages linking to a specific URL, click on the expand arrow located on the left side of the row or click on the number of internal links to the page for the row. Next, find the location in the page’s source code that contains the wrong link and correct the typo.
For pages that have moved, it is best to create 301 permanent redirects from old URLs to new ones. If this is impossible, one needs to locate all the links to the old URL and fix them so that they lead to the new URL.
If the URL doesn’t have any typos and the page hasn’t moved, it is probable that there is a mistake in the actual URL. Verify the file name and directory structure of the URL, or in case of using URL rewrites, check that there are no typos.
Check that the URL links to a page with content rather than a directory listing, since this is the most common cause behind 403 errors. If the URL indeed links to a directory listing, change all the links leading to this URL or create a 301 permanent redirect to a new URL.
403 errors might be also returned when requests from SiteCheckerBotCrawler aren’t accepted or are blocked by a specific web server. To address this issue, the server setting has to be altered in order to allow requests from SiteCheckerBotCrawler.
First, check whether the page is accessible without logging in. If it is, change your application code or web server settings in order to allow access by unauthenticated users. Conversely, the page should only be accessible to logged-in users, think about excluding it with the help of the robots.txt file.
Check your website for 4xx client errors
Audit your website to detect whether the site has pages with 4xx client errors