What is an SEO site audit
There are 4 variables that affect how much traffic and sales you get from Google.
- The target audience. How many search queries and how often your potential customers enter, what tasks they intend to solve, what is their most frequent buying scenario (whether they always buy after switching from Google or prefer to buy on social networks).
- Google. By what formula does the Google algorithm work at a particular point in time, how it crawls, indexes and ranks sites, for which actions it punishes and rewards.
- Your competitors. How long has their site existed and how much it has already been pumped by links, traffic from other sites, how good is their team, strategy and resources, whether they are connecting other traffic sources (for example, branded offline and online advertising) in order to improve their search positions.
- You yourself. How much better than your competitors do you know how Google works and the needs of your target audience, how long has your site been around, how good your team, strategy and resources are.
This step-by-step checklist will help you analyze and improve your website. But remember that this is often not enough to be the first in a niche. In addition to improving the site, study your target audience, competitors and stay tuned for Google updates.
Launch SEO site audit now
While you are reading the checklist, Sitechecker will perform an online audit of your site and generate a list of tasks to fix errors
When you need an SEO audit
1. Before launching the site
It is at the time of creating a site that you set its structure, URLs, links, meta tags and other important elements. Auditing a site before publishing it will help you avoid mistakes that can slow down the growth of its visibility in search.
2. Before buying and selling a website
If you sell a site, then it is important for you to bring it into a marketable state 🙂 If you buy, then it is important for you to assess all its vulnerabilities and errors that may lead to problems in the future. The higher the transaction value, the more expedient it is to spend time on a thorough audit.
3. Before and after site migration
Site migration is an even more complex process than launching. This is one of the most dangerous processes, after which the site can easily lose its position due to technical errors. It is necessary to audit two versions of the site at once, the old and the new. At the same time, you need to check that the transfer of link weight, content occurs correctly. Use our website migration checklist to avoid the most common mistakes during migration.
4. After making big changes to the site
Major changes can be called changes that affect more than 20% of pages or concern the most valuable pages in terms of traffic and conversions. Examples of such changes: changing URLs, changing the design of the site, adding new scripts and styles, changing the internal linking, deleting and adding pages, adding new language versions of the site.
All these situations contain risks that at a glance you will not see the problems that the Google bot and users will face when visiting the new version of the site.
5. To regularly search for problems
Even if you work with the site slowly, periodically making small changes, after a while, the number of problems can accumulate and slow down your growth. Most often, problems accumulate associated with broken links, redirects, the appearance of duplicates and low-value pages, cannibalization of requests.
Therefore, it is important to set yourself, as a rule, to conduct an audit once a month, quarter, six months, a year, depending on your dynamics of work on the site. But the manual version is increasingly becoming a thing of the past. Some tools can scan your site once a month, week, or day and send you notifications when they find technical errors on the site.
Here is an example of the email that Sitechecker sends when it finds pages that have been closed from the index.
6. To create or adjust an SEO strategy
At this stage, the SEO audit is performed using data from Google Analytics and Google Search Console. We need data on the effectiveness of pages, keywords, backlinks in order to identify where we can get the fastest increase in traffic and conversions, which pages we need to prioritize, and which, on the contrary, should be given or merged.
9 steps for a complete SEO audit
1. Check if the site is working
To do this, simply go to the site from any device and make sure the site loads. After that, check if the website is down or up through the eyes of a robot.
If it is important to check the availability of a site in different countries, you can try the tool from Uptrends.
The site may not load for a variety of reasons: an expired domain, errors when updating plugins or CMS, an accident or technical problems on the side of the hosting provider, a DDoS attack, or hacking of the site by hackers.
It is difficult to prepare for each of these problems, but you can reduce site downtime by quickly responding to the problem. Set up automatic site monitoring for availability, and it is best to use several tools at once.
2. Check if the site is in the SERP
At this stage, we are only looking at the presence of the site on the search results page, without evaluating any errors. Type site: domain.com into a Google search. We will use this method often to find various problems.
The site may be absent from the search for a variety of reasons.
- The site is too new and Google hasn’t found out about it yet;
- There is no or little content on the site;
- You have closed the site from indexing and/or crawling by a search bot;
- The site came under the filter or penalties.
If you are faced with this problem, then you need to conduct a separate audit of the reasons for the absence of the site in the search results. And after fixing it, you can move on to the next steps.
3. The site has only one working version
If the site has several working versions (that is, they have a status code of 200), then this can lead to the appearance of duplicates in Google, indexing of pages on different versions of the domain, spreading the link weight between several versions of the domain.
Of the four domain versions:
Only one version should have status code 200, and the remaining three should be redirected to the main one. Other errors can lead to duplicate pages, but problems with redirects between HTTP and HTTPS, WWW and non-WWW affect the entire site. Sitechecker can help identify such problems.
4. The site is loaded from different devices
To do this, you also need to look at the site with your own eyes and the eyes of a Google robot. Go to the site from your mobile and run the test in the Mobile-Friendly Test tool.
For example, from my mobile, I see that the page https://copywritely.com/seo-content/ is loading fine.
But when I check it in the tool, I see that the page is not mobile-friendly.
If you go to the details of the report, you can see all the problems that the tool found. Some images and CSS, JS files are closed from crawling in robots.txt. Therefore, the Googlebot does not see the page as the end-user sees it.
If you’ve already added your site to Google Search Console, you can see a summary of responsive issues across all pages.
It is also important that the tool can show you errors that you yourself would not have noticed. For example, clickable elements are too close to each other.
5. The site works on the HTTPS version
Until recently, the introduction of HTTPS was only a recommendation of Google. Now, this is almost a mandatory item, even for sites that do not store user information and on which there are no transactions.
Although a single visit to one of your unprotected websites may seem benign, some intruders look at the aggregate browsing activities of your users to make inferences about their behaviors and intentions and to de-anonymize their identities.
For example, employees might inadvertently disclose sensitive health conditions to their employers just by reading unprotected medical articles.<...>
In this case, it is important not only that the HTTPS certificate is present, but also that it is valid. Almost all browsers (I checked in Google Chrome, Opera, Safari) show a notification about certificate problems. As a rule, you can still get to the site, but after another two clicks.
If users come to your site for the first time, then they are unlikely to want to go to the resource after such a browser notification.
The reason for the invalid certificate can also be easily found out. You need to click on the Not secure message in the browser line and go to the details of the SSL certificate.
I use the Let’s Encrypt certificate on all sites. Many hosting providers already include automatic issuance and renewal of these certificates as a default service. Ask your hoster about this.
6. The site is safe for users
The presence of HTTPS adds confidence to your site in the eyes of users but does not completely insure against any vulnerabilities. For a long time, malicious or unwanted software (software) or content that uses social engineering methods to manipulate your users may be installed on the site, and you will not know about it.
To quickly notify site owners of such problems, Google has created a special report in the Google Search Console “Security Issues”. You can also use the Google Transparency Report or our website safety checker, which uses the Google API, to independently check any site.
I faced this problem myself. My site was showing banner ads when I visited it from a mobile device. If you use WordPress, then this can also happen due to hacking of plugins or themes that you have installed. Use Google Help to learn more about how to secure your site.
7. The site is not on the blacklists
A blacklist is a list of domains, IP addresses, and email addresses that users complain about for sending spam. They are public and private. People can use these lists to block unwanted mailings.
If you send emails to your customers, you can easily get caught in them and slow down the effectiveness of your email marketing. If you also send product letters to users, then the purity of your domain becomes critically important.
Examples of known blacklists:
- Spamhaus Block List (SBL);
- Composite Blocking List (CBL);
- Passive Spam Block List (PSBL);
- Barracuda Reputation Block List (BRBL).
On each of these sites, you can check if your site is on the list and what you need to do to remove it from the list. You can also use our blacklist checker that checks the presence of your site in many spam databases at once.
8. There are no extra pages in search results
In this step, we go through the SERPs with our own eyes to make sure that Google is not indexing unnecessary pages.
These pages include:
- duplicate pages;
- indexed search or filter pages (unless you intentionally index them);
- checkout pages;
- picture pages;
- any information pages that users are unlikely to search for.
Such pages can be indexed due to:
- the inattention of developers, administrators, content managers;
- problems with plugins, themes;
- hacking the site by intruders and generating pages for their own purposes;
- link building by attackers to non-existent URLs, unless you have configured a 404 server response for such pages.
If the site has already been shown in the search for a long time and generates traffic, then it is important to put tracking of problems with page indexing on the stream. The number of pages in the index can grow and fall both at your will (when you deliberately delete pages or add new ones) and as a result of the above errors.
Sitechecker has a special chart on which you can assess whether everything is normal with the indexing of your site. Dramatic bursts of pages in the index can serve as a trigger for a separate audit of what is new in the SERP or which pages have dropped out of it.
9. Check your robots.txt, sitemap.xml, and 404 server response settings
These settings help to partially insure the site against the above problems. Extra pages in the index are harmful not only because users get a negative experience by hitting them, but also because the Google bot will crawl less and less often the necessary pages.
The amount of time and resources that a Googlebot can spend on one site is at the heart of the so-called crawl limit. Please note that not all crawled pages on the site are indexed. Google analyzes them, consolidates them, and determines if they need to be added to the index. The scan limit depends on two main factors: the scanning speed and the need for scanning.<...>
<...>The resources that Google can allocate for crawling a particular site are calculated taking into account its popularity, uniqueness, value for users, as well as the power of servers. There are only two ways to increase the crawl limit: by allocating additional server resources for crawling, or (more importantly) by increasing the value of the content posted on the site for Google Search users.<...>
Sitechecker will help you identify the absence of these settings, but will not write all the conditions for you yet. Make sure that the robots.txt file contains rules that prohibit the crawling of pages that are not relevant for search, the sitemap.xml file contains links to all significant pages, and when requesting a non-existent page, the server always gives a 404 status code.
Also, check the errors in the Coverage report in Google Search Console to find crawl and indexing issues. Use the instructions from Google to understand what this or that status in the report means.
The article is being updated…