Google Algorithm: the Latest Updates Overview
Regularly, Google releases new algorithms, the task of which is to improve the issue, the fight against poor-quality resources. If earlier the user received an infinite number of pages to his request and did not always find the answer he needed, now, thanks to search algorithms, he can get a solution to an interesting question in seconds.
What is the Google algorithm?
In fact, search algorithm allows search engines in response to user requests to produce relevant (relevant) query results.
Google algorithm updates are on practically a monthly basis (and those are only updates that are officially confirmed!). By far, not all of them have a substantial impact on search results, and today we will help you to figure out what algorithms have contributed to significant changes in ranking results over the recent years.
It might occur that one of them has affected or will affect the positioning or traffic of your website. After you’ve read this article, you’ll know what features of content, internal and external optimization should be paid closer attention to, and how to adapt your website to the new requirements for successful ranking in Google.
Search engines algorithms` most common principles of work:
- Data Collection. As soon as a new resource appears the page crawler will immediately visit the page. It collects information about textual and visual content.
- Indexing. The site will not appear in the search results pages until the reverse index file is compiled for the page. It is designed to quickly search for it and consists of a list of words from the text. After that, the website can be found by keywords.
- Information search. After a request comes from a user, for example, to buy a book, robots find all the pages that fit this request. Documents in which the keywords are most suitable for users requests got into SERP.
- Ranging. The goal of any SEO site optimization is to get on the first page of Google result pages. There are many factors that matter, for example, weight of the page, the authority of the domain, the correspondence of the text to the demands, and the expertise of the material. Search algorithms analyze the frequency of use and location in the text of keyword requests on a separate page of the site. Pay attention to the title tag.
The List of Google Algorithm Updates
Launched: February 24, 2011
Purpose: Moves down the positions of low-quality content sites
Google Panda Algorithm is used to detect pages with unoriginal content, content that is flooded with keywords, spam or automated content. The algorithm can also affect resources that duplicate information on a number of site pages and sites with insufficient content. Pages and sites of this kind are usually downgraded by Google’s ranking system.
Initially, “Panda” was not a part of the main algorithm – PageRank, but worked as a Google filter, meaning that it affected a certain category of sites with every new update. But in January 2016, “Panda” was officially included in the basic ranking algorithm. This, however, doesn’t mean that the algorithm now truly operates in real time – there are still updates that affect a certain part of search results. It’s just about the fact that they’ve turned out so frequent that Google doesn’t advertise them.
On one hand, this increases the chances for faster pessimization of the risk zone sites by “Panda.” On the other, it allows the owners of the sites that have previously suffered from the algorithm to recover their positions and traffic much faster.
What does the “Panda” Algorithm Usually Punish for?
- Unoriginal content (plagiarism);
- Duplicated content on different pages of the same site;
- Automated content;
- Content flooded with keywords;
- Spam content generated by users (comments, for instance);
- Insufficient amount of content on a page (in relation to advertisement units, for example);
- Insufficient user experience.
How to Secure Your Website?
Check the site for content uniqueness. Even if you personally write all of the content for the site – don’t neglect the opportunity to use duplicate content checker periodically. Copywritely will help you to do it fast.
Online stores should care about the at least partial uniqueness of information on their fly pages. Although one cannot change specific technical descriptions of goods, it is still possible to make unique photos and videos of wares, same as to stimulate users to leave real reviews.
Check the site for the presence of duplicated information. It is one of the most commonly met reasons of site downgrading. Due to peculiarities of various CMS systems, the same content can be accessed by search engines via different URLs, which leads to the occurrence of duplicated pages in the index. Unfortunately, the site owner may not even be aware of such pitfalls in his/her resource structure.
Specialized software like our Website Crawler may actually help in detecting such pages. Scan your site and pay attention to the pages with a duplicated meta tag title and H1 tag. Those are usually potential duplicated content carriers. Take measures to eliminate them or close to search engine robots – you can use the canonical tag, 301 redirects, or the noindex meta tag.
Check the ratio of page content to outbound links. Rate your pages in the context of outbound links. Perhaps your project has pages with a lot of outbound links. In this case, it’s of huge importance to add unique content to them, which will help to avoid Panda’s close attention.
2. Google Penguin
Launched: April 24, 2012
Updates: May 25, 2012; October 5, 2012; May 22, 2013; October 4, 2013; October 17, 2014; September 27, 2016; October 6, 2016; currently updates in real time
Purpose: Downgrades the ranking of sites containing profiles with spam links and sites manipulating the reference weight
Links have long remained a decisive factor in Google’s ranking system, and the Penguin algorithm was primarily introduced to detect and apply sanctions to sites with unnatural reference weight. This algorithm changed the very understanding of site promotion according to Google standards and became a nightmare for a lot of SEO optimizers.
Penguin turned out to be the beginning of the end of the era of rented links viewed as the main component of successful promotion. Since the autumn of 2016, it operates in a freewheeling mode and has become a part of the main algorithm; it has equally improved the chances of both getting under sanctions and the possibility to remove those for previously affected sites.
The Key Reasons for Sites Falling Under the Sanctions of “Penguin”:
- Bought-in links;
- Links redirecting from low-quality spam sites;
- Links redirecting from grid-based sites that were primarily created as donors for building reference weight;
- Unnatural anchors;
- Links redirecting from pages with absolutely irrelevant content.
How to Secure Your Website?
Monitor changes in a link profile. One needs to periodically monitor a link profile using services for backlinks analysis. Also, one should clearly realize that the chances to get under Penguin’s sanctions for a couple of spam links are rather low, while immediate reception of hundreds of links from irrelevant sites would look extremely suspicious and could drag the attention of the algorithm. Try to monitor your site’s link schedule on more frequent occasions – any abrupt jumps should be thoroughly investigated.
Get rid of malicious links. So, you’ve analyzed a link profile and have found spam links? Now it’s time to think about how to get rid of them. Ideally, you’ve managed to get in contact with the owner/webmaster of the site, and he or she is about to delete it. If this scenario isn’t possible to fulfill, then you should use the disavow tool in the Google Search Console. In such a way you ask Google to ignore spam links when evaluating your link profile.
3. Google’s Pirate Update
Launched: August 2012
Updates: October 2014
Purpose: To move down in the ranking system those sites that regularly receive complaints about uploading pirated (copyrighted) content.
This algorithm was mainly developed for the sites with multiple complaints about pirated content not to be able to get higher rankings in Google SERP. The majority of the sites affected by the algorithm contained movies, music or books available for downloading or viewing. Google also included torrent trackers and shared service link aggregators to that category of sites; although they do not formally store forbidden files, they provide information on how to download those.
The search engine doesn’t state the exact number of complaints about a site that is required to get filtered, but it is absolutely clear that the work of this algorithm is far from ideal – many sites with hundreds of DMCA complaints are successfully ranked in the Top-3.
What Is It Imposed for?
It is logical to assume that it’s being imposed on a site containing pirated content or the information on how to obtain such content by-passing a rights holder.
How to Secure Your Website?
Simply do not upload such content to your website, for there’s no other way. If the dynamics of complaints will remain stable, then the algorithm will sooner or later be spread to your site too.
Launched: August 30, 2013
Purpose: To provide more relevant results, based on the semantic component of a search request
The Hummingbird algorithm brought large-scale changes to the scope of the search engine’s interpretation of user requests. After the release of the algorithm, the main emphasis was made on providing results based on the understanding of user intentions, rather than on direct entrance of keywords, as it used to be earlier. It is due to “Hummingbird” that Google has improved its understanding of synonyms and a subject-based content division. Only after the release of this algorithm, it became possible to see a page in the search results, which, technically, carried no directly stated user search request, but contained its synonyms.
How Can the “Hummingbird” Affect Your Website?
Excessive use of a particular search request (keyword) in text with no reference to task-making words, synonyms or maximal disclosure of a topic increases the risk of this request to get low rankings. Was it so that you truly wanted to rank well on a specific request and therefore referred to it many times in a text? After the release of “Hummingbird,” you put yourself at risk of getting a directly opposite effect.
How Can One Stay Secure?
Diversify your textual content. Use Google’s tips to process longer search requests for your landing page. Structure up content using H1-H6 headings; maximize the use of synonyms and words related to the desired topic. Avoid a direct entry of your request in a text if it makes it unreadable. Do not underestimate the current level of recognition and understanding of spoken language by alternative search engines. Read your text on a page that’s being promoted. How exhaustive was the answer to your search request? Was the text written in a proper language or in a “robotic” manner specifically for the search engine? If the answers to these questions evoke doubts – feel free to rewrite the content.
Launched: July 24, 2014
Updates: December 22, 2014
Purpose: To provide more relevant local search results
This algorithm influenced the search results that considered a user’s location. Despite the expectations of many specialists, it only touched the English-speaking user segment mostly. After the release of this algorithm, the user’s location and his/her distance to the object offered as a search result have been taken into account as ranking factors. The “Pigeon” algorithm allowed local niche businesses (restaurants, cafes, and educational institutions) to bypass large blown sites in search results and receive more traffic.
So, if you wish to rank better in a particular region, you should care about registering with Google listing and mentioning your site in the directories and sites of this particular region.
6. Mobile Friendly Update
Launched: April 21, 2015
Purpose: To promote mobile-optimized pages in search results delivered to mobile devices
This change allowed mobile-friendly pages to rank higher in mobile search results. This update did not affect the desktop search results. It was assigned to provide a user with more convenient pages suited for a mobile device – without the need to do scaling or horizontal scrolling to read a text, and with easy-to-click elements. It is notable that the algorithm activity is specifically targeted at a particular page, and not at a site as a whole. Thus, one page can be recognized as optimized for mobile devices and get promoted in positioning, while the other ones, on the contrary, be downgraded in search results.
How to Win the Algorithm’s Favor?
Make sure your site is convenient for mobile use. It sounds too simple, but it’s the most surefire way to do it. As you pass the mobile friendly test, you start realizing that your concept of usability could differ from that of Google. Your task here is to find a compromise – to make a page truly user-friendly and capable of meeting the key Google requirements at one and the same time.
Launched: October 26, 2015 (could be earlier)
Purpose: To provide a user with the best results based on both relevance and machine learning
RankBrain is a machine learning system that allows Google to decipher better and understand the meaning of a user’s requests and provide more relevant search results depending on the context of the request.
In a loose sense, the algorithm defines the subject matter of a page and determines how relevant the content is to the user’s request. Hinging upon the behavior of site visitors, the machine learning system allows the search engine to determine how useful the content was, then self-educate and provide the most useful results in due course.
Threats to Your Website:
Lack of factors affecting the relevance of a page to a search request. If you wish to be ranked according to a specific request then you need to increase the relevance of your landing page to this request.
Insufficient user experience. Analyze your pages for bounce rate, visit time, then determine sites’ leaving points and try to improve user interaction with your website. It is also worth defining pages that differ by these metrics from the average site pages and find out the causes of the poor user experience. For example, it might have something to do with the improper sorting of goods by default (from high to low) on a category page of an online store, which could force a user to form the wrong impression of a store. Similarly, you can find the most interesting pages for users. Analyze their content and use this knowledge when creating new pages and optimizing the existing ones. Also, one shouldn’t forget that the more diverse content and interactive elements a page contains the higher behavioral indicators it usually gets.
Launched: September 1, 2016
Purpose: To provide better, more relevant local search results based on a user’s location
This is another representative of fauna from Google’s “zoo” assigned to improve local search results. Owing to this algorithm, the user’s location has become an even more important factor to consider when showing this or that result from a local business – the closer a user is to the company’s physical address, the higher are the chances this result will be shown in the dropout.
The algorithm has seriously filtered affiliated organizations that, for instance, share the same phone numbers or addresses. This allowed search engines to provide more diverse results in order to avoid attempts to usurp a local dropout.
Another feature of the algorithm is the possibility to better rank companies with a physical address outside a certain city. Thus, if before the release of “Possum” such companies were rarely shown in local results on requests with city names, they got a significant growth in the number of shown results and traffic after the algorithm was released.
Launched: March 8, 2017
Purpose: Filter out low-quality pages from search results having the main goal to profit from advertisements and links to other sites
It is the last of the confirmed Google updates of the main algorithm; Fred got its name owing to the joke of Google’s employee Gary Illyes who offered to call all of the subsequent updates “Fred.” The search engine representatives confirmed that the update had been released, but explained in an implicit form the principle of its operation. Thus, according to representatives of Google, Fred punishes sites that violate the recommendations for webmasters. Such a statement didn’t provide the SEO-optimizers’ community with any useful information, but practical studies have shown that sites with low-quality content that host texts with over optimized keywords, a large number of advertisements or outbound links are usually the first to suffer from Fred.
- Sites that abuse banners, pop-ups and other types of advertisement;
- Sites containing articles that are primarily written for search engine robots to generate traffic;
- Sites with a lot of outbound links.
How Can One Stay Secure?
- To mitigate one’s ardor to monetize a project through advertising blocks;
- To write relevant, useful content instead of dull and robotic SEO-texts.