Almost every day, Google introduces changes to its strategy in the ranking algorithms. Some are tiny tweaks; others seriously curl the SERPs. This chapter will help you make sense of the most important algorithms changes which influence the SEO and penalties rolled out in recent years, with a brief description and SEO advice on each.

As mentioned previously, the Google algorithm partially uses keywords to determine page rankings. The best way to rank for specific keywords is by doing SEO. SEO essentially is a way to tell Google that a website or web page is about a particular topic.

Years ago, it was commonplace for many sites to “keyword stuff” in their content. In essence, they assumed that the Google algorithm would think that the more keywords that were written which influence the more important the page should be. Ultimately, the ranking of the page is effective. 

Once Google realized this, they altered their algorithm to penalize sites that stuff their pages with keywords, as well as many other “black hat” SEO practices. Because of this, you should avoid any tactic that is done with the aim of outsmarting search engines. Although they may not notice right away, Google will eventually pick up on what you are trying to do, and your rankings will suffer.

Let’s find out what each of the updates was about and how to fix it.


Launch dateFebruary 24, 2011
Objective Panda is the standard name of a Google algorithm that modernized to reduce the prevalence of low-quality and thin content in the search results so that to reward unique, compelling content.
Google’s Panda algorithm assigns pages of quality classification, used internally and designed after human quality ratings i.e., incorporated as a ranking factor.
Websites that recover from the impact of Panda to do so by renovating pages with low-quality content, adding new high-quality content, improving the user interface experience as itrelated to the content. Google Panda is also known as Farmer which was an update to Google’s organic SEO ranking algorithm.


Launch dateApril 24, 2012
ObjectiveIn 2012, Google officially launched the “Webspam algorithm update,” which specifically targeting to link spam and manipulative link building practices of a particular web page. Later, This webspam algorithm was known (officially) as the Penguin algorithm.
The algorithm’s objective was to gain greater control over spam tactics and reduce the effectiveness of spam and keep track of a number of black hat spamming techniques.
Google’s war on low-quality started with the Panda algorithm, and Penguin was an extension to the arsenal to fight this war.
Penguin was Google’s algorithm response to the increasing practice of manipulating search results and rankings through black hat link building techniques. 
Penguin worked towards ensuring the trustworthy, authoritative and relevant links rewarded to the websites which are triggered to the web page, while manipulative and spammy links were downgraded.


Launch dateAugust 22, 2013
ObjectiveGoogle announced Hummingbird on September 26, 2013, but actually it had already been placed for about a month prior. Similarly, previous algorithm updates like Panda and Penguin sparked vital reporting of lost traffic and rankings. Hummingbird did not appear to have severe negative impacts on the general web. It was largely understood as having a positive influence on the accuracy of Google’s knowledge base known as the “knowledge graph.” 
The main purpose of Hummingbird’s algorithm was to translate semantic search from a concept to a reality and one that would ultimately become the search standard.


Launch dateJuly 24, 2014 (US); December 22, 2014 (UK, Canada, Australia)
ObjectiveThe “Pigeon” is a new update in Google algorithm to provide more useful, relevant and accurate local search results that are tied more closely to traditional web search ranking signals of SERP’s. Google stated that the “pigeon” algorithm improves their distance and location ranking parameters.
The prominent role of Google’s Pigeon algorithms update was to connect their local algorithm more deeply to their traditional web algorithm to take full advantage of the hundreds of ranking signals that go into the web algorithm. These new ties to the web algorithm further emphasized the need for local businesses to have a strong organic web presence in order to compete for local rankings.


Launch dateApril 21, 2015
ObjectiveGoogle released a significant algorithm update for a new mobile-friendly ranking parameter that is designed to give a boost to mobile-friendly pages in Google’s mobile search results.
The change is so significant which creates a responsive theme of mobile-friendly interface. One of the best ways to test your web pages to be mobile-friendly by using its Mobile-Friendly Test tool that Google considers.


Launch dateOctober 26, 2015
ObjectiveRankBrain integrates with artificial intelligence to embed vast amounts of written language into mathematical entities called the computer can understand. If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.
RankBrain is the only live Artificial Intelligence (AI) that Google uses in its search results. While Google uses machine learning to teach algorithms, AI isn’t being used in wide and for good reason.If search broke, Google’s engineers would have no clue how to resolve it.


Launch dateSeptember 1, 2016
ObjectiveBefore the update occurred, it was extremely difficult for businesses that were located city outside limits to appear in the Local Search results.“Possum” is the name given to an unconfirmed but documented update that appeared to most significantly impact Google’s local pack and local finder results.
In fact, the update was never officially confirmed by Google, local SEOs have been left to hypothesize about the potential update’s purpose and concrete effects. It seems to be like the Local SEO.


Launch date
March 8, 2017
ObjectiveGoogle Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits. This does not mean all sites hit by the Google Fred update are dummy sites created for ad revenue, the majority of websites affected were content sites that have a large number of ads.
The ads are highlighted in yellow, and are the most prominent pieces on the page. The content is thin and unclear on the purpose. The majority of the websites affected by Google Fred on the following factors:
An extremely large presence of ads
Content on all sorts of topics created for ranking purposes
The quality of content is far below industry-specific sites
Deceptive ads (like a download or play when clicking)
Thin content
UX barriers
Mobile problems
Aggressive affiliate setups
Aggressive monetization

9.BERT(Bidirectional Encoder Representations from Transformers)

Launch date
October 25, 2019
ObjectiveAccording to Google BERT update will affect complicated search queries that depend on context.
This is what Google said:-“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
The BERT algorithm (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm related to natural language processing(NLP). It helps a machine to understand what words in a sentence mean, but with all the nuances of context.

Google rarely gives any indication of the changes in its algorithms unless they are major. Smaller changes are made internally and are usually not felt except as smaller ripples. It’s vital to stay with an updated core part of Google’s algorithm announcements, but it’s just as important not to overthink the algorithms.

Recommended Posts to Read:

Share with...