With the enormous amount of information available on the web, processing hundreds of billions of webpages and ranking them according to people’s search preferences is easier said than done. That’s where google algorithms come in.
Google updates its algorithm to sort through huge no: of webpages in its Search index to find the most relevant, useful results in a fraction of a second, and present them in a way that helps you find what one’s looking for. Search algorithms look at many factors, including the words of your query, relevance, and usability of pages, the expertise of sources, your location and settings.
In its early years, Google only made a handful of updates to its algorithms. Now, Google makes thousands of changes every year. Most of these updates are so slight that they go completely unnoticed. However, on occasion, the search engine rolls out major algorithmic updates that significantly impact the search engine results pages (SERPs) such as:
Featured Snippet Update
Source: Twitter
It was launched on January 23, 2020.
The featured snippet will ensure the Search Results page is not cluttered, and only relevant information gets displayed. The featured snippet will be counted as one of the ten listings on the SERP.
Restrict URLs that are shown in the featured snippet to appear again within the first ten organic search results.
January 2020 Core Update
Source: Twitter
It was launched on January 13, 2020.
This update mainly focuses on improving the search quality by giving users better search results by considering quality content on each website. There is no fix for websites that were flushed out by previous Google updates, the only fix is to improve the content quality.
Gives low ranks for sites with no relevant content.
BERT Update
It was launched on October 25, 2019.
According to Google, this update will affect complicated search queries that depend on context. BERT helps with named entity determination, textual entailment next sentence prediction, coreference resolution, question answering, word sense disambiguation, automatic summarization, and polysemy resolution.
Brings more traffic sites with content that is more focused and well organized. Solves the problem with words, ambiguity, polysemy and word’s context.
Source: Twitter
March 2019 Core Update (a.k.a. Florida 2)
It was launched on March 12, 2019.
This update is an improvement to Google’s overall algorithm for the purpose of a better understanding of search queries and webpages. Helps Google to more accurately match search queries to webpages and improve user satisfaction.
Ranking of any page depends on the content and it’s search query interpretations.
Source: Twitter
Broad Core Algorithm Update
It was launched on March 9, 2018.
The update was focused on providing better search results with a no way to “fix” sites that lost rankings. The improvements are focused on the content but it is not a “quality” issue.
According to Google, this update is meant to “benefit pages that were previously under-rewarded,” and its advisory that everyone should “continue building great content.” It’s all about relevance.
Source: Twitter
Fred
Source: Twitter
It was launched on March 8, 2017.
Fred targets websites that violate Google’s webmaster guidelines. The majority of affected sites are blogs with low-quality posts that appear to be created mostly for the purpose of generating ad revenue.
It puts thin, affiliate-heavy or ad-centred content at risk.
Possum
Source: Twitter
It was launched on September 1, 2016.
The Possum update ensured that local results vary more depending on the searcher’s location: the closer you are to a business’s address, the more likely you are to see it among local results.
This update mainly focuses on the searcher’s location.
RankBrain
It was launched on October 26, 2015.
RankBrain is part of Google’s Hummingbird algorithm. It is a machine learning system that helps Google understand the meaning behind queries, and serve best-matching search results in response to those queries. Google calls RankBrain the third most important ranking factor. The general opinion is that it identifies relevance features for web pages ranking for a given query, which are basically query-specific ranking factors.
Pushes web pages with a lack of query-specific relevance features, shallow content and poor UX to the bottom of the list.
Mobile
It was launched on April 21, 2015.
Google’s Mobile update aka Mobilegeddon, ensures that mobile-friendly pages rank at the top of mobile search, while pages not optimized for mobile are filtered out from the SERPs or seriously down-ranked.
Webpages with poor mobile versions will get almost neglected.
Pigeon
Source: Twitter
It was launched on July 24, 2014.
Pigeon affects those searches in which the user’s location plays an important part. The update created closer ties between the local algorithm and the core algorithm: traditional SEO factors are now used to rank local results.
Flushes out webpages with poor on- and off-page SEO.
Hummingbird
It was launched on August 22, 2013.
Hummingbird helps Google better interpret search queries and provide results that match searcher intent (as opposed to the individual terms within the query). While keywords continue to be important, Hummingbird makes it possible for a page to rank for a query even if it doesn’t contain the exact words the searcher entered. This is achieved with the help of natural language processing that relies on latent semantic.
Avoids keyword stuffing and low-quality content.
Penguin
Source: Twitter
It was launched on April 24, 2012.
Google Penguin’s objective is to down-rank sites whose links it deems manipulative. Since late 2016, Penguin has been part of Google’s core algorithm; unlike Panda, it works in real-time.
Keeps spammy or irrelevant links and links with over-optimized anchor text at bay.
Panda
It was launched on February 24, 2011.
Panda assigns a so-called “quality score” to web pages; this score is then used as a ranking factor. Initially, Panda was a filter rather than part of Google’s ranking also, but in January 2016, it was officially incorporated into the core algorithm. Panda rollouts have become more frequent, so both penalties and recoveries now happen faster.
Downranks duplicate, plagiarized or thin content, user-generated spam, and keyword stuffing.