10 Significant Google Algorithm Updates

by | Last updated Feb 4, 2020 | Published on Feb 11, 2019 | SEO

Ten Most Significant Google Algorithm Updates

Google maintains an index that covers billions of web pages and then uses advanced algorithms to determine which of them to suggest to search engine users. Although a system known as “PageRank” still does a lot of the heavy lifting, Google has been continuously updating its software since it first came online in 1998.

Testifying before Congress in 2011, for example, then-CEO Eric Schmidt described how Google had made “516 changes” to its core search engine algorithm the previous year after they “were determined to be useful to users based on the data.” That statement provided a rare insight into the inner workings of Google’s search engine because the company mostly sticks to insisting that webmasters focus on publishing useful, high-quality content instead of worrying about its algorithms.

Although many of the search engine updates Google pushes are rightfully regarded as minor, some are so significant as to merit official acknowledgement. In many other cases, watchful webmasters, marketers, and others have been able to identify major changes by sharing and analyzing their experiences collectively. What follows is a look at some of the most significant Google algorithm updates that have happened over the years.

Jagger (2005): An Early Attempt at Punishing Keyword Stuffing

Google has become quite adept at identifying and penalizing underhanded ranking tactics, and 2005’s “Jagger” update was one of the first significant moves in that direction. Up until that point, it had been relatively easy to boost search results rankings for particular pages by including dozens or hundreds of keyword terms that were invisible to users but seen by Google’s crawlers.

In a post on his personal blog, former Google search guru Matt Cutts gave some insights into how Jagger worked and even solicited feedback from webmasters who thought their sites had been affected. Thousands of minor updates and many major ones since have made Google’s search results ranking algorithms far more sophisticated than they were back in 2005. Although blunt by today’s standards, Jagger was a clear, relatively early sign of Google’s still-strong commitment to encouraging aboveboard behaviour.

Caffeine (2009): The Launch of the Second-Generation Google Crawler

In 2009, Google had been online for more than a decade and had been a public company for nearly half that time. Having just launched its Chrome web browser the year before, the dominant search engine turned its attention to speeding up its ability to crawl and index the fast-growing web.

An official post at the Google Webmaster Central Blog introduced Caffeine to the world, promising that the new architecture would benefit users and publishers alike. Meant to be a drop-in replacement for the infrastructure that had brought Google so far, Caffeine did produce some surprises for website developers and maintainers.

At the same time, the new system also allowed Google to crawl pages more frequently and maintain a significantly fresher index. While it touched only the backend responsible for acquiring data and not the algorithms used to rank pages, Caffeine opened up new opportunities for webmasters who tended to publish frequently.

Panda (2011): Google Takes Aim at Content Farms

Even if 2005’s Jagger update had put a damper on some of the most manipulative search engine optimization (SEO) activities, it was clear that plenty more could be done. Throughout the next five or six years, so-called “content farms” proliferated, with vast networks of low-quality pages often ranking well above more traditional and authoritative resources.

Google Panda Algorithm Parxavenue

Google had been consistent in warning against the use of such tactics, but some of its updates made them more effective. Google’s own Amit Singhal, for instance, noted at the 2010 TED Conference how the previous year’s introduction of the new Caffeine architecture had empowered content farms that were assembled quickly and updated with hundreds or thousands of original articles every day.

Panda was a response to that problem, and it proved to be perhaps the single-most significant Google algorithm update yet. When Panda was released in February of 2011, hundreds of sites that had formerly performed well saw their search engine results page (SERP) placements plummeting. More informative and somewhat more quality-focused content farms like eHow took less of a hit at first, although subsequent Panda updates have exacted a steep toll on them, as well.

Penguin (2012): Identifying and Punishing Spammy Links

When Google founders Sergey Brin and Larry Page published the details of their PageRank search engine algorithm in 1998, they did so in the belief that web links could be taken as reliable indicators of content quality. That insight remains a significant building block of Google’s search engine algorithms, but it has been refined and developed a great deal throughout the last two decades.

It did not take semi-scrupulous SEO experts long to figure out, for example, that enormous quantities of low-quality links could impact page rankings just as much as a handful of far more authoritative ones. Penguin was designed to lessen the effectiveness of such tactics and to reward websites that gained links organically because of the quality of their content.

As such, Penguin was naturally seen as a link-oriented counterpart to the content-focused Panda update of the year before. Taken together with the many improvements and adjustments made to them since, these two updates form a sort of foundation for the smarter, more sophisticated modern version of Google’s search engine that everyone is now so familiar with.

Pirate (2012): The DMCA Notice Becomes a Ranking Signal

At the time of this writing, Google claims to have received, Digital Millennium Copyright Act (DMCA) takedown notices targeting nearly 4 billion different URLs. While some of these copyright claims are unjustified, many see the associated web page ultimately being removed from Google’s search results.

In 2012, Google issued an update that made DMCA activity a ranking factor in its algorithms for the first time. Dubbed “Pirate” by others, the algorithmic tweak was meant to elevate authorized content hosted on sites like Hulu and Spotify over results of less legitimate provenance.

Pirate was successful enough that Google updated it at least twice throughout the next two years. Google’s official stance on the matter is that only “sites with high numbers of removal notices” should suffer from Pirate’s penalties and that mostly seems to have been the case.

Hummingbird (2013): Toward a Better Understanding of Intent

Asking questions is an art in its own right, and not every search query does a good job of conveying the actual intent. Google users can be looking for anything from websites that delve deeply into particular topics to brief, definitive answers to straightforward questions.

The 2013 launch of the Hummingbird algorithm update marked Google’s first concerted attempt at addressing this issue. Said to have affected about 90 percent of all searches from the start, Hummingbird added natural language processing to Google’s toolkit to allow for more semantically, contextually informed responses.

Like many other major updates, Hummingbird has been improved upon and tweaked many times since. The intent-aware results that Hummingbird enables are used today in modern SERP features like Google’s Direct Answer Boxes and Featured Snippets.

Pigeon (2013): A Major Upgrade to Local Search

At a conference at Google’s headquarters in the fall of 2018, one of the company’s representatives claimed that searches with a local focus accounted for nearly half of its average daily volume. While that figure has been growing steadily for many years, it had become clear by 2014 that Google could be doing a better job of recognizing and responding to local search intent.

Local Search Marketing Button

Unofficially dubbed “Pigeon,” an update issued in June of that year was meant to put much more of a focus on local businesses and other places physically close to search engine users. Pigeon had first made an appearance in testing a year earlier, with some users seeing Google’s longstanding “seven pack” local results boxes replaced with a more detailed “three pack” set below a map.

On the backend, Pigeon has been said to integrate Google’s local algorithms more profoundly and completely with the rest of the software that makes the search engine run. With local search becoming steadily more important thanks to developments like the increasing dominance of mobile devices, Pigeon could even be considered another cornerstone of the modern version of Google.

Mobilegeddon (2015): Prioritizing Mobile Devices

Smartphones and other mobile devices now handily beat out desktop computers and laptops concerning the share of Internet access they enable. By 2015, Google had for years been encouraging web developers to create sites that work well with the displays and interfaces of such platforms.

Well established technologies like responsive web design and freely available resources like Google’s “Mobile-Friendly Test” have been around for even longer. Released only after months of official warning had been given, the 2015 “Mobilegeddon” update was the first to make “mobile friendliness” a core signal in Google’s search results ranking algorithms.

Since then, some other mobile-focused updates have followed, as with 2018’s switch to “mobile-first” indexing for certain compliant websites and the incorporation of a mobile-device response-speed ranking signal later that year. Between Mobilegeddon and subsequent updates, webmasters trying to rank sites now need to make sure they display properly and quickly on mobile devices and do not incorporate mobile-hostile “features” like UI-blocking interstitial ads.

RankBrain (2015): AI Makes an Entrance

Artificial intelligence (AI) has been a hot topic in the realm of technology for many years. Digital neural networks trained through a process called “deep learning” have proved apt at classifying images, forecasting weather, and even accelerating the process of discovering helpful new pharmaceutical drugs.

Rank Brain Artificial Intelligence

In 2015, Google’s $15.1 billion third-quarter earnings report, revealed the existence of RankBrain, an AI-powered addition to the company’s search engine algorithms. A Bloomberg story on the technology described how “a substantial fraction” of Google’s daily search queries were being used to train the AI.

While the company’s existing algorithms were used to supply most results, those that were regarded as most ambiguous, unusual, or difficult to answer were being handed over to the RankBrain AI. Since the 200-plus ranking signals that power Google’s conventional algorithms are static and not capable of evolving on their own, RankBrain adds a fundamentally new level of capability and flexibility to the search engine.

Fred (1998-2019 and beyond): One Name for Everything Else

Google Webmaster Trends Analyst Gary Illyes earned some laughs on Twitter in 2017 when he stated that “from now on every update unless otherwise stated, shall be called Fred.” His questioner, SEO specialist Barry Schwartz, had some particular developments in mind, but the point was well taken: Google updates its search engine algorithms all the time and almost always without comment.

Most of these changes probably amount to minor tweaks, such as increasing the weight accorded to particular ranking signals. Others, like a bunch impacting mostly health-focused websites in 2018 which observers dubbed “Medic,” probably do represent more fundamental and significant modifications to the algorithms.

Particularly insofar as Google generally has no incentive to detail its algorithms or the changes it makes to them, there is always going to be a large degree of uncertainty involved. The 2017 updates that Illyes first christened “Fred,” though, seem to have been focused on low-quality websites which were designed to produce cash for their owners without delivering anything of value to visitors.

As such, “Fred” can mostly be taken as a reflection of the only bargain Google has specifically made with webmasters: Host high-quality content that our search engine visitors find useful, and we will send you traffic. A dozen or more Google updates each year make enough of an impact that they receive names and generate discussion among webmasters and SEO specialists, but many of them could just as well be called “Fred.”

A Simple Future for Google and Those Who Follow Its Algorithms

The future will undoubtedly include many more Fred-style algorithm updates that encompass everything from small tweaks to measured, strategic reorientation. Google will also definitely keep emphasizing mobile devices, perhaps even to the point that desktop sites start to take a back seat.

Given the popularity and usefulness of so many of its SERP features, it is also safe to expect that Google will introduce even more of these as time goes on and keep refining the existing ones. RankBrain will improve, too, with formerly unsatisfying search results becoming a lot less common in the process.

With a lot of the most obvious and significant steps having already been taken, though, Fred could become much more the norm in the future. Although Google will probably never stop updating its algorithms, recent history suggests that changes to come will more often be incremental and even unremarkable than in the past.

Click here to read about the birth of Google

10 Significant Google Algorithm Updates

by Stephen Gagnon

This Area is Widget-Ready

You can place here any widget you want!

You can also display any layout saved in Divi Library.

Let’s try with contact form: