A Brief History of Google Algorithm Updates

by Ryan Miller
on August 28, 2015

TL;DR

Google has been fine-tuning its search algorithm for over a decade, and each update has been about one thing: providing its users with a rich, safe and ultimately USEFUL experience.

As much as I try to avoid it, I have a tendency to use “Google” as a placeholder for what I really mean: search engines. I mean hey, it rolls of the tongue a little better than “Google, Bing, Yahoo, ASK, AOL, and Yandex” (if you’re really concerned with Russian organic rankings, I suppose).

All jokes aside, though, it’s kind of subconscious. Because let’s face it: Google basically is the search engine’s search engine. Globally, its market share continues to grow (a not-too-shabby 70.23% at the time of this writing).

So yes, of course other search engines such as Bing are absolutely important in SEO. But at the same time, let’s not kid ourselves. There’s a reason why, if you look up SEO in the dictionary, this is the illustration you’ll see:

SEOs bow to the greatness of Google
Via

And that reason…has everything to do with the structure of Google, how it ranks organic content, and how far it’s come in that endeavor over the long historical course of these Google algorithm updates.

In the Beginning…There Was PageRank

And PageRank said, “Let there be links.” And it was good…for a time.

Yes, PageRank (PR) was the OG algorithm (well, not the algorithm, but the main ingredient) that set Google on the trajectory that it continues to follow. Named after Larry Page, who co-founded Google with Sergey Brin, PageRank allowed Google to gauge the importance of a website by the quality and quantity of links pointing back to it.

Simply put, a link represented a “vote” in this cyber assembly called the world wide web. One link from site A to site B essentially said “site A thinks site B should rank.” The more authoritative and valuable the linking domain, the more weighted its “vote” was.

How exactly did Google measure such value and authority? Through PageRank, of course. So, higher PR sites could boost the PR of other sites by linking to them. And those sites could boost the PR of others. Sounds confusing, right? It was the Internet’s own chicken and the egg paradox.

Google PageRank
Via

The complexity of PageRank required Google to update its index often. So, Google executed frequent yet minor updates (about 10 times per year). This became known as the “Google Dance” — one step here, one step there, but no major movements.

Then, something happened. Google stopped the music and changed the tune. It took on an entirely new MO that entailed more massive Google algorithm updates, which perpetually “rolled out” and impacted its index. The dance went from waltz to swing, and it’s remained that way ever since. What follows is an overview of some of that dance’s major steps.

Boston (February 2003)

The first of many subsequent updates was named Boston (I know, it sounds totally innocuous), as Google announced it at the Boston Search Engine Strategies conference in 2002.

Though long since forgotten, what was significant about Boston was two things. First, it set the stage for major, rolling Google algorithm updates. Second, and arguably the more important takeaway, was the explicit emphasis on the quality of backlinks — a motif that the next update developed even more clearly.

Cassandra (April 2003)

Links are some tricky SOBs to scrutinize. And black hat SEOs, being as persistent as they are, constantly find new ways to game the system. If you need a site to rank for 500 keywords ASAP, why not just buy a high PR domain to link back to it?

This is precisely what Cassandra sought to rectify: links from co-owned domains. In addition, the Cassandra update set the stage for cracking down on some black hat techniques that future updates such as Austin would grapple with, such as hiding (or “cloaking”) text and/or links.

Florida (November 2003)

As Moz puts it, “this was the update that put updates (and probably the SEO industry) on the map.” This is no lofty statement, for Florida was the first major time that Google actively penalized sites for keyword stuffing.

For a while, keywords were the primary indicator of relevancy to a given search query. And how painful it must have been to read the pre-Florida content of the late ’90s, which looked something like:

Web design is the process of designing a web site. Web design is important because web design determines the whole look and feel of your website. Without web design, your site will look horrible. Call the best web design company for the best web design services at the best web design prices!”

A single keyword repeated six times in just fifty-something words? Apparently pronouns were something of a novelty.

Be that as it may, the message that Florida sent out was clear enough: quality content and SEO are inextricable. This was really bad news for keyword stuffers, and some webmasters even wondered if they could sue Google for being penalized.

The answer was: no, they couldn’t. From there forward, SEO became an art — and an extremely competitive one, at that.

Austin (January 2004)

Plenty of sites that didn’t get Florida’s memo — or were lucky enough to emerge from it unscathed — certainly felt the wrath of Austin two months afterward. Google celebrated the new year by cracking down on spammy meta tags (as well as link farms and some on-page issues — more on that later).

Austin put a major dent in the efficacy of stuffing a web page’s <meta> elements, particularly meta keywords and descriptions. What these HTML tags are supposed to do is provide information about a page and its contents to search engines. But what often did happen was something like this.

Let’s say a website sold whey protein powder. Until Austin, the meta keywords might look like this:

<meta name=”keywords” content=”buy whey protein, whey protein buy, purchase whey protein, whey protein for sale, buy whey protein powder, purchase whey protein powder, best whey protein powder, whey protein powder” />

Yea, yea… we get it, already. You want to rank for these keywords. Unfortunately, Austin made this an unfeasible SEO tactic. In fact, it wasn’t long before Google abandoned meta keywords as a ranking factor altogether. Here, as in 100% of these Google algorithm updates, even the most clever of spammy techniques cannot trump quality, useful content.

Brandy (February 2004)

Google followed Austin with a hodgepodge of updates that became known as “Brandy.” This altered a few things: increasing the size of the Google index, reducing the importance of some on-page elements, etc. But the 3 major legacies of Brandy are as follows:

  1. Deep Linking – even today, 99% of the time, 99% of inbound links point toward the homepage only. This is partially a residual of how PageRank used to work, indirectly conferring link value to pages that were internally linked (assuming a solid, intuitive site architecture). Brandy didn’t totally erase this, but it did increase the importance of “deep linking,” that is, inbound links pointing to inner website pages.
  2. Link Neighborhoods – As if the newfound importance of deep linking didn’t make life difficult enough, Brandy added one more signal to the value of a link: how close it is to a site’s “neighborhood,” that is, a site’s content and whether it’s relevant to the content of the site that is linking to it.
  3. Latent Semantic Indexing – One of my personal favorites, latent semantic indexing (LSI) was a complex addition to the search algorithm that forever changed the logic of keyword research. Essentially, it compares keywords with their synonyms and contextual usage, making the keyword “web design” the same as “web designer,” “web developer” and “designing a website.”

Bourbon (May 2005)

A lot happened after Brandy. Google went public, introduced the “nofollow” attribute, and began doling out penalties for sketchy backlinking. Following the Allegra update in February 2005 was “Bourbon”, which is one of the more technically-oriented Google algorithm updates.

Bourbon dealt primarily with duplicate content. Original (i.e. not duplicated from another source) content has long been crucial to SEO, but Bourbon also highlighted the problem of unintentional duplication (e.g. indexing both www and non-www versions of your domain, mobile-friendly sites, and page versions for printing).

The solutions that came out of Bourbon were canonicalization and redirection. The former informs search engines that a given page is canonical (the same as) with another page, while the latter redirects visitors (human or otherwise) to other pages or sites.

Since Bourbon, Google has continued to update and refine the way it handles these issues (later that year in the “Big Daddy” update, for instance), serving to remind SEOs that this profession is just as much technical as it is skill-based, just as much of a science as it is an art.

Jagger (October 2005)

Following in the footsteps of its predecessors, the “Jagger” algorithm update rolled out over the course of 3 months and targeted at least 3 manipulative link schemes:

  1. Reciprocal Links – The “I’ll scratch your back if you scratch mine” of the SEO world, which Jagger largely helped to discount. Sorry, folks. Links have to be earned the hard way.
  2. Link Farms – Some people grow corn, some grow tomatoes. These sites grow links, and they have millions on them. Any site that says “Submit Your Link” is almost guaranteed to be a link farm, and it should be avoided like the plague. If Jagger or subsequent updates didn’t catch ’em, they’re still not long for this world.
  3. Paid Links – You can’t trade for links, you can’t get them for free, but guess what else? You can’t pay for them, either. Knowing whether someone is paying for links can be extremely difficult, but I for one wouldn’t doubt Google. They spend a lot of time and effort researching this kind of black hat tactic, and you’re always better safe than sorry.

Vince (February 2009)

Whoa — didn’t we just skip three and a half years? Yes, and a lot happened since then. Vanessa Fox left Google, they updated their index a few dozen times, and also integrated news, videos and images into the search results (this is called “Universal Search”).

But the next really noteworthy update happened in February of 2009. Code-named “Vince,” it bestowed certain favor on big-name brands. Then-head of the Google spamdex (and its unofficial spokesperson) Matt Cutts called it a relatively “minor change.” But go ahead and Google “laptops” to see if you find any results for that tech engineer who’s been trying to sell his Hackintosh for the past decade.

Vince might seem unfair, but think about it from Google’s perspective. Whether it’s keywords, on-page optimization, backlinks, or technical SEO, at the end of the day all Google cares about is quality for its users. And the kind of quality it’s looking for is more often found in a major corporation than a mom and pop store.

But fear not, small business owners! SEO has a long way to go before becoming the exclusive province of the Fortune 500. It takes hard work, but getting small businesses to rank and succeed is very much possible. In fact, it’s still the lifeblood of the SEO industry.

May Day (May 2010)

May 2010, Matt Cutts confirmed a new update, which has since become known as “May Day.” Cutts referred to it as ranking-oriented, not crawling- or indexing-oriented per se. And what May Day was really concerned with was a particular kind of keyword ranking: the long-tail keyword.

Long-tail keywords are longer, more fully-formed keywords that, while occupying an extremely small portion of the keyword universe, tend to draw in significant amounts of conversion-heavy traffic. An example of a regular keyword might be “Key West hotel,” but the long-tail keyword “best hotels in Key West.”

It’s not hard to see how some pages could use titles and meta descriptions to game the system in spite of their thin on-site content, but May Day changed this. Consistent with the introduction of latent semantic indexing, this algorithm update foreshadowed an even more important emphasis on quality content: Panda.

Panda (February 2011)

Google PandaPanda is one of the more well known Google algorithm updates, and for good reason: according to Google itself, it affected approximately 12% of all searches. When you consider the billions upon billions of possible searches that take place at any given moment, this is no insignificant figure.

The focus of the Panda update can be summed up in a single word: content. And when it comes to content, there are numerous bad eggs plaguing the net. There are the sites with thin content (barely 100 words per page) that rely on long-tail keyword traffic, which May Day had begun to weed out.

And then there are “content farms” — sites that exist solely to churn out thousands of pages of content (often written by guest bloggers or freelance writers) in hopes of attracting high rankings.

Panda dealt with all of this and more. The Panda filter, which was patented later the next year, compares metrics such as inbound links to branded searches (i.e. searching for “TM34 Marketing” rather than searching for a keyword and then finding us that way).

The ratio of links to branded searches is the basis upon which Panda “ranks” the web pages on a given website. Employing human (not machine) Google Quality Testers, Google examines the quality and trust of a site. If it fails to meet their quality standards, the ranking factor is applied and that is how a page ranks for a certain keyword (which is always less than it would have been pre-Panda).

The first Panda update, Panda 1.0, made a huge impact on the SEO industry and the internet as a whole. But Google wasn’t done there. The filter is constantly applied and re-applied as Google rolls out fresh Panda updates, with the most recent (at the time of this writing) being Panda 4.2 on July 18, 2015.

Venice (February 2012)

For the next year, Panda became the sole bane of black hat SEOs everywhere. The Panda filter updated at least 11 times between February of 2011 and 2012, but while everyone was frantically attempting to fluff out their original content and rewrite the duplicates, something else went almost entirely unnoticed: the Venice update.

There are other reasons why few SEOs saw Venice coming, and that has to do with its impact. Venice’s focus was local search marketing, the somewhat prodigal child of the SEO world. Pre-Venice, local search marketing had of course existed, but it worked quite differently. The search engines were divided into their own little fiefdoms, with SEO here, local there, etc.

PPC vs. SEO vs. Local Search Marketing

Local SEOs were only paying close attention to the map, as pictured above, because that’s where the local search results were. What Venice did was take local search and put it in the organic results, as well.

Remember when I said small businesses still had much to gain from SEO? THIS IS WHAT I WAS REFERRING TO. Venice made it so that Google automatically took your location into account for a given search.

So, you could search for “real estate law” and yes, you might find a few highly ranked non-local results such as Avvo and Wikipedia. But guess who’s also ranking highly for that keyword? The small, local real estate law firm that’s knee deep in the SEO game.

Penguin (April 2012)

Google PenguinPanda was a huge wakeup call for SEOs with black-hat proclivities, whether or not they knew they were violating the best practices outlined in Google’s Webmaster Guidelines. For the umpteenth time in over a decade, they had conveyed the message: “Quality is our top priority. Don’t bother trying to rank for anything less.”

There are dozens of ways to gauge the quality of a webpage. Content is, of course, one obvious factor, and that had been the primary subject of most major Google algorithm updates since Big Daddy.

At the same time, no one at Google had forgotten about links and the significant role they play. And the proof of this fact was the introduction of Google’s Penguin filter on April 24, 2012.

This is not to say that content and links are two wholly separate concepts. In fact, some of Penguin’s primary targets were websites with “spammy links” (completely unrelated to the content on the page). For example:

Via
Via

WTF are readers supposed to make of garbage like this!? It’s not just bad writing. It’s deliberately manipulative linking.

The content of the linking page isn’t the only factor that Penguin attempts to filter out. You’ll notice that each link’s anchor text (the text that the link is wrapped around) targets “loan” keywords.

You see, once upon a time, anchor text was a valuable asset in the SEO world. Because links are supposed to be earned in an editorial way, it would make good sense for some website to rank for “fast cash loan” if another website linked to them like so:

Check out the fast cash loans at XYZ Brokerage for great terms and rates.

But people took advantage and found ways to editorialize their own links. Thus, a huge proportion of anchor text would look like this, and that looks extremely spammy.

Nowadays, anchor text is still significant, but it’s become counter-productive to waste time “optimizing” it for keywords. Too much, i.e. “over-optimized,” anchor text looks unnatural. And unnatural link schemes make perfect targets for ongoing Penguin updates (which, by the way, continue to roll out every few months or so).

EMD (September 2012)

One thing that came out of Penguin was the new best practice for anchor text. A few suspicious looking ones here and there usually don’t hurt if they are indeed natural. But, on the whole, it’s recommended that most links’ anchor text be branded (“TM34 Marketing“) or “naked” (“http://tm34marketing.com“).

This latter standard, naked anchor text, presents a certain problem. Let’s go back to the payday loan example. Sure, domains can stop spamming the anchor text, but what if the linking domain is actually called www.paydayloan.com?

Is it spam? After all, that’s the domain name. The fact that it’s comprised of valuable keywords is just a happy coincidence, as far as anyone is concerned.

Well…actually, it’s not just a coincidence. It’s a fairly obvious technique, and it was dealt with in the EMD (Exact Match Domain) algorithm update. Now, the power of spammy urls is significantly reduced. Although incorporating keywords into them remains important, the most keyword-saturated domain is of little value unless supported by the sturdy foundation of quality content.

Payday (June 2013)

Speaking of payday loans, this is one of the select industries that Google singled out and targeted in the 2013 update known as “Payday” or “Payday Loan.” This and pornography are the two kinds of “spammy search queries” that Matt Cutts mentioned on June 11, and they hit any site associated with them, whether through links, advertisements or content.

No complicated filters or mind-boggling algorithms here…just plain, good ol’ fashioned spamdexing. Because, let’s face it, if Google’s always going on about the significance of quality search results, there are some websites that well never meet their standards.

Hummingbird (August 2013)

Google HummingbirdRemember the search engine formerly known as Ask Jeeves (later renamed ask.com, and then later shut down due to their inability to compete with Google)? It was an interesting idea that drove the algorithm: not merely searches, but questions in search of answers.

This idea has been expanded into the concept of “semantic search,” which Google took and ran with at full speed just before its 15 year anniversary. With the latest Google algorithm update, more attention was paid to the context of a search, and not merely the search itself.

Hummingbird was a long time coming, and it can be seen as the culmination of a few initiatives that Google had rolled out over the years:

  • Latent Semantic Indexing: The driving principle behind semantic search, LSI is how Google discerns, for example, whether “Macintosh Apple” refers to the fruit or the computer.
  • Caffeine: Another algorithm change, unmentioned here, which made Google’s index more dynamic and 50% fresher.
  • Knowledge Graph: In my opinion, the major precursor to Hummingbird that gave special privileges to web pages whose content answered a user’s question as perfectly as possible (in Google’s estimation, at least).

So what effect did Hummingbird have on SEO? It was unrelated to links, and had about as little to do with content in the way that Florida, May Day or Panda did.

Well, the real impact of Hummingbird is in the way SEOs have since thought about content, and how that content is written (or supposed to be written). It forces the writer to really get into the mind of the reader. Even though it was always supposed to be like this, content must now be written for human users, not search engines.

Or, as Steve Masters put it: “think about why people are looking for something rather than what they are looking for. A content strategy should be designed to answer their needs, not just provide them with facts.”

Pigeon (July 2014)

July 2014 was a big month for SEO. First of all, it’s when I was introduced to (and fell head over heels in love with) this beautiful mistress called content writing. And I’ve dedicated my life to her ever since. So, you’re welcome.

Did anything else happen? Yes, that was when Google introduced Pigeon, and forever intertwined the fates of local and general SEO.

Essentially Venice on steroids, Pigeon took every noteworthy algorithm update and gave it some local flavor. From LSI to Knowledge Graph, redirects to social signals, you can bet that Google is serving you content based on all these factors PLUS their proximity to your wi-fi.

Mobilegeddon (April 2015)

Fast forward a year later and the internet experienced some more Panda and Penguin updates, the penalization of DMCA copyright violators, and the not-so-silent death of Google authorship. Then, in February of this year, e-commerce webmasters started reporting shifts in their keyword rankings.

Some thought it was a UX-related update, while others saw it as specifically related to mobile usability. Google never officially announced anything, but what they did announce was even more ominous. It was the Final Judgement, the Second Coming: mobilegeddon.

Chances are your hosting company, SEO agency, and every web designer in a 50 mile radius began spamming your inbox with subject lines like, “Your Site Isn’t Ready For Mobilegeddon. Get It Optimized Before It’s Too Late!” Maybe you bought into it, maybe you didn’t. But on April 21, it came.

In short, the idea behind this update was to give preference to mobile-optimized sites on mobile searches. Google never actually referred to it as “Mobilegeddon,” and calling such a name an exaggeration doesn’t even do it justice. Even though there were some major winners and losers, it turns out most of the fuss was all for naught.

According to a study by Searchmetrics, the average decline in rankings for non-mobile friendly sites was 0.21. It’s certainly nothing to cheer about, but at the same time, it’s no Panda or Penguin.

What we should take away from the mobile update is this: to my knowledge (and I’ve yet to discover otherwise), this is the first time that Google singled out web design as a ranking factor. Some, such as the “Page Layout”/”Above the Fold” update, did come close. But this is the first time it’s had absolutely nothing to do with links, content, technical SEO, or anything remotely related to search engine marketing.

Takeaways…and the Future of Google Algorithm Updates

And all this is just fine, because it’s completely consistent with something I’ve always said: marketing and web design are two sides of the same coin. I get that not every SEO or content writer has the mind for web development. I can accept that they don’t have time to learn CSS. But there’s no reason they shouldn’t be working with people who do have these capabilities.

Quality is the focus of all google algorithm updatesBecause, when you think about it, every Google algorithm update from Boston to Mobilegeddon is about the same thing: quality user experience. Google is a business, and like any other business, it provides a service. That service is its search engine results. And just as any other business would be, Google is extremely concerned with quality control.

So, despite what you may hear, it’s not hard to predict what the future of Google algorithm updates will hold. Yes, it’s true that only a select few know the specifics. But anyone who looks at this history can see that quality is the top priority.

And there’s no secret to producing quality work, whether in design or content or backlinks. Not everyone is capable of it, but those who are capable are capable. They cannot be replaced, downsized or off-shored. Because Google isn’t going anywhere. And neither is quality, nor those who produce it.

Like This Article? You'll Love Our Work.