6 Ways Google Killed SEO And What to Do About It
If I seem absent from this site, it is only because most of my work is published now by Biznology. In that blog, I am following a long thread about how to optimize digital experiences for Google post SEO. SEO as we know it is dead. But attracting an audience through Google is not optional. So how do we do it in that age post SEO? That is the point of my monthly posts at Biznology.
Occasionally, I find myself with fresh insights that don’t quite fit into the flow of that blog. So I will write them here. This is one such post. This one came about when I was in residence for a week at the IBM Design Lab in New York. In the course of my discussions with key collaborators there, I came to realize that the Biznology thread is a bit too narrow. There I have mostly focused on how Panda and Penguin have killed SEO. But these algorithm adjustments are only two of the six seismic changes in Mountain View that adjust the algorithm in ways one cannot reverse engineer. I’d like to highlight all six in this post.
First a bit of terminology. By “SEO” I mean the attempt to reverse engineer Google’s algorithm and build pages that will tend to rank well on that basis. Traditionally, this has been about learning the rules Google used to rank one page higher than another, all things considered, and trying to follow those rules. SEOs chased the algorithm by keeping up with how the rules changed–either new rules were added or existing rules were given different weight or, etc.
Well, in the last several years, Google has added other factors and filters that are not rules-based at all. It was never a good idea to chase the algorithm when it was rules-based. Now that it is ever less rules-based, chasing the algorithm is a fools errand. But as I say, ranking well in Google for the words your target audience cares about is not optional. So how do you do it post SEO?
1. Performance metrics
It was three years ago when I first heard from Avinash Kaushik how Google rewards pages that have high click-through rates (CTR) and low bounce rates on its search engine results pages (SERPs). (Fortunately, we were able to include this in our book.) What does this mean? Well, if your page performs well by conventionally accepted metrics, it will rank better over time. This makes sense because high CTR and low bounce rates are indicators of relevant content. Google is effective to the extent that it serves relevant results to its users. It’s results will tend to get more relevant over time if they promote content that perform well in these key metrics over time.
Note that performance is not rules-based. With this filter, Google is saying they really don’t care how you perform well. It is enough merely to do so. And there is no way to fake performance. The only way to maintain good ranking with this filter is to provide relevant experiences to your users, which is too highly varied to put into a discrete set of rules.
2. Quality signals
I have written extensively about how Panda affects results. So I won’t belabor the point here.The basic principle is that Panda uses crowd sourcing and machine learning to make ever more accurate assessments of the quality of the content in its results. It then rewards high quality content and punishes low-quality content.
Again, this can’t be done with rules. There is no one standard of high quality content that satisfies the various contexts in web publishing. But there are certain signals or hallmarks of quality content that Google can learn through its quality testers and look for. Once it sees those hallmarks, it rewards the pages that manifest them.
3. Semantic smarts
After IBM Watson won at Jeopardy!, I wrote in this blog how I thought future search engines would use semantic smarts to rank pages, rather than the dumb syntactic pattern matching they used at the time. SEO used to be about getting the exact phrases in the right places on pages, and then to vary the language on a page with different phraseology so as not to look like you were trying to game the system.
Semantic search renders all that advice garbage. It’s not about exactly matching the syntax (actual stings of letters and spaces) of the keywords your target audience cares about It’s about matching the semantics of what they type in search. Synonyms can have entirely different syntax. There’s no use trying to re-engineer a semantic algorithm. You just have to write naturally in a way that is relevant to your target audience and forget about building pages solely with exact-match keywords. Of course, it’s important to have those things in the title tags and meta descriptions. But otherwise, it’s not that simple.
4. Overoptimization signals
Google has been wise to SEOs for some time. Like virus writers and virus software makers, it has always been a dance of keeping one step ahead of Google and building pages that Google does not penalize for overt uses of SEO. There was a point at which Google caught up and overtook the SEOs on this issue. It was some time in 2011, when Google recognized patterns in the way SEOs responded to algorithm changes. Using machine learning again, Google discovered a way to see patterns in SEO behavior and thwart it before it became widespread. The upshot is: If you find a way to game the system and it becomes even remotely widespread, Google will discover it and write countermeasures into its algorithm. These countermeasures are not typically rules based anymore.
For example, a couple of years ago, I was on a call with a group and a cocky guy who had recently read a blog post by a prominent SEO was touting using ALT attributes to pump up keyword density in pages. By the time I studied this in greater detail, it was now more of a negative signal for Google than a positive one. In other words, Google was punishing sites for using ALT attributes to pump up keyword density, similar to what it had done with hidden text a decade before. The difference since 2011 is, they actually built a machine learning program to detect algorithm hacks and punish sites that use them programmatically.
5. Link building signals
When Google first started, it was like every other search engine in most ways. It ranked pages by systems of rules based on the syntax of the text on pages. The reason Google dominated the search game was not because it did this part of search better than the others. It was because it also looked at links. Links are the signals that tell Google of the context of the page it is trying to rank. It rewards pages that are contextually relevant to the search query all things considered, based on the quantity and authority of links into the page.
Clever SEOs discovered how to game this system by buying or swapping links. Eventually, it got so bad, links were practically meaningless for Google and it was back to the dumb pattern matching based on syntax rules. That was, until Penguin, when it found a way to detect and foil apparent link building activities in the algorithm. This is a form of overoptimization using links. Penguin thwarts apparent link building as surely as Panda thwarts keyword stuffing in alt attributes, using the machine learning techniques.
6. Copyright infringement
Google will continue to add ways of thwarting those who game the system as soon as their tactics become known to Google. One common way to game the system is to copy the content from a top-ranking page, build a better optimized page with the same content and publish it as original. Google recently announced that it has developed a way of detecting copyright infringement and severely punishing sites that appear to engage in these activities. Again, there is no single rule that can help a machine to know which of two pages is the original and which is the copy. Rather, there are certain hallmarks to infringement that Google looks for. Woe unto those sites that manifest these signals.
How to rank well post SEO
I am writing a new book about this topic, so, again, I won’t belabor the point. Simply put, there is no better approach to ranking well for Google than to build honest and transparent websites that attract, excite and compel your target audience to engage with them. This might go against the grain of your approach to marketing if it is typically based on hyperbole. But in the age of algorithms which use performance metrics, machine learning and semantic smarts, it is the only effective way to do digital marketing.