Skip to content

2 SEO overstatements from my previous post

June 21, 2010

I spent much of this week in agile development training. When all you think about is agile development for 36 hours, you start to see ways to apply it everywhere. So I’m going to apply a principle of agile to this blog today. One of the things agile encourages is to work iteratively: Get a product ready for your product owners in a two-week sprint; take feedback, and fix the product to better fit your product owners’ expectations. In the case of this blog, you are the product owners. In response to last week’s entry, you gave me some constructive feedback on how I had overstated the case. In this blog post, I want to correct those overstatements.

Before I get to the overstatements, I need to burn some space and attention writing on significance of the SEOmoz study that led to my post. Rand Fishkin, the author of the post, tweeted me that I ignored his discussion of correlation and causation. This was repeated a couple of times in the comments. I want to make it clear that I didn’t ignore his caution. I am not trying to claim that his data means that doing some of the 5 practices I recommended cause pages to rank better.

I don’t believe in using this kind of data to establish cause and effect. There are just too many variables involved in search algorithms for any one factor to cause a change in ranking. What I did not have space to say is that those practices tend to result in better ranking. Or, all things considered, if you adopt those practices, you will be more likely to rank better. I think Rand hedges a bit too much in this case. If the best data we have only demonstrates correlation and not causation, I still recommend adopting it. You have to trust the data you have, however imperfect.

Perhaps an analogy will help. On my spare time and under the alias cmathewson, I am one of several senior writers for an SB Nation blog called Twinkie Town, which covers all things related to the Minnesota Twins professional baseball team. One of the standing debates on the blog centers on to what extent we should trust sabermetrics–the mathematics of baseball–when we analyze players.

Baseball is a complex game, so sabermetricians develop statistical models that say things like, all things considered, a player who walks more than he strikes out will be a more consistent hitter than a player who strikes out more than he walks. There are all kinds of exceptions and it is not a hard-and-fast rule. But, if you play the odds, you would put your money on the player who walks more. The consensus view is that we should trust sabermetrics because it is the best data we have, though it only establishes statistical correlation, not cause and effect. Still, it is the primary way to predict how a player is likely to play in the future. Most baseball teams now employ a sabermatrician to analyze stats and help make player decisions.

The kind of statistics SEOmoz published in that report are akin to sabermetrics. They will help you predict search performance based on the myriad factors that contribute to search ranking. In that sense, they are very valuable tools. That’s why I was so excited to see the report. Perhaps I got a little too excited and overstated some of my claims. But the claims should be true in a weaker form. Perhaps better, if I qualify them slightly to hedge my bets, the recommendations will be more valuable to readers.

1. There are good reasons to populate your title and h1 tags with your primary keywords

Last week, I said title tags and h1 tags don’t help you rank for your target keywords. This was based on the SEOmoz claim that there is a negative correlation between having keywords in the title and h1 tags and ranking well for those keywords. Another way to say this is, if you put the keywords in the title tag and h1 tag, you will be less likely to rank well for those keywords.

All that is true, if you believe the statistics in the report. But there are a couple of caveats. First, the negative correlation is so small, it’s barely outside the margin for error. It’s close enough to the margin for error to say you should not change your h1 and title tag best practices based on this one report. Rather, the advice should be: Don’t put most of your effort into these two things as though you’re going to get results from it. A negative correlation means, at minimum, there’s no positive correlation. Putting your target keywords in the title tag and heading probably doesn’t help your ranking much, and it probably doesn’t hurt your ranking much either. One thing is certain: It is not as important as most guidance suggests.

Second, though keywords in the title and h1 by themselves probably won’t help you rank better for those keywords, it will tend to help you rank well for related long-tail keywords. Next week, I’ll devote a whole blog post on long-tail keywords. And we have an extensive discussion of them in our book. Suffice it to say here that long-tails are very valuable because they help you zero in on audiences that are likely to engage with your content when they find it. For this reason alone, it is worthwhile continuing to put your primary keywords in the title and h1 tags.

2. There are good reasons to choose .com domains over .org domains

Last week I said, “if you’re considering starting a new site, use .org if you can get it.” That statement is too strong. First, the data suggest a narrow enough difference that it does not lend itself to categorical statements like that. What I should have said is, there are contexts in is the better choice and there are contexts in which .com is the better choice. But, all things being equal, the data suggests a slight advantage to .org domains.

When is .com the better choice? If your site is primarily used for commerce, .com is the primary top-level domain for that activity. It is a bit disingenuous to choose .org in that case just to try to get an edge in search.

The point I should have focused on is this: If you own a site with a .com domain and you are considering using a primary keyword with a lot of noncommercial competition, consider using a different keyword. The data helps strengthen my hypothesis that it is very difficult to compete with .org and .gov domains, all things considered.

However, the top-level domain issue deserves further study. The likely reason is that .org and .gov sites have an easier time getting link equity because they tend to be independent. On the other hand, it is relatively difficult to get link equity if you’re a .com site because your positions are naturally biased towards your company’s branding and offerings. In other words, it is not likely that the algorithm inherently favors .org domains over .com. Web publishers tend to favor links to .org sites, which leads to an improvement in search ranking. Also, Google A/B testers might show a preference sites, all things considered, because of impartiality. In any event, this requires further study before I go off and draw a strong conclusion about it.

2 Comments leave one →
  1. July 3, 2010 12:26 am

    Pretty nice post. I just stumbled upon your blog and wanted to say that I have really enjoyed browsing your blog posts. In any case I’ll be subscribing to your feed and I hope you write again soon!


  1. Tweets that mention 2 SEO overstatements from my previous post « Writing For Digital --

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: