Skip to content

4 Web analytics basics

March 20, 2010

In her recent blog post The myth of data driven decisions, Leen Jones expresses doubts that data are the panacea they were once cracked up to be. According to her, she thought data alone would make her content decisions for her. She also thought she could avoid conflict with data. If someone questioned a content decision, she thought she could just show them the data and win the argument. This did not work:

I had to think even harder to apply the data to decisions. And, the arguments did not end. If anything, they got deeper and louder.

At first I thought she was recommending going back to the days when we published Web sites on blind intuition and ignored data, throwing stuff against the wall with our eyes closed and hoping it would be stuck there when we opened them again. But, as I read to the end of the piece, she was clearly not trying to say”don’t use data.” She was trying to debunk the myth that data alone could solve all our problems and help us avoid conflicts about our content decisions.

Despite her lead, I’m relieved to be able to agree with her. Data alone doesn’t make content decisions for us, we make content decisions. But data is key to helping us make smarter content decisions. Data doesn’t help us avoid conflict, either; we shouldn’t be trying to avoid conflict, we should embrace it. But the right data gives us the confidence that we can face any conflict with stakeholders and help them better understand the content decisions we make. With the right data, and a better understanding of how to interpret it, we can bring a sense of calm to the arguments and resolve the conflicts. That’s why data is at the center of content strategy best practices.

So what is the right data? And how do we interpret it? We devote a whole chapter of our book to those questions. Here I just want to give you a thumbnail sketch of the four analytics practices every content effort needs. If you do nothing else, you should track this data and use these interpretations as the basis for your content decisions.

When Jones speaks of data, she probably means traffic data. Traffic data is nice, but it doesn’t mean a whole lot about how effective your content is. It means more about how well you drive people to your site. If you drive a lot of people to your site and they leave right away, it’s actually worse for you than if they had never come to your site at all. Of course you want traffic, but the data you really want to track are what visitors do when they get to your pages. These are the things you should track.

1. Bounce rates

Bounce rates are the percentage of users who come to a page, click nothing, and leave, typically via the back button. We interpret a bounce as a sign that the user did not find the page relevant. If the user doesn’t bounce, we interpret that to mean that the user found the page at least nominally relevant.

Depending on the referring URL, somewhere between 20 % and 50 % bounce rates are acceptable. If you get better than 20 % bounce rates, you’re really doing a great job on your page. Natural language is so complex and users are so diverse, expecting better than 20 % is a recipe for disappointment. You might get better bounce rates from social media referrals because you know the people you’re tweeting to. But I like to set my organic search referral bench marks at 20 % bounce rates. Paid search referrals should come in under 50 % bounce. If you do other drive-to tactics, such as banners, you can expect higher bounce rates.

Tuning your description metatag can help organic users understand why they’re clicking, and thus, reduce bounce rates. Writing tweets and other social media posts that are highly targeted for the audience also helps lower bounce. Tuning your paid creatives has a similar effect. You can lower banner bounce by choosing highly relevant sites on which to buy your banners.

2. Engagement rates

We define engagement in terms of clicks. If a user clicks something on a page (i.e. doesn’t bounce), they engage with it. We say in our book that you can use engagement to infer that your content is at least nominally relevant. And the more deeply a user engages with a page, the more relevant she finds it. The goal of Web publishing is to get deep engagement. You define deep engagement by assigning value to your links. Perhaps a click on a persistent navigation element counts as a 1 on a scale of 5, and clicking the first link in the center pane of your page counts as a 5 out of 5. Every new click adds value to the engagement.

Tuning your pages for engagement is more of an art than a science. (This might have been what Jones means: Data alone can’t help you understand what’s going on. How you interpret the data can.)  You can have highly relevant messages but, if the design hinders the visitor from seeing them, the user might not engage. Also, referring URLs affect engagement: If a user lands on your page unexpectedly, the content might be relevant, but they are not ready or willing to engage. If you have low engagement rates, there are ways of finding out what’s wrong, but it might take a while.

3. A/B tests

Isolating the feature of your pages that is causing friction for audience engagement and fixing it is typically an iterative process. Change one thing at a time and test it out. Sometimes you don’t have time to do that. You might need to change a bunch of stuff at once and test the whole collection. But if you have time, it’s nice to identify the likely causes of engagement friction and fix them iteratively, testing in between. How do you do this? Well, we find running two versions of your pages and comparing how they do over the course of a month or so is the best way. This is called an A/B test: You serve an A version and a B version randomly to your users and test out how they click.

A month is an arbitrary time frame. It really depends on how much traffic you get. You need enough traffic to your pages for the results to be statistically significant. Google is continuously running A/B tests and it can do a new test every day in some cases because it gets so much traffic. Your mileage may vary.

4. Qualitative data

Quantitative data (traffic and engagement) has limitations. For one thing, interpretation can be prickly. There are just so many variables that affect how users on your site behave, there is a lot of margin for error. And it’s not always easy to get statistically meaningful results.For this reason, we recommend a healthy dose of qualitative data: User surveys, comments, user ratings, and other forms of user research.

There are a variety of ways of using social applications to get direct user feedback. We consider it imperative to allow users to rate and comment on your content. You can mine the comments for sentiment and other qualitative user judgments. Social tagging is another must. When users tag your content, you can get a sense of how well you met the mark semantically. Finally, nothing beats usability studies to tease out hidden problems with your pages.

When you use qualitative data to triangulate on your quantitative data, you can eliminate all kinds of variables and come to some pretty solid conclusions. For example, if your highest rated content  is buried, you know you can get better engagement rates by making it more prominent.

It isn’t a question of whether to gather and analyze data, it’s a question of how. If you do these four things, it’ll be a good start towards gathering meaningful data, which can help you make better content decisions. I’ve glossed over a lot of nuances and I’ve ignored other ways to compare and contrast data points. I’ll cover those another time. Or you can just read our book.

5 Comments leave one →
  1. March 21, 2010 12:02 am

    You know, I have to tell you, I really enjoy this blog and the insight from everyone who participates. I find it to be refreshing and very informative. I wish there were more blogs like it. Anyway, I felt it was about time I posted, Ive spent most of my time here just lurking and reading, but today for some reason I just felt compelled to say this.

  2. March 22, 2010 1:14 pm

    Hi James, Frank, and Cynthia,

    Thanks for your thoughtful response to my post about using data to inform content and design decisions. As you note, I’m not a data hater, and I like your introduction to useful data sources. For content strategy, I like to go a bit deeper with the qualitative data and recently shared some tips in the article Testing Content Concepts.

    Looking forward to your book. Sounds like a positive step toward using data the RIGHT way.


  3. March 22, 2010 1:23 pm

    To me the secret is analyzing data to make informed decisions. Actionable insights based on data trends is what we should all strike for. The reality is most people just see data spikes and drops and immediately jump to some conclusions without doing the research as to what causes the data reaction and determining if it is repeatable. I will give you an example, we might have a blogger link to our page this month, send 300 refferals, link my drive our organic rankin up, etc., but what can do about it next month when he/she does not blog about our content ? Interesting spike in data, but hard to make actionable unless you have a team of people to chase bloggers about posting content (bad approach if you ask me).


  1. Book Review Part IV: Content Strategy for the Web–Content Audits « Writing For Digital
  2. The 3 I’s of Smarter Content: Instrumented, Interconnected, Intelligent « Writing For Digital

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: