3 Studies Show Critical Mass for Outside-In Marketing

Erin Kissane said something recently that shocked me:

“I don’t understand your research.”

This came during a talk at the October Minneapolis Content Strategy Meetup. Now, Erin is one of the smartest people I know in the content strategy field, and author of a great little book: The Elements of Content Strategy. So when she doesn’t get it, we have a problem. Obviously, I have not done a good job of explaining my research. I thought I’d take this opportunity to explain it, not only in terms of what I’ve written both here and in our book, but in terms of what other people are writing about it.

This week I was pleased to come across three good resources that explain the basic principles of my research from different points of view. They don’t all refer to my work specifically, but they are essentially about the same thing: what I call outside-in marketing. Before getting into them, let me take another crack at explaining what I do.

Outside-in marketing is the method of learning the language of clients and prospects and using thier language to develop content that they can relate to on their own terms.

We learn the language of clients and prospects through keyword research and social media listening. We analyze the research to develop content strategies that will tend to attract and captivate that audience with digital experiences. We publish and iterate on those experiences to develop healthy relationships with that audience–relationships based on trust that will ultimately lead to stronger business results.

Outside-in marketing is a radical concept for some marketers, perhaps because they learned more traditional inside-out marketing. In inside-out marketing, we develop products to differentiate our brand from competitors. We build that brand by persistently pushing our messages into the market, primarily with advertising. Then we try to reuse the same messaging in our digital experiences.

When you try to reuse inside-out messages in digital, lots of undesirable things can happen. Typically, the only people who find and use these digital experiences are existing customers who are thoroughly entrenched in your branded nomenclature. It might rank well in search, but the words don’t have much demand outside of those who already know what you offer. If you want to attract new people who might not know what you offer, you need to use their language. If you try to slap their language on inside-out pages, it’s even worse. Either you rank well with very high bounce rates or you don’t rank at all. Since Google released Panda, the latter is more often the case.

Outside-in marketing is a tacit acknowledgement that most people use search to find content. Content helps them learn about product categories; use those product categories to solve their problems; compare and contrast products; and ultimately purchase products. Your goal is to create the content they’re looking for to do each of these activities on their terms. My research is about matching the grammar of search queries to the activities audiences want to participate in, and developing content strategies that will help them do those activities.

As I said, I was pleased to see some independent support for this approach. After the jump, I will give short descriptions of and links to that research.

1. Lou Rosenfeld recently presented at Web 2.0 what is the best presentation explaining how to use data to inform content strategy decisions. Even without speaker notes, the presentation is a treasure trove of outside-in marketing best practices. Here is a partial list:

  • How you can discover what users want in their own words by using Site Search Analytics
  • How you can architect a site based on query grammar
  • How jargon (i.e. branded terms) is the enemy of usability
  • How to segment the audience by their interests and develop content geared towards different audiences
  • How search analytics can help you prioritize content and remove underperforming content

After viewing the presentation, I can think of no better site analytics tool than site search analytics. But, though Lou did a great job of focusing on how to use site search analytics, there is one thing missing from his presentation: It only covers the people who interact with your site. What about all the people with whom you want to develop a relationship but who haven’t found your site yet? The answer is, you can use the same methods he demonstrates for external Google.

2. The Content Marketing Institute just came out with a great guide to using content optimization tools for better writing. The only negative review I’ve read of our book was from a novice writer who wanted practical guidance for using our strategy to help with her writing. She found our book wanting for this. Well, this article can help point the way. It shows you how to focus on the words and phrases your target audience uses to build more compelling content experiences for them.

The authors did a study to compare writing by those who use content optimization tools and those who didn’t. The content optimization tool used by study participants is InboundWriter, which gives real-time search and social intelligence to writers as they compose. The tool plugs into WordPress and other authoring environments.  Using the tool increased traffic to some of the pages by up to 30 percent.

All that said, the authors of the study are also the developers of the tool. So they have a vested interest in promoting a tool that does this. I’m just pleased that there is a market for the kind of tool I’m helping to develop within IBM–one that will not only give writers guidance on word choice and usage, but will govern that usage across an enterprise.

3. Brafton published one of the best content and search infographics I’ve seen. It’s part of a great blog post about the importance of quality content. To quote the post:

The bottom line is that a positive search experience translates into users’ ability to find quality information that answers their queries. Top-notch content is what searchers want, and it’s what search engines want to prioritize in results.

I couldn’t have said it better myself. The post, and especially the infographic, will take you some time to consume. But they are well worth it.

As we said in the book, the winner of the search wars will ultimately provide the highest quality results for its users. When we wrote that, it was almost a joke because Google results were polluted with content farms. Bing gained market share because users found their results more relevant and higher quality. So Google responded with Panda, which has progressively favored higher quality results. This is great for users and white-hat SEOs, and bad for black-hat SEOs.

It also puts a premium on good content. There are no shortcuts to attracting and captivating an audience of willing participants. You need to learn their information preferences and develop content for them. If you’re in marketing, it’s called outside-in marketing.  Erin, if you’re out there, I hope this helps.

The Beatitudes of Digital

Navigating the collaborative culture is one of the most difficult challenges for digital creatives–designers, UX people, content strategists, coders, etc. We care about doing good work. We are passionate about it. This passion can clash with the passions of other creatives, resulting in a lot of conflict. This conflict can be heightened if we collaborate remotely. Isolation often amplifies rather than pacifies conflict. And we are not just judged by our teammates. We are judged by the results of our work. Results can be our harshest critics.

I have found in my long career that high functioning creative teams have an essential trait: Their members have a high emotional quotient (EQ). They are able to give and take constructive criticism in style. They are able to state their views without digging in. They are able to see others’ perspectives and sacrifice their own for the good of the team. They pick their battles. They don’t get mired in their own turf. They don’t hold grudges or carry prejudices. They give their team mates the benefit of the doubt.

I can think of no better source of  the attitudes that lead to higher EQ than the Beatitudes. This might seem strange to you, especially if you are not Christian. But I am not writing this to evangelize. I think the Beatitudes have universal appeal regardless of your religion, or lack thereof. They transcend faith and reason, appealing to the way we respond to the challenges in life in our guts. To me, the Beatitudes are an approach to the visceral reactions that affect behavior more than we might want to admit.

There’s perhaps no better proving ground for them than in digital. Digital is ripe for emotional meltdowns that can scuttle a team or a project. I hope you find it as helpful as I do. If you’re interested, please read on.

Blessed are the humble*

This is perhaps the most important attitude to have in digital. We are all learning. Digital is so new, we can’t pretend to be gurus or experts. We have to humbly accept failure and keep trying, without feeling wounded or deflated. We also can’t be too attached to our successes because digital is changing so fast, expertise is fleeting. Accepting failures and faults and being willing to work on them is a necessary first step to success in digital.

Blessed are they who mourn

This is perhaps the most conceptually challenging Beatitude. Nobody likes grief, so how could that lead to being blessed? I don’t think it’s about wallowing in sadness. It’s about letting go. What is the point of mourning? Mourning is just a productive way of dealing with grief. It sure is more productive than stuffing grief  deep down and trying to ignore it. That just leads to blowing conflict out of proportion later.

Grief is a necessary part of the digital process. We all have our “babies”–digital artifacts that we are especially attached to. Perhaps we worked really hard on them. Perhaps we put our heart and soul into them. But if they don’t work, we have to be prepared to kill them. The faster we iterate, the more of our babies we will need to kill and the more grief we will deal with. If we can learn to mourn our babies and let them go, we can eliminate the emotional roller coaster that plagues creatives, especially in digital.

Blessed are the meek

When I was editor of ComputerUser, I often let my passion for the product get the better of me. I recall storming into the art director’s cube and saying such things as “It’s my name on the masthead!” That kind of stuff never worked. Slowly over time, I have learned to put on a more mild appearance, even as the storms rage inside me. My next challenge is calming the storms themselves. I still have a long ways to go. But those I admire most have a way of calmly and dispassionately stating their position. Their quiet confidence speaks louder than any of my rants.

Blessed are they who hunger and thirst for righteousness

To me, righteousness in digital just means doing things the right way. Since we only learn how to do things the right way by doing them, failing, and doing them better, we need to be hungry and thirsty for righteousness to continue to learn and grow. It’s not easy. You really have to want to do things the right way to keep from getting demoralized.

Blessed are the merciful

When a team makes so many mistakes despite their best efforts, the tendency is to blame each other.  Even if we determine that the whole project went into the ditch because of one person, blaming and shaming doesn’t help anyone. The only productive attitude is to forgive, hope that the person learns from it, and fix the problem in the next iteration.

Blessed are the clean of heart

This is the hardest attitude to adopt. Again, I personally have a long way to go. Still, it’s a worthy ideal to strive for.  Adopting the attitude of kindness and compassion towards all your teammates–regardless of history, personalities, or flaws of character–allows you to work as a team. It is the condition for all other Beatitudes.

Blessed are the peacemakers

Sometimes we find ourselves on the sidelines as two teammates get mired in an impasse. Unless we are able to mediate and break the impasse, the whole team grinds to a halt. Peacemakers are the quintessential team players.

Blessed are they who are persecuted for the sake of righteousness

Leadership consultant David O’Brien writes extensively about a caustic attitude common in all walks of business–the critic. These are people who resent success and do little but criticize people, projects and products. In online forums, we call them trolls. The curious thing is that trolls tend to gravitate towards the people who do things the right way, who have a history of success, or who have demonstrated strong leadership. The point is, if you are committed to doing things the right way, you will attract critics or trolls. After all my years, I consider their presence a sign that we are on the right track. Rather than letting critics derail you, consider them a cost of doing digital the right way.

The Beatitudes are not the only ways to salve the emotional wounds we suffer from in the digital creative process. Humor helps a lot. Personal connections–reinforced in social events outside of the pressure cooker of the virtual office–are critical. But the Beatitudes are ways individuals can learn to deal with the emotional toll digital projects require. They surely help me a lot.

*Bible scholars will cry foul already. The text reads “Blessed are the poor in spirit.” But I have been reflecting on this phrase for years and it seems to me what He means by poor in spirit  is humble. So allow me to make a creative translation.

Siri is the Killer App

Last year at this time, I wrote the following post entitled 4 Ways to Avoid Chasing the Algorithm on this blog:

Years down the road, Google might not even be the search leader. But search will be the preferred way to find information for a large and growing majority of users. Sooner than you might think, users will have a Watson in their pockets: A computer that has the best available answer for every question. As search engines approach the Watson ideal, and more users access the web through mobile devices, we think users will ever more prefer to search for information rather than browse or navigate.

Little did I know just how soon that would happen. Apple Acquired Siri, a program that would do just what I predicted in this quote, three months later, and incorporated it (her?) into the iPhone 4S (the S is for Siri) in November of 2011. I never dreamed that a product would come out within the same year of that prediction, which is a pretty good facsimile of what I predicted.

I was actually doubtful that Siri did what I had predicted until recently, when Apple released its Q4 results, including this quote:

The Company sold 17.07 million iPhones in the quarter, representing 21 percent unit growth over the year-ago quarter.

Apple stock took a hit when it released the iPhone 4S rather than something more ambitious. Little did investors know just how ambitious putting a Watson in users’ pockets would be. And little did investors know that having a Watson in your pocket is a killer app. Now they do. In less than two thirds of a quarter, Apple sold more iPhones than it had in the full quarter the year before. It will be interesting to see how many more people buy iPhones in Q1 2012 than Q1 2011. I’m predicting a huge increase.

Futurists have long predicted a voice-activated computer, fueled in part by Star Trek. What gives Siri so much appeal is that voice is the preferred interface into a phone. Typing has always been challenging on smart phones. Also, screen real estate severely limits navigation and point-and-click UI. So it makes sense that the technology would appear first on the phone. I expect it to migrate to iPads and other tablet devices before taking hold in PCs as well. The appeal goes well beyond Internet search: The ability to find files, run programs, and execute common commands with your voice is a big time saver.

Of course, Google will not take this news lying down. It has been widely rumored that it will incorporate a similar feature into Android. Not only does it need a voice app to compete with Apple for its smart phone business, but Google needs voice-activated search. Most of the growth in search is in mobile search. Reading between the lines, this is at least a contributing factor in its aggressive strategy with Panda and semantic search, Google + and Search Plus Your World. They don’t want a Siri clone, they want something that beats Siri, by delivering better, more personalized mobile search results through a voice interface, exclusively on Android.

All this is good news for users, and a cautionary tale for SEOs and content strategists. We should be asking ourselves how our content works on mobile devices and in mobile use cases, particularly how it is accessible through a voice interface. We should be asking ourselves how our content sounds, not just how it looks. We should be asking ourselves how queries change when spoken rather than typed. We should be asking ourselves how our content is shareable (i.e. people will want to share it) when Twitter and Google + are the primary ways they do this.

These are huge questions that crack the very foundations of digital media, which, until now, was primarily about parsing text through visual interfaces. I won’t provide the answers in this post. Just know that I will begin exploring these questions in future posts. Stay tuned.

Content Strategy is the New SEO

Happy New Year! I have been taking a nice long holiday break and am now ready to get back into blogging. I haven’t been idle over the break. I’ve mostly been writing for other projects, such as my InformIT page and a video lecture series I am getting close to releasing.

In the course of the research for these two projects, I made a startling discovery: The Google Panda algorithm is a radical attempt to equate content quality with SEO, as much as an automated system can do so. I knew that Google said this about content quality when it released Panda, and I even wrote about it on this blog. But I didn’t understand the inner workings of how Google makes this happen. Plus, I didn’t really believe that you could develop an algorithm that truly favored content quality until I started researching the way Panda is built.

Savvy readers will notice I used the present tense in the previous paragraph–it was intentional. Google has developed Panda to be continually improved. Panda’s on version 2.6 since it’s initial release in March of 2011. It issued 30 new improvements in December alone. It’s called machine learning,–a method borrowed from artificial intelligence–which describes how to train a system for continual improvement. It’s similar to the method that IBM used to hone Watson’s Jeopardy!-playing skills.

Automation is necessary because of the sheer volume of content on the web. But the real key lies in the inputs Google gave to the algorithm–and the way it analyzed those inputs– before honing it through machine learning. The inputs were derived from perhaps hundreds of quality testers, who ranked thousands of pages for content quality. Google took this data and crunched the numbers to derive some rules. Then the machine-learning program honed the rules, and continues to hone them over time, getting ever more accurate. The end result is an algorithm that places a premium on content quality over the simplistic checklists and other tools common to traditional SEO.

SEOMoz’s Rand Fishkin does the best job of explaining what this means for SEOs in the future.

Note: Content Quality is as much a function of the whole collection of pages and other assets as it is of a particular page. It won’t do to focus all your attention on the top-level portal and let the lower-level pages go. The quality of your whole corporate site stands or falls as a collection. Though individual pages can do better than others, poor-quality pages, duplicates, and other features common to poor content strategy pull the whole collection down. The antidote to poor content quality is good content strategy.

6 Ways Google Killed SEO And What to Do About It

If I seem absent from this site, it is only because most of my work is published now by Biznology. In that blog, I am following a long thread about how to optimize digital experiences for Google post SEO. SEO as we know it is dead. But attracting an audience through Google is not optional. So how do we do it in that age post SEO? That is the point of my monthly posts at Biznology.

Occasionally, I find myself with fresh insights that don’t quite fit into the flow of that blog. So I will write them here. This is one such post. This one came about when I was in residence for a week at the IBM Design Lab in New York. In the course of my discussions with key collaborators there, I came to realize that the Biznology thread is a bit too narrow. There I have mostly focused on how Panda and Penguin have killed SEO. But these algorithm adjustments are only two of the six seismic changes in Mountain View that adjust the algorithm in ways one cannot reverse engineer. I’d like to highlight all six in this post.

First a bit of terminology. By “SEO” I mean the attempt to reverse engineer Google’s algorithm and build pages that will tend to rank well on that basis. Traditionally, this has been about learning the rules Google used to rank one page higher than another, all things considered, and trying to follow those rules. SEOs chased the algorithm by keeping up with how the rules changed–either new rules were added or existing rules were given different weight or, etc.

Well, in the last several years, Google has added other factors and filters that are not rules-based at all. It was never a good idea to chase the algorithm when it was rules-based. Now that it is ever less rules-based, chasing the algorithm is a fools errand. But as I say, ranking well in Google for the words your target audience cares about is not optional. So how do you do it post SEO?

1. Performance metrics

It was three years ago when I first heard from Avinash Kaushik how Google rewards pages that have high click-through rates (CTR) and low bounce rates on its search engine results pages (SERPs). (Fortunately, we were able to include this in our book.) What does this mean? Well, if your page performs well by conventionally accepted metrics, it will rank better over time. This makes sense because high CTR and low bounce rates are indicators of relevant content. Google is effective to the extent that it serves relevant results to its users. It’s results will tend to get more relevant over time if they promote content that perform well in these key metrics over time.

Note that performance is not rules-based. With this filter, Google is saying they really don’t care how you perform well. It is enough merely to do so. And there is no way to fake performance. The only way to maintain good ranking with this filter is to provide relevant experiences to your users, which is too highly varied to put into a discrete set of rules.

2. Quality signals

I have written extensively about how Panda affects results. So I won’t belabor the point here.The basic principle is that Panda uses crowd sourcing and machine learning to make ever more accurate assessments of the quality of the content in its results. It then rewards high quality content and punishes low-quality content.

Again, this can’t be done with rules. There is no one standard of high quality content that satisfies the various contexts in web publishing. But there are certain signals or hallmarks of quality content that Google can learn through its quality testers and look for. Once it sees those hallmarks, it rewards the pages that manifest them.

3. Semantic smarts

After IBM Watson won at Jeopardy!, I wrote in this blog how I thought future search engines would use semantic smarts to rank pages, rather than the dumb syntactic pattern matching they used at the time. SEO used to be about getting the exact phrases in the right places on pages, and then to vary the language on a page with different phraseology so as not to look like you were trying to game the system.

Semantic search renders all that advice garbage. It’s not about exactly matching the syntax (actual stings of letters and spaces) of the keywords your target audience cares about It’s about matching the semantics of what they type in search. Synonyms can have entirely different syntax. There’s no use trying to re-engineer a semantic algorithm. You just have to write naturally in a way that is relevant to your target audience and forget about building pages solely with exact-match keywords. Of course, it’s important to have those things in the title tags and meta descriptions. But otherwise, it’s not that simple.

4. Overoptimization signals

Google has been wise to SEOs for some time. Like virus writers and virus software makers, it has always been a dance of keeping one step ahead of Google and building pages that Google does not penalize for overt uses of SEO. There was a point at which Google caught up and overtook the SEOs on this issue. It was some time in 2011, when Google recognized patterns in the way SEOs responded to algorithm changes. Using machine learning again, Google discovered a way to see patterns in SEO behavior and thwart it before it became widespread. The upshot is: If you find a way to game the system and it becomes even remotely widespread, Google will discover it and write countermeasures into its algorithm. These countermeasures are not typically rules based anymore.

For example, a couple of years ago, I was on a call with a group and a cocky guy who had recently read a blog post by a prominent SEO was touting using ALT attributes to pump up keyword density in pages. By the time I studied this in greater detail, it was now more of a negative signal for Google than a positive one. In other words, Google was punishing sites for using ALT attributes to pump up keyword density, similar to what it had done with hidden text a decade before. The difference since 2011 is, they actually built a machine learning program to detect algorithm hacks and punish sites that use them programmatically.

5. Link building signals

When Google first started, it was like every other search engine in most ways. It ranked pages by systems of rules based on the syntax of the text on pages. The reason Google dominated the search game was not because it did this part of search better than the others. It was because it also looked at links. Links are the signals that tell Google of the context of the page it is trying to rank. It rewards pages that are contextually relevant to the search query all things considered, based on the quantity and authority of links into the page.

Clever SEOs discovered how to game this system by buying or swapping links. Eventually, it got so bad, links were practically meaningless for Google and it was back to the dumb pattern matching based on syntax rules. That was, until Penguin, when it found a way to detect and foil apparent link building activities in the algorithm. This is a form of overoptimization using links. Penguin thwarts apparent link building as surely as Panda thwarts keyword stuffing in alt attributes, using the machine learning techniques.

6. Copyright infringement

Google will continue to add ways of thwarting those who game the system as soon as their tactics become known to Google. One common way to game the system is to copy the content from a top-ranking page, build a better optimized page with the same content and publish it as original. Google recently announced that it has developed a way of detecting copyright infringement and severely punishing sites that appear to engage in these activities. Again, there is no single rule that can help a machine to know which of two pages is the original and which is the copy. Rather, there are certain hallmarks to infringement that Google looks for. Woe unto those sites that manifest these signals.

How to rank well post SEO

I am writing a new book about this topic, so, again, I won’t belabor the point. Simply put, there is no better approach to ranking well for Google than to build honest and transparent websites that attract, excite and compel your target audience to engage with them. This might go against the grain of your approach to marketing if it is typically based on hyperbole. But in the age of algorithms which use performance metrics, machine learning and semantic smarts, it is the only effective way to do digital marketing.

How Search and Social are Interdependent

When I say that search and social are interdependent, I don’t just mean that any effective digital strategy ensures that you do both well. Of course, they are both strong drivers of relevant traffic to your sites. But I also mean that they are interrelated. That is, you can’t do search effectively without an effective social strategy and you can’t do social effectively without an effective search strategy. Since this is a controversial position, allow me to e’splain sum up now and explain after the jump.

In brief, social content is findable: It is built to demonstrate relevance and to provide context for the audience, where search engines are a proxy for the audience. When you jump into the middle of a conversation, it can seem confusing. You understand the full conversation by searching for content related to it. Building findable content begins with keyword research, which is a form of audience analysis. Building social campaigns also begins with keyword research, which gives you a cue into the conversations your audience engages in. Building shareable content depends on understanding those conversations.

Findable content is shareable: It is parsed into modular chunks that can be easily shared from within pages. Pages are just carriers of shareable content that demonstrate relevance and provide context for the audience. Pages full of relevant, shareable content become link bait, which then rank better over time in search engines because the audience is effectively voting for the content by sharing it and otherwise building links into it from external sources.

It all fits together. But I admit the summary view might be a bit dense for those who have not read our book Audience, Relevance and Search: Targeting Web Audiences with Relevant Content. If you need further explanation, please read the example after the jump.

You’re doing it wrong: How not to build a socially aware microsite

I recently engaged in a conference call that took at least a year off my life. I was asked to join the call because the site designer wouldn’t listen to the designated search consultant. So we were both on the call, attempting to play bad cop/worse cop with this obstinate and arrogant designer.

The designer presented a page that was simply a set of tiles, four wide and four deep. Some tiles were videos. Some tiles were screen captures of tweets. Some tiles were thumbnails of infographics. There was no original content on the page at all. Everything was curated from some other social source. There was nothing on the page that indicated the context of these items, either to each other or to the larger set of conversations. The page was entirely devoid of text of any kind to help build that context. And none of the items was shareable, either in the sense that you could click a share button near them or in the sense that anyone would want to share them (in the unlikely event they stayed on the page long enough to try to share them).

The first question I asked him was what the purpose of the page was supposed to be. He said the point of the experience was to foster social conversations around the products owned by the stakeholders sponsoring the work. I tried to explain to him that the page as designed would not accomplish the goal because:

  1. The page would not be findable in search: Without any discernible message, there is no way search engines would ever rank the page anywhere near the top page in rankings (after which, pages are functionally irrelevant). Ranking on the first page in Google gives you a shot at the credibility you need to gain the trust of the audience. For anonymous experiences, it is the only way to have a chance. From there, perhaps you could develop trust and loyalty over time, if you include sharable content and highlight the work of a few relevant experts.
  2. The page did not contain shareable assets: It’s a simple thing to add share buttons to the assets on the page, but nobody shares anything if they don’t know the context of the item. In social settings, such as Twitter, the context is partly determined by the Twitter handle of the person sharing the piece. This signal is only as strong as the credibility of the one sharing the item. Not only was the page itself anonymous, but it would not be given credibility simply because it was on the web. In short, shareability is not just about the message, it’s about the messenger. There was nothing on the page itself that gave the user a sense of the credibility of the messenger.

When I say that the designer was obstinate, I mean that he would not listen to all the evidence we provided that this design was DOA. We provided data point after data point of designs like his that had failed, and how they evolved to be more effective through agile iterations. Invariably, these evolutions involved adding some static text to the upper left portion of the white space, which explained what the page was about and why the audience should pay attention to it, in plain language.

The experiences were optimized when the other pieces of content for which the pages acted as carriers were ever more tightly relevant to the pain points or top tasks of the target audience for the defined context. We presented dozens of examples of pages that get tens of thousands of search referrals and hundreds of downloads, shares and conversions per month after similar evolutions. The designer would not be convinced.

When I say that the designer was arrogant, I will give you his own words: “You brought me in to create a next-generation experience…. Your experiences are tired and boring.” The gist was that his design was web 3.0, whereas our UX best practice is so web 2.0, and including static text on the page is so web 1.0. Meanwhile, the collective web effectiveness wisdom on the call, between the two search SMEs and the product owner,  was 45 years. He was fresh out of college with a degree in web design. One wonders what they teach aspiring web designers in college if this is what they learn.

As an aside, see my blog post about the sorry state of web development skills. One of the contributing factors seems to be that colleges are not teaching what works in actual cases, but what looks really cool and what might work. I have worked with many designers right out of college who had similar attitudes to this designer, but perhaps not to this degree. Those who succeeded quickly learned to be more pragmatic and less dogmatic. When they do, they learn to design their pages to be findable and the content on the page to be shareable. As long as they do that, they can do all the cool stuff they want to make the site look good.

To his credit, the designer eventually agreed to go back to the drawing board, after several rounds of bad cop/worse cop. I told the product owner after the call that I was pleased to get some blog fodder for my time and trouble, because “blogging is better than therapy or alcoholism.”

Book Review: Content Strategy by Bailie and Urbina

It’s not every day that I’m extensively interviewed for a book. And it’s even more rare that I thoroughly approve of the book in which I am interviewed. So I’m thrilled to have the opportunity to read and review Content Strategy: Connecting the dots between business, brand, and benefits  by Rahel Bailie and Noz Urbina.

My updated bookshelf with Content Strategy taking its rightful place.

The book is a comprehensive approach to corporate content strategy from the perspective of two seasoned consultants, with decades of hard-won content strategy experience between them. (Full disclosure: I am a friend of Rahel’s and her publisher sent me the book for review.)

The best part of the book is its collection of case studies, which show how companies large and small have used content strategy to improve their businesses. The message that comes out of these stories, and is reinforced through clear and compelling prose, is that content is one of your most precious corporate assets. Investing in good content strategy doesn’t just help companies save costs over time, it helps them drive revenue, build brand loyalty, and manage risk and compliance. And the alternative to good content strategy can be disastrous.

One of the biggest challenges for content strategists is convincing their executives to invest in the people and tools they need to produce, publish and maintain quality content for customers. The authors do a great job of building effective business cases based on the often under appreciated value of content. After the jump, I’ll outline three ways the book helps content strategists demonstrate the value of their work.

Cutting costs

Of course, content is very expensive. Good writing is one of the few things that can’t be automated. We can use tools like Acrolinx to automate some of the editing and translation efforts. We can implement governance to ensure that we are not creating duplicate content. We can figure out ways of building responsive designs that enable more automated content sharing and curation. We can reduce call volumes at the support centers by answering customer questions with better content experiences. All these efforts will help you cut costs, and so pay for themselves over time. But there’s no substitute for client-centric, clear, concise, compelling, credible, conversational, and clean content.

Quality content is expensive to produce because it requires a lot of smart people with excellent writing, editing, and content strategy skills. If the only way you can get funding for content strategy is through cost savings, one key expectation is that you will reduce head count for writers and editors. This can ultimately hurt the business by reducing content quality over time. That’s why I’m pleased that the authors spend so much space on other ways content can generate return on investment (ROI).

Driving revenue

I’m especially pleased that the authors gave our practices at IBM so much space in the book (pp. 107-109). It all started with a meeting with Rahel over breakfast at Intelligent Content 2010, in which I explained how we get funding for strategic content initiatives at IBM–revenue. The way we grow our business with content is by mining the search and social behavior of our target audience (mostly prospects), and building content experiences for them. If these experiences help them complete their information tasks in a pain-free way, they start to develop loyalty to our brand. This loyalty results in completed response forms on our site, which results in new leads for our business. When new leads result in sales, we grow our business.

Of course, we also need to close the loop with existing customers. This means improving the customer experience with content for the entire customer journey, from learning to solving to comparing to purchasing to installing to optimizing to getting support, and looping back to learning again when it’s time to upgrade. Every customer who has an excellent content experience with the dozens of assets she touches in her journey becomes an advocate for the brand.

Building brand loyalty

This is really where the book shines. It is unique in stressing the long view when it comes to building content strategies that result in ROI. On the ever-more-social web, customer loyalty is expressed through content. It is the way that clients and prospects help each other make better purchasing decisions. In this environment, bad content experiences not only do damage to that one customer’s loyalty, but to everyone in her network. Quality, findable, sharable content is no longer optional. It’s table stakes. If you want to win, you need to invest more than table stakes. You need to differentiate yourself from the competition by building excellent content experiences across the whole customer lifecycle. The book makes a compelling case for this, and helps content strategists tailor this message for their executives.

Conclusion

That’s only a small snapshot of a book, about which one revue could not do justice. It’s not just about ROI, it’s about best practices and governance and content management and taxonomy and SEO and translation and…. If I had one complaint, it’s that the book is a bit overwhelming. I found myself skipping and skimming a lot over aspects of the book that don’t apply to my work. And that’s OK. Good books help readers get what they need out of them. This book does that for a wide range of readers, in start-ups and large enterprises and everything in between. So I will leave you to the task of getting what you need out of the book.

I want to close with one admonishment:  if you’re serious about content strategy, this book is not optional.

What is Relevance, Again?

Since before I started this blog with my co-author Frank Donatone, I’ve been engaging in a long and fruitful virtual debate with a group of people I lovingly refer to as the search haters. My latest blog about this can be found on Biznology: “Five Critical Roles that Need SEO Skills.” Not that the group of search haters is organized or has its own user group. But there is a long line of folks who are willing to trash the practice of SEO on the basis of two facts:

  1. SEO has sometimes been practiced by unscrupulous agencies to try to gain unfair advantage for their clients, thus this is what most SEO amounts to
  2. Search results are sometimes wildly irrelevant to search queries, thus search is not all that helpful in providing relevant content to audiences

I write this in the hope that I might influence a few search haters into a more sympathetic understanding of SEO. As the above  Biznology post indicated, I spend the majority of my time training folks on SEO. Much of this is in countering myths 1. or 2. above. If I can preempt some of this training by influencing a few people now, I just might be able to get down to business with new hires in digital marketing sooner.

A Smashing Debate

Since I wrote the above blog post, several of my colleagues have alerted me to a couple of long and detailed blog posts in Smashingmag.com. The first is called “The Inconvenient Truth about SEO.” In it, author and apparent search hater Paul Boag makes some good points about the way SEO is sometimes practiced. But he also makes some logical and factual errors. Most of the logical or factual errors were  well countered in a follow-on blog called “What The Heck Is SEO? A Rebuttal”

The most important is the counter to point 1. above. Authors Bill Slawski, Will Critchlow rightly say that this is a straw man. Most SEO is in fact practiced by people who only want the search traffic commensurate with the value of their content, using legitimate means of attaining it. SEO spam is like junk mail spam or email spam: Even though it is not representative of all SEO, we remember SEO spam (aka black hat SEO) because it is so annoying, So our tendency is to over generalize from black hat SEO  to all SEO. The authors also did a good job curating the results of a poll of SEOs in describing what it is SEOs actually do.

I highly recommend that you read both posts, especially the accounts of what SEOs actually do in the rebuttal. As an SEO, I do all of those things and then some. The picture that emerges is that SEOs are really just digital strategists who will do whatever is needed to ensure that clients get ROI for their web development efforts. Since most people search for information “often or always,” being available in search results for the queries your target audience cares about is job 1. So, as I describe in Biznology and elsewhere, the role of an SEO is helping everyone else on the team understand how their work affects search results, i.e., training.

Still, the rebuttal is incomplete. I won’t take Boag’s post apart in detail. But I do want to point out a fallacy in the hopes that it will illuminate why myth number 2. above is a commonly held belief. Here is what Boag says:

Your objective should be to make it easier for people who are interested in what you have to offer to find you, and see the great content that you offer. Relevant content isn’t “great content”. Someone searches for a pizza on Google, and they don’t want prose from Hemingway or Fitzgerald on the history and origin of pizza — they most likely want lunch. An SEO adds value to what you create by making sure that it is presented within the framework of the Web in a way which makes it more likely that it will reach the people that you want it seen by, when they are looking for it.

What is Relevance, Again?

First of all, I completely agree with everything in the above quote, except the bold part. The way I read it, he is saying that content need not be great in order to be relevant. Considering that I say content quality is a proxy for relevance, the bold statement in the Boag quote is a problem for me.

Let’s revisit our definition of relevance. Content is more or less relevant to the audience to the extent that:

  1. It maximizes the audience’s ability to achieve their information goals
  2. It minimizes the effort required by the audience to achieve those goals

We unpack these two conditions in probably more detail than most of the readers of our book need. But if you are interested in the complete picture, see Audience, Relevance and Search. For most of you, it suffices to say that content is optimally relevant if it helps the audience get the information they need in the shortest possible time. (Note that it sometimes takes longer to grasp overly condensed text. So I don’t say, “in the smallest possible space”.)

There is a reading of Boag in which his quote agrees with our definition. If by placing quotes around “great content” he means to connote “literary masterpieces,” then fine. A small percentage of your audience on the web is looking for highly crafted, poetic prose. An even smaller percentage is looking for long-winded stories told from a fictional voice. Highly relevant content on the web is typically brief, to the point, and abundantly clear. (Note that this does not make it boring. It is the antithesis of boring to the audience in that it answers their most pressing questions.)

Part of my insistence on spending entirely too much space in the book explaining how web content is fundamentally unlike print content is to emphasize this point. On the web, readers are in charge of the story. It’s their story. The writer must try to understand the reader well enough to figure out what they need to complete their story, and to provide it in the easiest and quickest way. Turns of phrase and other poetic language tend to reduce relevance on the web by introducing ambiguity in a fundamentally literal medium. Worse still, internal company jargon and other brain-dead colloquial language (e.g. “leverage,” “paradigm shift,”  “next generation,” etc.) defeats relevance.

If this is what Boag means, then I agree completely with his quote. But, if this is what he means, why then does he take the side of the search hater? We published our book in 2010. I’ve spoken about it at high-end conferences a dozen times. The whole industry has rallied behind the vision outlined in the book (whether they were aware of it or not). The search engines have followed suit with algorithm changes like Panda that reward relevant content as we define it and punish black hat SEO. Most decent SEOs practice it as we preach it (again, whether they’re aware of our book or not).

Can we please dispense with the myths so we can give SEO its rightful place in digital strategy?

You’re Doing it Wrong: How to Build a Great Career

I’ve had a rant stewing in me since I saw a TED Talk by Larry Smith on why you will fail to have a great career. The gist of his unpleasant talk is that you will fail to have a great career because you will compromise on what you are most passionate about. And if you don’t do what you’re most passionate about, you will never have a great career.

When I first saw the video, I said “Yes!” So much so that I posted it on my Facebook page. But after a few days, lingering doubts about it caused me to delete it from my Facebook page. These doubts have only grown in the interceding months. It took an interview with Martha Stewart in Parade magazine this past Sunday to inspire me to express these doubts in a blog post. She says:

My father was the smartest guy, he said: ‘you can do anything you set your mind to.’

I know you have all heard these words from your parents and teachers. And I don’t want to discourage you from pursuing your dreams. But I’m here to tell you if you insist on doing whatever it is that you are passionate about, you are more likely to fail to have a great career. Great careers are made by people who listen to what the world needs and who learn to provide those things. They are not necessarily made by people who create things they are passionate about and hope the world needs them. If you think that is the way things work, you are setting yourself up for disappointment.

It certainly helps to be passionate about what you do. It is important to any happy career. And you probably can’t have a great career if your work makes you miserable. But learning to do unpleasant things well, and learning to enjoy success in things for which you are not gifted are essential to cultivating a great career. If you only do what you like to do and what comes easily to you, you are likely to fail. This is the gist of my rant against Larry Smith.

Two wrong turns in the pursuit of a great career

I was a bright-eyed college student, going to school on my own nickel, working two and sometimes three jobs while taking a full load. I was technically a pre-architecure student, meaning I was taking all the core classes one takes in preparation for entering a design school. I wanted to be an architect from the time I was 8 years old because it represented that Ancient ideal combination of art and science. I loved to draw and I was good at math–Rain Man good. So it seemed like the ideal career for me. I pursued it with gusto.

When it came time to apply for design schools, I applied to two. I got an early acceptance from the Rhode Island School of Design (RISD). Based on that, I assumed my home school (The University of Minnesota) would accept me. I was years ahead of other applicants in math, science and art education, and my portfolio was praised by the RISD acceptance committee. But I really wanted to go to the U of M. With confidence, I turned down RISD and waited for my acceptance letter from the U of M. It never came. To this day, I don’t know why.  I was devastated.

I took a year off to consider my options. I had a philosophy minor while I pursued pre-architecture (again, following the Ancient ideal). So when I went back to school, I became a philosophy major. I excelled, pulling a 3.9 over the two years left for my B.A. And I loved it. I reveled in abstract thinking and debating. My teachers said I had a chance to do great things in philosophy, encouraging me to apply to several grad schools. I was accepted at a few and wound up at my home school, the U of M.

I worked on a PhD in philosophy for seven years. I can’t say I was a top student. But I did good work. I served as a teaching assistant and instructor in 20 classes. I got scholarships and fellowships. I got my M.A. I was all-but-dissertation (ABD). I thought I was on my way. But the department didn’t think so. I was one of several colleagues who were told, “You will no longer be allowed to pursue a PhD at this institution.” I was devastated.

I had pursued my passions. I had focused on what I was good at. I had followed my heart. And I was 0-2 with twelve years of post-secondary education and a mountain of student debt. In both cases, I consoled myself with wise words from mentors and advisors. One architect said, “I was a top student and graduated with a B. Arch. with honors. I’ve been a mere draftsman since, working for little better than minimum wage for 15 years.” A philosophy PhD had a similar story: “I went to the top school, had a top 10 advisor, published 10 papers and a book in my first five years out of grad school. But I spent my first 10 years wandering from one-year appointment to one-year appointment.” He was one of the lucky ones. When I was shown the door, there were 350 philosophy PhDs in the United States without any kind of teaching position.

The right approach to a great career

At 31, I changed my strategy, out of necessity. The new strategy was simple: Listen to what the world needs and learn how to provide it. I took the first job I could find: As an editor for the campus newspaper. Meanwhile, I entered a degree program for Scientific and Technical Communication. My new goal was to do what no one wanted to do and no one seemed to do well: tech journalism. I got a reporter job at the paper covering the science and tech beat. I learned. It was difficult. I was never good at English growing up. In fact I was am dyslexic. But I kept at it. Slowly, my career grew. I had many set backs.  But I eventually got a break. I was hired as the managing editor of ComputerUser magazine and a month later, the editor in chief (EIC) quit. I got his job. And my career has taken off from there.

It doesn’t just take off on its own, however. You have to continually listen to what the world needs andlearn to provide it. I won’t bore you with all the twists and turns of my career. But one in particular is instructive. At a certain point, I became the EIC of ibm.com. We had a survey on our site that asked people if they had achieved their goals. If not, we asked them follow-up questions. When I started, content quality and search were the two most prevalent reasons people had not achieved their goals. After two years in which I focused on content quality, the survey indicated content quality was no longer a significant issue. But search remained an issue. So I shifted my whole focus to search and learned everything I could about how to improve our client search experiences. In the process, I wrote the book on the subject (with the help of my co-authors), and continue to grow my subject matter expertise. Point is, careers evolve. If you continue to listen and learn, you can proactively evolve your career, rather than letting your career evolve in ways that restrict your opportunities.

Two key traits of a great career

You might wonder how I could choose to do something that stretched my skills so severely. How does a dyslexic man become EIC? The thing is, after 20 years in this career, I am actually better at writing and editing skills than I ever was at math. I now struggle to tutor my son in geometry and trig, two subjects I aced when I was young. Why? Because the brain is a flexible organ. It will grow and develop in ways you want it to. (Conversely, use it or lose it.) It takes long hours of practice and hard work. But eventually you can do it. In this respect, Martha Stewart’s father is right. You can do whatever you set your mind to. But the thing is, you need not have passion for it first. You can develop a passion for the things the world needs you to do.

You might also wonder how someone can succeed without having initial passion for something. That is also not easy. But passions are transient. Even those that naturally spring forth from your heart need to be cultivated, lest they become stale. Boredom is a self-fulfilling prophesy. But if you really take an interest in your subject, it will begin to delight and fascinate you. That is what happened to me with technology, journalism, and search. And this fascination continues, as humans continue their relentless pursuit of knowledge, constrained only by Moore’s Law. Also, there is no escape from tedious work. The trick is, to learn how to enjoy what might seem tedious to some. By learning to love work that others find boring, you will never be short of opportunities.

I want to close with one thought: Some of you might see a connection between the theme of our book and the theme of this blog post. The book is based on the notion that before you create content, it’s important to listen to what your audience needs. It is much more effective than writing what suits your fancy, publishing it, and hoping someone will find it useful. The most effective tool for this listening is keyword research. Listening for opportunities to grow your career is a bit more challenging than keyword research. I recommend  seeing how the skills in LinkedIn grow and shrink in popularity. This is a good source of listening data on what skills are most needed in the marketplace.

The Psychology of Digital Content

I had the great fortune to attend the Cognitive Colloquium in early October of this year at the IBM Watson Research Center in Yorktown Heights, NY. It was one of those life-changing moments when you feel like you’re sitting on top of a mountain and you can see much more distant horizons. In my case, the horizon I saw involved using some of my mental energy to solve the grand problems of digital content using the methods of cognitive computing.

What are these methods ? Well, at IBM, we describe cognitive computing as a cluster of practices that use machine learning, natural language processing and high-performance computing to change the way computers work and how humans work with them. Heady stuff, I know.

Before you abandon this blog for more comfortable pursuits, please consider a ready example of this in Watson, the supercomputer that competed in Jeopardy! last year and beat the top champions the show had ever had. The IBM team taught Watson the rules of the game and he proceeded to improve his play through many months of live competition leading up to the televised show. He used natural language processing to understand the clues presented by the host, and devised likely questions for them. He used machine learning to get better and better at the game. He’s now being employed in medicine, marketing and several other domain-specific specialties, including our line of work.

Thinking, Fast and Slow by Daniel Kahneman

Job 1 for my new mission was to read Nobel Laureate Daniel Kahneman’s thick book Thinking, Fast and Slow. Kahneman was a keynote speaker at the Cognitive Colloquium. His talk triggered several new insights in me about the relationship between human psychology and content strategy. As I read the book (primarily on my train ride between my home in Beacon, NY and Grand Central station), I continue to solidify these insights.  I can now articulate several of them. In the interests of space, I will cover one of them for the content strategists who are likely to read this blog. If you’re still interested, please read on.

(If you’re interested in the complete set, look for my forthcoming book: Outside-In Marketing: Using Big Data to Drive Your Content Marketing. I also highly recommend reading Kahneman when you find yourself with a hundred hours or so of unstructured time.)

The central framework of Thinking, Fast and Slow

The central thesis for Kahneman’s life’s work, spanning over forty years of research of practitioners of fields too numerous to list, is a kind of mental dualism. Our minds have two distinct systems, which Kahneman calls System 1 and System 2.

System 1 is the set of processes that happen automatically, in a flash. They are so automatic, we often can’t recall afterwards intending to do them. We just do them. Examples include the habits of driving, like putting on your turn signal prior to a turn. You don’t have to think about it, you just do it. Most of our lives and much of our communication is governed by System 1. We are faced with so much uncertainty in life and it all comes at us so fast, we need a system to make sense of it in the rough. Kahneman calls System 1 “a machine for jumping to conclusions,” because that is what it does. It judges things automatically before all the data are available.

System 2 is the logical and systematic part of our minds, which has been modeled by cognitive scientists since the discipline was conceived. Though it is accurate and precise, it is slow and lazy. There are times when we doubt the knee-jerk responses our System 1 provides. And these are the times we engage System 2 to analyze all the facts at hand and make a reasoned decision. But System 2 is so lazy, we don’t use it as much as the philosophers and other idealists like to believe. In his book, he documents decisions made by experts in a variety of fields based almost entirely on System 1 thinking, and laced with the biases that it uses to jump to conclusions.

Kahneman was the keynote speaker at the cognitive Colloquium because his framework serves as a new way to model human thinking. As he said, “If you want to build systems that think like humans, start with understanding how humans think.”

Computers have always been devices that needed to be right all the time, without fail. So of course we patterned them after System 2 thinking. The trouble is, it takes huge supercomputers to do somewhat ordinary human tasks, like scanning encyclopedic knowledge for a likely question that matches a cryptic answer. Watson takes up a decent sized room and consumes massive amounts of electricity. The machines of tomorrow need to get ever smaller and more efficient, approaching the efficiency of the human brain. To do that, we need to build systems that do much of their work like System 1, fast and imprecise. Only when accuracy is needed will they engage System 2.

Practice: How do users interact with websites?

Beyond the implications of Kahneman’s work for cognitive computing, some of his work has more direct practical applications for content strategy. Indeed, his framework can be used to approximate how users consume websites. Consider this scenario:

Lizzy is a highly educated millennial who works as an editor in the publishing field. She searches for “structured mark-up” in Google and gets a ton of results. She scans the first search engine results page (SERP) and clicks the most likely link without really reading the results. When she lands on the page, she scans it to determine if it is worth the effort. She decides that it is, and begins reading the long-form content on the page.

What does Lizzy’s mental state look like? Well, she uses both System 1 and System 2 in the process of her information journey. System 1 is the primary mechanism of her scanning and clicking behavior. Scanning SERPs and clicking is so familiar to Lizzy, it’s like using your turn signals while driving. She doesn’t need to think about it. System 2 is what she uses to read and digest the content.

A whole UX discipline has grown out of Steve Krug’s imperative, Don’t Make Me Think. If you make Lizzy think when she lands on your page, you force her to engage System 2, which is slow and lazy. Not only is Lizzy in a hurry, she really doesn’t want to waste mental energy either. If you force her to think, she will jump to the conclusion that your page is not relevant before even engaging System 2, and she’ll bounce back to the search engine to try another result.

When Lizzy does find your page relevant, she is ready to engage System 2. This means providing enough data, case studies and other stuff to help her complete her information task. Once she engages System 2, she does not want to have to go back to the SERP again. Ideally, she can get everything she needs on your site. Once she engages System 2, long-form content is what she needs.

For the longest time, we have had a raging debate in our field of whether users read on the web. All kinds of studies showed that “users don’t read” on the web, they just scan. I have tried to replicate these studies in ibm.com with mixed results. After analyzing the results, I came to a conclusion that seems obvious after the fact: If you get the Lizzy use case right, users do read on the web. They’ll even download a longish whitepaper and read it on the web if it is relevant and compelling. But if you don’t get the Lizzy use case right, they bounce off your page before reading regardless of how close the content is to the query.

I have not done a complete analysis. Provisionally, the studies that suggest that users just scan on the web suffer from the fallacy of small samples. They happened to choose content that was not easy to scan as the basis for the studies. It forced users to do something they were not willing to do: To engage System 2 prior to deciding whether the content was worth their time and attention. Since these users never relented to engage System 2, they never “read” in those studies.

As pages improve and the body of evidence approaches critical mass, similar studies have come to different conclusions. Thanks to Kahneman, we now have a framework for understanding these studies. The inflection point between scanning and reading seems to be a System 1 process that determines whether a page is worth a users’ time and attention or not.

Theory: Digital content relevance works like typical human psychology

Those of you who are familiar with my work know I have based much of it on Relevance Theory, which is a kind of psychology of communication. It is the keystone of my book Audience, Relevance, and Search: Targeting Web Audiences with Relevant Content. The theory defines relevance as a sliding scale with two extent conditions, which I sketch below:

  1. The stronger the cognitive effect in the audience, the more relevant the linguistic artifact to that audience
  2. The more effort a linguistic artifact requires, the less relevant it is

A cognitive effect is just a change in the mind of the audience. When we learn or are influenced or make a decision, there is a corresponding cognitive effect. Most of these are small and incremental. Some are breakthroughs. All things considered, breakthroughs are more relevant than small changes to our attitudes. The actual theory is quite a bit more complex than this, but we can gloss over that complexity for the time being.

As I read Kahneman’s book for the first time, it struck me that Sperber and Wilson—the authors of Relevance Theory—were describing communication in terms of System 1 and System 2. They just hadn’t made that connection. When they talk about cognitive effects, they are talking about System 2. Relevance Theory is based on work by H.P. Grice that describes how we reason when we communicate. Because reasoning falls into System 2, cognitive effects are, by definition, System 2 processes.

The extent condition that is more interesting to me is the one about effort. It seems to me that determining whether a page is nominally relevant—that is, whether it is worth the effort or not—is a System 1 process. The content buried within an opaque UX could answer Lizzy’s questions exactly, but she will determine it is irrelevant in a flash if it lacks the visual cues System 1 requires—tight punchy headings, bolded keywords, etc., in short, all the things Google’s algorithm looks for.

The one correction I would make to Relevance Theory after reading Thinking Fast and Slow is to reverse the extent conditions. I would put the one about effort first, because on the web, a page is functionally irrelevant if it doesn’t convince System 1 to devote the effort. And if it requires too much effort for the time being, it loses relevance fast. Only after it is deemed worth the effort do users judge to what extent it is relevant. If the page helps Lizzy make a breakthrough about structured mark-up, it is highly relevant to her.

The blog medium prevents me from stating more. All I hoped to do is plant a few seeds in the minds of enterprising readers to take these thoughts further than I could in this medium. As I said, I will have a great deal more to say in my book when it comes out this year. In the meantime, if one reader had a mountain top experience with this blog, I feel it is doing its job.

James Mathewson is the program director for search and content marketing for IBM.