Posted by StephanieChang
We've entered a fortuitous time to be involved in the digital marketing space. Almost half of the global population now has access to the internet, the way consumers consume content is rapidly evolving, and with that comes an exciting array of challenges and opportunities. This post specifically focuses on the trends that lay ahead for content marketers and the role they play within an organization. Having a concrete understanding of upcoming trends is important in laying the foundation for defining the content goals within an organization and deciding where resources should be allocated.
Posting new, unique content regularly on your site is NOT enough. Each day there are around 92,000 new articles posted on the internet. Digital media publishers have created systems to produce the greatest amount of content at the lowest price. For example, The Huffington Post produces at least 1,200 pieces of content a day, and Forbes produces 400 (with 1,000 contributors). It's not just from publishers; WordPress users produce about 35.8 million new posts each month.
Smaller businesses won't be able to compete based on sheer volume. So how can a site differentiate itself in this market? This is where the development of a content strategy can come into play. It's extremely helpful to understand a company's unique value proposition, and if the company doesn't have one, to understand where the opportunities are in the space to create one. For B2C companies, it can be identifying the company's existing target audience and promoting the brand as an advocate for a particular lifestyle. For B2B companies, it is often times about positioning your brand to be the ultimate authority or source of knowledge in a specific industry/niche.
When developing a content strategy, it's important to evaluate the product that the business sells. Evaluating a product doesn't mean identifying the features or solely understanding the benefits of the product. It actually means understanding the marketability of the product. For instance, is the product a "think" product or a "feel" product? Does the product require high involvement or low involvement from the consumer? Using the FCB grid developed by Richard Vaughn is a useful tactic.
A "think" product is one where a consumer heavily considers before purchasing. These type of products usually involve a high amount of research and personal effort by the consumer before purchasing.
A "feel" product is one where emotion plays a pivotal role in the buying process.
A "high involvement" product is one where the consumer is heavily involved in the buying decision. These products are generally more expensive, but not from just a fiscal perspective. It can also be something that once purchased, will require a lot more time to change, or it has significantly more impact from a long-term perspective. For instance, opening a retirement account is a "high involvement" purchase. A wallpaper purchase is also a "high involvement" purchase.
"Low involvement" products tend to err on a more impulsive or spur-of-the moment purchase. Once a consumer decides they need this product, not much time will be spent researching because it involves a low margin of error if a decision was incorrectly made. The price of the product is usually low.
If the product the company sells is a "high involvement"/"think" product, the consumer is going to spend significantly more time researching the product, including reading/watching product reviews, identifying product features, assessing if this purchase is worth the cost, etc. As a result, the content strategy for such a product should involve plenty of information on the product features, the benefits of the product, as well as growing the product and brand awareness, so that consumers will both discover and search for the product.
If the product the company sells is a "low involvement"/"feel" product, more time should be invested to connecting with consumers and appealing to their emotions. These products should also focus their efforts on building brand loyalty and retention of customers because these products tend to be repeat purchases.
Julian Cole, the Head of Comms Planning at BBH, breaks down this process in great detail in his "Working Out the Business Problems" slide deck.
Traditionally, traffic and page views have been the longstanding metrics to gauge a piece of content's success by. Although there are clear value propositions in having increased traffic (such as increased brand awareness and increased/potential revenue for publishers and bloggers), these metrics on their own can be misleading. More importantly, solely focusing on traffic and page views as a metric of success can lead to unintentional behaviors and misguided motivations. These can include an overemphasis of click-worthy headlines, overuse of keywords in a title, and changing the focus from creating content for users (building for the long-term) to creating content for page views (short-term wins).
Ultimately, determining the right metrics for an organization's content depends on the goals for the content. Is it to maintain an engaged community/develop brand advocates, build brand awareness, and/or to convert users into paying customers? Perhaps it is a combination of all 3? These are all difficult questions to answer.
At Distilled, we're currently working with clients to help them define these metrics for their content. Sometimes, the best option is to use a combination of metrics that we want to analyze and target. For some clients, a key metric could be combining organic traffic + % returning visitors + tracking changes in bounce rate and time on site. For instance, if a user finds exactly what they're looking for and bounce, that's not necessarily bad. Perhaps, they landed on an ideal landing page and found the exact information they were looking for. That's a fantastic user experience, especially if the users have a long time on site and if they become a returning visitor. Looking at any metric in isolation can lead to tons of wrong assumptions and while there is not a perfect solution, combining metrics can be the next best alternative.
For other businesses, social metrics can be a great conversion metric for content pieces. A Facebook like or a Twitter retweet signals some engagement, whereas a share, a comment, or becoming a "fan" of a Facebook page signals a potential brand advocate. Although a share or a new "fan" on a Facebook page may be worthy more, all these activities demonstrate the ability of a piece to gain a user's attention and that awareness is worth something.
Some of the biggest challenges involved in content often times have nothing to do with content. For many of my clients, the biggest struggles usually involve decisions regarding proper resource allocation - lack of time to implement all of the goals, lack of budget to implement these strategies in an ideal way, and the constant battle with readjusting priorities. These hard constraints make marketing especially challenging, especially as more and more channels develop and digital innovation advances so quickly. While there is no perfect solution to this problem, the next best alternative to balancing out hard resource constraints with the constant need for innovation is to develop better integration methodologies. A poll of CMOs have put integrated marketing communications ahead of effective advertising when it comes the most important thing they want from an agency.
Why is this so important? It's because there is a change in the way consumers shop. Accenture conducted global market research on the behaviors of 6,000 consumers in eight countries. One of the top recommendations was the important of providing consumers with a "seamless retail experience." This means providing an on-brand, personalized, and consistent experience regardless of channel. That seamless experience will require content to be heavily involved in a multitude of channels from online to in-person in order to provide potential and current customers with one consistent conversation.
The chart below shows statistics about the way Millennials shop. Although Millennials tend to be exceptionally digitally-savvy (especially when it comes to social media), studies show they still like to shop in retail/brick-and-more stores. Millennials use the internet to research and review price, products, value, and service and have shown to have an impact on how their parents shop.
The integration of content does not apply to just consumer retail stores. For instance, British Airways has a billboard in London that is programmed to show a kid pointing to a flying British Airways plane every time one passes over the billboard. Here is the video that shows how the billboard works.
Last year, AT&T launched a 10,000 foot digitally enhanced store to showcase an apps wall, as well content dedicated to lifestyle areas, like fitness, family, and art. Start-up food blog, Food52 (who is starting to go into ecommerce) is launching a holiday market pop-up store in NYC.
Content Marketing Institute's 2014 Report for B2B content marketers indicates that B2B content marketers still view in-person events as their most effective tactic. The seamless transition of content from online marketing channels (via social media conversations, PPC and display ads, and content on the site via case studies and videos) to in-person conversations and consumer experience will only grow in importance.
Technology and digital innovation are experiencing rapid increases in growth. PCs are now a small percentage of connected devices, wearables, and smart TVs are about to go mainstream. As competition for attention increases, companies will be increasingly willing to experiment with content in new mediums to reach their intended audiences.
This graph is just one depiction of how quickly technology evolves. As marketers, having the ability to quickly adapt and scale to new trends/opportunities is critical. This past year, marketing agency, SapientNitro, released a 156-page free guide entitled Insights 2013 that talks in detail about some of these trends, such as in-store digital retail experiences, the future of television, sensors and experience design, and customer experience on the move to name a few.
One of their case studies talks about Sephora. Sephora has developed great content in retail stores, such as several interactive kiosks that allow users to explore different fragrances or gain understanding about skincare. IPads surround the store that provide how to makeup tips and items can be scanned to reveal product information. Sephora's mobile app has content that speaks to their core customer base and is in line with their other online and social media content. All of the content can be easily shared via email or through social networks.
Other brands, such as Nivea mixed print advertising with mobile innovation. In this case, Nivea's print ad also doubled as a solar ad charger for phones.
Finally, PopTopia is a mobile game that has a mobile phone attachment, called Pop Dangle that will emit the smell of popcorn as you play the game. The game works because the attachment plugs into the audio jack and at a certain frequency, it will signal to spread the smell of popcorn. These examples all show brands who have embraced new mediums for content.
2014 will be an exciting time for the future of content. As technology evolves and competition for user attention increases, marketers need to be agile and adapt to the growing needs and expectations of their customers. The future of businesses will absolutely be critical upon businesses having a very clear unique value proposition. Why is this so crucial? This is the pivotal foundation from which marketing strategies and execution will grow. Our job as marketers is to use that information to pinpoint the metrics we need to measure and prioritize all future marketing strategies. This task is very difficult, but our role is to continue to embrace these challenges in order to seek solutions. Now is the ideal time to begin.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by randfish
Sticking to what can be easily measured often seems like the safest route, but avoiding the unknown also prevents some of the happier accidents from taking place. In today's Whiteboard Friday, Rand explains why it's important to invest some of your time and resources in non-measurable, serendipitous marketing.
For reference, here's a still of this week's whiteboard!
Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk about something that we don't usually talk about in the inbound marketing world because inbound, of course, is such a hyper-measurable channel, at least most of the investments that we make are very measurable, but I love serendipitous marketing too. That's investing in serendipity to earn out-sized returns that you might not be able to make. That's a tough sell for a lot of management, for a lot of executives, for a lot of marketers because we're so accustomed to this new world of hyper-measurability. But with a couple examples, I'll illustrate what I mean.
So let's say we start by maybe you go and you attend an off-topic conference, a conference that isn't normally in your field, but it was recommended to you by a friend. So you go to that event, and while you are there, you meet a speaker. You happen to run into them, you're having a great chat together, and that speaker later mentions your product, your company, your business on stage at the event. It turns out that that mention yields two audience members who become clients of yours later and, in fact, not just clients, but big advocates for your business that drive even more future customers.
This is pretty frustrating. From a measurability standpoint, first off, it's an off-topic event. How do you even know that this interaction is the one that led to them being mentioned? Maybe that speaker would have mentioned your business anyway. Probably not, but maybe. What about these folks? Would those two customers have come to your business regardless? Were they searching for exactly what you offered anyway? Or were they influenced by this? They probably were. Very, very hard to measure. Definitely not the kind of investment that you would normally make in the course of your marketing campaigns, but potentially huge.
I'll show you another one. Let's say one day you're creating a blog post, and you say, "Boy, you know, this topic is a really tough one to tackle with words alone. I'm going to invest in creating some visual assets." You get to work on them, and you start scrapping them and rebuilding them and rebuilding them. Soon you've spent off hours for the better part of a week building just a couple of visual assets that illustrate a tough concept in your field. You go, "Man, that was a huge expenditure of energy. That was a big investment. I'm not sure that's even going to have any payoff."
Then a few weeks later those visuals get picked up by some major news outlets. It turns out, and you may not even be able to discover this, but it turns out that the reporters for those websites did a Google image search, and you happened to pop up and you clearly had the best image among the 30 or 40 that they scrolled to before they found it. So, not only are they including those images, they're also linking back over to your website. Those links don't just help your site directly, but the news stories themselves, because they're on high-quality domains and because they're so relevant, end up ranking for an important search keyword phrase that continues to drive traffic for years to come back to your site.
How would you even know, right? You couldn't even see that this image had been called by those reporters because it's in the Google image search cache. You may not even connect that up with the rankings and the traffic that's sent over. Hopefully, you'll be able to do that. It's very hard to say, "Boy, if I were to over-invest and spend a ton more time on visual assets, would I ever get this again? Or is this a one-time type of event?"
The key to all of this serendipitous marketing is that these investments that you're making up front are hard or impossible to predict or to attribute to the return on investment that you actually earn. A lot of the time it's actually going to seem unwise. It's going to seem foolish, even, to make these kinds of investments based on sort of a cost and time investment perspective. Compared to the potential ROI, you just go, "Man, I can't see it." Yet, sometimes we do it anyway, and sometimes it has a huge impact. It has those out-sized serendipitous returns.
Now, the way that I like to do this is I'll give you some tactical stuff. I like to find what's right here, the intersection of this Venn diagram. Things that I'm passionate about, that includes a topic as well as potentially the medium or the type of investment. So if I absolutely hate going to conferences and events, I wouldn't do it, even if I think it might be right from other perspectives.
I do particularly love creating visual assets. So I like tinkering around, taking a long time to sort of get my pixels looking the way I want them to look, and even though I don't create great graphics, as evidenced here, sometimes these can have a return. I like looking at things where I have some skill, at least enough skill to produce something of value. That could mean a presentation at a conference. It could mean a visual asset. It could mean using a social media channel. It could mean a particular type of advertisement. It could mean a crazy idea in the real world. Any of these things.
Then I really like applying empathy as the third point on top of this, looking for things that are something that my audience has the potential to like or enjoy or be interested in. So this conference my be off-topic, but knowing that it was recommended by my friend and that there might be some high-quality people there, I can connect up the empathy and say, "Well, if I'm putting myself in the shoes of these people, I might imagine that some of them will be interested in or need or use my product."
Likewise, if I'm making this visual asset, I can say, "Well, I know that since this is a tough subject to understand, just explaining it with words alone might not be enough for a lot of people. I bet if I make something visual, that will help it be much better understood. It may not spread far and wide, but at least it'll help the small audience who does read it."
That intersection is where I like to make serendipitous investments and where I would recommend that you do too.
There are a few things that we do here at Moz around this model and that I've seen other companies who invest wisely in serendipity make, and that is we basically say 1 out of 5, 20% of our time and our budget goes to serendipitous marketing. It's not a hard and fast rule, like, "Oh boy, I spent $80 on this. I'd better go find $20 to go spend on something serendipitous that'll be hard to measure." But it's a general rule, and it gives people the leeway to say, "Gosh, I'm thinking about this project. I'm thinking about this investment. I don't know how I'd measure it, but I'm going to do it anyway because I haven't invested my 20% yet."
I really like to brainstorm together, so bring people together from the marketing team or from engineering and product and other sections of the company, operations, but I really like having a single owner. The reason for that single owner doing the execution is because I find that with a lot of these kind of more serendipitous, more artistic style investments, and I don't mean artistic just in terms of visuals, but I find that having that single architect, that one person kind of driving it makes it a much more cohesive and cogent vision and a much better execution at the end of the day, rather than kind of the design by committee. So I like the brainstorm, but I like the single owner model.
I think it's critically important, if you're going to do some serendipitous investments, that you have no penalty whatsoever for failure. Essentially, you're saying, "Hey, we know we're going to make this investment. We know that it's the one out of five kind of thing, but if it doesn't work out, that's okay. We're going to keep trying again and again."
The only really critical thing that we do is that we gain intuition and experiential knowledge from every investment that we make. That intuition means that next time you do this, you're going to be even smarter about it. Then the next time you do it, you're going to gain more empathy and more understanding of what your audience really needs and wants and how that can spread. You're going to gain more passion, a little more skill around it. Those kinds of things really predict success.
Then I think the last recommendation that I have is when you make serendipitous investments, don't make them randomly. Have a true business or marketing problem that you're trying to solve. So if that's PR, we don't get enough press, or gosh, sales leads, we're not getting sales leads in this particular field, or boy, traffic overall, like we'd like to broaden our traffic sources, or gosh, we really need links because our kind of domain authority is holding us back from an SEO perspective, great. Make those serendipitous investments in the areas where you hope or think that the ROI might push on one of those particularly big business model, marketing model problems.
All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday. We'll see you again next week. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by Cyrus-Shepard
BuiltWith knows about your website.
Go ahead. Try it out.
BuiltWith also knows about your competitors' websites. They've cataloged over 5,000 different website technologies on over 190 million sites. Want to know how many sites use your competitor's analytics software? Or who accepts Bitcoin? Or how many sites run WordPress?
Like BuiltWith, Moz also has a lot of data. Every two years, we run a Search Engine Ranking Factors study where we examine over 180,000 websites in order to better understand how they rank in Google's search results.
We thought, "Wouldn't it be fun to combine the two data sets?"
That's exactly what our data science team, led by Dr. Matt Peters, did. We wanted to find out what technologies websites were using, and also see if those technologies correlated with Google rankings.
BuiltWith supplied Moz with tech info on 180,000 domains that were previously analyzed for the Search Engine Ranking Factors study. Dr. Peters then calculated the correlations for over 50 website technologies.
The ranking data for the domains was gathered last summerâ€”you can read more about it hereâ€”and the BuiltWith data is updated once per quarter. We made the assumption that basic web technology, like hosting platforms and web servers, don't change often.
It's very important to note that the website technologies we studied are not believed to be actual ranking factors in Google's algorithm. There are huge causation/correlation issues at hand. Google likely doesn't care too much what framework or content management system you use, but because SEOs often believe one technology superior to the other, we thought it best to take a look..
One of the cool things about BuiltWith is not only can you see what technology a website uses, but you can view trends across the entire Internet.
One of the most important questions a webmaster has to answer is who to use as a hosting provider. Here's BuiltWith's breakdown of the hosting providers for the top 1,000,000 websites:
Holy GoDaddy! That's a testament to the power of marketing.
Webmasters often credit good hosting as a key to their success. We wanted to find out if certain web hosts were correlated with higher Google rankings.
Interestingly, the data showed very little correlation between web hosting providers and higher rankings. The results, in fact, were close enough to zero to be considered null.
Statistically, Dr. Peters assures me, these correlations are so small they don't carry much weight.
The lesson here is that web hosting, at least for the major providers, does not appear to be correlated with higher rankings or lower rankings one way or another. To put this another way, simply hosting your site on GoDaddy should neither help or hurt you in the large, SEO scheme of things.
That said, there are a lot of bad hosts out there as well. Uptime, cost, customer service and other factors are all important considerations.
Looking at the most popular content management systems for the top million websites, it's easy to spot the absolute dominance of WordPress.
Nearly a quarter of the top million sites run WordPress.
You may be surprised to see that Tumblr only ranks 6,400 sites in the top million. If you expand the data to look at all known sites in BuiltWith's index, the number grows to over 900,000. That's still a fraction of the 158 million blogs Tumblr claims, compared to the only 73 million claimed by WordPress.
This seems to be a matter of quality over quantity. Tumblr has many more blogs, but it appears fewer of them gain significant traffic or visibility.
Does any of this correlate to Google rankings? We sampled five of the most popular CMS's and again found very little correlation.
Again, these numbers are statistically insignificant. It would appear that the content management system you use is not nearly important as how you use it.
While configuring these systems for SEO varies in difficulty, plugins and best practices can be applied to all.
To be honest, the following chart surprised me. I'm a huge advocate of Google+, but never did I think more websites would display the Google Plus One button over Twitter's Tweet button.
That's not to say people actually hit the Google+ button as much. With folks tweeting over 58 million tweets per day, it's fair to guess that far more people are hitting relatively few Twitter buttons, although Google+ may be catching up.
Sadly, our correlation data on social widgets is highly suspect. That's because the BuiltWith data is aggregated at the domain level, and social widgets are a page-level feature.
Even though we found a very slight positive correlation between social share widgets and higher rankings, we can't conclusively say there is a relationship.
More important is to realize the significant correlations that exist between Google rankings and actual social shares. While we don't know how or even if Google uses social metrics in its algorithm (Matt Cutts specifically says they don't use +1s) we do know that social shares are significantly associated with higher rankings.
Again, causation is not correlation, but it makes sense that adding social share widgets to your best content can encourage sharing, which in turn helps with increased visibility, mentions, and links, all of which can lead to higher search engine rankings.
Mirror, mirror on the wall, who is the biggest ecommerce platform of them all?
Magento wins this one, but the distribution is more even than other technologies we've looked at.
When we looked at the correlation data, again we found very little relationship between the ecommerce platform a website used and how it performed in Google search results.
Here's how each ecommerce platform performed in our study.
Although huge differences exist in different ecommerce platforms, and some are easier to configure for SEO than others, it would appear that the platform you choose is not a huge factor in your eventual search performance.
One of the major pushes marketers have made in the past 12 months has been to improve page speed and loading times. The benefits touted include improved customer satisfaction, conversions and possible SEO benefits.
The race to improve page speed has led to huge adoption of content delivery networks.
In our Ranking Factors Survey, the response time of a web page showed a -0.10 correlation with rankings. While this can't be considered a significant correlation, it offered a hint that faster pages may perform better in search resultsâ€”a result we've heard anecdotally, at least on the outliers of webpage speed performance.
We might expect websites using CDNs to gain the upper hand in ranking, but the evidence doesn't yet support this theory. Again, these values are basically null.
|AJAX Libraries API||0.031412968|
|GStatic Google Static Content||0.017903898|
While using a CDN is an important step in speeding up your site, it is only one of many optimizations you should make when improving webpage performance.
We ran rankings correlations on several more data points that BuiltWith supplied us. We wanted to find out if things like your website framework (PHP, ASP.NET), your web server (Apache, IIS) or whether or not your website used an SSL certificate was correlated with higher or lower rankings.
While we found a few outliers around Varnish software and Symanted VeriSign SSL certificates, overall the data suggests no strong relationships between these technologies and Google rankings.
|Shockwave Flash Embed||0.046545556|
We had high hopes for finding "silver bullets" among website technologies that could launch us all to higher rankings.
The reality turns out to be much more complex.
While technologies like great hosting, CDNs, and social widgets can help set up an environment for improving SEO, they don't do the work for us. Even our own Moz Analytics, with all its SEO-specific software, can't help improve your website visibility unless you actually put the work in.
Are there any website technologies you'd like us to study next time around? Let us know in the comments below!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by David-Mihm
Though I no longer actively consult for clients, there seems to have been a significant qualitative shift in local results since Google's release of Hummingbird that I haven't seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.
From where I sit, the outcome has been a slightly lower overall quality in Google's local results, possibly due in part to a "purer" ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird's release have mostly disappeared, it's the secondary Hummingbird flutter, which may have coincided with the November 14th "update," that seems to have caused the most noticeable changes.
I'll be working with Dr. Pete to put together more quantitative local components of Mozcast in the coming months, but for the time being, I'll just have to describe what I'm seeing today with a fairly simplistic analysis.
To do the analysis, I performed manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country. I selected these keywords based on terms that I knew Google considered to have "local intent" across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each "pack."
|Keywords||Markets||Result Type Taxonomy|
|personal injury lawyer||Chicago||national directory (e.g., Yelp)|
|assisted living facility||Portland||regional directory (e.g., ArizonaGolf.com)|
|wedding photographer||Tampa||local business website (e.g., AcmeElectric.com)|
|electrician||Burlington||barnacle webpage (e.g., facebook.com/acmeelectric)|
|pet store||Flagstaff||national brand (e.g., Petsmart.com)|
I also performed an even smaller analysis using three keywords that returned carousel results (thanks to SIM Partners for this sample list of keywords): "golf course," "restaurant," and "dance club."
Again, a very simple analysis that is by no means intended to be a statistically significant study. I fully realize that these results may be skewed by my Portland IP address (even though I geo-located each time I searched for each market), data center, time of day, etc.
I'll share with you some interim takeaways that I found interesting, though, as I work on a more complete version with Dr. Pete over the winter.
If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.
But the winners of this update haven't necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.
This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for "pet store portland" demonstrates.
Results like the one above call into question Google's longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pagesâ€”the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.
Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked atâ€”a performance amplified when considering geo-modified keywords alone.
National brands as a result type is underrepresented due to 'personal injury lawyer,' 'electrician,' and 'wedding photographer' keyword choices. For the keywords where there are relevant national brands ('assisted living facility' and 'pet store'), they performed quite well.
While a number of thriving directories were wiped out by the initial Panda update, here's an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page positionâ€”in some cases above the fold. I don't remember seeing as many of these kinds of sites over the last 18 months as I do now.
Especially if keywords these sites are targeting return carousels instead of packs, there's still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.
While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.
Numerous "lucky" small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.
The overwhelming majority of packs seem to be displaying in position 4 these days, especially for "generic" local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.
Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn't beforeâ€”additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.
This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google's structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they'd show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.
At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.
|Keyword||# in Pack||Reason for Variance|
|assisted living facility||6.9||6-pack in Burlington|
|electrician||6.9||6-pack in Portland|
|personal injury lawyer||6.4||Authoritative OneBox / Bug in Chicago|
This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn't reliably differ from that of (former) packs, it stands to reason that visual display of all local results might now be controlled by a single back-end mechanism.
This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie's List, Yellowpages.com, and others) for small-market geo-modified phrases (such as "electrician burlington").
For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including "Burlington, VT" in its homepage Title Tag. With just a little TLCâ€”maybe a link to a contact page that says "contact our Burlington electricians"â€”sites like this one might be able to displace those national directories in positions 1-2-3.
Look at the number of times Facebook and Yelp show up in last year's citation study I co-authored with Whitespark's Darren Shaw. Clearly these are major "fixed objects" to which small businesses should be attaching their exoskeletons.
Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.
This result for "pet store chicago" is one of the few barnacles that I came acrossâ€”and it's a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema'd stars right on the main Google SERPâ€”which has to increase her clickthrough rate relative to her neighbors.
Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite resultâ€”the surprisingly competitive "dance club flagstaff" where Jax is absolutely crushing it on Facebook despite no presence in the carousel.
I have to admit, I don't really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?
One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?
At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.
Andrew Shotland already said it in the last section of his Search Engine Land column, but regionally-focused sitesâ€”whether directories or businessesâ€”should absolutely invest in great content. With Penguin and Hummingbird combined, thin-content websites of all sizes are having a harder time ranking relative to slightly thicker content directories.
Posted by DannyDover
My initial response to the massive traffic increase was not exactly professional.
"HOLY FREAKING CRAP BALLS!", I blurted out. I searched the room for a fellow nerd to share my e-thusiasm with, but only found a room full of strangers eating sandwiches.
Over the course of the next few days, the post received more than 600,000 unique visitors. If you segment the traffic to only include visits from Singapore, the number of unique visitors is equivalent to 10% of the entire population of the country (although admittedly this metric is a bit inflated due to people reading the post on multiple devices.)
I support myself financially as a storytelling consultant. On a day-to-day level this means I work on marketing strategy, creative writing, and web development. Admittedly it is a weird mix, but I enjoy the lifestyle.
I am currently living in Vietnam, but recently spent two months living in Singapore.
Like I do with all of my travels, I penned a blog post about my experience living in Singapore and hit publish. You can read the entire post here, but the quick summary is:
My blog is fairly well read, so I was surprised that this post started out as one of my least-read posts. After a few days the post was, for most intents and purposes, just another link in the archive.
Last Wednesday, I grabbed my normal Vietnamese breakfast (a local sandwich called a BĂˇnh mĂ¬ and a coconut milk-based smoothie) and went into my co-working office to start on my to-do list for the day.
I have been trying to convert bad online habits into good ones, so when I found myself craving a peek at Facebook, I clicked on my Google Analytics shortcut instead. It opened up my real-time report, and I practically dropped my meal.
The next few days were the craziest marketing adventure that I have ever had. The following are the key lessons I learned from this experience:
I think the key reason that this post resonated with people was that it was uncommonly honest. (This is a trait I picked up from Rand when I worked at Moz. It isn't a marketing trait, it is a life trait.) This post was published on my personal blog where I don't have any ads or up-sells. I write posts there solely because I enjoy writing. In this case, I thought I had some interesting insights about Singapore and wanted to share my honest thoughts. The power in this was that when people read it, they too wanted to share my thoughts (along with their own!) with their online friends.
In the post I cited some suicide statistics that were quite alarming. As the thousands of comments about the post came in (mostly via Facebook), I continually received the criticism that my data was incorrect. I triple-checked my sources (they checked out) and tried to reply to as many of the false claims of bad data as possible. It wasn't until two days later that I realized that people Googling the statistics were taken straight to a Wikipedia article that listed outdated data. After I updated the Wikipedia article to include the most recent data, the data criticism comments immediately stopped. I could have saved myself a giant headache if I had just viewed the situation from the readers' perspective and found the misinformation on Wikipedia earlier.
As the comments came in, I was alerted (rudely and repeatedly) that I had erroneously cited a date as 2011 rather than 2001. My first thought was just to subtly update the number but was worried this might start a backlash. For this reason, I called Jessica Dover. Jessica has worked on social media strategy for many of the world's most well known celebrities and has solved more social media problems than I have followers. (Disclaimer: She also happens to be my sister, but I honestly think that has hindered her more than helped her :-p. Her success is hard-earned and her own.) Without hesitation, she told me exactly what to do.
If you don't have your own social media mentor like Jessica, Moz's Q&A can be a great source of information.
At the onset, I was receiving a lot of traffic but none of it was converting (my conversion events were email captures and social follows). When I couldn't fix this myself, I called another member of my marketing SWAT team, Joe Chura. Joe runs an agency called Launch Digital Marketing. I think they are the most underrated team in the industry. In no time, they had a plan. Following their advice I installed two WordPress plugins:
After I added these plugins, it doubled the size of my mailing list and started what eventually became a viral spread of the blog post on Twitter. These were huge wins. (Hat tip to Dan Andrews for being at the forefront of that Twitter storm.)
Again, if you don't have your own marketing SWAT team, Moz's Q&A can be a great resource.
Throughout the entire process my server never went down. I credit this to two things:
First, props to WPengine (my host) for being seamless. They handled the spike without any hiccups or annoying interruptions. I will likely have to pay an overage fee but that is a MUCH better option than having a site outage.
Second, I credit preparation. I have long been using a tool called http://gtmetrix.com/ to diagnose speed problems on my site. (Hat tip to Jon over at Raven for introducing this tool to me). I love this tool because it combines the Google Page Speed tool and Yahoo's YSlow into one convenient and easy to understand interface. Luckily, I had implemented all of the recommended fixes well before this traffic spike. I am kind of a speed optimization nerd. :-p
When I first posted the blog post, no one cared. When it started to gain some traction, I was immediately told how stupid it and I were. As it gained momentum the amount of naysayers increased. It wasn't until the post reached full velocity that the supporters started to outnumber the naysayers. This has been a trend that I have observed with all of my successful content. I now take comfort in knowing that it is going to get worse until it suddenly gets better. Negativity online is a slope, and luckily it does have a peak.
Facebook once offered a tool called Facebook Insights for Domains. This tool allowed you to get valuable information on any traffic that was referred to your verified domain from Facebook. Unfortunately, Facebook has killed it off. When my post went viral on Facebook, I had no visibility other than that the traffic was coming from Facebook and Facebook mobile. I had no idea what pages or groups the applicable conversations were happening on, and thus had no way to respond to conversations happening behind the wall. This was a huge frustration throughout the whole process.
When people came to my website to read the Singapore post, many of them checked out my other posts as well (this is to be expected). In response to this, I published a post that I thought would also be applicable to the new readers. Due to the increased visibility, this post (on useful money philosophies) subsequently went mildly viral. This in turn drove even more conversions.
Stories exist in parallel universes:
These are all very different stories!
Many of the comments, compliments and criticism that I received about the Singapore post had absolutely nothing to do with the words written in my article. For many, it was their personal experiences, not my blog post, that drove their responses. At first, this was a major frustration point for me. It wasn't until I mapped out the perspectives in the above list that I calmed down and started to appreciate the storytelling experience.
When the responses came in, I was vastly outnumbered (it was literally 500,000 to 1)! The only way I was able to deal with that amount of volume was to listen, learn from an expert (see lesson 3), collect data, process that data, and then react. I let the first several dozen comments come in before I started to respond. I think this was critical in me being able to follow and supplement the large-scale discussion.
The click-worthiness of the blog post title was a major contributing factor to its success. (Second only to its honesty). Admittedly it was an attention-grabbing title but at the same time it was true. I actually will never be returning to Singapore. I didn't perform any keyword research or A/B tests when picking the blog post title. Instead, I just picked something that I figured I would want to click. The best titles are always that simple.
When I look back on this marketing adventure, I feel thankful. The world, not just Singapore, is in an amazing state of change right now. I am glad that my little voice was able to contribute a little bit to the global discussion.
If you would like to hear about other marketing adventures, feel free to connect with me on Google+.
Posted by MartinMacDonald
Despite keywords being slightly out of fashion, thanks to the whole (not provided) debacle, it remains the case that a large part of an SEO's work revolves around discovering opportunity and filling that same opportunity with content to rank.
When you are focusing on smaller groups of terms, there are plenty of tools to help; the Moz Keyword Difficulty Tool being a great example.
These tools function by checking the top results for a given keyword, and looking at various strength metrics to give you a snapshot as to how tough they are to rank for.
The problem is, though, that these tools operate on the fly, and generally only allow you to search for a small amount of keywords at any one time. The Moz tool, for instance, limits you to 20 keywords.
By the end of this tutorial you will be able to visualize keyword difficulty data in a couple of ways, either by keyword:
Or by keyword type:
Or by category of keyword, spliced by specific position in the results:
All keyword difficulty tools work in the same way when you break them down.
They look at ranking factors for each result in a keyword set, and sort them. It's that simple.
The only thing we need to do is work out how to perform each step at scale:
My preference for scraping Google is using Advanced Web Ranking to get the ranking results for large sets of keywords.
Quite a few companies offer software for this service (including Moz), but the problem with this approach is that costs spiral out of control when you are looking at hundreds of thousands of keywords.
Once you have added your keyword set, run a ranking report of the top 10 results for the search engine of your choice. Once it's complete you should see a screen something like this:
The next step is to get this data out of Advanced Web Ranking and into Excel, using a "Top Sites" report, in CSV format (The format is important! If you choose any other format it makes manipulating the data much tougher):
This presents us with a list of of keywords, positions, and result URLs:
So now we can start harvesting some SEO data on each one of those results!
Equally, though, you could use the SEOgadget Excel tool alongside the Moz API. I haven't tested that thoroughly enough, but it should give you pretty similar results if you are more used to using them.
Now that we have a nice result set of the top 10 results for your keyword list, its time to start pulling in SEO metrics for each of those to build some actionable data!
My preference is to use the Niels Bosma Excel Plugin, as its super easy and quick to pull the data you need directly into Excel where you can start analyzing the information and building charts.
If you haven't already done so, you should start by downloading and installing the plugin available here (note: It's for Windows only, so if you are a Mac user like me, you'll need to use Parallels or another virtual machine).
In the column adjacent to your list of URLs you simply need to use the formula:
This formula gives you the CitationFlow number for the URL in cell C2. Obviously, if your sheet is formatted differently, then you'll need to update the cell reference number.
Once you see the CitationFlow appear in that cell, just copy it down to fill the entire list, and if you have lots of keywords right now would be a great time to go grab a coffee, as it can take some time depending on your connection and the number of results you want.
Now you should be looking at a list something like this:
Which allows us to start doing some pretty incredible keyword research!
The first thing that you probably want to do is look at individual keywords and find the ranking opportunity in those. This is trivially easy to do as long as you are familiar with Excel pivot tables.
For a simple look, just create a pivot of the average citation score of each keyword, the resulting table creator wizard will look something like this:
Of course you can now visualize the data just by creating a simple chart, if we apply the above data to a standard bar chart you will begin to see the kind of actionable data we can build:
This is just the beginning, though! If you create a pivot chart across a large dataset and look at the average citation score for each position, you can see interesting patterns develop.
This example is looking at a dataset of 52,000 keywords, and taking the average score of each site appearing in each position in the top 10 results:
As you can see, across a large dataset there is a really nice degradation of strength in the top 10 results, a real vindication that the data we are looking at is rational and is a good indicator of how strong you need to be to rank a given page (providing the content is sufficient and focused enough).
You really want to splice the data into categories at this stage, to identify the areas of quickest opportunity and focus on building content and links towards the areas where you are likely to earn traffic.
The below chart represents a comparison of three categories of keywords, sorted by the average Citation of the results in each category:
From this we can see that of the three keyword categories, we are likely to rank higher up for keywords in the "brown widgets" category. Having said that, though, we are also able to rank lower down the page in the "blue widgets" category, so if that has significantly more traffic it might prove a better investment of your time and energy.
We have created a homebrew keyword difficulty tool, capable of analyzing hundreds of thousands of URLs to mine for opportunity and guide your content and linkbuilding strategies!
There is so much you can do with this data if you put your mind to it.
True, scraping Google's results strictly speaking is against their Terms of Service, but they have a habit of using our data, so lets turn the tables on them for a change!
Posted by steviephil
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc.
Picture the scene...
You wake up, grab a shower, throw on your Moz t-shirt (and other clothes, presumably...), boil the kettle, pour yourself a cup of coffee, switch on the ol' laptop, let your daily rank checks complete and then slowly run through them one by one...
...Ooo, that's nice...
...Yes! Great jump there!...
...Ye- Wait, hold on... What? Lots of red, all across the board? Rankings have either dropped multiple pages or dropped out of the top 100 results entirely?!
Uh-oh. It's gonna be a looong day....
This happened to me recently with one of my clients. Their homepage - their main page as far as rankings were concerned - had mysteriously vanished from Google's index overnight, taking with it a lot of page one rankings, as you can see from the world's saddest and perhaps most unnecessary GIF image below:
This was also the first time that it'd happened to me. Granted, I've consulted on this type of thing before, but usually when it's happened to someone and they approach me asking what's happened afterwards. However, this was the first instance of it where I was discovering it for myself and it was happening under my watch, affecting one of my clients.
This post runs through the steps that I took to resolve the issue. I acted methodically yet swiftly, and in doing so managed to get the homepage back in Google's index (and - with it - its former rankings) in less than 12 hours.
I accept that this is one of those articles where you probably won't even need it until it happens to you. To be honest, I was in that exact situation - I pretty much knew what to do, but I was still a bit like "OMG OMG OMG, whowhatwherewhenwhy?!" in trying to find an article to just double-check that I was doing everything I could be doing and wasn't overlooking anything obvious.
So... Are you ready? Here we go!
I primarily use Link Assistant's Rank Tracker (with Trusted Proxies) for my rank checking needs, with Moz PRO's rank checking as a backup and second opinion. Rank Tracker allows a 'URL Found' column, which revealed something to me instantly: other pages were still ranking, just not the homepage. Additionally, where a ranking had seen a drop of a few pages (but was still ranking within the top 10 pages/100 results), a different page was ranking instead - in my client's case, it was things like the Services, Testimonials and Contact pages.
This suggested to me that it was just the homepage that was affected - but there was still a way that I could find out to be sure...
My next step was to use Google's 'site:' operator (see #1 here) on the domain, to see whether the homepage was still in Google's index. It wasn't - but all of the site's other pages were. Phew... Well at least it wasn't site-wide!
Even though I had a feeling that this would be the case based on what Rank Tracker was saying, it was still important to check, just in case the homepage was still ranking but had been devalued for whatever reason.
Now that I knew for sure that the homepage was gone from Google, it was time to start investigating what the actual cause might be...
In my experience, this is usually what's responsible when something like this happens... Given that the option to noindex a page is often a tick-box in most CMS systems these days, it's easy enough to do. In fact, one of the times I looked into the issue for someone, this was what was the cause - I just told them to untick the box in WordPress.
In order to check, bring up the page's source code and look for this line (or something similar):
<meta name="robots" content="noindex">
(Hit Ctrl + F and search for "noindex" if it's easier/quicker.)
If you find this code in the source, then chances are that this is responsible. If it's not there, onto the next step...
It seems to be a somewhat common myth that robots.txt can noindex a page - it actually tells search engines not to crawl a page, so it'd only be true if the page had never actually appeared in Google's index in the first place (e.g. if it were a brand new site). Here's more info if you're interested.
To be honest though, given what had happened, I didn't want to assume that this wasn't the cause and therefore I thought it would be best just to check anyway.
But alas... The site's robots.txt file hadn't changed one iota. Onto step 3...
Given that this was my client, I was already familiar with its history, and I was already adamant that a penalty wasn't behind it. But again, I wanted to do my due diligence - and you know what they say when you assume...!
I jumped into Google Webmaster Tools and looked at the recently added Manual Actions tab. Unsurprisingly: "No manual webspam actions found." Good good.
However, let's not rule out algorithmic penalties, which Google doesn't tell you about (and oh lordy, that's caused some confusion). As far as Pandas were concerned, there was no evidence of accidental or deliberate duplicate content either on the site or elsewhere on the Web. As for those dastardly Penguins, given that I'm the first SEO ever to work on the site and I don't build keyword anchor text links for my clients, the site has never seen any keyword anchor text, let alone enough to set off alarm bells.
Following these checks, I was confident that a penalty wasn't responsible.
Another check while you're in your Webmaster Tools account: go to Google Index > Remove URLs and check that the page hasn't been added as a removal request (whether by accident or on purpose). You never know... It's always best to check.
Nope... "No URL removal requests" in this case.
It was at this point, that I was starting to think: "what the hell else could it be?!"
On the day that this happened, I met up with my good friends and fellow SEOs Andrew Isidoro (@Andrew_Isidoro) and Ceri Harris of Willows Finance for a drink and a bite to eat down the pub. I ran this whole story by them along with what I'd done so far, and Andrew suggested something that I hadn't considered: although extremely unlikely, what if the homepage was now showing up as a 404 (Not Found) code instead of a 200 (OK) code? Even if the page is live and performing normally (to the visitor), a 404 code would tell Google that that page "don't live here no more" (to quote the mighty Hendrix) and Google would remove it accordingly.
Again, it was worth checking, so I ran it past SEO Book's HTTP header checker tool. The verdict: 200 code. It was a-OK (pun fully intended - it's a good thing that I'm an SEO and not a comedian...)
Ok, so now what?
Now it was time to ask the big boss Googly McSearchengineface directly: what do you make of the page, oh mighty one?
In order to do this, go to Google Webmaster Tools, click on the site in question and select Crawl > Fetch as Google from the side-menu. You should see a screen like this:
Simply put the affected page(s) into it (or leave it blank if it's the homepage) and see what Google makes of them. Of course, if it's "Failed," is there a reason why it's failed? It might also help to give you an idea about what could be wrong...
Once you have done the above in GWT, you're given this option if Google can successfully fetch the page:
I decided to do just that: ask Google to (re)submit the page to its index.
At this point I was confident that I had done pretty much everything in my power to investigate and subsequently rectify the situation. It was now time to break the news, by which I mean: tell the client...
I thought it best to tell the client after doing all of the above (except for the 404 check, which I actually did later on), even if it was possible that the page might recover almost immediately (which it did in the end, pretty much). Plus I wanted to be seen as proactive, not reactive - I wanted to be the one to tell him, not for him to be the one finding out for himself and asking me about it...
Here's the email that I sent:
Hi [name removed],
I just wanted to bring your attention to something.
I conduct daily ranks checks just to see how your site is performing on Google on a day-to-day basis, and I've noticed that your homepage has disappeared from Google.
Usually this is the result of a) accidental de-indexation or b) a penalty, but I have checked the usual suspects/causes and I see no sign of either of those occurring.
I have checked in your Webmaster Tools account and Google can successfully read/crawl the page, so no problems there. I have taken appropriate steps to ask Google to re-index the page.
I've done all that I can for now, but if we do not see everything back to normal in the next couple of days, I will continue to research the issue further. It's likely the case that it will recover of its own accord very soon. Like I say, I've checked the usual signs/causes of such an issue and it doesn't appear to be the result of any of those.
Just to check, have you or your web designer made any changes to the website in the last couple of days/weeks? If so, could you please let me know what you have done?
I know it's not an ideal situation, but I hope you can appreciate that I've spotted the issue almost immediately and have taken steps to sort out the issue.
If you have any questions about it then please do let me know. In the meantime I will keep a close eye on it and keep you posted with any developments.
(Note: In this instance, my client prefers email contact. You may find that a phone call may be better suited, especially given the severity of the situation - I guess it will be a judgement call depending on the relationship that you have with your client and what they'd prefer, etc.)
He took it well. He hadn't noticed the drop himself, but he appreciated me notifying him, filling him in on the situation and explaining what action I had taken to resolve the issue.
Later on the same day in the evening, I did another quick check. To my surprise, the homepage was not only back in Google, but the rankings were pretty much back to where they once were. PHEW!
I say "surprised" not because of my ability to pull it off, but with how quickly it'd happened - I expected that it might've taken a few days maybe, but not a mere few hours. Oh well, mustn't complain...!
So what did cause the deindexation? Well, another suggestion that came from Andrew while we were down the pub that I'd stupidly overlooked: downtime!
It could've been an unfortunate and unlucky coincidence that Google happened to re-crawl the page exactly when the site had gone down.
I hadn't added the site to my Pingdom account before all of this had happened (something that I have since rectified), so I couldn't know for sure. However, the site went down again a day or so later, which made me wonder if downtime was responsible after all... Even so, I advised the client that if this was a common occurrence that he should maybe consider switching hosting providers to someone more reliable, in order to reduce the chance of this happening all over again...
In order to make sure that you're fully on top of a situation like this, make sure that you're carrying out daily rank checks and that you're quickly checking those rank checks, even if it's a quick once-over just to make sure that nothing drastic has happened in the last 24 hours. It's clear to say that if I hadn't have done so, I might not have realised what had happened for days and therefore might not have rectified the situation for days, either.
Also, having a 'URL Found' column in addition to 'Ranking Position' in your rank checking tool of choice is an absolute must - that way you can see if it's a particular page that's affected if different pages are now the highest-ranking pages instead.
Anyway, I hope that this case study/guide has been useful, whether you're reading it to brush up ready for when the worst happens, or whether the worst is happening to you right now (in which case I feel for you, my friend - be strong)...!
Also, if you'd do anything differently to what I did or you think that I've missed a pivotal step or check, please let me know in the comments below!
Did you like the comic drawings? If so, check out Age of Revolution, a new comic launched by Huw (@big_huw) & Hannah (@SpannerX23). Check them out on Facebook, Twitter and Ukondisplay.com (where you can pick up a copy of their first issue). Their main site - Cosmic Anvil - is coming soon... I'd like to say a massive thanks to them for providing the drawings for this post, which are simply and absolutely awesome, I'm sure you'll agree!
Posted by CraigBradford
If you're like a lot of people (myself included) it's very easy to go into an analytics package and focus only on conversion rate. We look at reports like the one below and make short-sighted decisions:
Looking at only the information above, we might decide that "Organic Search" is a bad channel. Making decisions on how successful a channel is based only on conversion rate is short-sighted and will cost you money. Instead, I urge you to think of your channels like a soccer team.
A sensible soccer formation looks something like the image below:
You have one goalkeeper, defenders, midfielders and attackers. You would never think of creating a team of only 11 strikers. But that's exactly what we do with our channels all the time. We create a team that looks like this:
We have a team of channels that are all being graded on their ability to "score goals"â€”please don't make this mistake. I'm okay with the fact that some of my channels have a low "e-commerce" conversion rate; that may not be what they're designed to do.
The one thing that I want you to take from this blog post is that channels are not binary. It isn't that they either drive sales or do nothing; there's lot's of value in between if you know what to look for.
In a report "The Customer Journey to Online Purchase" Google showed this to be the case by looking at the relationship that each channel is likely to play in the customer journey. The idea is to show on a very simple scale whether a channel plays an "awareness role" or more of a "decision making" role.This is an interactive piece so please have a look and play around in it. You can segment by industry or by country.
Let's take the US market as an example:
It shows that in general, Display and Social are more of awareness channels, while Organic search and Paid search tend to be last interaction/decision making channels. I'm not surprised by that, but if that's trueâ€”if social is best used as a tool for driving awareness of my brandâ€”why would I ever use e-commerce conversion rate as a metric of success? The answer, of course, is that I shouldn't. Better metrics would perhaps be things like:
How many new visitors did social bring this month?
Brand awareness - how many people have heard of my business?
How many people interacted with my brand in some way?
These are just a couple of examples, but if you want more specifics I recommend you read this post by Hannah Smith on the Distilled Blog: Calculating ROI from Social Media - Problems, Pitfalls & Breaking all the thingsâ€¦
If you dig a little further, it gets more interesting. Let's look at the health industry in particular:
Social is still an awareness channel, but look at display. It's now playing more of a decision-making role. Email has also moved from an awareness role to a decision-making role.
The data above shows that different channels play different roles depending on country and industry, so don't assume anything. Don't assume that social will be an awareness channel, don't assume that email will be a decision making channel, and whatever you do, don't assume that all channels are designed to only drive sales. Next time you're accessing your channels, try two things:
As mentioned above, not all channels have the same strengths, but that's okay as long as they are pulling their weight somewhere else. To see if that's the case, try assigning them some attributes other than sales. Avinash Kaushik gave an excellent presentation at MozCon 2013 (if you weren't there, the video can be purchased from Moz), in which he said that channels should solve for performance and relationships. This is shown in the table below (the example is for ModCloth):
As you can see, if we were to only solve for the line with red text ("Orders") we would ignore all of the other good that some channels are doing. Social, in this example, is terrible at everything except "Be the Buyer." I encourage you to do the same for your channels; add in all the metrics that are important for relationship-building, not just sales, and take a step back to see what else your channels might be contributing to that isn't immediately obvious when you simply look at sales.
Just about anyone who's ever read about goal-setting will have seen the theory of creating SMART goals:
I think most people are good at thinking about goals that are specific, attainable, and realistic. We think we're good at measurable, and we seem to often forget about time-bound. Since the rest could easily be a post on their own, I'll just focus on time-bound here.
When we look at tables like the one below, If we are going to make bad decisions like declaring channels "good" or "bad" from just one metric, at least remember to consider time.
If we say organic search is a bad channel, what you actually mean is organic is a bad channel at driving sales in the last X days. That's an important difference, because it has an impact on where the channel is placed on the scale of "awareness" to "decision-making."
Posted by EricaMcGillivray
Every year at MozCon, I have the joy of working with our fabulous MozCon speakers. One of the speakers, who we were most excited about for MozCon 2013, was Kyle Rush. Kyle's name might not be on the tip of your tongue, but he worked on possibly the biggest and best online marketing campaign, Obama for America, as their deputy director of front-end web development. From there, he went to The New Yorker, and he just announced that he's headed over to Optimizley.
When Kyle told us he wanted to present about the conversion rate optimization and a/b testing the Obama campaign did, there may have been some squeeing from Rand (like the Packers won) and me (like over new Sherlock episodes). Marketing nerds. Because regardless of your politics, Obama's reelection campaign not only broke fundraising records, but changed the way we think about using big data and CRO.
Kyle rocked that MozCon 2013 stage. He presented a ton of actionable information for attendees, and he was one of our top scoring presentations. When we went to decide which full-length MozCon presentation to share with all of you, for free, Kyle's was it. Enjoy!
Kyle: Thank you, Cyrus. It feels great to be in Seattle. I just came from New York City. Is anybody else here from New York? Yeah. You guys all know what I mean when I say it feels great to be in Seattle. You guys know how to do the summer with this 77 degree weather. This dry heat is awesome. We've got to figure out how to get that in New York City. Can we get on that?
As Cyrus said, my name is Kyle Rush. I'm currently at 'The New Yorker.' Before that I was at the Obama campaign. I worked on a lot of the product and tech aspects of our online fundraising. Obviously, we ran a lot of optimization on that. So, that's what I'm going to be talking to you guys today about.
Before we get started, I want to give you guys some context on what we jumped into, the situation on day one at the Obama campaign. All the media outlets at the time were reporting that we were expected to raise one billion. They did probably $700 million in 2008. So, we were expected to raise one billion.
Just to put that into perspective for you guys, Amazon's Q4 profit for last year was only $97 million. So, when you spread that out over a year and a half, which was the life of the campaign, you still only get like half or a little over half what we were expected to raise on the campaign. So, this was a pretty daunting challenge.
But, in the end... Oh, I didn't mean to click to. But, in the end we did $1.1 billion. So, we exceeded expectations. None of us thought we could do it. Obviously, that's a lot of money. We did $690 million of it online as Cyrus said.
Another thing that I want to talk to you guys about is just an example of one of our online fundraising programs. That was called Quick Donate. This was a way for our users to save their payment information so that they could do one click donations on the Web, and they could also do one click donations in email - which had never been done before. So, we had to do a lot of funky engineering to get that to work.
But, you could also SMS donate which was a first for political campaigns. It was actually a big achievement for us. Because the Federal Election Commission said that political campaigns can not use short codes to fundraise. So, we weren't allowed to work with AT&T and Verizon to send out short codes and ask people to text those. We had to engineer a way around that. When we launched SMS donations it was the first of its kind.
Quick Donate brought in $115 million over its lifespan. It had 1.5 million users. This was a thrill to work on. But, obviously, this type of program we optimized. We ran a lot of tests. Those are kind of the things I'm here to share with you guys.
You might ask how did we get here. We ran 500 experiments. We always had a test running. It was really, really intense the amount of traffic that we had. We did weeks of user testing. User testing is really simple. It's just putting a user in front of a computer and observing them.
We used a program called Silverback. I don't know if any of you guys are familiar with it. But, it records the eyesight camera and the computer screen at the same time. So, you can actually see your user making a donation. We learned a lot from this. We did it on and on and on to the point where we probably did weeks of it.
Sorry, this thing is pretty sensitive.
We also just did general data gathering which I really like to do. Because if you're not gathering data then you're kind of flying blind. Just a data point to show you guys how much data gathering we did, we did over 668 million Google Analytics custom events. I'll be talking about those in a minute. But, that's a ton. I don't think that I've ever worked at a place that pushed Google Analytics to the point that we did on the campaign. It was pretty intense.
You might ask 'What did that all get us?' It got us a 49% increase on our donation page conversion rate. And, it got us a 161% increase in our email signup page. These are two really high level conversion goals for us.
The email signup you might not have known about. We didn't really talk about it. But, I'll let you in on a little secret. Email is responsible for just about 90% of our online fundraising. So, gathering emails on our list was super important. We spent a lot of time optimizing email acquisitions.
The three things that I want to talk to you guys about today, and this is really what optimization means to me, is experimentation. I think we're all mostly familiar with this. This is A/B testing, multi variate testing. The second is observation, and that's what I was talking about when I was talking about user testing. You want to observe your users using your product. Otherwise, you're not going to know how they're using it. Because you're not a user. Also, just general data gathering, which is super important.
First up is experimentation. Sorry. This thing's super sensitive. We identified a process when we were on the campaign. I want to share that with you. I'm sure everybody has their own processes. But, this is what really worked for us.
The first step for us in experimentation was to identify our goals. I mean this from both a micro and a macro level.
On the macro level I just talked about some of our goals which was email acquisition and donations. You need money to win a campaign. In our instance we needed emails to get that money.
But, I also encourage you to focus on micro goals. This is like conversion goals when you're running tests. You should just measure everything. So, micro goals can be like the error rate on a form, like how many errors do you get when somebody mistypes their email address. Is the label clear enough there? You just really want to measure everything.
One thing that really blew me away on the campaign is that we started measuring the conversion rate on the follow-up page. So, when you made a donation and it was successful you got taken to a follow up ask that asked you to save your payment information.
That was Quick Donate. That was the opt in to Quick Donate. That was a very critical conversion goal for us, because we found out early on that Quick Donate users were four times more likely to make a donation in the future. That's like money right there that we needed to focus on.
We measured that goal even though we weren't changing that page at all. We were changing the donation page. Then, we found out that some of the variations that we ran actually affected the follow up page. It's really, really important to measure as many conversion goals as you possibly can when you're doing your experiments just to get a good sense of what's going on.
The second step that we would do is develop hypotheses. This is really important. It's just basically like the scientific process that you guys all learned in grade school. Develop your hypotheses and then test them. This is really helpful in making sure that you're staying focused.
It's really easy to fall in this trap when you realize how much you can test. You just start to test everything. You don't want to make any decisions. You just want to test. It's like, 'Oh, what color should the submit button be?'
'I don't know, test it.'
Don't do that. That's not a good idea.
Create high level hypotheses. One of ours, for example, in the campaign was that less copy does better than more copy for conversions. So, we tested that on our splash page. We tested that on our donate page. We tested that on our email sign up page. We tested it everywhere on the site. We figured out different experiments to test it.
That's actually number three here is to create experiments. Create many experiments to test your hypotheses. You might want to test the same experiment more than one time. Because you might get different results in the time of the day. There are all kinds of weird things that can happen. Test it multiple times and create several experiments that test your hypothesis.
Oh, wow. The fourth, and I can't stress this enough, is to prioritize with ROI. I touched on this a little bit earlier. But, as you start building out your experiments... I'll iterate this with an example from the campaign.
We ran an experiment where on our donate page we had a picture of the President behind a donate form. That was our control. But, then we added an inspirational quote above the President's head. It said something like 'Stand with me, work with me, let's finish what we started.'
When we tested that we got something like a 17% increase in conversions. Because it made the page just a little bit more inspirational and made people really want to finish and stand with the President. That was awesome.
That was just adding copy. That only took us, like, a couple of minutes to get onto a page and actually into production when it won. So, ROI on that is really high.
Our finance team wanted us to implement paying by check, because they had some data that said a lot of people don't have credit cards. Maybe they have checks that they can pay with. It sounds like a crazy idea to me, but the data that we got from them said that we could expect a 3% increase in the conversion rate.
But, on the technical side that was kind of a big lift. That would take days, if not weeks, to implement. We're only going to expect a 3% lift. So, when it comes to figuring out what experiments are going to give you the highest ROI, just really dig into the data and make sure that you're focusing on experiments like the inspirational quote and not things like changing your whole donation system for just a 3% increase in donations.
The fifth one is very easy - test your ideas. Then, lastly, you want to record results. I can't stress this one enough either. Because on the campaign what happened is we ran so many tests - 500 total - that we couldn't always remember what the result from one test was.
If we didn't have this awesome Google doc that we built out that recorded the time, the hypothesis, the result, a screen shot of the control and the variation and the results, and a link to the results, an optimized link, if we didn't have all of that we really couldn't have functioned. Because you just can't remember the results of 500 tests.
You can also disseminate that information when you have it in a Google doc. Just make sure that you're recording your results.
Now, I just want to talk about four areas where you can experiment. I've ordered these by ROI. Copy is, in my experience, by far the highest ROI that you can experiment with. It's very simple, because you don't have to change any code or anything. Changing copy only takes a minute or two, and the results that you can get can be really awesome.
Here is the Quick Donate opt in page that I was talking about before. This is the page where if you make a successful donation we ask you to save your payment information for next time.
We did a variation of the header. This one says 'Save your payment information for next time.' Very simple, right. Then, our variation changed the copy and it said,'Now, save your payment information.' It only changed a few words around. It's not a huge change. Obviously, it only took us like a minute to get this test into production.
By making the copy more direct and directing the user into what we wanted them to do we got a 21% increase on conversions. Again, this is very little development effort, but a huge result in conversions, or conversion lift I should say. Here you can see if you missed it before what the control and the variation was.
After copy, the next highest ROI area of experimentation that I would say is imagery. Because it's very easy to switch images out, almost the same as copy. It takes a little bit longer, though.
Here's an example of what we did on the campaign with imagery. This is our splash page for the 'Dinner with Barack' contest which is a super cool contest. You could actually win dinner with Barack. They would fly you out to Washington, DC. You'd sit down with Barack and have dinner. Sometimes Michelle would be there. Actual people won this contest. After you submit you would get entered into that.
Here we have a picture of the President. We figured out early on that big smiling pictures of the President worked because people love him. We had a hypothesis that people would be more likely to submit this if they could picture themselves in that scenario. You can't really see the people that he's talking to. It doesn't really seem like a real contest. It's like, 'Could I really have dinner with Barack Obama?'
So, we came up with a variation that gave the user a view of a little bit more of the situation. Those are two actual people on the right that won this contest. They flew them out, and they had dinner with Barack and Michelle.
The results of this putting a more situational image in there gave us a 19% lift in the conversion rate. Again, this does not take a lot of time to implement. It's just a very easy test. We got a huge lift on it.
Here are the two different images so that you can see them again.
Another area that I want to talk about is performance. This is going to be a little bit techie for technical. But, you guys are all probably very familiar with how page load affects conversion rate. We were, too. Early on in the campaign we knew that Amazon had published a statistic, and it's a crazy statistic, that even 100 milliseconds of additional latency on page load could drop the conversion rate by one percent. So, that's like huge.
We're obsessed with performance. We want to make our pages as fast as possible. Here is a look at the architecture diagram for the platform that we started with. It's very simple. It's very basic. It was built by a company called Blue State Digital which was one of our vendors. I actually came from there before I started at the campaign.
It worked really well for us in the beginning, because it was built out of the box. As the first engineer there I didn't have time to build a new platform. This was already out there and working.
The user makes requests to a load balancer, and that splits requests to two clusters. If you're asking for the page it would send you to the web cluster. If you actually hit submit on the form it would send you to the payment cluster.
Very simple, but there were a lot of problems with this in terms of performance. We, on average, saw five second page load time which is horrendous when you're processing $690 million worth of donations. You want something more like below two seconds, or how about zero seconds. Can we get the page to just load automatically?
It didn't have a CDN. I don't know how many of you people here are familiar with CDN. That's content delivery network. If I'm in LA and I request a page, in that architecture diagram the servers were in Boston. So, the data has to go all the way from Boston to LA. If you put it on a CDN... We used Akamai. There's an Edge server in LA, so it gets it to you much quicker.
There wasn't any caching in this environment. There were a lot of things that we needed to change. We basically started from scratch and built a new platform. We asked Blue State to turn their hosted platform into an API that we could hit on the client side.
Then, we put the Akamai CDN in front of that. So, we have really fast access to those. Then, we generated our HTML, the actual pages for these, with a static site generator called Jekyll which is built in Ruby. It's super simple to work with. It's great for front end engineers. They don't have to worry about server side templates and all of that stuff.
Then, we hosted all those HTML files on AWS S3 just like our static assets, and we put Akamai in front of that. The cool part is the two donation processors. Like I said before, Blue State built a donation API for us to post to, and then they had load balancing on their end. They had two nodes behind their endpoint.
We put ours on EC2, and we put them in two different regions. We put one payment processor in California, or it may have been Oregon. But, it was on the west coast. We put another payment processor in Virginia on the east coast.
So, if you had an IP address that was in the western side of the United States you'd be sent to the west coast payment processor, and the same for the east. If the west coast went down for some reason... There was actually a hurricane in Virginia and actually caused EC2 servers to go down during the campaign. All that traffic just got sent to the west coast. It was great. It was very redundant.
Once we got this system in place there was never a down time for accepting donations. We were accepting donations 100% of the time.
To show you what that is, what that looks like, I use WebPagetest - which you guys should all use if you're not using it now. It's super easy to get data like this. The top film strip shows you that that's the fast platform. In one second we have a painted screen. That's a screen that the user can start filling out a donation. That's super fast. The only thing that's not loaded is the graphic assets. Those load by two seconds.
You can see our old platform doesn't even have anything on the screen by four seconds. That's awful.
We did a lot to increase the performance here. We had a 63% reduction in page weight. We just threw out all that legacy code and wrote our own. We went from something like 720 kilobytes to, like, 120 kilobytes. Then we had a 52% reduction in HTTP requests which is one of the most common things that contribute to page latency.
What did we get with an 80% faster time to paint? An increase in conversions by 14%. To measure that, we made a page on the fast platform that was identical to the slow platform. Then, we A/B tested them with Optimizely. 14% is not as big as the numbers I was talking about before, but this was in the beginning when we first launched this platform. This was the A/B test to put it into production.
When you calculate the $250 million that this platform brought in over its lifetime that's $32 million dollars. I'll take that. The money raised on the campaign was tight. Just by making that 80% faster we got $32 million. Obviously, this takes a lot more engineering, time, and effort, which is why it's less ROI than the copy and the imagery. But, this is huge. This is $32 million dollars that we got just by making that faster.
The second area of optimization that I want to talk about is... Sorry. This is experimentation and user experience, which also takes a little bit more time.
The screen that you're looking at right now is a donate page that is already super optimized. This was later on in the campaign. We had run hundreds of tests on this page, and it was performing brilliantly. We ran a lot more experiments on it to try and increase the conversion rate, and we kept failing. We couldn't get the conversion rate up. So, we got really frustrated and we couldn't figure out what to do.
We decided to try something big. What we did is on the variation we chunked the donation experience into four parts. Because if you look at this slide right here you see all 16 fields. It looks very intimidating to fill out. It looks like it's going to take you forever. But, if you look at this one all you have to do is select an amount. That's a much lower barrier for entry on engagement here. Then, you just go through that and it guides you through very nicely.
We tested this one. I like to call this the gradual incline instead of steep slope. We got a 5% conversion lift. Obviously, that's not as big as the numbers before. But, like I said, we had already picked all that low hanging fruit. So, 5% at that point was major, because we went a month or two where we couldn't get the conversion rate up at all.
That was a pretty big win for us. Like I said, it was on an already optimized page. You can see the two forms here. One is obviously much simpler to fill out, or it looks like it is.
Here are some best practices I want to share with you guys. The first is start simple. You don't have to make this complicated. My motto in any engineering scenario at all is start simple and test up. You don't have to make a really fancy user experience. You don't have to make it all Ajaxy when you launch.
Just get something out there and get it into production, because done is better than perfect. Then, since you're in production so much earlier you can start experimenting. Each feature that you roll out you'll know what affect that has on the conversion rate because you can test it.
The second is always have a test running. If you have traffic coming to your site, which you probably do right now, and you're not running a test that's just wasted potential right there. Because you're not learning from the people that are going to your site. Always have a test running.
The third is don't be afraid to fail. I can't stress this one enough. I can't actually remember the numbers, but I want to say something like only 20% of our experiments on the campaign actually raised the conversion rate. A lot of them were a statistical tie where it resulted in nothing. Some of them even decreased the conversion rate. Those are pretty damaging psychologically, but you can't let that get you down.
I want to show you an example of this. Ignore the amount buttons. This is a bad screen shot. I don't know how this came about. But, everything was the same except for a little check box down there that says 'Save my payment information on the variation'.
Somebody had the idea to instead of ask the follow up screen to save your payment information we wanted to put it on the donate page. Because they thought maybe that would increase the conversion rate on saving people's payment information. Well, this slide is a little out of order.
That actually reduced our conversion rate by 44%. Right when we saw that we stopped the experiment immediately and just moved on. That's the whole thing about testing. It's not permanent. You can just move on. You might not even have thought that that would result in that. I'll go back to this side. If you aren't failing then you aren't testing enough, because you're not going to have 100% success in your tests. It's just not possible.
The second area of optimization I want to talk about quick is data gathering. You really can not gather enough data. That's really my motto.
We on the campaign just gathered any kind of data that we could think of - error rates on forms, when people focused in the forms, and how long it took people to submit the form. And, how long it took for our Ajax response when the user hit submit to get a response from the server so that we could tell the back end engineers how long it's taking. Because we want it to be faster, obviously. Anything we could think of we measured it.
Again, here's this number. We did over 668 million Google Analytics custom events. Here's an example of one. This is an interactive infographic that we put out to showcase our 1 million donors. It was pretty early on in the campaign. It has a lot of little pieces of interactive content there where you can scroll to see names, what are the most popular names people donated under, and where people are from.
One part of that is this little piece right here which you can just scroll through and see the most popular names. We put Google Analytics custom events on the left arrow on the right arrow, and we found that 82% of the clicks were to the right arrow. So, that left arrow was unnecessary, and it's just cluttering the UI and gives the user more options. You obviously want to be guiding the user through what you're presenting to them.
We used that learning to optimize our UI's further down the road, and we just didn't put left arrows on anything, because it doesn't really make sense. This is the Google Analytics custom event to track that data. It's super simple and it's arbitrary. The category is one million infographic. The label is name slides. Super simple.
The last area that I want to talk about is user testing. This is actually a really cool example, because it solved a problem that I don't think that we were going to be able to solve without user testing.
This is the last step in the donation process. This is where we're asking for your employer and occupation. This is required of us by the Federal Election Commission. So, there's no choice. We had to gather this information.
Well, when we put the error tracking on our donate form we found out that the two most common errors behind people entering their credit card information was employer and occupation. We were like,'Wow, that's really weird. How can that be such a hard question?'
We went through and looked at the data people were submitting. It was like, 'None of your business', 'F you'. People just aren't comfortable, right. So, that was that. There's nothing we can do to make people more comfortable, really.
So, we just left it at that until we started doing user testing. We took a lot of the volunteers that came into headquarters. There was a ton of them. There were students, there were retired people, and all kinds of age ranges.
We sat them down on the computer on Silverback, and we asked them to make a donation. Sorry, I'm cheating a little bit. We found out that the students and the retired people did not know what to put in there. Because they're not employed.
Again, this is us thinking as us as the users. We work for the campaign. 'I know where I work. I work for Obama for America.' That's a very simple question for me.
But, to a retired person it's like, 'What do I put in there?' So, they don't put anything, and then they hit submit and that triggers the error. That's why the error rate was going up so high on these forms.
Once we got that feedback from user testing and observing our users use our product we put a little tiny - and I don't know if you guys can see it but it's just a little tiny line that says 'If you are retired please enter "retired" in both fields'. Little tiny bit of copy. It did not take us a long time to put that in there.
Adding that field hint in reduced the error rate by 63%. That's just crazy. Like I said, we would not have known to test that beforehand if we weren't doing user testing and watching our users.
I blog about all of this stuff a lot on my personal website. It's kylerush.net. I go into a lot more in depth on the technical side and a lot more experiments if you want to check that out.
That's all I have for you guys. Thank you.
Cyrus Shepard (emcee): Let's step over here under the light...
Kyle: ...You want this?
Cyrus: Awesome work, man.
Kyle: Thank you.
Cyrus: I assume you're using the enterprise version of Google Analytics.
Kyle: Is there an enterprise version?
Cyrus: Yeah, yeah.
Kyle: I know that we had a direct line over there where we were like 'Hey our stuff's not loading, can you please do something?' They were, like, 'Refresh it because there was too much dataâ€¦'
Cyrus: Yes, yes...
Kyle: ...It was a lot going on.
Cyrus: One question I did want to ask. For your testing platform, did you build that yourself, or did you use an off the shelf version?
Kyle: No, we used Optimizely.
Cyrus: You used Optimizely.
Kyle: Yeah, which is awesome...
Cyrus: ...And, you'd recommend it?
Kyle: If you guys aren't using that, use Optimizely. It's amazing.
Cyrus: Yes, question?
Amanda: Is this on? There we go. Hi, my name's Amanda Stevens. I'm from marketing agency in Winnipeg, Canada. Fantastic presentation. My question for you is you talked a little bit about the design elements and the UX changes you made to the website to add that lift. I'm just wondering if you can expand on some other design elements that you incorporated to increase conversions.
Kyle: Yeah, sure. I don't want to be too harsh on design, but in my experience what we tested on design, embellishments and stuff, is just kind of a waste of time. It's fine if the designers want to put that in there. That's great.
But, like I said, when you're testing, like, button colors, and rounded corner versus square corner, do not waste your time with that. That's not going to do anything. It's just going to sink. It's a time sink.
Really, when it comes to design, our brand was all about imagery and photos. That's where we got the real big increases in design changes is imagery. Other than that, I wouldn't say that we found anything as far as design goes that had a real impact on the conversion rate.
Amanda: Cool. Thank you.
Alan: Hi, I'm Alan. I'm with Three Ventures Technology and Agency. I actually watched Dan speak at an analytics conference in San Francisco. One of the things that I think I actually would like to ask you about is why Optimizely and not Google Analytics content experiments with the multi arm banded approach, and basically minimizing the time increasing a certain conversion rate at 95% probability. So, I mean the amount of time basically that it would take for an A/B test to finish at those rates.
Kyle: Yeah, sure. I can talk about this forever, but I'm going to make it really brief. If you're an engineer there's really no other option for you. Because Optimizely makes your life so, so easy.
We actually were tasked with finding other A/B testing platforms that were either cheaper or I don't know what the situation was. We evaluated a lot. I don't want to dump on other platforms, because every one has its use. But, for us on the campaign Optimizely was by far the best.
One of the problems with Google Analytics is the data's not live. Optimizely gives you a live reporting on the results. So, you can see right away if your experiment is dragging your conversion rate through the dirt and you can stop the experiment.
There's nothing that we couldn't do in Optimizely. Any idea that we came up with we could do in Optimizely. We tried it in other platforms. There were a lot of limitations. From an engineering perspective that's why Optimizely is great. That's mainly why we chose to go with it.
Alan: Cool, awesome. Thank you.
Cyrus: And, I think we have time for one more. We'll go over here.
Q: Okay, so I work in fundraising. Most of the time the relationships that we're dealing with in terms of how long a person is going to donate is five or ten years, longer if we're talking about direct mail. So, it seems like a lot of what you were looking at is immediate return. I don't know if you had an LTV where you were saying we got a 60% increase in conversions, but it affected the LTV or even just the length of the relationship by X. Did you look at things like that?
Kyle: Yeah, we did. I would say it's very difficult to measure something like that, because it's not like an exact, like the user's on the page clicking something. But, if you think about it, we've been raising money, not me personally but the campaign, since 2007. So, there is a long term donation cycle there.
The campaign is actually still raising money now. They have an organization called Organizing for Action that exists to support the President's legislative agenda. They're still raising money.
I would say that in a political campaign where it's so crazy and there's a deadline that is election day, which usually people do not have to deal with, it's more about the short term. But, they are still doing long term stuff. We just didn't have to worry about that as much because it was November 7, that's the day.
Q: Okay, thank you.
Cyrus: Kyle, thank you so much for coming to Seattle.
Want more? Kyle's coming back for MozCon 2014, and you can buy your MozCon 2014 ticket today and save $400.
Can't wait? Get a front-row experience for all 37 sessions, plus their slide decks, with the 2013 MozCon Video Bundle. Moz Analytics Subscribers, you get $100 discount. $399 regular price - $100 subscriber discount = $299 for the entire video bundle!
Is your CMS SEO-friendly? The following checklist will help you determine the capabilities of your...
TYPO3 would do well to learn from the growing success and core strenghs of Wordpress
If you're looking for a quality and affordable web hosting provider, look no further. Web Hosting...