A/B testing is an internet marketing standard. In order to optimize response rates, you compare one page against another. You run with the page that gives you the best response rates.
But anyone who has tried A/B testing will know that whilst it sounds simple in concept, it can be problematic in execution. For example, it can be difficult to determine if what youâ€™re seeing is a tangible difference in customer behaviour or simply a result of chance. Is A/B testing an appropriate choice in all cases? Or is it best suited to specific applications? Does A/B testing obscure what customers really want?
In this article, weâ€™ll look at some of the gotchas for those new to A/B testing.
You set up test. Youâ€™ve got one page featuring call to action A and one page featuring call to action B. You enable your PPC campaign and leave it running for a day.
When you stop the test, youâ€™ve found call-to-action A converted at twice the rate of call-to-action B. So call-to-action A is the winner and we should run with it, and eliminate option B.
But this would be a mistake.
The sample size may be insufficient. If we only tested one hundred clicks, we might get a significant difference in results between two pages, but that change doesn't show up when we get to 1,000 clicks. In fact, the result may even be reversed!
So, how do we determine a sample size that is statistically significant? This excellent article explains the maths. However, there are various online sample size calculators that will do the calculations for you, including Evanâ€™s. Most A/B tracking tools will include sample size calculators, but itâ€™s a good idea to understand what theyâ€™re calculating, and how, to ensure the accuracy of your tests.
In short, make sure you've tested enough of the audience to determine a trend.
We might want to test a call to action metric. We want to test the number of people who click on the â€śfind out moreâ€ť link on a landing page. We find that a lot more people click on this link we use the term â€śfind out moreâ€ť than if we use the term â€śbuy nowâ€ť.
But what if the conversion rate for those who actually make a purchase falls as a result? We achieved higher click-thrus on one landing page at the expense of actual sales.
This is why itâ€™s important to be clear about the end goal when designing and executing tests. Also, ensure we look at the process as a whole, especially when weâ€™re chopping the process up into bits for testing purposes. Does a change in one place affect something else further down the line?
In this example, you might A/B test the landing page whilst keeping an eye on your total customer numbers deeming the change effective only if customer numbers also rise. If your aim was only to increase click-thru, say to boost quality scores, then the change was effective.
In the example above, we know the â€śwhatâ€ť. We changed the wording of a call-to-action link, and we achieved higher click thruâ€™s, although weâ€™re still in the dark as to why. Weâ€™re also in the dark as to why the change of wording resulted in fewer sales.
Was it because we attracted more people who were information seekers? Were buyers confused about the nature of the site? Did visitors think they couldnâ€™t buy from us? Were they price shoppers who wanted to compare price information up front?
We donâ€™t really know.
But thatâ€™s good, so long as we keep asking questions. These types of questions lead to more ideas for A/B tests. By turning testing into an ongoing process, supported by asking more and hopefully better questions, weâ€™re more likely to discover a whole range of â€śwhyâ€™sâ€ť.
If youâ€™re a small company competing directly with big companies, you may already be on the back foot when it comes to A/B testing.
Itâ€™s clear that its very modularity can cause problems. But what about in cases where the number of tests that can be run at once is low? While A/B testing makes sense on big websites where you can run hundreds of tests per day and have hundreds of thousands of hits, only a few offers can be tested at one time in cases like direct mail. The variance that these tests reveal is often so low that any meaningful statistical analysis is impossible.
Put simply, you might not have the traffic to generate statistically significant results. Thereâ€™s no easy way around this problem, but the answer may lay in getting tricky with the maths.
Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. It lets marketers project the impact of many variables by testing just a few of them. Mathematical formulas use a subset of combinations of variables to represent the complexity of all the original variables. That allows the marketing organization to more quickly adjust messages and offers and, based on the responses, to improve marketing effectiveness and the companyâ€™s overall economics
Another thing to consider is that if youâ€™re certain the bigger company is running A/B tests, and achieving good results, then â€śstealâ€ť their landing page*. Take their ideas for landing pages and use that as a test against your existing pages. *Of course, you canâ€™t really steal their landing page, but you can be "influenced byâ€ť their approach.
What your competitors do is often a good starting point for your own tests. Try taking their approach and refine it.
Are there alternatives to A/B testing?
Some swear by the Multi Armed Bandit methodology:
The multi-armed bandit problem takes its terminology from a casino. You are faced with a wall of slot machines, each with its own lever. You suspect that some slot machines pay out more frequently than others. How can you learn which machine is the best, and get the most coins in the fewest trials?
Like many techniques in machine learning, the simplest strategy is hard to beat. More complicated techniques are worth considering, but they may eke out only a few hundredths of a percentage point of performance.
What multi-armed bandit algorithm does is that it aggressively (and greedily) optimizes for currently best performing variation, so the actual worse performing versions end up receiving very little traffic (mostly in the explorative 10% phase). This little traffic means when you try to calculate statistical significance, thereâ€™s still a lot of uncertainty whether the variation is â€śreallyâ€ť worse performing or the current worse performance is due to random chance. So, in a multi-armed bandit algorithm, it takes a lot more traffic to declare statistical significance as compared to simple randomization of A/B testing. (But, of course, in a multi-armed bandit campaign, the average conversion rate is higher).
Multivariate testing may be suitable if youâ€™re testing a combination of variables, as opposed to just one i.e.
There would be 3x2x3 different versions to test.
The problem with multivariate tests is they can get complicated pretty quickly and require a lot of traffic to produce statistically significant results. One advantage of multivariate testing over A/B testing is that it can tell you which part of the page is most influential. Was it a graphic? A headline? A video? If you're testing a page using an A/B test, you won't know. Multivariate testing will tell you which page sections influence the conversion rate and which donâ€™t.
So is A/B testing worthwhile? Are the alternatives better?
The methodology we choose will only be as good as the test design. If tests are poorly designed, then the maths, the tests, the data and the software tools wonâ€™t be much use.
To construct good tests, you should first take a high level view:
Start the test by first asking yourself a question. Something on the lines of, â€śWhy is the engagement rate of my site lower than that of the competitorsâ€¦..Collect information about your product from customers before setting up any big test. If you plan to test your tagline, run a quick survey among your customers asking how they would define your product.
Secondly, consider the limits of testing. Testing can be a bit of a heartless exercise. Itâ€™s cold. We canâ€™t really test how memorable and how liked one design is over the other, and typically have to go by instinct on some questions. Sometimes, certain designs just work for our audience, and other designs donâ€™t. How do we test if we're winning not just business, but also hearts and minds?
Does it mean we really understand our customers if they click this version over that one? We might see how they react to an offer, but that doesnâ€™t mean we understand their desires and needs. If weâ€™re getting click-backs most of the time, then itâ€™s pretty clear we donâ€™t understand the visitors. Changing a graphic here, and wording there, isnâ€™t going to help if the underlying offer is not what potential customers want. No amount of testing ad copy will sell a pink train.
The understanding of customers is gained in part by tests, and in part by direct experience with customers and the market weâ€™re in. Understanding comes from empathy. From asking questions. From listening to, and understanding, the answers. From knowing whatâ€™s good, and bad, about your competitors. From providing options. From open communication channels. From reassuring people. You're probably armed with this information already, and that information is highly useful when it comes to constructing effective tests.
Do you really need A/B testing? Used well, it can markedly improve and hone offers. It isn't a magic bullet. Understanding your audience is the most important thing. Google, a company that uses testing extensively, seem to be most vulnerable when it comes to areas that require a more intuitive understanding of people. Google Glass is a prime example of failing to understand social context. Apple, on the other hand, were driven more by an intuitive approach. Jobs: "We built [the Mac] for ourselves. We were the group of people who were going to judge whether it was great or not. We werenâ€™t going to go out and do market research"
A/B testing is can work wonders, just so long as it isnâ€™t used as a substitute for understanding people.
Last October Vendran Tomic wrote a guide for local SEO which has since become one of the more popular pages on our site, so we decided to follow up with a QnA on some of the latest changes in local search.
Q: Google appears to have settled their monopolistic abuse charges in Europe. As part of that settlement they have to list 3 competing offers in their result set from other vertical databases. If Google charges for the particular type of listing then these competitors compete in an ad auction, whereas if the vertical is free those clicks to competitors are free. How long do we have until Google's local product has a paid inclusion element to it?
A: Local advertising market is huge. It's a market that Google still hasn't mastered. It's a market still dominated by IYP platforms.
Since search in general is stagnant, Google will be looking to increase their share of the market.
That was obvious to anyone who was covering Google's attempt to acquire Groupon since social couponing is a local marketing phenomenon mostly.
Their new dashboard is not only more stable with a slicker interface, but also capable of facilitating any paid inclusion module.
I would guess that Google will not wait a long time to launch a paid inclusion product or something similar, since they want to keep their shareholders happy.
Q: In the past there have been fiascos with things like local page cross-integration with Google+. How "solved" are these problems, and how hard is it to isolate these sorts of issues from other potential issues?
A: Traditionally, Google had the most trouble with their "local" products. Over the years, they were losing listings, reviews, merging listings, duplicating them etc. Someone called their attempts "a train wreck at the junction." They were also notoriously bad with providing guidance that would help local businesses navigate the complexity of the environment Google created.
Google has also faced some branding challenges - confusing even the most seasoned local search professionals with their branding.
Having said that, things have been changing for the better. Google has introduced phone support which is, I must say, very useful. In addition, the changes they made in a way they deal with local data made things more stable.
However, I'd still say that Google's local products are their biggest challenge.
Q: Yelp just had strong quaterly results and Yahoo! has recently added a knowledge-graph like pane to their search results. How important is local search on platforms away from Google? How aligned are the various local platforms on ranking criteria?
A: Just like organic search is mostly about two functions - importance and relevance, local search is about location prominence, proximity and relevance (where location prominence is an equivalent to importance in general SEO).
All local search platforms have ranking factors that are based on these principles.
The only thing that's different is what they consider ranking signals and the way they place on each. For example, to rank high in Yahoo! Local, one needs to be very close to the centroid of the town, have something in the title of their business that matches the query of the search and have a few reviews.
Google is more sophisticated, but the principles are the same.
The less sophisticated local search platforms use less signals in their algorithm, and are usually geared more towards proximity as a ranking signal.
It's also important to note that local search functions as a very interconnected ecosystem, and that changes made in order to boost visibility in one platform, might hurt you in another.
Q: There was a Google patent where they mentioned using driving directions to help as a relevancy signal. And Bing recently invested in and licensed data from Foursquare. Are these the sorts of signals you see taking weight from things like proximity over time?
A: I see these signals becoming/increasing in importance over time as they would be a useful ranking signal. However, to Google, local search is also about location sensitivity, and these signals will probably not be used outside of this context.
If you read a patent named "Methods And Systems For Improving A Search Ranking Using Location Awareness" (Amit Singhal is one of the inventors), you will see that Google, in fact, is aware that people have different sensitivities fo different types of services/queries. You don't necessarily care where your plumber will come from, but you do care where the pizza places are where you search for pizza in your location.
I don't see driving directions as a signal ever de-throning proximity, because proximity is closer to the nature of the offline/online interaction.
Q: There are many different local directories which are highly relevant to local, while there are also vertical specific directories which might be tied to travel reviews or listing doctors. Some of these services (say like OpenTable) also manage bookings and so on. How important is it that local businesses "spread around" their marketing efforts? When does it make sense to focus deeply on a specific platform or channel vs to promote on many of them?
A: This is a great question, Aaron! About 5 years ago, I believed that the only true game in town for any local business is Google. This was because, at that time, I wasn't invested in proper measurement of outcomes and metrics such as cost of customer acquisition, lead acqusition etc.
Local businesses, famous for their lack of budgets, should always "give" vertical platforms a try, even IYP type sites. This is why:
Keep in mind, basics need to be covered first: data aggregators, Google Places, creating a professional/usable/persuasive website, as well as developing a measurement model.
Q: What is the difference between incentivizing a reasonable number of reviews & being so aggressive that something is likely to be flagged as spam? How do you draw the line with trying to encourage customer reviews?
A: Reviews and review management have always been tricky, as well as important. We know two objective things about reviews:
Every local search/review platform worth its weight in salt will have a policy in place discouraging incentivizing and "buying" reviews. They will enforce this policy using algorithms or humans. We all know that.
Small and medium sized businesses make a mistake of trying to get as many reviews as humanly possible, and direct them to one or two local search platforms. Here, they make two mistakes:
1. they're driven by a belief that one needs a huge number of reviews on Google and
2. one needs to direct all their review efforts at Google.
This behavior forces them to be flagged algorithmically or manually. Neither Google nor Yelp want you to solicit reviews.
However, if you change your approach from aggressively asking for reviews to a survey-based approach, you should be fine.
What do I mean by that?
A survey-based approach means you solicit your customers' opinions on different services/products to improve your operations - and then ask them to share their opinion on the web while giving them plenty of choices.
This approach will get you much further than mindlessly begging people for reviews and sending them to Google.
The problem with clear distinction between the right and wrong way in handling reviews, as far as Google goes, lies in their constant changing of guidelines regarding reviews.
Things to remember are: try to get reviews on plenty of sites, while surveying your customers and never get too aggressive. Slow and steady wins the race.
Q: On many local searches people are now getting carouseled away from generic searches toward branded searches before clicking through, and then there is keyword(not provided) on top of that. What are some of the more cost efficient ways a small business can track & improve their ranking performance when so much of the performance data is hidden/disconnected?
A: Are you referring to ranking in Maps or organic part of the results? I'm asking because Google doesn't blend anymore.
Q: I meant organic search
A: OK. My advice has always been to not obsess over rankings, but over customer acquisition numbers, leads, lifetime customer value etc.
However, rankings are objectively a very important piece of the puzzle. Here are my suggestions when it comes to more cost efficient ways to track and improve ranking performance:
Q: If you are a local locksmith, how do you rise above the spam which people have publicly complained about for at least 5 years straight now?
A: If I were a local locksmith, I would seriously consider moving my operations close to the centroid of my town/city. I would also make sure my business data across the web is highly consistent.
In addition, I would make sure to facilitate getting reviews on many platforms. If this wouldn't be enough (as it often isn't enough in many markets), I would be public about Google's inability to handle locksmiths spam in my town - using their forums, and any other medium.
Q: In many cities do you feel the potential ROI would be high enough to justify paying for downtown real estate then? Or would you suggest having a mailing related address or such?
A: The ROI of getting a legitimate downtown address would greatly depend on customer lifetime value. For example, if I were a personal injury attorney in a major city, I would definitely consider opening a small office near a center of my city/town.
Another thing to consider would be the search radius/location sensitivity. If the location sensitivity for a set of keywords is high, I would be more inclined to invest in a downtown office.
I wouldn't advocate PO boxes or virtual offices, since Google is getting more aggressive about weeding those out.
Q: Google recently started supporting microformats for things like hours of operation, phone numbers, and menus. How important is it for local businesses to use these sorts of features?
A: It is not a crucial ranking factor, and is unlikely to be any time in the near future. However, Google tends to reward businesses that embrace their new features - at least in local search. I would definitely recommend embracing microformats in local search.
Q: As a blogger I've noticed an increase in comment spam with NAP information in it. Do you see Google eventually penalizing people for that? Is this likely to turn into yet another commonplace form of negative SEO?
A: This is a difficult question. Knowing how Google operates, it's possible they start penalizing that practice. However, I don't see that type of spam being particularly effective.
Most blogs cannot do a lot to enhance the location prominence. But if that turned into a negative SEO avenue, I would say that Google wouldn't handle it well (based on their track records).
Q: Last year you wrote a popular guide to local search. What major changes have happened to the ecosystem since then? Would you change any of the advice you gave back then? Or has local search started to become more stable recently?
A: There weren't huge changes in the local ecosystem. Google has made a lot of progress in transferring accounts to the new dashboard, improving the Bulk upload function. They also changed their UX slightly.
Moz entered the local search space with their Moz Local product.
Q: When doing a local SEO campaign, how much of the workload tends to be upfront stuff versus ongoing maintenance work? For many campaigns is a one-off effort enough to last for a significant period of time? How do you determine the best approach for a client in terms of figuring out the mix of upfront versus maintenance and how long it will take results to show and so on?
A: This largely depends on the objective of the campaign, the market and the budget. There are verticals where local Internet marketing is extremely competitive, and tends to be a constant battle.
Some markets, on the other hand, are easy and can largely be a one-off thing. For example, if you're a plumber or an electrician in a small town with a service area limited to that town, you really don't need much maintenance, if any.
However, if you are a roofing company that wants to be a market leader in greater Houston, TX your approach has to be much different.
The upfront work tends to be more intense if the business has NAP inconsistencies, never did any Internet marketing and doesn't excel at offline marketing.
If you're a brand offline and know to tie your offline and online marketing efforts, you will have a much easier time getting the most out of the web.
In most smaller markets, the results can be seen in a span of just a few months. More competitive markets, in my experience, require more time and a larger investment.
Q: When does it make sense for a local business to DIY versus hiring help? What tools do you recommend they use if they do it themselves?
A: If local business owner is in a position where doing local Internet marketing is their highest value activity, it would make sense to do it themselves.
However, more often than not, this is not the case even for the smallest of businesses. Being successful in local Internet marketing in a small market is not that difficult. But it does come with a learning curve and a cost in time.
Having said that, if the market is not that competitive, taking care of data aggregators, a few major local search platforms and acquisition of a handful of industry links would do the trick.
For data aggregators, one might go directly to them or use a tool such as UBM or Moz Local.
To dig for citations, Whitespark's citation tool is pretty good and not that expensive.
Q: The WSJ recently published a fairly unflatering article about some of the larger local search firms which primarily manage AdWords for 10's of thousands of clients & rely on aggressive outbound marketing to offset high levels of churn. Should a small business consider paid search & local as being separate from one another or part of the same thing? If someone hires help on these fronts, where's the best place to find responsive help?
A: "Big box" local search companies were always better about client acquisition than performance. It always seemed as if performance wasn't an integral part of their business model.
However, small businesses cannot take that approach when it comes to performance. Generally speaking, the more web is connected to business, the better of a small business is. This means that a local Internet marketing strategy should start with business objectives.
Everyone should ask themselves 2 questions:
1. What's my lifetime customer value?
2. How much can I afford to spend on acquiring a customer?
Every online marketing endeavor should be judged through this lens. This means greater integration.
Q: What are some of the best resources people can use to get the fundamentals of local search & to keep up with the changing search landscape?
A: Luckily for everyone, blogosphere in local search is rich in useful information. I would definitely recommend Mike Blumenthal's blog, Andrew Shotland's Local SEO Guide, Linda Buquet's forum, Nyagoslav Zhekov, Mary Bowling and of course, the Local U blog.
Vedran Tomic is a member of SEOBook and founder of Local Ants LLC, a local internet marketing agency. Please feel free to use the comments below to ask any local search questions you have, as Vedran will be checking in periodically to answer them over the next couple days.
There's the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.
One is white hat and the other is black hat.
With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.
Are you a white hat SEO? or a black hat SEO?
Do you even know?
Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.
It's fantastic journalism & an important read for anyone who considers themselves an SEO.
Take the offline analog to Google's search "quality" guidelines & in spirit Google repeatedly violated every single one of them.
creating links that werenâ€™t editorially placed or vouched for by the siteâ€™s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank
Advertorials are spam, except when they are not: "the staff and professors at GMUâ€™s law center were in regular contact with Google executives, who supplied them with the companyâ€™s arguments against antitrust action and helped them get favorable op-ed pieces published"
Don't deceive your users.
Ads should be clearly labeled, except when they are not: "GMU officials later told Dellarocas they were planning to have him participate from the audience," which is just like an infomercial that must be labeled as an advertisement!
Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
Money influencing outcomes is wrong, except when it's not: "Googleâ€™s lobbying corps â€” now numbering more than 100 â€” is split equally, like its campaign donations, among Democrats and Republicans. ... Google became the second-largest corporate spender on lobbying in the United States in 2012."
The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
Payment should be disclosed, except when it shouldn't: "The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Googleâ€™s involvement was not publicly disclosed."
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Googleâ€™s Webmaster Guidelines because it provides our users with different results than they expected.
cloaking is evil, except when it's not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. â€śWe will certainly limit who we announce publicly from Googleâ€ť
...and on and on and on...
It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.
And while they may not approve of something, that doesn't mean they avoid the strategy when mapping out their own approach.
There's a lesson & it isn't a particularly subtle one.
Free markets aren't free. Who could have known?
Thereâ€™s a case study on Moz on how to get your site back following a link penalty. An SEO working on a clients site describes what happened when their client got hit with a link penalty. Even though the link penalty didn't appear to be their fault, it still took months to get their rankings back.
Some sites aren't that lucky. Some sites donâ€™t get their rankings back at all.
The penalty was due to a false-positive. A dubious site links out to a number of credible sites in order to help disguise their true link target. The client site was one of the credible sites, mistaken by Google for a bad actor. Just goes to show how easily credible sites can get hit by negative SEO, and variations thereof.
Thereâ€™s a tactic in there, of course.
Tired of trying to rank better? Need a quicker way? Have we got a deal for you!
Simply build a dubious link site, point some rogue links at sites positioned above yours and wait for Googleâ€™s algorithm to do the rest. If you want to get a bit tricky, link out to other legitimate sites, too. Like Wikipedia. Google, even. This will likely confuse the algorithm for a sufficient length of time, giving your tactic time to work.
Those competitors who get hit, and who are smart enough to work out whatâ€™s going on, may report your link site, but, hey, there are plenty more link sites where that came from. Roll another one out, and repeat. So long as your link site canâ€™t be connected with you - different PC, different IP address, etc - then what have you got to lose? Nothing much. What have your competitors got to lose? Rank, a lot of time, effort, and the very real risk they wonâ€™t get back into Googleâ€™s good books. And thatâ€™s assuming they work out why they lost rankings.
Iâ€™m not advocating this tactic, of course. But we all know itâ€™s out there. It is being used. And the real-world example above shows how easy it is to do. One day, it might be used against you, or your clients.
Grossly unfair, but what can you do about it?
Pleading to Google is not much of a strategy. Apart from anything else, itâ€™s an acknowledgement that the power is not in your hands, but in the hands of an unregulated arbiter who likely views you as a bit of an annoyance. Itâ€™s no wonder SEO has become so neurotic.
It used to be the case that competitors could not take you out pointing unwanted links at you. No longer. So even more control has been taken away from the webmaster.
The way to manage this risk is the same way risk is managed in finance. Risk can be reduced using diversification. You could invest all your money in one company, or you could split it between multiple companies, banks, bonds and other investment classes. If youâ€™re invested in one company, and they go belly up, you lose everything. If you invest in multiple companies and investment classes, then youâ€™re not as affected if one company gets taken out. In other words, donâ€™t put all your eggs in one basket.
Itâ€™s the same with web traffic.
1. Multiple Traffic Streams
If you only run one site, try to ensure your traffic is balanced. Some traffic from organic search, some from PPC, some from other sites, some from advertisements, some from offline advertising, some from email lists, some from social media, and so on. If you get taken out in organic search, it wonâ€™t kill you. Alternative traffic streams buy you time to get your rankings back.
2. Multiple Pages And Sites
A â€śweb siteâ€ť is a construct. Is it a construct applicable to a web that mostly orients around individual pages? If you think in terms of pages, as opposed to a site, then it opens up more opportunities for diversification.
Pages can, of course, be located anywhere, not just on your site. These may take the form of well written, evergreen, articles published on other popular sites. Take a look at the top sites in closely related niches and see if there are any opportunities to publish your content on them. Not only does this make your link graph look good, so long as itâ€™s not overt, youâ€™ll also have achieve more diversity.
Consider Barnacle SEO.
Will creatively defines the concept of barnacle SEO as follows:
Attaching oneself to a large fixed object and waiting for the customers to float by in the current.
Directly applied to local search, this means optimizing your profiles or business pages on a well-trusted, high-ranking directory and working to promote those profiles instead of â€” or in tandem with â€” your own website.â€ś
You could also build multiple sites. Why have just one site when you can have five? Sure, thereâ€™s more overhead, and it wonâ€™t be appropriate in all cases, but again, the multiple site strategy is making a comeback due to Google escalating the risk of having only one site. This strategy also helps get your eggs into multiple baskets.
3. Prepare For the Worst
If you've got most of your traffic coming from organic search, then youâ€™re taking a high risk approach. You should manage that risk down with diversification strategies first. Part of the strategy for dealing with negative SEO is not to make yourself so vulnerable to it in the first place.
If you do get hit, have a plan ready to go to limit the time youâ€™re out of the game. The cynical might suggest you have a name big enough to make Google look bad if they donâ€™t show you.
Lyrics site Rap Genius says that it is no longer penalized within Google after taking action to correct â€śunnatural linksâ€ť that it helped create. The site was hit with a penalty for 10 days, which meant people seeking it by name couldnâ€™t find it.
For everyone else, hereâ€™s a pretty thorough guide about how to get back in.
Have your â€śplead with Googleâ€ť gambit ready to go at a moments notice. The lead time to get back into Google can be long, so the sooner you get onto it, the better. Of course, this is really the last course of action. Itâ€™s preferable not make yourself that vulnerable in the first place.
Bing recently stated testing listing 'alternatives' near their local search results.
I wasn't able to replicate these in other search verticals like flight search, or on an iPhone search, but the format of these alternatives looks similar to the format proposed in Google's ongoing monopolistic abuse case in Europe:
"In effect, competitors will have the 'choice' either to pay Google in order to remain relevant or lose visibility and become irrelevant," a European consumer watchdog, BEUC, said in a letter it sent to all 28 EU commissioners. The letter, seen by The Wall Street Journal, terms the deal "unacceptable."
Guest blogging was once considered a widely recommended white hat technique.
Today our monopoly-led marketplace arbitrarily decided this is no longer so.
Stick a fork in it. Torch it. Etc.
Now that rules have changed ex post facto, we can expect to deal with a near endless stream of "unnatural" link penalties for doing what was seen at the time as being:
Google turns your past client investments into new cost centers & penalties. This ought to be a great thing for the SEO industry. Or maybe not.
As Google scares & expunges smaller players from participating in the SEO market, larger companies keep chugging along.
Today a friend received the following unsolicited email:
Curious about their background, he looked up their past coverage: "Written then offers a number of different content licenses that help the advertiser reach this audience, either by re-branding the existing page, moving the content to the advertiserâ€™s website and re-directing traffic there, or just re-publishing the post on the brandâ€™s blog."
So that's basically guest blogging at scale.
And it's not only guest blogging at scale, but it is guest blogging at scale based on keyword performance:
"You give us your gold keywords. Written finds high-performing, gold content with a built-in, engaged audience. Our various license options can bring the audience to you or your brand to the audience through great content."
What's worse is how they pitch this to the people they license content from:
I'm sorry, but taking your most valuable content & turning it into duplicate content by syndicating it onto a fortune 500 website will not increase your traffic. The fortune 500 site will outrank you (especially if visitors/links are 301 redirected to their site!). And when visitors are not redirected, they will still typically outrank you due to their huge domain authority (and the cross-domain rel=canonical tag), leading your content on your site to get filtered out of the search results as duplicate content & your link equity to pass on to the branded advertiser.
And if Google were to come down on anyone in the above sort of situation it would likely be the smaller independent bloggers who get hit.
This is how SEO works.
Smaller independent players innovate & prove the model.
Google punishes them for being innovative.
As they are punished, a vanilla corporate tweak of the same model rolls out and is white hat.
In SEO it's not what you do that matters - it's who your client is.
If you're not working for a big brand, you're doing it wrong.
If the current war on SEOs by Google wasnâ€™t bad enough if you own the site you work on, then it is doubly so for the SEO working for a client. When the SEO doesnâ€™t have sufficient control over the strategy and technology, it can be difficult to get and maintain rankings.
In this post, we'll take a look at the challenges and common objections the SEO faces when working on a client site, particularly a client who is engaging an SEO for the first time. The SEO will need to fit in with developers, designers and managers who may not understand the role of SEOs. Here are common objections you can expect, and some ideas on how to counter them.
The objection is that SEO gets in the way. Itâ€™s too hard.
Itâ€™s true. SEO is complicated. It can often compromise design and site architecture. To managers and other web technicians, SEO can look like a dark art. Or possibly a con. There are no fixed rules as there are in, say, coding, and results are unpredictable.
So why spend time and money on SEO?
One appropriate response is â€śbecause your competitors areâ€ť
Building a website is the equivalent of taking the starting line in a race. Some site owners think thatâ€™s all they need do. However, the real race starts after the site is built. Every other competitor has a web site, and theyâ€™re already off and running in terms of site awareness. Without SEO, visitors may find a site, but if the site owner is not using the SEO channel, and their competitors are, then their competitors have an advantage in terms of reach.
SEOâ€™s can do their thing after the site is built, but itâ€™s more difficult. As a result, itâ€™s likely to be more expensive. Baking SEO into the mix when it is conceived and built is an easier route.
Just as copywriters require space to display their copy, SEO's require room to manoeuvre. Theyâ€™ll likely contribute to information architecture, copy, copy markup and internal linking structures. So start talking about SEO as early as possible, and particularly during information architecture.
There are three key areas where SEO needs to integrate with design. One, the requirement that text is machine readable. Search engines "think" mostly in terms of words, so topics and copy need to relate to search terms visitors may use.
Secondly, linking architecture and information hierarchies. If pages are buried deep in the site, but deemed important in terms of search, they will likely be elevated in the hierarchy to a position closer to the home page.
Thirdly, crawl-ability. A search engine sends out a spider, which grabs the source code of your website, and dumps it back in the search engines database. The spider skips from page to page, following links. If a page doesn't have a crawlable link pointing to it, it will be invisible to search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. The SEO may also want to ensure the site navigation is crawlable.
SEOâ€™s do need to tweak code, however the mark-up is largely inconsequential.
SEO's need to specify title tags and some meta tags. These tags need to be unique for each page on the site, as each page is a possible entry page. A search visitor will not necessarily arrive at the home page first.
The title tag appears in search results as a clickable link, so serves a valuable marketing function. When search visitors consider which link on a search results page to click, the title tag and snippet will influence their decision. The title tag should, therefore, closely match the content of each page.
The second aspect concerns URL's. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php, less so.
The more often the keyword appears, the more likely it will be bolded on a search results page, and is therefore more likely to attract a click. It's also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic.
SEO Plugins cover the on-site basics. But ranking well involves more than covering the basics.
In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.
It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have unique value.
So, the type and quality of content has more to do with SEO than the way that content is marked up by a generic plugin. The content must attract links and generate engagement. The visitor needs to see a title on a search result, click through, not click back, and, preferably take some action on that page. That action may be a click deeper into the site, a bookmark, a tweet, or some other measurable form of response.
Content that lends itself to this type of interaction includes blog posts, news feeds, and content intended for social network engagement. In this way, SEO-friendly content can be functionally separated from other types of content. Not every page needs to be SEOâ€™d, so SEO can be sectioned off, if necessary.
If your aim, or your clients aim, is to attract as much targeted traffic as possible then SEO integration must be taken just as seriously as design, development, copy and other media. SEO is more than a technical exercise, itâ€™s a strategic marketing exercise, much like Public Relations.
SEO considerations may influence your choice of CMS. It may influence your strategic approach in terms of what type of information you publish. It may change the way you engage visitors. Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage.
The reality of any marketing endeavour is that it will have a shelf-life. Sometimes, that shelf life is short. Other times, it can run for years.
SEO is vulnerable to the changes made by search engines. These changes arenâ€™t advertised in advance, nor are they easily pinned down even after they have occurred. This is why SEO is strategic, just as Public Relations is strategic. The Public Relations campaign you were using a few years ago may not be the same one you use now, and the same goes for SEO.
The core of SEO hasnâ€™t changed much. If you produce content visitors find relevant, and that content is linked to, and people engage with that content, then it has a good chance of doing well in search engines. However, the search engines constantly tweak their settings, and when they do, a lot of previous work - especially if that work was at the margins of the algorithms - can come undone.
So, ranking should never be taken for granted. The value the SEO brings is that they are across underlying changes in the way the search engines work and can adapt your strategy, and site, to the new changes.
Remember, whatever problems you may have with the search engines, the same goes for your competitors. They may have dropped rankings, too. Or they may do so soon. The SEO will try to figure out why the new top ranking sites are ranked well, then adapt your site and strategy so that it matches those criteria.
PPC has many advantages. The biggest advantage is that you can get top positioning, and immediate traffic, almost instantly. The downside is, of course, you pay per click. Whilst this might be affordable today, keep in mind that the search engine has a business objective that demands they reward the top bidders who are most relevant. Their auction model forces prices higher and higher, and only those sites with deep pockets will remain in the game. If you donâ€™t have deep pockets, or want to be beholden to the PPC channel, a long term SEO strategy works well in tandem.
SEO and PPC complement one another, and lulls and challenges in one channel can be made up for by the other. Also, you can feed the keyword data from PPC to SEO to gain a deeper understanding of search visitor behaviour.
This is the reason for undertaking any marketing strategy.
An SEO should be able to demonstrate value. One way is to measure the visits from search engines before the SEO strategy starts, and see if these increase significantly post implementation. The value of each search click changes depending on your business case, but can be approximated using the PPC bid prices. Keep in mind the visits from an SEO campaign may be maintained, and increased, over considerable time, thus driving down their cost relative to PPC and other channels.
Facebook. A mobile phone. Email. How often do you check them? Many of us have developed habits around these services.
The triggers that help create these habits can be baked in to the design of websites. The obvious benefit of doing so is that if you create habits in your users, then youâ€™re less reliant on new search visitors for traffic.
I recently read a book called â€śHooked: How To Build Habit Forming Productsâ€ť by Nir Eyal. Eyal is an entrepreneur who has built and sold two start ups, including a platform to place advertising within online social games. He also writes for Forbes, TechCrunch,and Psychology Today about the intersection of psychology, technology, and business. This latest book is about how technology shapes behaviour.
If usability is about engineering a site to make things easier, then forming habits is engineering user behaviour so they keep coming back. Forming habits in the user base is a marketers dream, yet a lot of search marketing theory is built around targeting the new visitor. As competition rises on the web, traffic becomes more valuable, and the price rises.
Clicks are likely more profitable the less you have to pay for them. If visitors keep returning because the visitor has formed a habit, then thatâ€™s a much more lucrative proposition than having to continually find new visitors. Facebook is a habit. Email is a habit. Google is a habit. Amazon is a habit. We keep returning for that fix.
What techniques can we use to help build habits?
The book is well worth a read if youâ€™re interested in the psychology of repeat engagement. Thereâ€™s a lot of familiar topics presented in the book, with cross-over into other marketing territory such as e-mail and social media marketing, but I found it useful to think of engagement in terms of habit formation. Hereâ€™s a taste of what Eyal has discovered about habit forming services.
1. Have A Trigger
A trigger is something that grabs your attention and forces you to react to it. A trigger might be a photo of you that appears on a friends Facebook Feed. It might be the ping of an email. It might be someone reacting to a comment that you made on a forum and receive notification. These triggers help condition a user to take an action.
2. Inspire Action
Action is taken when a user anticipates a reward. An example might be clicking on a link for a free copy of a book. There are two conditions needed for a reward to work. It must be easy and there must be a strong motivation. The investment required - the click and attention - is typically a lower â€ścostâ€ť than the reward - the book. On social sites, like Facebook, the reward of the â€ślikeâ€ť click is the presumption of a social reward.
3. Variable Reward
The reward in response to the action must be variable. Something different should happen as the result of taking an action. The author gives the example of a slot machine. The reward might occur as the result of an action, or it might not. A slot machine would be boring if you got the exact same result each time you pulled the handle and spun the dials. The fact the slot machine only pays out sometimes is what keeps people coming back. All sports and games work on the basis of variable reward.
An online equivalent is Twitter or Facebook feeds. We keep looking at them because they keep changing. Somedays, there isnâ€™t much of interest. Sometimes there is. Looking at that river of news going past can be an addictive habit, in part, because the reward changes.
The user must invest some time and do some work. Each time they invest some time and work, they add something that improves the service. They may add friends in Facebook. They add follows in Twitter. They build up reputation in forums. By adding to it, the service becomes more valuable both to the owner of the service, but also to the user. The bigger and deeper the network grows, the more valuable it becomes. If all your friends are on it, itâ€™s valuable. This builds ever more triggers, makes actions easier and likely more frequent, and the reward more exciting.
The circle is complete. A habit is formed.
Habits create unprompted user engagement. The value is pretty obvious. Thereâ€™s likely a higher lifetime value per customer than a one-off visit, or on-going visits we have to pay per click. We can spend less time acquiring new customers and more time growing the value to those we already have. If we create an easy mechanism by which that occurs, and spreads, then weâ€™re not as vulnerable to search engines.
If this all sounds very function and product oriented, well, it is. So how does this apply to a published website? A product website that aims for a one off sale?
For one off sales, there aren't opportunities for habit formation in the same way as there might be for, say, Facebook.
Developers often give away free apps, but bill for continued use. Once the user gets in the habit, of doing something, price becomes less of an issue. Price is much more of an issue before they form a habit because they wonder if they will get value. AngryBirds, WhatsApp, et al created a habit first, then cashed in once it was established.
A call-to-action is a trigger. If we think about how calls-to-action in social media and mobile applications, they tend to be big, bold and explicit. If users are in the habit of clicking big, bold buttons in other media, then try testing these such buttons against your current calls-to-action on web pages. Look to mimic habits and routines your visitors might use in other applications.
Habits can be a defensive strategy. Itâ€™s hard for a user to leave a company around which they've formed a habit. On the surface, there is a low switching cost between Google and, say, Bing, but how many people really do switch? Google has locked-in users habit by layering on services such as Gmail, or just the simple act of having people used to its interfaces. The habit of users increases their switching cost.
Thereâ€™s a great line in the book:
Many innovations fail because consumers irrationally overvalue the old while companies irrationally overvalue the newâ€ť - John Gourville
Changing user habits is very difficult. Even Google couldn't do it with Google Video vs the established YouTube. If youâ€™re thinking of getting into an established market, think about how youâ€™re going to break existing habits. A few new features probably isn't enough. If breaking established habits seems too difficult, you may decide to pick an entirely new niche and try to get users forming a habit around your offering before other early movers show up.
Eyal also discusses emotional triggers. He uses the example of Instagram where users form a habit for emotional reasons, namely the fear of missing out. The fear of missing out is a more passive, internal trigger.
After the trigger comes action. Usability is all about making it easy for the user to take action. Are you putting unnecessary sign-up stages in the way of a user taking action? Does the user really need to sign up before they take action? If you must have a sign up, how about making that process easier by letting people sign in with Facebook logins, or other shared services, where appropriate? Any barrier to action may lessen the chance of a user forming a habit.
Evan Williams, Blogger & Twitter:
Take a human desire, preferably one that has been around for a really long time...identify that desire, then take out steps
The technologies and sites that go big tend to mirror something people already do and have done for a long time. They just make the process easier and more efficient. Email is easier than writing and posting a letter. Creating a blog is easier than seeking a publishing deal or landing a journalism job at a newspaper. Sharing photos with Facebook is easier than doing so offline.
Apple worked on similar principles:
The most obvious thing is that Jobs wanted his products to be simple above all else. But Jobs realized early on that for them to be simple and easy to use, they had to be based on things that people already understood. (Design geeks have since given this idea a clunky name: so-called skeuomorphic user interfaces.) What was true of the first Macintosh graphical interface is true of the iPhone and iPad--the range of physical metaphors, and, eventually, the physical gestures that control them, map directly with what we already do in the real world. Thatâ€™s the true key to creating an intuitive interface, and Jobs realized it before computers could really even render the real world with much fidelity at all.[An example of "imputing" Apples values on the smallest decisions: Jobs spent hours honing the window borders of the first Macintosh GUI. When his designers complained, he pointed out that users would look at those details for hours, so they had to be good.
Reducing things to the essentials fosters engagement by making an action easier to take. If in doubt, take steps out, and see what happens.
Look for ways to reward the user when they take action. Forums use social rewards, such as reputation and status titles. Facebook has â€śLikeâ€ť Buttons. Inherent is this reward system is the thrill of pursuit. When a visitor purchases from you, or signs up for a newsletter, do you make the visitor feel like they've â€śwonâ€ť?
Placing feeds on your site are another example of variable reward. The feed content is unpredictable, but that very unpredictability may be enough to keep people coming back. Same goes for blog posts. Compare this with a static brochure site where the â€śrewardâ€ť will always be the same.
Can you break a process down into steps where the user is rewarded for taking each little step towards a goal? The reward should match the desires of the visitor. Perhaps the reward is monetary, perhaps itâ€™s social. Gamification is becoming big business and itâ€™s based around the idea of varying reward, action and triggers in order to foster engagement.
Gamification has also been used as a tool for customer engagement, and for encouraging desirable website usage behaviour. Additionally, gamification is readily applicable to increasing engagement on sites built on social network services. For example, in August 2010, one site, DevHub, announced that they have increased the number of users who completed their online tasks from 10% to 80% after adding gamification elements. On the programming question-and-answer site Stack Overflow users receive points and/or badges for performing a variety of actions, including spreading links to questions and answers via Facebook and Twitter. A large number of different badges are available, and when a user's reputation points exceed various thresholds, he or she gains additional privileges, including at the higher end, the privilege of helping to moderate the site
This is â€ścheckingâ€ť behaviour. We check for something new. We get a variable reward for checking something new. If we help create this behaviour in our visitors, we get higher engagement signals, and weâ€™re less reliant on new visitors from search engines.
Checking habits may change in the near future as more and more informational "rewards" are added to smartphones. The paper argues that novel informational rewards can lead to habitual behaviors if they are very quickly accessible. In a field experiment, when the phone's contact book application was augmented with real-time information about contacts' whereabouts and doings, users started regularly checking the application. The researchers also observed that habit-formation for one application may increase habit-formation for related applications.
Youâ€™ve got to feel a little sorry for anyone new to the search marketing field.
On one side, theyâ€™ve got to deal with the cryptic black box that is Google. Often inconsistent, always vague, and can be unfair in their dealings with webmasters. On the other side, webmasters must operate in competitive landscapes that often favour incumbent sites, especially if those incumbents are household names.
Sadly, much of the low hanging search fruit is gone. However, there are a number of approaches to optimization that donâ€™t involve link placement and keyword targeting.
Like any highly active and lucrative market sector, the web business can be challenging, but complaining about the nature of the environment will do little good. The only real option is to grab some boxing gloves, jump in the ring and compete.
In the last post, we talked about measurement. We need to make sure weâ€™re measuring the right things in order to win. This post is about measuring our competitors to see if we enjoy a competitive advantage. If not, we need to rethink our approach.
One of the problems with counting links, and other popular SEO metrics, is that they can be reductive. High link counts and pumped-up Google juice do not guarantee success, more traffic, or business success. For example, we might determine our competitor has X links from sites A, B and C, so we should do likewise. If we do likewise, plus a little more, then we win.
But often we donâ€™t.
We often donâ€™t win because there are multiple factors in play. Our competitorâ€™s site might rank for reasons that are difficult to determine, and even more difficult to emulate. They may have brand, engagement metrics or historical advantages. But most challenging of all, they could have some underlying competitive advantage that no amount of link building or ranking for keyword X by a new site will counter. They may just have a better offer.
Thereâ€™s an old joke about a two guys out walking in the African Savannah. They come across a hungry lion. The lion eyes them up, then charges them. One man turns and runs. The other man yells at him â€śyou fool, you canâ€™t outrun a lion!â€ť The other man yells back â€śthatâ€™s true, but I donâ€™t have to outrun the lion. I only have to outrun you!â€ť
Once we figure out what Google wants, we then need to outrun other sites in our niche in order to win. Those sites have to deal with Googleâ€™s whims, just like we do.
Typically, webmasters will reverse engineer competitor sites, using web metrics as scores to target and beat. Who is linking to this page? How old are the links? What are their most popular keywords? Where are they getting traffic from? Thatâ€™s part of the puzzle. However, we also need to evaluate non-technical factors that may be underpinning their business.
Competitive intelligence is an ongoing, systematic analysis of our competitors.
The goal of a competitor analysis is to develop a profile of the nature of strategy changes each competitor might make, each competitor's possible response to the range of likely strategic moves other firms could make, and each competitor's likely reaction to industry changes and environmental shifts that might take place. Competitive intelligence should have a single-minded objective -- to develop the strategies and tactics necessary to transfer market share profitably and consistently from specific competitors to the company. We should look at the sites positioned around and above us and analyse what they do in terms of business.
Do they understand the target market a little better than we do? Are their goals different from ours? If so, how are they different, and why? How are they pricing their products and services? How do their services differ from our own? In other words, do they know something we donâ€™t?
We can optimize for competitive advantage. It's about identifying what market your competitors capture, and where that market is heading in the future. Once you've figured that out, you might be able to discover opportunities your competitors have missed.
It would be great if we could call up our competitors and ask them exactly what they're doing, how theyâ€™re doing it, and where they are heading - and theyâ€™d tell us. But we all know that's not going to happen.
So we have to dig. We don't want to do too much digging, as it is time consuming, expensive and, truth be told, somewhat tedious. Thankfully, a lot of the answers we need are sitting right in front of us and readily available.
To undertake a competitive analysis, try asking these questions:
1. The Nature Of The Competition
The little guy used to prosper in search just by being clever. If you knew the tricks, and the big companies didnâ€™t - and typically, they didnâ€™t - you could beat them easily. This is now harder to do. These days, traditional power structures play a greater role in search results, so it is often the case that big brands can dominate SERPs by virtue of their offline market position. Their market position is creating the signals Google tends to look for, such as regular major press mentions, resulting links and direct search volume, often with little direct SEO effort on the part of the brand.
So, if youâ€™re the little guy coming up against big, entrenched competition, thatâ€™s going to be a hard road.
We saw what happened with Adwords, and now the same thing is happening in the main search results. Those with the deepest pockets could run Adwords campaigns that appear to make absolutely no fiscal sense, either because theyâ€™re getting their revenue from elsewhere to subsidise the Adwords spend, or, as is often the case, theyâ€™re prepared to wage a defensive war of attrition to prevent new competitors entering or dominating their space.
I think these long-term trends are mostly due to increasing competition. As more and more companies bid on Adwords for a finite number of clicks, it inevitably drives up the cost of clicks (simple supply and demand). It also doesnâ€™t help that a lot of Adwords users are not actively managing their campaigns or measuring their ROI, and are consequently bidding at unprofitably high levels. Google also does its best to drive up CPC values in various ways (suggesting ridiculously high default bids, goading you to bid more to get on page 1, not showing your ad at all if you bid too low â€“ even if no other ads appear etc).
Of course, this is just my data for one product in one small market. But the law of shitty clickthrus predicts that all advertising mediums become less and less profitable over time. So I would be surprised if it isnâ€™t a general trend
In the main search results, a large companies position will be influenced by spend they make elsewhere. Big PR media campaigns, and the resulting press, links, and mentions in other channels, all result in a big data footprint of attention and interest that Google is unlikely to miss.
However, the little guy still has one advantage that the big businesses seldom have. The little guy is like the speedboat compared to an ocean liner. They may be small, they may be easily swamped in a storm, but they can change direction very quickly. The ocean liner takes a long time to turn around.
The little guy can change direction and get into new markets quickly - â€śpivotâ€ť in Silicon Valley parlance. The little guy can twist new markets slightly and invent entire new markets, whilst the bigger business tend to sail pre-set courses along known routes. This is how the once nimble Google trounced their search competitors. They didnâ€™t take the competitors head on, they took a different tack (focused on the user, not advertisers), made strategic alignments (Yahoo), a few twists and turns (Overture) , and eventually worked themselves in the center of the search market. Had they just built another Yahoo, they wouldnâ€™t have got very far.
If youâ€™re a small business or new to a market, then itâ€™s not a great idea to take on a big, entrenched business directly. Rather, look for ways you can outmanoeuvre them. Are there changes in the market they arenâ€™t responding to? Are the markets about to change due to innovations coming over the horizon that you can spot, but they canâ€™t? Look for areas of abrupt change. The little guy is typically well placed to take advantage of rapid change in markets. And new, fast developing markets.
Choose your market space carefully.
So, how do you become the next Picasso? The same way you build a powerful brand. Create a new category you can be first in.
The best way to become a world-famous artist is to create paintings that are recognized as a new category of art. - Al Ries
2. Where Does The Competitor Compete?
For example, are they limited to a certain geography? Culture? Language? Do they have an offline presence?
You could take their business model to a geographic location they donâ€™t serve. Is there something that succeeds in the US, but has yet to reach Australia? Or Europe? Are your competitors targeting nationally, when you could target locally?
3. Who Do You Compete Against?
Make a list of the top ten competitors in a niche. Compare and contrast their approaches and offerings. Compare their use of language and their relative place in the market. Who is entrenched? Who is up-and-coming?
The up-and-coming sites are interesting. If theyâ€™re new, but making headway, it pays to ask why thatâ€™s happening. Is it just because theyâ€™re getting more links, or is it because theyâ€™re doing something new that the market likes? Bit of both?
I think the most interesting opportunities in search are found by watching the sites that aren't doing much in the way of SEO, but they are rising fast. If theyâ€™re not playing hard at â€śrigging the search voteâ€ť in their favour, then their positioning is likely due to genuine interest out in the market.
4. How Does The Competitor Compete
What are the specifics of the products and services they are offering. Lower prices? High service levels? Do they provide information that can't be obtained elsewhere? Do they have longevity? Money, staff and resources? Are they building brand? What are they doing besides search?
What prevents you doing likewise?
5. Are They More Engaging?
Google talk about engagement a lot, and we saw engagement metrics become important after updates Penquin/Panda.
Panda is really the public face of a much deeper switch towards user engagement. While the Panda score is sitewide the engagement "penalty" or weighting effect on also occurs at the individual page. The pages or content areas that were hurt less by Panda seem to be the ones that were not also being hurt by the engagement issue.
Engagement is a measure of how interesting visitors find a site. Do people search for your competitors by name, do they click through rather than back to the SERPs, and do they talk about that site to others?
The click-back, or lack-thereof, is a hard one to spot if you donâ€™t have access to a websites data. Take a look at your competitors usability. Is it easy to navigate? It is obvious where visitors need to click? Are they easy to order from? Is their offer clear? Do they have fast site response times? Of course, we view these things as fundamental, however many sites still overlook the basics. If you can optimize in these areas, do so. If your competitors ranking above you have good engagement design and content, then you need to do it, too.
One baseline to look at is branded search volumes. If people are specifically & repeatedly looking for something that typically means they are satisfied with it.
Matt Cutts has recently mentioned that incumbent sites may not enjoy the previous â€śagedâ€ť advantages theyâ€™ve had in the past.
This may well be the next big Google shift. It makes sense that Google would reward sites that have higher user utility scores, all other factors being equal. Older sites may have built up a lot of links and positive SEO signals over time, but if their information is outdated and their site cumbersome, the site will likely have low utility. Given the rise of social media, which is all about immediacy and relevance (high utility as perceived by the user), Google would be foolish to reward incumbency at the expense of utility. Itâ€™s an area weâ€™re watching closely as it may swing back some advantage to the smaller, nimble players.
6. Do They Have A Good Defensive Position?
Is it hard to enter their market? Competitors may have a lot of revenue to throw around, and a considerable historical advantages. Taking on the likes of Trip Advisor would be difficult and expensive, no matter how good the SEO.
If they have a strong defensible position, and you have limited resources, trying creating your own, unique space. For example, in SEO, you could compete with other SEOs for clients (crowded), or your could become a local trainer who trains existing SEOs inhouse (less crowded). You could move from selling widgets to hiring out widgets to people. You could repackage your widgets with other widgets to create a new product. An example might be selling individual kitchen utensils, but packaged together, they become a picnic kit.
Look for ways to create slightly different markets that you can make your own.
7. Whatâ€™s In Their Marketing?
What does their advertising look like? Scanning competitor's ads can reveal much about what that competitor believes about marketing and their target market.
Are they changing their message? Offering new products? Rebranding? Positioning differently? This is not absolute, of course, but it could offer up some valuable clues. Thereâ€™s even a Society of Competitive Intelligence Professionals devoted to this very task.
Whilst competitive analysis is huge topic, the value of even a basic competitive analysis can be considerable.
By doing so, we can adjust our own offering to compete better, or decide that competing directly is not a great idea, and that we would be better off entering a closely-related market, instead . We may create a whole new niche and have no competition. At least, not for a while. We might make a list of all the things we need to do to match and overtake a fast rising new challenger who isnâ€™t doing much in the way of SEO.
There's much more to search competition that algo watching, keywords and links. And many ways to compete and optimize.
Matt Cutts is just toying with SEOâ€™s these days.
Okay, Iâ€™m calling it: if youâ€™re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time itâ€™s become a more and more spammy practice, and if youâ€™re doing a lot of guest blogging then youâ€™re hanging out with really bad company.
The hen-house erupted.
The hens should know better by now. If a guest post is good for the audience and site, then do it. If itâ€™s being done for no other reason than to boost rank in Google, then thatâ€™s a sign a publishing strategy is weak, high risk, and vulnerable to Google's whims. Change the publishing strategy.
Although far from perfect, Google is geared towards recognizing utility. If Google doesnâ€™t recognize utility, then Google will become weaker and someone else will take their place. Only a few people remember Alta Vista. They didnâ€™t provide much in the way of utility, and Google ate their lunch.
Which brings me onto the importance of measurement.
Itâ€™s important we measure the right things. If people get upset because guest posting is called out, are they upset because they are counting the number of inbound links as if that were the only benefit? Why are they counting inbound links? To get a ranking boost? So, why are some people getting upset? They know Google doesnâ€™t like marketing practices that serve no other purpose than to boost rank. Or are people concerned Google might confuse a post of genuine utility with link spam?
A publishing strategy based on nothing more than Google rankings is not a publishing strategy, itâ€™s a tactic. Given the changes Google has made recently, itâ€™s not a good tactic, because if they can isolate and eliminate SEO tactics, they will. Those who guest post on other sites, and offer guest post placement in order to provide utility, should continue to do so. They are unlikely to eliminate genuine utility, regardless of links, and at worst, they'll likely ignore the site it appears on.
To prosper, we need to be more interesting that the next guy. We need to focus on delivering â€śinterestingnessâ€ť.
The buzzword term is â€śvisitor engagementâ€ť, but that really means â€śbe interestingâ€ť. If we provide interesting material, people will read it, and if we provide it on a regular basis, they might come back, or remember our brand name, and then search on that brand name, and then they might link to it, and that this activity combined helps us rank. Ranking is a side effect of being genuinely interesting.
This is not to say measuring links, or page views, are unimportant. But they can be an oversimplification when taken in isolation.
Demand Media's eHow focused on pageviews rather than engagement. Which is a big part of the reason why the guys who sold them eHow were able to beat them with wikiHow.
Success depends on achieving the underlying business goal. Perhaps high page views are not important if a site is targeting a very specific audience. Perhaps rankings aren't all that important if most of the audience is on social media or repeat business. Sometimes, focusing on the wrong metrics leads to the wrong marketing tactics.
What else can we measure? Some common stuff....
The choice of what we measure depends on what weâ€™re trying to achieve. The SEO may say they are trying to achieve a high rank, but why? To get more traffic, perhaps. Why do we want more traffic? In the hope more people will buy our widget.
So, if buying more widgets is the goal, then perhaps more energy needs to be placed into converting the traffic we already have, as opposed to spending the same energy getting more? Perhaps more time needs to be spent on conversion optimization. Perhaps more time needs to be spent refining the offer. Or listening to customers. Hearing their objections. Writing Q&A that addresses those objections. Guest posting somewhere else and addressing industry wide objections. Thinking up products to sell to previous customers. Making them aware of changes via an email list. Optimizing the interest factor of your site to make it more interesting than your competitors, then treat the rankings as a bonus. Link building starts with "being interesting".
When it comes to the guest post, if youâ€™re only doing it to get a link, then youâ€™re almost certainly selling yourself short. A guest post should serve a number of functions, such as building awareness, increasing reach, building brand, and be based on serving your underlying marketing objective. Pick where you post carefully. Deliver real value. If you do guest post, always try and extract way more benefit than just the link.
There was a time when people could put low-quality posts on low-quality sites and enjoy a benefit. But that practice is really just selling a serious web business short.
There are a couple of different types of measurement marketers use. One is an emotional response, where the visitor becomes â€śpositively interestedâ€ť. This is measured by recall studies, association techniques, customers surveys and questionnaires. However, the type of response on-line marketers focus on, which is somewhat easier to measure, is behavioural interest. When people are really interested, they do something in response.
So, to measure the effectiveness of a guest posting, we might look for increased name or brand searches. More linkedin views. We might look at how many people travel down the links. We look at what they do when they land on the site, and - the most important bit - whether they do whatever that thing is that translates to the bottom line. Was it subscribing? Commenting? Downloading a white paper? Watching a video? Getting in contact? Tweeting? Bookmarking? What was that thing you wanted them to do in order to serve your bottom line?
Measurement should be flexible and will be geared towards achieving business goals. SEOs may worry that if they donâ€™t show rankings and links, then the customer will be dissatisfied. Iâ€™d wager the customer will be a lot more dissatisfied if they do get a lot of links and a rankings boost, yet no improvement in the bottom line. We could liken this to companies that have a lot of meetings. There is an air of busyness, but are they achieving anything worthwhile? Maybe. Maybe not. We should be careful not to mistake frenzy for productivity.
Measuring links, like measuring the number of meetings, is reductive. So is measuring engagement just by looking at clicks. The picture needs to be broad and strategic. So, if guest posts help you build your business, measured by business metrics, keep doing them. Donâ€™t worry about what Google may or may not do, because itâ€™s beyond your control, regardless.
Control what you can. Control the quality of information you provide.
Is your CMS SEO-friendly? The following checklist will help you determine the capabilities of your...
TYPO3 would do well to learn from the growing success and core strenghs of Wordpress
If you're looking for a quality and affordable web hosting provider, look no further. Web Hosting...