In the last post, we looked at how SEO has always been changing, but one thing remains constant - the quest for information.
Given people will always be on a quest for information, and given there is no shortage of information, but there is limited time, then there will always be a marketing imperative to get your information seen either ahead of the competition, or in places where the competition havenât yet targeted.
My take on SEO is broad because Iâm concerned with the marketing potential of the search process, rather than just the behaviour of the Google search engine. We know the term SEO stands for Search Engine Optimization. Itâs never been particularly accurate, and less so now, because what most people are really talking about is not SEO, but GO.
Still, the term SEO has stuck. The search channel used to have many faces, including Alta Vista, Inktomi, Ask, Looksmart, MSN, Yahoo, Google and the rest, hence the label SEO. Now, itâs pretty much reduced down to one. Google. Okay, thereâs BingHoo, but really, itâs Google, 24/7.
We used to optimize for multiple search engines because we had to be everywhere the visitor was, and the search engines had different demographics. There was a time when Google was the choice of the tech savvy web user. These days, âsearchâ means âGoogleâ. You and your grandmother use it.
But people donât spend most of their time on Google.
The techniques for SEO are widely discussed, dissected, debated, ridiculed, encouraged and weâve heard all of them, many times over. And thatâs just GO.
The audience we are trying to connect with, meanwhile, is on a quest for information. On their quest for information, they will use many channels.
So, who is Googleâs biggest search competitor? Bing? Yahoo?
Many people think our main competition is Bing or Yahoo," he said during a visit to a Native Instruments, software and hardware company in Berlin. "But, really, our biggest search competitor is Amazon. People don't think of Amazon as search, but if you are looking for something to buy, you are more often than not looking for it on AmazonâŠ.Schmidt noted that people are looking for a different kind of answers on Amazon's site through the slew of reviews and product pages, but it's still about getting information
An important point. For the user, itâs all about âgetting informationâ. In SEO, verticals are often overlooked.
I'm going to digress a little....how do you select clients, or areas to target?
I like to start from the audience side of the equation. Who are the intended audience, what does that audience really need, and where, on the web, are they? I then determine if itâs possible/plausible to position well for this intended audience within a given budget.
There is much debate amongst SEOs about what happens inside the Google black box, but we all have access to Googleâs actual output in the form of search results. To determine the level of competition, examine the search results. Go through the top ten or twenty results for a few relevant keywords and see which sites Google favors, and try to work out why.
Once you look through the results and analyze the competition, youâll get a good feel for what Google likes to see in that specific sector. Are the search results heavy on long-form information? Mostly commercial entities? Are sites large and established? New and up and coming? Do the top sites promote visitor engagement? Who links to them and why? Is there a lot news mixed in? Does it favor recency? Are Google pulling results from industry verticals?
Itâs important to do this analysis for each project, rather than rely on prescriptive methods. Why? Because Google treats sectors differently. What works for âtravelâ SEO may not work for âcasinoâ SEO because Google may be running different algorithms.
Once you weed out the wild speculation about algorithms, SEO discussion can contain much truth. People convey their direct experience and will sometimes outline the steps they took to achieve a result. However, often specific techniques aren't universally applicable due to Google treating topic areas differently. So spend a fair bit of time on competitive analysis. Look closely at the specific results set youâre targeting to discover what is really working for that sector, out in the wild.
Itâs at this point where youâll start to see cross-overs between search and content placement.
You could try and rank for term X, and you could feature on a site that is already ranked for X. Perhaps Google is showing a directory page or some industry publication. Can you appear on that directory page or write an article for this industry publication? What does it take to get linked to by any of these top ten or twenty sites?
Once search visitors find that industry vertical, what is their likely next step? Do they sign up for a regular email? Can you get placement on those emails? Can you get an article well placed in some evergreen section on their site? Can you advertise on their site? Figure out how visitors would engage with that site and try to insert yourself, with grace and dignity, into that conversation.
Users may by-pass Google altogether and go straight to verticals. If they like video then YouTube is the obvious answer. A few years ago when Google was pushing advertisers to run video ads they pitched YouTube as the #2 global search engine. What does it take to rank in YouTube in your chosen vertical? Create videos that will be found in YouTube search results, which may also appear on Googleâs main search results.
With 200,000 videos uploaded per day, more than 600 years required to view all those videos, more than 100 million videos watched daily, and more than 300 million existing accounts, if you think YouTube might not be an effective distribution channel to reach prospective customers, think again.
There's a branding parallel here too. If the field of SEO is too crowded, you can brand yourself as the expert in video SEO.
Thereâs also the ubiquitous Facebook.
Facebook, unlike the super-secret Google, has shared their algorithm for ranking content on Facebook and filtering what appears in the news feed. The algorithm consists of three componentsâŠ..
If youâre selling stuff, then are you on Amazon? Many people go directly to Amazon to begin product searches, information gathering and comparisons. Are you well placed on Amazon? What does it take to be placed well on Amazon? What are people saying? What are their complaints? What do they like? What language do they use?
In 2009, nearly a quarter of shoppers started research for an online purchase on a search engine like Google and 18 percent started on Amazon, according to a Forrester Research study. By last year, almost a third started on Amazon and just 13 percent on a search engine. Product searches on Amazon have grown 73 percent over the last year while searches on Google Shopping have been flat, according to comScore
All fairly obvious, but may help you think about channels and verticals more, rather than just Google. The appropriate verticals and channels will be different for each market sector, of course. And they change over time as consumer tastes & behaviors change. At some point each of these were new: blogging, Friendster, MySpace, Digg, Facebook, YouTube, Twitter, LinkedIn, Instagram, Pinterest, Snapchat, etc.
This approach will also help us gain a deeper understanding of the audience and their needs - particularly the language people use, the questions they ask, and the types of things that interest them most - which can then be fed back into your search strategy. Emulate whatever works in these verticals. Look to create a unique, deep collection of insights about your chosen keyword area. This will in turn lead to strategic advantage, as your competition is unlikely to find such specific information pre-packaged.
This could also be characterised as âcontent marketingâ, which it is, although I like to think of it all as âgetting in front of the visitors quest for informationâ. Wherever the visitors are, thatâs where you go, and then figure out how to position well in that space.
SEO is subject to frequent change, but in the last year or two, the changes feel both more frequent and significant than changes in the past. Florida hit in 2003. Since then, itâs like we get a Florida every six months.
Whenever Google updates the underlying landscape, the strategies need to change in order to deal with it. No fair warning. Thatâs not the game.
There used to be a time when SEOs followed a standard prescription. Many of us remember a piece of software called Web Position Gold.
Web Position Gold emerged when SEO could be reduced to a series of repeatable - largely technical - steps. Those steps involved adding keywords to a page, repeating those keywords in sufficient density, checking a few pieces of markup, then scoring against an âidealâ page. Upload to web. Add a few links. Wait a bit. Run a web ranking report. Viola! Youâre an SEO. In all but the most competitive areas, this actually worked.
Seems rather quaint these days.
These days, you could do all of the above and get nowhere. Or you might get somewhere, but when so many more factors in play, they canât be isolated to an individual page score. If the page is published on a site with sufficient authority, it will do well almost immediately. If it appears on a little known site, it may remain invisible for a long time.
Before Google floated in 2004, they released an investor statement signalling SEO - well, âindex spammersâ - as a business risk. If you ever want to know what Google really feels about people who âmanipulateâ their results, itâs right here:
We are susceptible to index spammers who could harm the integrity of our web search results.
There is an ongoing and increasing effort by âindex spammersâ to develop ways to manipulate our web search results. For example, because our web search technology ranks a web pageâs relevance based in part on the importance of the web sites that link to it, people have attempted to link a group of web sites together to manipulate web search results. We take this problem very seriously because providing relevant information to users is critical to our success. If our efforts to combat these and other types of index spamming are unsuccessful, our reputation for delivering relevant information could be diminished. This could result in a decline in user traffic, which would damage our business.
SEO competes with the Adwords business model. So, Google âtake very seriouslyâ the activities of those who seek to figure out the algorithms, reverse engineer them, and create push-button tools like Web Position Gold. Weâve had Florida, and Panda, and Penguin, and Hummingbird, all aimed at making the search experience better for users, whilst having the pleasant side effect, as far as Google is concerned, of making life more difficult for SEOs.
I think the key part of Googleâs statement was âdelivering relevant informationâ.
SEO will always involve technical aspects. You get down into code level and mark it up. The SEO needs to be aware of development and design and how those activities can affect SEO. The SEO needs to know how web servers work, and how spiders can sometimes fail to deal with their quirks.
But in the years since Florida, marketing aspects have become more important. An SEO can perform the technical aspects of SEO and get nowhere. More recent algorithms, such as Panda and Penguin, gauge the behaviour of users, as Google tries to determine information quality of pages. Hummingbird attempts to discover the intent that lays behind keywords.
As a result, Keyword-based SEO is in the process of being killed off. Google withholds keyword referrer data and their various algorithms attempt to deliver pages based on a users intent and activity - both prior and present - in order to deliver relevant information. Understanding the user, having a unique and desirable offering, and a defensible market position is more important than any keyword markup. The keyword match, on which much SEO is based, is not an approach that is likely to endure.
The emphasis has also shifted away from the smaller operators and now appears to favour brands. This occurs not because brands are categorized as âbrandsâ, but due to the side effects of significant PR activities. Bigger companies tend to run multiple advertising and PR campaigns, so produce signals Google finds favorable i.e. search volume on company name, semantic associations with products and services, frequent links from reputable media, and so on. This flows through into rank. And it also earns them leeway when operating in the gray area where manual penalties are handed out to smaller & weaker entities for the same activities.
Apparently, Google killed off toolbar PageRank.
We will probably not going to be updating it [PageRank] going forward, at least in the Toolbar PageRank.
A few people noted it, but the news won't raise many eyebrows as toolbar PR has long since become meaningless. Are there any SEOs altering what they do based on toolbar PR? Itâs hard to imagine why. The reality is that an external PR value might indicate an approximate popularity level, but this isnât an indicator of the subsequent ranking a link from such a page will deliver. There are too many other factors involved. If Google are still using an internal PR metric, itâs likely to be a significantly more complicated beast than was revealed in 1997.
A PageRank score is a proxy for authority. Iâm quite sure Google kept it going as an inside joke.
A much more useful proxy for authority are the top ten pages in any niche. Google has determined all well-ranking pages have sufficient authority, and no matter what the toolbar, or any other third-party proxy, says, itâs Googleâs output that counts. A link from any one of the top ten pages will likely confer a useful degree of authority, all else being equal. Itâs good marketing practice to be linked from, and engage with, known leaders in your niche. Thatâs PR, as in public relations thinking, vs PR (Page rank), technical thinking.
The next to go will likely be keyword-driven SEO. Withholding keyword referral data was the beginning of the end. Hummingbird is hammering in the nails. Keywords are still great for research purposes - to determine if thereâs an audience and what the size of that audience may be - but SEO is increasingly driven by semantic associations and site categorizations. Itâs not enough to feature a keyword on a page. A page, and site, needs to be about that keyword, and keywords like it, and be externally recognized as such. In the majority of cases, a page needs to match user intent, rather than just a search term. There are many exceptions, of course, but given what we know about Hummingbird, this appears to be the trend.
People will still look at rank, and lust after prize keywords, but really, rankings have been a distraction all along. Reach and specificity is more important i.e. whereâs the most value coming from? The more specific the keyword, typically the lower the bounce rate and the higher the conversion rate. The lower the bounce-rate, and higher the conversion rate, the more positive signals the site will generate, which will flow back into a ranking algorithm increasing being tuned for engagement. Ranking for any keyword that isnât delivering business value makes no sense.
There are always exceptions. But thatâs the trend. Google are looking for pages that match user intent, not just pages that match a keyword term. In terms of reach, you want to be everywhere your customers are.
To adapt to change, SEOs should think about search in the widest possible terms. A search is quest for information. It may be an active, self-directed search, in the form of a search engine query. Or a more passive search, delivered via social media subscriptions and the act of following. How will all these activities feed into your search strategy?
Sure, itâs not a traditional definition of SEO, as I'm not limiting it to search engines. Rather, my point is about the wider quest for information. People want to find things. Eric Schmidt recently claimed Amazon is Google's biggest competitor in search. The mechanisms and channels may change, but the quest remains the same. Take, for example, the changing strategy of BuzzFeed:
Soon after Peretti had turned his attention to BuzzFeed full-time in 2011, after leaving the Huffington Post, BuzzFeed took a hit from Google. The site had been trying to focus on building traffic from both social media marketing and through SEO. But the SEO traffic â the free traffic driven from Googleâs search results â dried up.
Reach is important. Topicality is important. Freshness, in most cases, is important. Engagement is important. Finding information is not just about a technical match of a keyword, itâs about an intellectual match of an idea. BuzzFeed didnât take their eye off the ball. They know helping users find information is the point of the game they are in.
And the internet has only just begun.
In terms of the internet, nothing has happened yet. The internet is still at the beginning of its beginning. If we could climb into a time machine and journey 30 years into the future, and from that vantage look back to today, weâd realize that most of the greatest products running the lives of citizens in 2044 were not invented until after 2014. People in the future will look at their holodecks, and wearable virtual reality contact lenses, and downloadable avatars, and AI interfaces, and say, oh, you didnât really have the internet (or whatever theyâll call it) back then.
In 30 years time, people will still be on the exact same quest for information. The point of SEO has always been to get your information in front of visitors, and thatâs why SEO will endure. SEO was always a bit of a silly name, and it often distracts people from the point, which is to get your stuff seen ahead of the rest.
Some SEOs have given up in despair because itâs not like the old days. Itâs becoming more expensive to do effective SEO, and the reward may not be there, especially for smaller sites. However, this might be to miss the point, somewhat.
The audience is still there. Their needs havenât changed. They still want to find stuff. If SEO is all about helping users find stuff, then thatâs the important thing. Remember the âwhyâ. Adapt the âhowâ
In the next few articles, weâll look at the specifics of how.
In recent years, the biggest change to the search landscape happened when Google chose to withhold keyword data from webmasters. At SEOBook, Aaron noticed and wrote about the change, as evermore keyword data disappeared.
The motivation to withold this data, according to Google, was privacy concerns:
SSL encryption on the web has been growing by leaps and bounds. As part of our commitment to provide a more secure online experience, today we announced that SSL Search on https://www.google.com will become the default experience for signed in users on google.com.
At first, Google suggested it would only affect a single-digit percentage of search referral data:
Google software engineer Matt Cutts, whoâs been involved with the privacy changes, wouldnât give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com
...which didn't turn out to be the case. It now affects almost all keyword referral data from Google.
Was it all about privacy? Another rocket over the SEO bows? Bit of both? Probably. In any case, the search landscape was irrevocably changed. Instead of being shown the keyword term the searcher had used to find a page, webmasters were given the less than helpful ânot providedâ. This change rocked SEO. The SEO world, up until that point, had been built on keywords. SEOs choose a keyword. They rank for the keyword. They track click-thrus against this keyword. This is how many SEOs proved their worth to clients.
These days, very little keyword data is available from Google. There certainly isnât enough to keyword data to use as a primary form of measurement.
This change forced a rethink about measurement, and SEO in general. Whilst there is still some keyword data available from the likes of Webmaster Tools & the AdWords paid versus organic report, keyword-based SEO tracking approaches are unlikely to align with Googleâs future plans. As we saw with the Hummingbird algorithm, Google is moving towards searcher-intent based search, as opposed to keyword-matched results.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if youâve shared that with Google. It might understand that âplaceâ means you want a brick-and-mortar store. It might get that âiPhone 5sâ is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words
The search bar is still keyword based, but Google is also trying to figure out what user intent lays behind the keyword. To do this, theyâre relying on context data. For example, they look at what previous searches has the user made, their location, they are breaking down the query itself, and so on, all of which can change the search results the user sees.
When SEO started, it was in an environment where the keyword the user typed into a search bar was exact matching that with a keyword that appears on a page. This is what relevance meant. SEO continued with this model, but itâs fast becoming redundant, because Google is increasingly relying on context in order to determine searcher intent & while filtering many results which were too aligned with the old strategy. Much SEO has shifted from keywords to wider digital marketing considerations, such as what the visitor does next, as a result.
Okay, if SEOâs donât have keywords, what can they use?
If we step back a bit, what weâre really trying to do with measurement is demonstrate value. Value of search vs other channels, and value of specific search campaigns. Did our search campaigns meet our marketing goals and thus provide value?
Do we have enough data to demonstrate value? Yes, we do. Here are a few ideas SEOs have devised to look at the organic search data they are getting, and they use it to demonstrate value.
1. Organic Search VS Other Activity
If our organic search tracking well when compared with other digital marketing channels, such as social or email? About the same? Falling?
In many ways, the withholding of keyword data can be a blessing, especially to those SEOs who have a few ranking-obsessed clients. A ranking, in itself is worthless, especially if itâs generating no traffic.
Instead, if we look at the total amount of organic traffic, and see that it is rising, then we shouldnât really care too much about what keywords it is coming from. We can also track organic searches across device, such as desktop vs mobile, and get some insight into how best to optimize those channels for search as a whole, rather than by keyword. Itâs important that the traffic came from organic search, rather than from other campaigns. Itâs important that the visitors saw your site. And itâs important what that traffic does next.
2. Bounce Rate
If a visitor comes in, doesnât like what is on offer, and clicks back, then that wonât help rankings. Google have been a little oblique on this point, saying they arenât measuring bounce rate, but I suspect itâs a little more nuanced, in practice. If people are failing to engage, then anecdotal evidence suggests this does affect rankings.
Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.
Either way, we must encourage visitor engagement. Even small improvements in terms of engagement can mean big changes in the bottom line. Getting visitors to a site was only ever the first step in a long chain. Itâs what they do next that really makes or breaks a web business, unless the entire goal was that the visitor should only view the landing page. Few sites, these days, would get much return on non-engagement.
PPCers are naturally obsessed with this metric, because each click is costing them money, but when you think about it, itâs costing SEOs money, too. Clicks are getting harder and harder to get, and each click does have a cost associated with it i.e. the total cost of the SEO campaign divided by the number of clicks, so each click needs to be treated as a cost.
3. Landing Pages
We can still do landing page analysis. We can see the pages where visitors are entering the website. We can also see which pages are most popular, and we can tell from the topic of the page what type of keywords people are using to find it.
We could add more related keyword to these pages and see how they do, or create more pages on similar themes, using different keyword terms, and then monitor the response. Similarly, we can look at poorly performing pages and make the assumption these are not ranking against intended keywords, and mark these for improvement or deletion.
We can see how old pages vs new pages are performing in organic search. How quickly do new pages get traffic?
Weâre still getting a lot of actionable data, and still not one keyword in sight.
4. Visitor And Customer Acquisition Value
We can still calculate the value to the business of an organic visitor.
We can also look at what step in the process are organic visitors converting. Early? Late? Why? Is there some content on the site that is leading them to convert better than other content? We can still determine if organic search provided a last click-conversion, or a conversion as the result of a mix of channels, where organic played a part. We can do all of this from aggregated organic search data, with no need to look at keywords.
5. Contrast With PPC
We can contrast Adwords data back against organic search. Trends we see in PPC might also be working in organic search.
For AdWords our life is made infinitesimally easier because by linking your AdWords account to your Analytics account rich AdWords data shows up automagically allowing you to have an end-to-end view of campaign performance.
Even PPC-ers are having to change their game around keywords:
The silver lining in all this? With voice an mobile search, youâll likely catch those conversions that you hadnât before. While you may think that you have everything figured out and that your campaigns are optimal, this matching will force you into deeper dives that hopefully uncover profitable PPC pockets.
6. Benchmark Against Everything
In the above section I highlighted comparing organic search to AdWords performance, but you can benchmark against almost any form of data.
Is 90% of your keyword data (not provided)? Then you can look at the 10% which is provided to estimate performance on the other 90% of the traffic. If you get 1,000 monthly keyword visits for [widgets], then as a rough rule of thumb you might get roughly 9,000 monthly visits for that same keyword shown as (not provided).
Has your search traffic gone up or down over the past few years? Are there seasonal patterns that drive user behavior? How important is the mobile shift in your market? What landing pages have performed the best over time and which have fallen hardest?
How is your site's aggregate keyword ranking profile compared to top competitors? Even if you don't have all the individual keyword referral data from search engines, seeing the aggregate footprints, and how they change over time, indicates who is doing better and who gaining exposure vs losing it.
You can also go further with other competitive research tools which look beyond the search channel. Is most of your traffic driven from organic search? Do your competitors do more with other channels? A number of sites like Compete.com and Alexa have provided estimates for this sort of data. Another newer entrant into this market is SimilarWeb.
And, finally, rank checking still has some value. While rank tracking may seem futile in the age of search personalization and Hummingbird, it can still help you isolate performance issues during algorithm updates. There are a wide variety of options from browser plugins to desktop software to hosted solutions.
By now, I hope Iâve convinced you that specific keyword data isnât necessary and, in some case, may have only served to distract some SEOs from seeing other valuable marketing metrics, such as what happens after the click and where do they go next.
So long as the organic search traffic is doing what we want it to, we know which pages it is coming in on, and can track what it does next, there is plenty of data there to keep us busy. Lack of keyword data is a pain, but in response, many SEOs are optimizing for a lot more than keywords, and focusing more on broader marketing concerns.
Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.
In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:
For example, if you sell spectacles, the difference between users searching on âglassâ vs. âglassesâ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.
Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins - those benefits are now off the table.
CPC isn't the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.
An offline analogy for this loss of segmentation ... you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn't want and didn't ask for.
Why does a person misspell a keyword? Some common reasons include:
In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.
Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.
No one who's in the know has more than about 5-10 total keywords in any one adgroup because they're using broad match modified, which eliminated the need for "excessive keyword lists" a long time ago. Now you're going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.
You might not know which end of the spectrum your account is on until disaster strikes:
I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.
In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
Variation match isnât always bad, there are times it can be good to use variation match. However, there was choice.
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now youâre losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.
A monopoly restricting choice to enhance their own bottom line. It isn't the first time they've done that, and it won't be the last.
Have an enhanced weekend!
Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.
The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.
Here's a custom drawing we commissioned for this interview.
Want to embed this image on your website?
To date there have been 5 Penguin updates:
There hasn't been one in quite a while, which is frustrating many who haven't been able to recover. On to the interview...
At its core what is Google Penguin?
Jim: It is a link filter that can cause penalties.
Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.
What sort of ranking and traffic declines do people typically see from Penguin?
Jim: 30-98%. actually, seen some "manual partial matches" some, where traffic was hardly hit...but that's rare.
Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven't been quite as severe.
After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn't been one for about 10 months now. So why have the updates been so rare? And why hasn't there been one for a long time?
Jim: Great question. We all believed there'd be an update every 6 months, and now it's been way longer than 6 months...maybe because Matt's on vacation...or maybe he knew it would be a long time until the next update, so he took some time off...or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.
Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain - Google doesn't intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and 'dirty' vs 'clean' links is difficult to ascertain on that signal.
Jim: Most people disavow some, then the disavow some more...then next month they disavow more...wait a year and they may disavow them all :)
Jim: Then Google will let them out...hehe, tongue in cheek...a little.
Joe: I've seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources - absurd.
Jim: Me too. Most of the people are clueless ... there's tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.
Joe: Yes; I've seen a lot of panda hits where the person wants to immediately disavow. "whoa, slow down there Tex!"
Jim: I've seen services where they guarantee you'll get out of a penguin penalty, and we know that they're just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you're left with nothing.
Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.
Jim: or as they are disavowing 100% of the links they can find going to the site.
OK. I think you mentioned an important point there Jim about "100% of the links they can find." What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?
Joe: Rarely. I've seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren't initially in the WMT data dump. I'm dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won't let them up for air and won't do the hard reset.
Jim: well first...if you're getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can't just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.
We often start with that. When we find big patterns that we feel are the cause, we'll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that's after the Google backlink analysis.
Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?
Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I'll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren't looking on WMT) is the timing -- think of a bomb going off vs a sniper...everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you'll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite...externally supported pages - to - total indexed pages ratios.
Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you're screwed....because you're waiting for the next refresh...hoping you did enough to get out.
I don't mind going back and forth with Google with a manual penalty...at least I'm getting an answer.
If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates...if you see a drop on one of the update days, then you can know if you have Panda or Penguin....and if you're traffic is just falling, it could be just that, and no penalty.
Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var - the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn't generate a lot of confidence for me
Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.
You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?
Jim: Well, I'm one for trying to save a site. I haven't once said "it's over for that site, let's start fresh." Links are so important, that if I can even save a few links going to a site, I'll take it. I'm not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.
Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won't take anyone's money for those hit on penguin 1.0 though...I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.
For churn and burn, redirects and canonicals can still work if you're clever...but that's not reinvestment so much as strategy shift I realize.
You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?
Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let's just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)
Jim: pain :) ... not sure if they're leveraging the data yet, but they might be. It shouldn't be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, "no one else seems to trust these links, we should just nuke that site and not count any links from there."
we can do this ourselves with our own tools we have..I can see how many times I've seen a domain in my disavows, and how many times I disavowed that...ie, If I see spamsite.com in 20 disavows I've done, and I'd disavowed it all 20 times I saw it, I can see this data... or if I've seen goodsite.com 20 times, and never once disavowed it, I can see that too. I'd assume Google must do something like this as well.
Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?
Joe: When I deliver "disavow these" and "say this" stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.
Jim: I figure it will take a few reconsideration requests...and yes, I start "big" and get "bigger."
but that's for a sitewide penalty...
We've seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it's much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.
A sitewide manual penalty kills the site...a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks...once we're at a partial match, I feel much more confident in getting that resolved.
Jim, I know you've mentioned the errors people make in either disavowing great links or disavowing links when they didn't need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy "rule of thumb" guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?
Joe: It comes down to education, doesn't it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you're more serious after the first disavow is declined by showing you "invested" in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing
Jim: We charge $5000 for the analysis, and $5000 for the link removal process...some may think that's expensive...but removing good links will screw you, and not removing bad links will screw you...it's a real science, and getting is wrong can cost you a lot more than this...of course I'd recommend seeing a professional, as I sell this service...but I can't see anyone who's not a true expert in links doing this themselves.
Oh...and once we start work for someone, we keep going at no further cost until they get out.
Joe: That's a nice touch Jim.
Jim: Thank you.
Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google's reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?
Joe: Really though, be upfront and honest when using those link removal services (which I'd do vs trying to do them one-by-one-by-one)
Jim: Only 1% of the people will remove links anyways; it's more to show Google that to you really tried to get the links removed.
Joe: Let the link holder know that you got hit with a penalty, you're just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.
I've been on the receiving end of a lot of different strategies given the size of my domain portfolio. I've been sued before (as a first course of action!) by someone that PAID to put a link on my site....they never even asked, just filed the case.
Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links...but it's more about "show" to Google.
Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.
This is more of a business question than an SEO question, but ... as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?
Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get "your manual penalty has been removed" and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a "manual penalty removed" is great. IF there's good links left in your profile...if you've disavow everything, and your penalty is removed...so what...you've got nothing....people often ask where they'll be once they "recover" and I say "it depends on what you have left for links"...but it won't be where you were.
Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you're going to want to trim as quickly as possible just in order to survive.
Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others...even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.
One of my theories for why there hasn't been a penguin update in a long time was that as people have become more afraid of links they've started using them as a weapon & Google doesn't want a bunch of false positives caused by competitors killing sites. One reason I've thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?
Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I've been railing on negative SEO for several years now...right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else's ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally
I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even...by BIG companies. Brands being the mechanism to sort out the cesspool and all that.
Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.
That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?
Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile -- I wouldn't go so far as to preemptively disavow unless something major popped up.
Jim: I've done a preemptive disavow for my site. I'd say everyone should do a preemptive disavow to clean out the crap backlinks.
Joe: I can't wait to launch an avow service...basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)
Jim: We should team up Joe and do them together :)
Joe: I'll have my spambots call your spambots.
Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.
Joe: For Google or from Google? :) The other dig, if there's time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I'd still be waiting, even after fixing (which rapgenius really didn't do) largely because Google is not one of my direct or indirect investors.
Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)
Joe: Or to extract money from former Googlers...there's a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.
Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?
Jim: Don't try to rank for specific phrases anymore. It's a long slow road now.
Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.
Jim: You must create great things, and build your brand...that has to be the focus...unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.
Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.
Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?
Jim: links are still the biggest part of the Google algorithm - they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today... but yes, after links, you need great content, good user experience, and more.
Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn't mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don't see it going away in the next year or two.
An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we're all just guessing.
Thanks guys for sharing your time & wisdom!
About our contributors:
Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.
Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.
For those new to optimizing clients sites, or those seeking a refresher, we thought we'd put together a guide to step you through it, along with some selected deeper reading on each topic area.
Every SEO has different ways of doing things, but weâll cover the aspects that youâll find common to most client projects.
The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn't, try something else until it does.
Many opportunities tend to present themselves in ways not covered by âthe rulesâ. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it's a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).
When you work on a client site, you will most likely be dealing with a site that is already established, so itâs likely to have legacy issues. The other main challenge youâll face is that youâre unlikely to have full control over the site, like you would if it were your own. Youâll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.
The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.
The first step in optimizing a client site is to create a high-level strategy.
"Study the past if you would define the future.â - Confucious
Youâre in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.
Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why itâs important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.
This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO canât be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as itâs easy for visitors to click other listings and compare offers.
Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.
Document. Get it down in writing. The strategy will change over time, but youâll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. Itâs on this basis the client will judge you, and maintain or expand your services in future.
Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
We need to quantify what is already there, and whatâs not there.
Broken links are a low-quality signal. It's debatable if they are a low quality signal to Google, but certainly to users. If the client doesn't have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure itâs not blocking areas of the site that need to be crawled.
As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they've had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.
Some people roll this into a site audit, but Iâll split it out as weâre not looking at technical issues on competitor sites, weâre looking at how they are positioned, and how theyâre doing it. In common with a site audit, thereâs some technical reverse engineering involved.
There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.
When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where theyâre mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.
Sometimes, âcompetitorsâ, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but arenât direct competitors.
Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, theyâll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.
A well organised site is both useful from a usability standpoint and an SEO standpoint. If itâs clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.
Itâs a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).
A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site's architecture. Favour your money pages, and relegate your less important pages.
Think mobile. If your content doesn't work on mobile, then getting to the top of search results won't do you much good.
Ensure your site is deep crawled. To check if all your URLs are included in Googleâs index, sign up with Webmaster Tools and/or other index reporting tools.
The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.
Hereâs a video explaining how:
If you donât redirect pages, then you wonât be making full use of any link juice allocated to those pages.
Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better youâll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you're likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.
Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, itâs making your site into an attractive linking target.
Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.
Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.
You also need links from sites that your competitors donât have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask âwhat do they needâ?. Then give it to them.
Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.
Before chasing links, you should be aware of what type of site typically receives links, and make sure youâre it.
Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.
Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?
Youâll need to create content for any keyword areas youâre missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they donât already know. Mee-too content can work, but itâs not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as youâll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.
Consider keyword co-occurrence. What terms are readily associated with the keywords youâre chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.
Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isnât squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation theyâve found the right information. This was written a long time ago, but itâs interesting how relevant it remains.
Sites that donât link out appear unnatural. Matt Cutts noted:
Of course, folks never know when we're going to adjust our scoring. It's pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.
Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. Thatâs a great way to foster relationships, especially if your clientâs site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.
Links, the good kind, are about human relationships.
Itâs also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If youâre wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. Itâs too easy to be seen as a part of a link network, and itâs no longer 2002.
Itâs not set and forget.
Clients canât expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but itâs also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.
Whole books can be written about SEO for clients. And they have. We've skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.
Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?
Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
As Google has become more dominant, they've moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn't adversely impact the home team's business model.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Google engineers "research" new forms of Flash security issues to drive critical security updates.
Obviously, users love it:
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
And Chrome is easily the most locked down browser out there.
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.
This brings us back to the current snafu with the "right to be forgotten" in Europe.
Google notified publishers like the BBC & The Guardian of their links being removed due to the EU "right to be forgotten" law. Their goal was to cause a public relations uproar over "censorship" which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."
Google aims to promote themselves as a digital librarian: "Itâs a bit like saying the book can stay in the library, it just cannot be included in the libraryâs card catalogue."
That analogy is absurd on a number of levels. Which librarian...
David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past weâve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once weâre notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google's great power they do make mistakes. And when they do, people lose their jobs.
They were penalized November 17, 2012.
At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees â some even closing their doors â as a result of Googleâs Panda filter serving as judge, jury and executioner. Theyâve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
And such stories are understated for fear of coverage creating a witch-hunt:
Conversations Iâve had with web publishers, none of whom would speak on the record for fear of retribution from Cuttsâ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. âThe very fact Iâm not able to be candid, thatâs a testament to the grotesque power imbalance thatâs developed,â the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cuttsâ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
Then there are areas like locksmiths:
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."
Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.
And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
"Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient." â Noam Chomskyï»ż
Many people have come to the same conclusion
"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page
I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories â 150 to 300 words â about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.
In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The Automated Insights homepage lists both Yahoo! & Microsoft as clients.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
- Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
- Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
- Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
- Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...
About a month ago a year old case of an SEO firm being sued by it's client resurfaced via a tweet from Matt Cutts.
I'd like to add something to this conversation that will be helpful for you as a service provider seeking to avoid that really, really scary issue.
Some quick background information if I may? No specifics are allowed but I've been a party, on both sides, to actual litigation pertaining to SEO contracts (not services rendered, just contractual issues with a third-party).
I've been the plaintiff and the defendant in cases involving contractual disputes and legal obligations so I, much to my dismay, speak from experience.
Suffice to say I'm not a lawyer, don't act on any of this advice without talking it over with your counsel so they can tailor it to your specific needs and state law.
There are essentially 3 ways to legally protect yourself and/or your company objectively. I say objectively because anyone can sue you for anything and "service" is a subjective term as are "results" unless they are specifically spelled out in your contract.
Objectively speaking, the law gives you 3 broad arenas for protective measures:
Get a real lawyer, do not use internet "templates" and do not modify any piece of the contract yourself. Make sure your attorney completely understands what you do. A good lawyer will listen to you. Heck, mine now knows who Matt Cutts is and where the Webmaster Guidelines are located and what "anchor text" is :)
Your contracts need to cover the following scenarios:
For standard client agreements you'll want to cover some basic areas:
Some important notes are needed to discussion a couple of core areas of the contract:
For Governing law go with your home state if possible. Ideally, I try to get an arbitration clause in there rather than state law so in case there is a dispute it goes to a much less expensive form of resolution.
However, you can make an argument that if your contract is signed with your home state as governing law and your language is strong you are better off doing that instead of arbitration where one person makes a decision and no appeal is available.
For Limit of Liability go broad, real broad. You want to spell out that organic search (or just about any service) is not guaranteed to produce results, no promises were made, Google does not fully publish the algorithim thus you can't be held liable for XYZ that happens.
Also, if your client is asking you to do things against webmaster guidelines, and you decide to do them, you NEED to get that documented. Have them email it to you, record the call, something. Here is the liability clause in my contract:
Client agrees and acknowledges that the internet is an organic, constantly shifting entity, and that Clientâs ranking and/or performance in a search engine may change for many reasons and be affected by many factors, including but not limited to any actual or alleged non-compliance by Provider to guidelines set forth by Google related to search engine optimization.
Client agrees that no representation, express or implied, and no warranty or guaranty is provided by Provider with respect to the services to be provided by Provider under this Agreement. Providerâs services may be in the form of rendering consultation which Client may or may not choose to act on. To the maximum extent permitted by law, Client agrees to limit the liability of Provider and its officers, owners, agents, and employees to the sum of Providerâs fees actually received from Client.
This limitation will apply regardless of the cause of action or legal theory pled or asserted. In no event shall Provider be liable for any special, incidental, indirect, or consequential damages arising from or related to this Agreement or the Project. Client agrees, as a material inducement for Provider to enter into this Agreement that the success and/or profitability of Clientâs business depends on a variety of factors and conditions beyond the control of Provider and the scope of this Agreement. Provider makes no representations or warranties of any kind regarding the success and/or profitability of Clientâs business, or lack thereof, and Provider will not be liable in any manner respecting the same.
Client agrees to indemnify and hold harmless Provider and its officers, owners, agents, and employees from and against any damages, claims, awards, and reasonable legal fees and costs arising from or related to any services provided by Provider, excepting only those directly arising from Providerâs gross negligence or willful misconduct.
For vendor and independent contractor agreements you'll want most of the aforementioned clauses (especially the relationship of parties) in addition to a few more things (for employee stuff, get with your lawyer because states are quite different and a lot of us use remote workers in different states)
These clauses essentially prohibit the pursuit of your clientele and employees by a vendor/contractor for a specified period of time.
Don't be a sole proprietor, ever. If you're a smaller shop you might consider being a single member LLC (just you), an LLC (you and employees), or an S Corp. If you're a larger operation you might want to incorporate and go Inc.
The benefits of the LLC set up are:
Benefits of an S Corp are:
With the S Corp there's more paperwork and filings but if you are earning a fair bit of money it may be worth it to you. Here's a good article breaking this all down, and a excerpt:
"If you operate your business as a sole proprietorship or partnership/LLC, you will pay roughly 15.3% in self-employment taxes on your $100,000 of profits. The calculations get a little tricky if you want to be really super-precise but you can think about self-employment tax as roughly a 15% tax. So 15% on $100,000 equals $15,000. Roughly."
"With an S corporation, you split your business profits into two categories: "shareholder wages" and "distributive share." Only the "shareholder wages" get subjected to the 15.3% tax. The leftover "distributive share" is not subject to 15.3% tax."
Be careful here (and I'm not a CPA so don't do anything without consulting with your accountant) not to be absurd with your wages. So, if your net income is 1 million don't take 25k in wages and 975k as a distribution.
Some final thoughts on entities:
As you would imagine, insurance policies are few and far between for our industry. You can get general liability for your office, workers comp for your employees, disability for yourself, and so on. However, what you might want to look into is a professional liability policy.
You'll probably end up looking at a miscellaneous one like the one here (marketing consultant?) offered by Travelers. You'll probably have to educate your agent on your business practices to ensure proper coverage.
This might be worth it just due to the legal protection clause; meaning they will pay for a lawyer to defend you. Having the proper entity classification might protect your assets but paying lawyers is expensive to defend even frivolous lawsuits.
This is a bit out of the "contract" topic but good record keeping is essential. If you use a project management and/or a CRM system you really should make sure you can export when you need it.
Many online CRM applications and project management applications have limited export capabilities especially when it comes to export comments and notes on things like tasks and records. Most have an API that you can have a developer custom code to export your stuff. I'd look into this as well.
Get with your attorney and CPA to get your specific situations up to legal snuff if you haven't already. Don't act on my advice as I'm not a lawyer nor a CPA. Contracts and agreements are not fun to negotiate and can be even harder when you work with people you generally trust.
However, when it comes to business dealings and contracts I would save my trust for my lawyer :)
The internet runs on advertising. Google is funded almost entirely by advertising. Facebook , likewise. Digital marketing spends continue to rise:
Internet advertising revenues in the United States totaled $12.1 billion in the fourth quarter of 2013, an increase of 14% from the 2013 third-quarter total of $10.6 billion and an increase of 17% from the 2012 fourth-quarter total of $10.3 billion. 2013 full year internet advertising revenues totaled $42.78 billion, up 17% from the $36.57 billion reported in 2012.
Search advertising spend comes out on top, but thatâs starting to change:
Search accounted for 41% of Q4 2013 revenues, down from 44% in Q4 2012, as mobile devices have shifted. Search-related revenues away from the desktop computer. Search revenues totaled $5.0 billion in Q4 2013, up 10% from Q4 2012, when Search totaled $4.6 billion
The growth area for digital advertising lays in mobile:
Mobile revenues totaled 19% of Q4 2013 revenues, or $2.3 billion, up 92% from the $1.2 billion (11% of total) reported in Q4 2012
Prominent venture capitalist, Mary Meeker, recently produced an analysis that also highlights this trend.
So, internet advertising is growing, but web internet adoption is slowing down. Meanwhile, mobile and tablet adoption is increasing fast, yet advertising spend on these mediums is comparatively low. Nice opportunity for mobile, however mobile advertising is proving hard to crack. Not many people are clicking on paid links on mobile. And many mobile ad clicks are accidental, driving down advertiser bids.
This is not just a problem for mobile. There may be a problem with advertising in general. Itâs about trust, and lack thereof. This situation also presents a great opportunity for selling SEO.
But first, a little background....
Advertisingâs golden age was in the 50âs and 60âs.
Most consumers were information poor. At least, they were information poor when it came to getting timely information. This information asymmetry played into the hands of the advertising industry. The advertising agency provided the information that helped match the problems people had with a solution. Of course, they were framing the problem in a way that benefited the advertiser. If there wasnât a problem, they made one up.
Today, the internet puts real time information about everything in the hands of the consumer. It is easy for people to compare offers, so the basis for advertising - which is essentially biased information provision - is being eroded. Most people see advertising as an intrusion. Just because an advertiser can get in front of a consumer at âthe right timeâ does not necessarily mean people will buy what the advertiser has to offer with great frequency.
Your mobile phone pings. âYouâre passing Gordonâs Steak HouseâŠ.come in and enjoy our Mega Feast!â You can compare that offer against a wide range of offers, and they can do so in real time. More than likely, youâll just resent the intrusion. After all, you may be a happy regular at Susanâs Sushi.
âKnowing thingsâ is not exclusive. Being able to âknow thingsâ is a click away. If information is freely available, then people are less likely to opt for whatever is pushed at them by advertisers at that moment. If itâs easy to research, people will do so.
This raises a problem when it comes to the economics of content creation. If advertising becomes less effective for the advertiser, then the advertisers is going to reduce spend, or shift spend elsewhere. If they do, then what becomes of the predominant web content model which is based on advertising?
Weâre seeing it in broadcast television, and weâll see it on the web.
Television is dying and being replaced by the Netflix model. There is a lot of content. There are not enough advertisers paying top dollar as the audience is now highly fragmented. As a result, a lot of broadcast television advertising can be ineffective. However, as weâve seen with Netflix and Spotify, people are prepared to pay directly for the content they consume in the form of a monthly fee.
The long term trend for advertising engagement on the web is not favourable.
The very first banner advertisement appeared in 1994. The clickthru rate of that banner ad was a staggering 44% It had a novelty value, certainly. The first banner ad also existed in an environment where there wasnât much information. The web was almost entirely about navigation.
Today, there is no shortage of content. The average Facebook advertisement clickthrough rate is around 0.04%. Advertisers get rather excited if they manage to squeeze 2% or 3% click-thrus rates out of Facebook ads.
Digital advertising is no longer novel, so the click-thru rate has plummeted. Not only do people feel that the advertising isnât relevant to them, they have learned to ignore advertising even if the ad is talking directly to their needs. 97-98% of the time, people will not click on the ad.
And why should they? Information isnât hard to come by. So what is the advertiser providing the prospective customer?
Even brand engagement is plummeting on Facebook as the novelty wears off, and Facebook changes policy:
According to a new report from Simply Measured, the total engagement for the top 10 most-followed brands on Facebook has declined 40 percent year-over-yearâeven as brands have increased the amount of content theyâre posting by 20.1 percent.
Our industry runs on advertising. Much of web publishing runs on advertising.
However, Eric Clemons makes the point that the traditional method of advertising was always bound to fail, mainly because after the novelty wears off, itâs all about interruption, and nobody likes to be interrupted.
But wait! Isnât the advantage of search that it isnât interruption advertising? In search, the user requests something. Clemons feels that search results can still be a form of misdirection:
Misdirection, or sending customers to web locations other than the ones for which they are searching. This is Googleâs business model. Monetization of misdirection frequently takes the form of charging companies for keywords and threatening to divert their customers to a competitor if they fail to pay adequately for keywords that the customer is likely to use in searches for the companiesâ products; that is, misdirection works best when it is threatened rather than actually imposed, and when companies actually do pay the fees demanded for their keywords. Misdirection most frequently takes the form of diverting customers to companies that they do not wish to find, simply because the customerâs preferred company underbid.
He who pays becomes ârelevantâ:
it is not scalable; it is not possible for every website to earn its revenue from sponsored search and ultimately at least some of them will need to find an alternative revenue model.
The companies that appear high on PPC are the companies who pay. Not every company can be on top, because not every company can pay top dollar. So, what the user sees is not necessarily what the user wants, but the company that has paid the most - along with their quality score - to be there.
But nowadays, the metrics of this channel have changed dramatically, making it impossible or nearly impossible for small and mid-sized business to turn a profit using AdWords. In fact, most small businesses canât break even using AdWords.This goes for many large businesses as well, but they donât care. And that is the key difference, and precisely why small brands using AdWords nowadays are being bludgeoned out of existence
Similarly, the organic search results are often dominated by large companies and entities. This is a direct or side-effect of the algorithms. Big entities create a favourable footprint of awareness, engagement and links as a result of PR, existing momentum, brand recognition, and advertising campaigns. Itâs a lot harder for small companies to dominate lucrative competitive niches as they canât create those same footprints.
Certainly when it comes to PPC, the search visitor may be presented with various big player links at the expense of smaller players. Google, like every other advertising driven medium, is beholden to itâs big advertisers. Jacob Nielsen noted in 1998:
Ultimately, those who pay for something control it. Currently, most websites that don't sell things are funded by advertising. Thus, they will be controlled by advertisers and will become less and less useful to the usersâ
Being informed has changed customer behaviour.
The problem is not the medium, the problem is the message, and the fact that it is not trusted, not wanted, and not needed.
People donât trust ads. There is a vast literature to support this. Is it all wrong?
People donât want ads. Again, there is a vast literature to support this. Think about your own behavior, you own channel surfing and fast forwarding and the timing of when you leave the TV to get a snack. Is it during the content or the commercials?
People donât need ads. There is a vast amount of trusted content on the net. Again, there is literature on this. But think about how you form your opinion of a product, from online ads or online reviews?
There is no shortage of places to put ads. Competition among them will be brutal. Prices will be driven lower and lower, for everyone but Google.
If the advertising is not scaleable, then a lot of content based on advertising will die. Advertising may not be able to support the net:
Now reality is reasserting itself once more, with familiar results. The number of companies that can be sustained by revenues from internet advertising turns out to be much smaller than many people thought, and Silicon Valley seems to be entering another ânuclear winterâ
A lot of Adsense publishers are being kicked from the program. Many are terminated, without reason. Google appear to be systematically culling the publisher herd. Why? Shouldnât web publishing, supported by advertising, be growing?
The continuing plunge in AdSense is in sharp contrast to robust 20% revenue growth in 2012, which outpaced AdWords' growth of 19%.....There are serious issues with online advertising affecting the entire industry. Google has reported declining value from clicks on its ads. And the shift to mobile ads is accelerating the decline, because it produces a fraction of the revenue of desktop ads.
Matt Sanchez, CEO of San Francisco based ad network Say Media, recently warnedthat, "Mobile Is Killing Media."
Digital publishing is headed off a cliff âŠ There's a five fold gap between mobile revenue and desktop revenueâŠ What makes that gap even starker is how quickly itâs happeningâŠ On the industryâs current course, thatâs a recipe for disaster.
Prices tumble when consumers have near-perfect real time information. Travel. Consumer goods. Anything generic that can be readily compared is experiencing falling prices and shrinking margins. Sales growth in many consumer categories is coming from the premium offerings. For example, beer consumption is falling across the board except in one area: boutique, specialist brews. That market sector is growing as customers become a lot more aware of options that are not just good enough, but great. Boutique breweries offer a more personal relationship, and they offer something the customer perceives as being great, not just âgood enoughâ.
Mass marketing is expensive. Most of the money spent on it is wasted. Products and services that are âjust good enoughâ will be beaten by products and services that are a precise fit for consumers needs. Good enough is no longer good enough, products and services need to be great and precisely targeted unless you've got advertising money to burn.
Consumers will go to information suppliers they trust. There is always demand for a trusted source.
Trip Advisor is a great travel sales channel. Itâs a high trust layer over a commodity product. People donât trust Trip Advisor, per se, they trust the process. Customers talk to each other about the merits, or otherwise, of holiday destinations. Itâs transparent. Itâs not interruption, misleading or distracting. Consumers seek it out.
Trust models will be one way around the advertising problem. This suits SEOs. If you provide trusted information, especially in a transparent, high-trust form, like Trip Advisor, you will likely win out over those using more direct sales methods. Consumers are getting a lot better at tuning those out.
The trick is to remove the negative experience of advertising by not appearing to be advertising at all. Long term, itâs about developing relationships built on trust, not on interruption and misdirection. Itâs a good idea to think about advertising as a relationship process, as opposed to the direct marketing model on which the web is built - which is all about capturing the customer just before point of sale.
Rand Fishkin explained the web purchase process well in this presentation. The process whereby someone becomes a customer, particularly on the web, isnât all about the late stages of the transaction. We have to think of it in terms of a slow burning relationship developed over time. The consumer comes to us at the end of an information comparison process. Really, itâs an exercise in establishing consumer trust.
Amazon doesnât rely on advertising. Amazon is a trusted destination. If someone wants to buy something, they often just go direct to Amazon. Amazonâs strategy involves what it calls âthe flywheelâ, whereby the more things people buy from Amazon, the more theyâll buy from Amazon in future. Amazon builds up a relationship rather than relying on a lot of advertising. Amazon cuts out the middle man and sells direct to customers.
Going viral with content, like Buzzfeed, may be one answer, but itâs likely temporary. It, too, suffers from a trust problem and the novelty will wear off:
Saying âIâm going to make this ad go viralâ ignores the fact that the vast majority of viral content is ridiculously stupid. The second strategy, then, is the high-volume approach, same as it ever was. When communications systems wither, more and more of whatâs left is the advertising dust. Junk mail at your house, in your email; crappy banner ads on MySpace. Platforms make advertising cheaper and cheaper in a scramble to make up revenue through volume.
Itâs not just about supplying content. It could be said newspapers are suffering because bundled news is just another form of interruption and misdirection, mainly because it isn't specifically targeted:
Following The New York Times on Twitter is just like paging through a print newspaper. Each tweet is about something completely unrelated to the tweets before it. And this is the opposite of why people usually follow people and brands online. It's not surprising that The New York Times have a huge problem with engagement. They have nothing that people can connect and engage with
Eventually, the social networks will likely suffer from a trust problem, if they donât already. Their reliance on advertising makes them spies. There is a growing awareness of data privacy and users are unlikely to tolerate invasions of privacy, especially if they are offered an alternative. Or perhaps the answer is to give users a cut themselves. Lady Gaga might be onto something.
Friends âsellingâ (recommending) to friends is a high trust environment.
The serp is low trust. PPC is low trust. Search keyword plus a site that is littered with ads is low trust. So, one good long term future strategy is to move from low to high trust advertising.
A high trust environment doesnât really look like advertising. Itâs could be characterised as a transparent platform. Amazon and Trip Advisor are good examples. They are honest about what they are, and they provide the good along with the bad. It could be something like Wikipedia. Or an advisory site. There are many examples, but it's fair to say we know it when we see it.
A search on a keyword that finds a specific, relevant site that isnât an obvious advertisement is high trust. The first visit is the start of a relationship. This is not the time to bombard visitors with your needs. Instead, give the visitor something they can trust. Trip Advisor even spells it out: "Find hotels travelers trust".
Telsla understands the trust relationship. Recently, theyâve made their patents open-source, which, apart from anything else, is a great form of reputation marketing. Itâs clear Telsa is more interested in long term relationships and goodwill than pushing their latest model on you at a special price. Their transparency is endearing.
First, you earn trust. Then you sell them something later. If you donât earn their trust, then youâre just like any other advertiser. People will compare you. People will seek out information. Youâre one of many options, unless you have formed a prior relationship. SEO is a brilliant channel to develop a relationship based on trust. If you're selling SEO to clients, think about discussing the trust building potential - and value proposition - of SEO with them.
It's a nice side benefit of SEO. And it's a hedge against the problems associated with other forms of advertising.
Is your CMS SEO-friendly? The following checklist will help you determine the capabilities of your...
TYPO3 would do well to learn from the growing success and core strenghs of Wordpress
If you're looking for a quality and affordable web hosting provider, look no further. Web Hosting...