Wednesday, October 30, 2013

Halloween Inspired SEO Tricks to Keep Spiders at Bay

Over the years, I’ve made a habit of prodigiously extolling search engine best practices per contra to taking shortcuts designed to trick search engines into trusting that an online destination is something it is not. This Halloween, I’ve decided to produce an antithetical essay to my digital morals and beliefs by way of parody and embrace the dark side of search engine spoofs.
Fear of spiders?
Not a problem. There are many ways to keep unwanted arachnids from crawling through your content.
For starters, why not avoid using visible text on your website? Embed as much content as possible in images and pictures. Better yet, make your site into one big splash page that appears to scroll to infinity and beyond.
Also, make certain that the imagery loads as slowly as possible. Consider yourself lucky that you will be able to streamline your web metrics around paid search campaigns and not worry about those pesky organic referral terms [not provided] by Google anymore. Keeping spiders out of your content is your first step toward complete search engine invisibility.
If your site is inherently text heavy, consider using dropdown and/or pop-up boxes for navigation. Configure these with Flash or have them require JavaScript actions to function. If possible, put the rest of your web content exclusively in frames. Designing a website in such a manner is another great way to keep all those bad robots out of your site.
When it comes to URL structures, try to include as many ampersands, plus signs, equal signs, percentage signs, session IDs, and other dynamic parameters as possible in a multifaceted, splendidly deep file structure. That way, your website will be made up of really long URL strings that can confuse humans and spiders alike. Even better, add filters and internal site search functionality, metrics tags, and other superfluous attributes to your URLs, just to keep the search engines guessing about your site structure. Get ready to burn your site’s crawl equity to the ground, while watching your bandwidth spend soar, when you wrap your site up like a mummy with this navigational scheme.
If you really want to turn your website into a graveyard for search engine spiders, consider using completely unnecessary redirects on as many different URLs as possible, taking multiple hops along the way. Combine permanent and temporary redirects with soft 404 errors that can keep your content alive in search indices forever. Make certain to build in canonical tag conflicts, XML sitemap errors, perpetual calendars and such, reveling in the knowledge that you will never have to waste precious development time fixing broken links again!
Content creation budget got you down? Build in new economic efficiencies by using the exact same content across as many domains as your budget can spawn. Invest in machine-generated content instead of having to listen to those troublesome user reviews. Make “Spamglish” the official language of your website. Since you don’t have to worry about looking at what keywords Google allows to send traffic to your Frankensite, feel free to target irrelevant keywords on as many pages as possible.
Additionally, try to keep all the title tags exactly the same on the critically important pages within your site. Spiders don’t have good eyesight – sometimes you have to shout to get their attention. Consider keyword stuffing as a way to make certain that the search engines understand precisely what your site is all about. If you don’t have room to stuff unnecessary contextual redundancies into your web content, consider using hidden text that flickers like a ghost when users mouse over what looks like dead space.
Still not convinced you can hide your site from the search engines this Halloween?
Break out the big tricks, my friends, because we’ve got some link building treats to share.
If your ultimate goal is to bury a domain name for all eternity, make certain that you participate in as many link farming free-for-all sites as possible. When you get a chance to do so, go ahead and “splog” other’s guest books and forums. In addition to buying site-wide text links, demand that your backlinks be placed in the footers. While you’re at it, sell a similar “service” to others.
Consider hiding some links in places that surprise visitors and always embrace bad linking neighborhoods. You know the type of sites I’m talking about… they’re the spooky ones and the non-paranormal that a business person would avoid.
Have a wonderful Halloween this year, with the knowledge that you too can make a website completely disappear!
Disclaimer: I don't actually endorse that you try any of the above; everything in this particular column should be taken with a serious dose of tongue-in-cheek.

The Value of Referrer Data in Link Building

Before we get into this article let me first state, link building is not dead.  There are a lot of opinions floating around the web on both sides; this is just mine.  Google has shut down link networks and Matt Cutts continues to make videos on what types of guest blogging are OK.  If links were dead, would Google really put in this effort?  Would anyone get an “unnatural links” warning?
The fact is, links matter.  The death is in links that are easy to manipulate.  Some may say link building is dead but what they mean is, “The easy links that I know how to build are dead.”
What does this mean for those of us who still want high rankings and know we need links to get them?  Simply, buckle up, because you have to take off your gaming hat and put on your marketing cap.  You have to understand people and you have to know how to work with them, either directly or indirectly.
I could write a book on what this means for link building as a whole, but this isn't a book, so I'll try to keep focused.  In this article, we're going to focus on one kind of link building and one source of high quality link information that typically goes unnoticed: referrer data.
I should make one note before we launch in, I'm going to use the term loosely  to provide additional value.  We'll get into that shortly but first, let's see how referrer data helps and how to use it.

The Value Of Referrer Data

Those of you who have ignored your analytics can stop reading now and start over with “A Guide To Getting Started With Analytics.”  Bookmark this article and maybe come back to it in a few weeks.  Those of you who do use your analytics on at least a semi-regular basis and are interested in links can come along while we dig in.
The question is, why is referrer data useful?  Let's think about what Google's been telling us about valuable links: they are those that you would build if there were no engines.  So where are we going to find the links we'd be happy about if there were no engines?  Why, in our traffic, of course.
Apart from the fact that traffic is probably one of, if not the best, indicator of the quality and relevancy of a link to your site, your traffic data can also help you find the links you didn't know you had and what you did to get them. Let's start there.

Referrers To Your Site

Every situation is a bit different (OK – sometimes more than a bit) so I'm going to have to focus on general principles here and keep it simple.
When you look to your referrer data, you're looking for a few simple signals.  Here's what you're looking for and why:
  1. Which sites are directing traffic to you?  Discovering which sites are directing traffic to you can give you a better idea of the types of sites you should be looking for links from (i.e. others that are likely to link to you, as well). You may also find types of sites you didn't expect driving traffic. This happens a lot in the SEO realm, but obviously can also happen in other niches.  Here, you can often find not only opportunities, but relevancies you might not have predicted.
  2. What are they linking to?  The best link building generates links you don't have to actively build. The next best are those that drive traffic.  We want to know both. In looking through your referrer data, you can find the pages and information that appeal to other website owners and their visitors.  This will tell you who is linking to you and give you ideas on the types of content to focus on creating.  There's also nothing stopping you from contacting the owner of the site that sent the initial link and informing them of an updated copy (if applicable) or other content you've created since that they might also be interested in.
  3. Who are they influential with?  If you know a site is sending you traffic, logically you can assume the people who visit that site (or the specific sub-section in the case of news-type sites) are also interested in your content (or at least more likely to be interested than standard mining techniques).  Mining the followers of that publisher for social connections to get your content in front of them is a route that can increase your success rate in link strategies ranging from guest blogging to pushing your content out via Facebook paid advertising.  Admittedly, this third area of referrer data is more akin to refining a standard link list, but it's likely a different audience than you would have encountered (and with a higher-than-standard success rate for link acquisition or other actions).
As I noted above, I plan to use the term referrer data loosely.  As if point three wasn't loose enough, we're going to quickly cover a strategy that ties nicely with this: your competitor's referrer data.

Competitor Data

You probably can't call up a competitor and ask them for their traffic referrer data (if you can, I wish I was in your sector).  For the rest of us, I highly recommend pulling backlink referrer data for your competitors using one of the many great tools out there.  I tend to use Moz Open Site Explorer and Majestic SEO personally, but there are others.
What I'm interested in here are the URLs competitors link to.  While the homepage can yield interesting information, it can often be onerous to weed through and I generally relegate that to different link time frames.
Generally, I will put together a list of the URLs linked to, then review these as well as the pages linking to them.  This helps give us an idea of potential domains to target for links, but more importantly, they can let us know the types of relevant content that others are linking to.
If we combine this information with the data collected above when mining our referrer data, we can be left with more domains to seek links on and broader ideas for content creation.  You'll probably also find other ways the content is being linked to. Do they make top lists?  Are they producing videos or whitepapers that are garnering links from authority sites?  All of this information meshes together to make the energies you put into your own referrer mining more effective, allowing you to produce a higher number of links per hour than you'd be able to get with your own.

Is This It?

No.  While mining your referrer data can be a great source of information regarding the types of links you have that you should be seeking more of, it's limited to the links and traffic sources you already have.  It's a lot like looking to your Analytics for keyword ideas (prior to (not provided) at least).  It can only tell you what's working of what you have already.
A diversified link profile is the key to a healthy long term strategy.  This is just one method you can use to help find what works now and keep those link acquisition rates up while exploring new techniques.

30 Quick & Easy Ways to Increase Your Site's Linkability

Well here's the big problem with that: yes, I can just handle it and send you a report each month and I have clients where that works very well. But every single one of those clients has someone else handling all the "other" stuff that I'd be doing or thinking about, or they're knowledgeable themselves.
Link building can exist in a bubble and it can be successful that way, but it can't reach maximum success without the client getting involved on some level, and I'm talking about doing more than paying the invoices.
While we're on the subject of invoices and money, that's another problem: paying someone to do link building for you can get very expensive, especially if they're really good at what they do. For some small business owners, the cost is difficult to justify so they have to either do it themselves or accept the fact that they probably won't be able to compete very effectively.
The beauty of building links is that there are many, many things that can be done to both get you a link and raise your visibility so that your likelihood of generating more links increases. You can get a link from:

  • Handing out your business card.
  • Talking to your neighbor who mentions your business to the barista who makes his morning latte and is overheard asking another customer if he knows of anyone in town who does what you do because he wants to interview them for his side project, a personal blog. 
  • Emailing someone and asking for it.
  • Someone who finds your content in a search.
You can also invest some time in making your site a better and more efficient resource for your users. Link building isn't just about doing something that immediately gives you a text link. Links can be built from 100 different paths, some of which you'll never be able to accurately track.
It's hard to get that idea across to many clients who want to get lots of links and get them right now.
You have no idea how many times I've gone to a client and said "you know, if we rewrote the content on this page a bit so that it better reflected the anchor you want us to use, I think we'd be better off" or "you might want to figure out where the contact us form goes because when I tested it, it went nowhere and no one got in touch with me" and they basically (and usually nicely) tell me to keep quiet and just build them some links.
You also have no idea of the times we've lost link opportunities when a webmaster said "I can't even get their site to load" or asked why we wanted specific anchor text when it made no sense.
For example, if I'm writing a post like this one and I need to cite a source about a topic, I'm not going to use a site that takes a full minute to load. I'm not going to use one where there are 50 spam comments with no legitimate ones. I have my personal biases as does everyone else, and if those biases mean that a site loses a link opportunity, then it's something that could be fixed in order to improve the odds of a link opportunity.
Let's go through a few quick things that any client can do on his or her own, just to get started on the path of (eventually or immediately) building links and improving the linkability of the site.
You probably won't see massive changes overnight with any of these ideas, but they're all practices that we conduct ourselves and advise clients to do. They're also quick and easy (with the exception of the few more intensive initial ones), and they're a great way to get more comfortable with doing all the things you need to be doing in order to maximize your online visibility.
The key here is to make time to do something, even if it's just 15 minutes here and there.
You know how writers tell you that if you want to write, you should just set aside a few minutes a day and write something, just to get into the habit? Do the same with link building.

If You Have a Nice Chunk of Time

  • Analyze your backlink profile.
  • Analyze your competitor's backlink profile.
  • Make a list of what you have to offer a potential linking partner. If you don't have much, figure out what you can add.
  • Check your Site Speed Suggestions in Google Analytics. It's under Behavior > Site Speed. I've found this to be incredibly useful in identifying issues that are causing sites to lag.
PageSpeed Suggestions

If You Have an Hour

  • Find a top referring URL in your analytics that has a very low time on site/visit duration listed. Figure out why they're bouncing so fast. Does your site content that they're landing on seem misleading based on the anchor or the topic of the page that led them there? If they're landing on your contact form page and not spending much time, that might be fine, but if they're landing on a page where you'd expect them to stick around for more than 15 seconds, you might need to update your content, find a better landing page for that link and try to get it changed, or even change the anchor so that it better reflects its target.
Low Site Time
  • Write a blog post about something relevant to your industry, something big happening, etc.
  • Rework an outdated blog post or page on your site. See what you wrote about a year ago and update it in a new post. Socialize that you've updated it.
  • Write a response to another blog post on someone else's site and email or tweet to them to let them know about it.
  • Set up Google Authorship.
  • Check your site on a mobile device and move around. Do the links work properly? Is everything rendered correctly? With more and more people using a mobile device to surf the web, you really need to make sure that your site works well there.

If You Have 30 Minutes

  • Set up Google Analytics if you haven't done it already.
  • Set up Google and Bing Webmaster Tools if you haven't done it already.
  • Make sure you're listed in Google and Bing Places.
  • Set your business up on Foursquare.
  • Read Google's Webmaster Guidelines on link schemes.
  • Set up a social profile somewhere where you don't have one already, but only if you're prepared to use it. If you're a Twitter maniac but don't have a Facebook profile and can handle both, set up a Facebook page for your business. Set up a LinkedIn page. Set up a Quora account and go answer some questions.
  • Do a search to find any unlinked brand mentions. "Brand Online Niche" -site:site.com -press -release.
  • Do some manual rankings checking to see how your results look from a potential user's perspective, and not just to check the spot where they appear. For example, I recently found a result for one of my clients that was ranking at position 3 consistently but the CTR was terribly low. Once I saw the result and how it appeared, it became obvious that people were clicking on the other results (even further down the page) because their results looked much more enticing and mentioned a lower price, free software, etc.
  • Create a list of Potential Partner sites that you want to reach out to when you have more time. Go ahead and note the contact info, your idea for the site (guest post, potential linking partner, getting added to their resource page, etc.) and any other info that will help you.
  • Check to see if you are throwing any 404 errors and if you are and there are good links coming to these pages, either 301 redirect them somewhere else on your site (to the most relevant page, or the homepage if you have to although that isn't great for usability) or rebuild the page.

If You Have 5-10 Minutes

  • Set up Google Alerts (and/or Talkwalker Alerts) for your main keywords and your competitors' brand names, adding to it whenever you have a few extra minutes.
  • Reach out to a site that's authoritative in your niche and ask if there's a chance you could do a guest post there. To avoid irritating them, first check to see if they have a policy about this. Some have specific rules to follow for submissions, but some state that they don't accept guest posts so check to see if they list their policy before contacting them.
No Guest Posts
  • Ask someone authoritative in your industry if you could interview him or her for your site. That builds links and visibility as the people interviewed tend to promote these pieces.
  • Email or call one of the sites on your Potential Partners list that you created for this exact purpose.
  • Find more relevant people to follow on Twitter. Followerwonk is good for searching Twitter bios by keyword.
  • Interact with people on Twitter. This almost sounds silly but honestly, you'd be amazed at how many people do nothing other than occasionally tweet out links to their own content with nothing else ever being socialized. Most people who use social for the interaction aren't going to bother with you if that's how you work social.
  • Socialize an older blog post that's still relevant. (On that note, when you do write, work in some evergreen topics to increase the chances that you'll always have relevant content on the site.)
  • Thank someone who's just given you an unsolicited new link. Thank them on Twitter, thank them via email, etc. Just say thanks.
  • Send an email to your employees asking for content ideas and volunteers to write content.
  • Ask your employees or customers to ask you questions or identify areas where they're having difficulty. Sometimes when I'm stuck trying to decide what to write about, I'll ask a couple of my link builders to let me know where they're having trouble, for example. It's great fodder for content as the chances are they aren't the only ones having this issue. They can also point out issues that you've overlooked, so it's a great chance to fix something.

Summary

Just remember that building links involves more than doing something that immediately generates a link. Sometimes the process takes a very long time.
You might write a blog post that doesn't do well for months then suddenly someone uncovers it and links to it. You might send an email asking a webmaster to update his or her link to your site because you have a much better target that fits the anchor and it takes 6 months to get a reply saying it's been changed.
This post certainly isn't even close to being an exhaustive list of ideas of what you could do in small amounts of time, but hopefully it can help you realize that little changes can have big results.

Monday, October 21, 2013

Targeting Weird Niche Audiences for Links

Want to know an extremely successful strategy to get links and gain rankings? This one is actually a sub-strategy of the "limited edition" link building strategy.
Instead of making huge adaptations to your product or service, make small changes to make the product suitable for a weird niche audience.
The choice of your target audience should be illogical, but you should be able to defend it in the media. The fact that you created a special product specifically for that particular group of people makes you linkable.
Leftover Happy Hour
Here are a couple of examples:

  • "New dating site for LARP-ers" (Live Action Role Playing)
  • "Hyundai SUVs for midgets"
  • "Unique solar energy service for the Amish"
  • "Lingerie line suited for transsexuals"
  • "Restaurant leftover timeslot for poor people"
  • "Special car insurance for natural blonds"
  • "Trendy 'high school' bulletproof vests"
  • "Fast food discounts for recently divorced men"
  • "Strict celibacy group travel"
Once you've figured out what new audience you're going to address, you can start attracting media attention.
Select various angles from which your story is interesting to different media.
Prepare enough background information for each angle on your website so you're linkable to everybody.
Always try to involve any existing industry specific news sites. These will probably be interested in your explanation on why you see your new audience as an untouched market potential.
If your product has a technical aspect, make sure you're able to explain the inner working and how that appeals to your unique audience. If your storyline can be interpreted as a stance for or against a certain viewpoint (or group), make sure you attract attention from both sides. If possible, try to get a discussion about the ethics of your approach going.
Try to roll out the story within a week or two. This way no one interprets it as yesterday's news and they'll all want their share of the story's momentum.
If you need more time to contact all your preferred media partners, give them scoops under the embargo that they may only launch when the rest of your campaign does. This allows them to prepare the story and allows you to help and them in creating perfect link bait for your website.
Timing is key, but links and ranking are sure to follow.

What '(Not Provided)' & Google Hummingbird Mean for Small Business SEO

Many small businesses measure the performance of their SEO agency based on keyword-level data. They believe the value proposition stems from better rankings on specific keywords that they select. Small businesses generally approach SEO firms with the assumption that better rankings will equate to more business.

Small Business SEO Monthly Reports

As time goes by, the small business will receive a monthly report that will explicitly show that great progress has been made moving the rankings of the selected keywords in Google and Bing/Yahoo. More sophisticated agencies transform raw keyword analytics into several buckets.
It has been popular to bundle keywords that are branded (i.e., use some form of the company's name or brand in the search phrase) against those that are non-branded. An increase in non-branded traffic has been attributed to SEO, while increases in branded traffic were attributed to other marketing efforts.

'(Not Provided)' Kills Specific Analytics Reporting

Regardless of the politics of the decision, Google’s recent move to 100 percent secure search has decimated the branded/non-branded keyword analysis used by many small businesses to evaluate their SEO firms. But this is OK.

SEO is Changing (Again)

Google has made several large algorithmic changes in 2013. The net result of these changes is that many old tactics for link acquisition are no longer helpful.
Additionally, both small business owners and SEO firms need a new orientation to remain successful.
The Evolution of SEO Metrics

Brand is Important to Rankings

The death of branded versus non-branded keyword traffic may be a blessing, as many now believe that branded mentions are a key signal in the Google algorithm.
While no one will deny that backlinks remain the primary driver of rankings, the anti-spam filters have also become much more sophisticated.
Mentions of a small business company name, even when not accompanied by a backlink, are believed by many to be a signal of legitimacy.

Breadth of a Website Matters to Hummingbird

Whereas small businesses used to obsess over the rankings of their top money keywords, Google's new Hummingbird algorithm implies their focus needs to be elsewhere.
It is now critically important that the website answers questions for end users.
Yes, content is still king. In fact, content that answers specific questions may be critical for Hummingbird success.

Replace Non-Branded Keyword Traffic with Entrance Pages

A healthy website is constantly expanding in breadth. In other words, SEO post-Hummingbird requires that a site gain new keyword rankings every month to demonstrate that it is a helpful resource.
The easiest metric to review is the number of unique entrance pages used by users to get to the website.
A website with more entrance pages:
  • Is a stronger resource for answering questions.
  • Offers expertise on more topics.
  • Ranks on more long-tail keywords.
  • Has more breadth in a specific space.

How to Grow Website Entrances Pages

Websites can't grow their entrance pages without introducing new content regularly. While hard to believe, there are many webmasters who don't update their websites, have no blog, and refuse any assistance.
While introducing new pages of helpful, high-quality content is a great start, the issue of syndication and recognition remains a barrier – particularly in highly competitive keyword spaces.
Webmasters and small business owners need to be very creative to be noticed:
  • Newsjack hot stories in their areas of expertise.
  • Create infographics.
  • Make videos to post on YouTube.
  • Utilize staff to promote in social media.
  • Use a small budget to promote the best content via PPC.

Measuring Keyword Traffic Was Wrong Anyway

While new alternatives to keyword-level analytics, branded and non-branded traffic are rapidly emerging, the punchline is that measuring keyword rankings and traffic was the wrong criteria. The real return on investment from SEO comes from an increase in new customers.
Lead tracking remains the single best measure of how a small business website is performing. Many websites lack the appropriate web-to-lead and phone tracking (up to 50 percent of leads come through phone calls even when the website was found via a search engine) integration to have a complete picture.

Conclusions

This year has proven to be very dramatic for SEO with major changes from Google in the form of Hummingbird and "(not provided)". Small business owners who once obsessed over their top keyword rankings and traffic from a few "money terms" now need to adjust their thinking.
Google's secure search has removed the popular branded keyword traffic data, but offers an opportunity to instead focus on the breadth of a website via entrance pages. And Google Hummingbird clears the path for small business owners to generate high-quality content that really answers questions.

Big Brands, Google, Penalties & You

For years there has been controversy about big brands and their special place in Google's heart. For the most part, Google's supposed brand bias is really an SEO myth told late night over beers in darkened corners at conferences and in forum postings.
Most sites that are big brands rank well because they meet so many points on Google algorithm – everything from authority, to quality score, to links, to social signals.
If you see Wikipedia everywhere, as annoying as it may be, it positions so well because it has tons of content and more links pointed at it than stars in a desert sky. This doesn't mean Google prefers brands; it means the site is algorithmically awesome.
OK, so "algorithmically awesome" isn't really an SEO term, but it might as well be. If you naturally meet more points on the algorithm than any of your competitors, then your gift from Google will be to occupy a higher position in the search results. It just kind of works that way.

Wait, What About Google's Vince Update?

Yes, there was an algorithmic update called Vince in 2009 that threw big brands some special algo points.
That same year, there was "brand affinity." This quote from Eric Schmidt, Google's CEO at the time, probably said it best:
"Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard wired. It is so fundamental to human existence that it's not going away. It must have a genetic component."
So big brands were wired into the algorithm with Vince, then even more so with Panda. Yet, Google's Distinguished Engineer Matt Cutts said big brands "can't do whatever they want" and are subject to the algorithm just like regular sites. What is an SEO to believe? Which is it?
Well, it's really all of the above. You don't automatically get to the top by just being a big brand. If you have a poor website and are in general not doing well on the algorithm, you might do well for a few terms sure, but overall, no.

Big Brands and Search

Being a big brand naturally helps you with some algorithmic factors, including perceived site authority and quality. You also have the one thing most mom and pops don't: brand affinity.
One sentence of the Search Quality Rating Guidelines is telling: "Would you recognize this site as an authoritative source when mentioned by name?" Brands are more well known to users by simply being a brand, so the user intent is more likely to be Target the store then say target the bullseye.
Wikipedia ranks well everywherebecause, frankly it should. That said, you don't always position well just because you are a brand.
I once sat in a site clinic in which one of the largest ecommerce sites on the web didn't understand why it couldn't position for selling television sets. The problem? There were no signals from the site that it deserved to position to sell television sets. While the site did position well for things related to its core brand, it just didn't for things that weren't.
So where is the benefit for big brands outside of having a bit more ability to extract authority, quality, and large value points on links and social and rank for being a known brand?
Well mostly it seems limited to penalties.

Penalties and Big Brands

Many big brands have a direct connection at Google, which means someone at Google that will tell them when they crossed a line or at least one to Cutts to answer a question or two. And if you're really big, Cutts will warn you himself (see Mozilla)
Big brands also more likely to survive a Google penalty than a small- or medium-sized business (SMB) site, partially because they are stronger sites, partially because their penalties do seem to have much more limited damage.
Seriously, when was the last time you heard of a big brand being removed entirely from Google's index? Sure they get hit with penalties all the time, I know Cutts isn't misleading us about that. But the damage is much more focused and much more limited.
Remember JCPenney and Overstock? They lost keyword traffic, not their entire website.
Go to Google forums see how many SMBs can say the same.
Now some theorize that it is because Google gives these sites leniency – in the hey – they were at #1 they could not rank any higher, so those link buys didn't actually help them. Others believe it is just a strict brand preference.
Personally, my SEO brain has settled on it is a set of algorithmic factors, I like to term "the expectancy factor" or the "should be there factor" (this is just an easy way to say algorithmic factors combined with the fact that the sites are just stronger and the brands are given leniency and limited penalties).

'Should Be There' Status

Some sites should be in Google's index – not just because Google thinks it should be there, but because searches by end users have indicated to Google that they expect it to be there.
If one of these "should be there" sites didn't show up in Google's results, then the searcher might think Google has a pretty lousy product.
So Google protects its product by making sure to limit the effect of penalties on big brands by warning them directly and by helping certain ones recover quickly if a penalty is more damaging. Whichever it is, big brands are like a naval carrier in the middle of a penalty storm; your SMB is a Tiki raft.

Google Penalties – Big Brand Leniency

So how does it work in the real world when you have "should be there" status?
While a standard SMB site might receive one penalty and find itself without a homepage in the Google index (and many have), a big brand might look like this:
Manual Action Viewer
This is the manual action viewer of a big brand site.
Under each of these penalties, except for one, in the manual action viewer were approximately 1,000 pages (the maximum the viewer can handle). These penalties were on the subdomain, but the main domain was also penalized.
Both the main domain and the subdomain lost key terms in the search engine and key placements, but it did continue to position for new, highly trafficked terms though less relevant and longer tail, didn't cause enough damage that its core business functions were threatened.
Now what happens if you're a company that doesn't have "should be there" algorithmic triggers? And you receive even one of these manual actions on your site over a multitude of pages or even just a percentage of them?
You would have probably woken up to this:
Analytics OOPS
But this big brand site never saw this graph. This site hardly noticed the blip. Partially because it creates new pages all the time, which were positioning (well enough), which helped cover up the loss, but mostly because the penalties were isolated and not cross sectional.

Penalty Removal: Big Brand vs. SMB

These penalties were there for quite some time. So, if your site was not expected to be in the index, if you did not have site authority, site strength and your site was not a big brand, you would probably expect the road back would be pretty tough (noting that you would at most be dealing with one or two of these penalties as no small site would survive more than that without getting their homepage kicked out of the index along with the rest of their site).
Here's what the site looked like after one reconsideration request and one penalty removal.
2 million impressionsWithin a few days they regained 100 percent of their impressions, or 2 million (on average). This change shows that their site was repositioned into key terms and their new content was likely being shown highly in the search results.
We did a hand-check and yes, they regained key category terms and they were now positioning for relevant, highly competitive terms, even highly competitive phrases with short life expectancy (terms that would live only a few days).
If you aren't a big brand, you site isn't likely to work the same way. At the same time as this big brand site was recovering, we helped an SMB recover.
Instead of days, the SMB site took three months, three requests (one rewrite for depth and breadth), and a complete site rebuild. Only then did the homepage just start to show for their own name on the fifth page of Google.
This is more likely the outcome for an SMB. If you recover at all.

So Why do You Care About What Big Brands do?

Maybe that big brand corporation site is getting away with some black hat tactic, and your (clueless) boss, marketing team, board of advisors, or some other stakeholder knows about it.
"If they can do it, well we can, too!" they say.
No. You can't.
Unless you have algorithmic awesomeness, authority, and expectancy (and that expectancy is the key) you will much more likely end up losing your position, your pages or your homepage if you buy links or engage in other practices that violate Google's guidelines.

What Else Can You Learn From Big Brands?

Acting like a big brand is your best method for achieving success. Big brands send out strong signals to Google that tell Google there is a company and people behind them. These signals tell Google that the site is taken care of, that the company is awake at the helm, and that the site is going to be a good product.
Now not all big brands put out great websites, in fact a lot of them put out horrible websites, which is where expectancy (brand) can save them and where you can beat them.
Google has provided some guidance on building high-quality sites, in the form of these 23 questions. Follow these concepts, check your site. Does it meet the criteria of what Google (not you) considers a high-quality site? You don't have to hit every point, but the more signals you send Google the better.

Be a Big Brand in the Making

SEO times they are a' changing
Get your site to send out big brand signals. If you mimic all the good things a brand does well, Google will give you some of those authority and quality points:
  • Create content.
  • Build a natural link profile.
  • Use social.
  • Create more content (and more content).
  • Have a blog.
  • Make sure your site is technically sound, fast, visually appealing, and easy for users (and search engines to navigate).
  • Utilize experts in the industry to audit your site and tell you what you're not doing well, so you can do it better.
You aren't likely to get those valuable expectancy signals unless you have offline indicators as well, but that's OK. If you build a better site, with authority and meet more brand points on the algorithm, you don't need that to compete.
Brands do have leeway, but that is a much stronger case when it comes to penalties. Being a brand just means they need to be there and found, not found in the top position except for their name and a few core key terms.
Expectancy, being a brand, having authority doesn't get you automatic position; it just gets you some advantages in the game.

Friday, October 18, 2013

Securing the Future of SEO: Global Brands & 5 '(Not Provided)' Solutions

Future SEO
SEO has changed forever.
The great philosopher Heraclitus once said "change is the only constant". But wait? Einstein said a similar thing about the universe. Even the very subject on the correlation of change and constant in life is open to debate.
This sums up the situation that search marketers face themselves in today. SEO has a new meaning, a new direction. How we deal with it is driven by marketers' perception and the word "secure" now has more than one connotation in our market.
When Google made it apparent that 100 percent of its keyword data will be "(not provided)" (to the SEO community), many reacted with anger, angst, and frustration. Others sat back to absorb the news and some, content-based marketers, embraced the news as part of the natural evolution of SEO. They saw it coming and planned ahead. Read more

Will Google Punish a Guest Posting Strategy Due to Unnatural Link Patterns?

The Missing Link
"The Missing Link" is a Search Engine Watch exclusive reader-driven Q&A column with veteran content publicist Eric Ward. You can ask questions about all aspects of links and link building and Eric will provide his expert answers. Submit your questions here, and you may be featured in a future installment!

Until Google's recent algorithm changes we consistently ranked on top of the search results for our primary keyword phrase. Now we are on page two for that phrase so I am looking into pursuing guest posts on other sites. To my understanding Google will detect and penalize unnatural link patterns. As a result I am wondering if I obtain guest posts from a blog network would or could this be easily detected and penalized as a pattern?
– Blue on Page 2
I'll give you a short answer and a long answer.
Short answer: Writing posts for other sites (which is a form of guest posting) is still an effective way to build credible links.
The devil is always in the details though. Here's a longer answer.
First, a video from Google's Matt Cutts addressed this exact subject. I strongly suggest you watch it:
Of everything Cutts said, the thing that struck me most was when he said, "It's a long and time honored tradition" for writers with expertise in certain topics to share content with each other. In other words, it's absolutely acceptable.
In fact, this column you're reading likely falls into that category. I've written this post for Search Engine Watch and nobody else.
Technically, this is a guest post. But this is much different than a guest posting approach where you aren't selective about where you seek out posting opportunities or, on the receiving end, you aren't selective about who you accept guest post content from.
Here are few guidelines/criteria that may help you as you seek out guest posting opportunities:
  1. Look for signals/signs of credibility and longevity:
    • How long has the target site been on the web? Longer may mean more credibility.
    • Who owns/operates the site? Have you ever heard of them?
    • Is there an editorial team with clearly stated guidelines? There should be.
    • Is the target site's content made up of mostly guest posts? If so, be very cautious. Search to see if the posts have appeared on other sites.
    • Look at the other guest posts on the site. Search on author names to see who they are and what kind of web presence they have.
  2. If you want to take your credibility analysis really far, do a backlink analysis of the target site, as well as the target sites of the guest posters, to see just how credible their existing link profiles are. I personally stay away from any site that has any evidence of a spammy backlink profile, because I don't want my site to have any negative signal association with those sites.
One final point: at the start of your question, you specifically mentioned your Google rankings. Consider that even under the best guest posting circumstances, you can only take guest posting so far as a ranking-specific strategy.
You can't permanently guest post your way to the top of the search rankings. Nor should that be the only goal.
In fact, I look at guest posting opportunities for their potential to help my direct traffic and exposure to an audience I'm interested in reaching. I don't guest post for search rank. Seeking search rank via guest posting can lead you to make poor decisions and leave a linking footprint that Google can detect as manipulative.

3 SEO Success Factors for 2014

I recently came upon one of my old articles, "SEO Factors for 2011". I chuckled to myself, not only about how some of the details were actually still relevant but also how many seemed so elementary compared to today's search environment.
Yes, we still need to worry about the possession of "natural-looking" link profiles and how we feed our data to the engines. Those items will always be of consideration.
However, the nostalgic review of my piece had me thinking about what factors we need to consider for successful SEO as online marketers in 2014.

Social, Local & Mobile

We've spent the last handful of years practicing and preaching the importance of being in social, mobile, and local. This mindset was proactive. It allowed us to not solely focus on keywords and search results, but how these elements were going to change the search results our users saw as well as our user's experience.
While we walked down this road, at first it felt as if we were making strides to build silos of these efforts. Soon we saw the convergence of local and social sites molding into Google local results (e.g., Yelp reviews in Google local listings). We've also seen the fast paced growth of mobile and how localization of results has brought a more relevant delivery of results in this arena.

Search in 2013

This year has brought upon a lot for us to understand as marketers. As we close out 2013 algorithmic intelligence is changing faster than ever, at least in my opinion.
The buzz of 2013 and even more so the last few months has been upon the advancements of the Knowledge Graph, Local Carousel, Google Now, Hummingbird, and the great secure search/"(not provided)" change.
That's not even to mention Penguin and Panda, but those changes are more about what you may have done wrong in the past. We're here to talk about the future.

The Future of SEO

While the "(not provided)" announcement was a smack in the face to SEO professionals, hopefully it has helped you to realize that our intentions shouldn't be so focused so solely or intently on ranking a keyword in search results.
After watching what Google has been doing over the last year or so, where do keywords tie into the above-mentioned rollout features? They each in some way or another tie into local, mobile, or social.
  • Will keywords help you with the Local Carousel? No, proximity and review generation will.
  • How will Google Now propel your keyword strategy? It won't, but social efforts will.
  • Do you think that Google will give you a Knowledge Graph box for a keyword and link to your site? If so, you're dreaming.
Add in the Hummingbird update, and all of these changes tell us that Google is moving closer to bringing everything together through the tie-ins of localization and semantic improvements for conversational search, which is popular on mobile.

SEO Isn't Dead, It's Converging

SEO at its core will never be dead. All of the on-site needs of yesteryear will remain important in 2014. All of the newer processes of creating informational, enticing, and insightful content for link building and social digestion are still the hot topic now and will be heading into the future.
My point is that we need to watch the converging of our old silos into the new SERP display. SEO has taken on a converging role with other mediums which impact SERP display.
Pizza Near Shawnee KS Google SERP
For example, you've created optimized local listings for your local business, but know that the display weighs even more heavily on reviews, have you done your job at local-social integration.
Lowes Google SERP
Do you want to display your social activity in SERPs?

2014 Will Still be Big for SEO

Sites must be crawled efficiently, content must be targeted, and yes we still want to rank where desired. The focus as we move down the road is more so on what vehicles we use off-site to help drive traffic to our sites.
How we use the previous discussed pillars alongside their continual convergence by Google will determine how successful your online marketing strategy will become.
Quick takeaways:
  • Don't build a local listing. Allow your audience to help you build a local presence.
  • Don't build a brand. Build a community, a socialized brand, one that can keep your audience in tune with you in real-time.
  • Don't just optimize a site. Optimize an experience for those that are mobile and content hungry.

Ecommerce Product Pages: How to Fix Duplicate, Thin & Too Much Content

Content issues plague many sites on the web. Ecommerce sites are particularly at risk, largely due to issues that can stem from hosting hundreds or thousands of product pages.
Typical issues with ecommerce product pages are:

  • Duplicate content.
  • Thin content.
  • Too much content (i.e., too many pages).
Left unchecked, these issues can negatively impact your site's performance in the SERPs.
If you run an ecommerce site and you've seen traffic flat-line, slowly erode, or fall off a cliff recently, then product page content issues may be the culprit.
Let's take a closer look at some of the most common content woes that plague ecommerce sites, and recommendations on how to can fix them.

Duplicate Content

There are typically three types of duplicate content we encounter on ecommerce sites:
  • Copied versions of the manufacturer's product descriptions.
  • Unique descriptions that are duplicated across multiple versions of the same product.
  • Query strings generated from faceted navigation.

Copied product descriptions

A large degree of ecommerce resellers copy their generic product descriptions directly from the manufacturer's website. This is a big no-no. In the age of Panda, publishing copied or duplicated content across your site will weigh your site down in the SERPs like a battleship anchor.

How to fix it

The solution here is to author original product descriptions for every product on your site. If budget is an issue, prioritize and get fresh content written for your highest margin product pages first and work backwards.

Unique yet duplicated product descriptions

With many ecommerce sites, site owners have authored original product descriptions, which is fantastic. Where they run into trouble is they sell multiple versions of the same product (different sizes or colors or materials, etc), and each product version has a different page/URL with the same boilerplate description.
Now even though this content is technically unique to your site (it's not copied from somewhere else), it's only unique to a single page. Every other page it lives on is considered duplicated content.

How to fix it

The solution here is to concentrate multiple product version pages to a single page, with all the different product options listed down the page. Or you can position them as a list in a drop down menu, like Zappos does.
Product Dropdown Nike Lunarglide
Once you combine all pages to a single page, 301 redirect the other URLs to that single page, in the event they've attracted links and/or accrued link equity. The redirects will also help Google sort out the true version of your product page, and can help with any potential crawl budget issues.
Depending on the ecommerce platform you're using, concentrating multiple versions of a product page to a single URL can be difficult or impossible. If that's the case, think about moving to a SEO-friendly platform, like Magento or Shopify.

Faceted navigation issues

Many ecommerce sites host category pages with a range of filters to help users easily navigate their site and drill down to specific products, like this Weber Grill page on Home Depot.
Home Depot Faceted Navigation
A faceted navigation menu like the one above can create dozens if not hundreds of query strings that are appended to the URL, thereby creating duplicate versions of the same page. Faceted navigation can be a fantastic UX feature for consumers, but can problematic for SEO.

How to fix it

There are a few ways to prevent searches engines from indexing duplicate content from faceted navigation:
  • Block faceted pages via Robots.txt file.
  • Parameter handling via Webmaster Tools.
  • Add self-referential canonical tags (rel="canonical") Note: this may help Google distinguish original from duplicate content, but it won't address crawl budget issues.

Thin Content

Even if a site has 100 percent unique product descriptions, they can often be on the thin side (i.e., a few bullets of text). Now, product pages with light content can still rank well where domain strength helps supersede potential thin content issues.
But most sites don't have the backlink profiles of Amazon or Zappos, and I like to think in terms of risk/reward. Thickening up descriptions makes sense because:
  • It can reduce any risk that thin content issues might negatively impact SERP visibility
  • It adds more content for engines to crawl, which means more opportunities for your page to rank for a wider basket of search queries.
  • It freshens up your page, and freshening up your content can definitely pay dividends with Google.
To audit word count for every page on your site, crawl the site with Screaming Frog and looking for potential trouble spots in the "Word Count" column.
Word Count Audit

How to fix it

Some of the ways you can address thin content on your ecommerce product pages include:
  • Enable (and solicit) user reviews and feedback. User-generated content is free and helps thicken up your content with naturally-written text (not "SEO" content). This additional content can help improve potential relevancy scoring, time on page, user engagement levels, and can help the product page rank for a broader basket of search queries. Also, user reviews offer social proof and can improve conversion rates as well.
  • In the previous example, I spoke about condensing multiple versions of the same product to a single page. Doing this would also help thicken up that pages since you'd list all the different dimensions, size variations, colors available to consumers.
  • Write some additional, original content. You can hire a writer to help thicken up these pages with additional features and benefits, or you can do it yourself. Again, given it could be very costly to thicken up every product page on the site, you can prioritize your highest margin products first.
  • Pulling in mashups of links/text of similar products, product accessories, special offers and recently viewed items is another way to add more content to a page, and a tactic many larger ecommerce sites use like Amazon.com.
Amazon Product Mashups

Too Much Content

Saying that a site has "too much content" may sound contradictory to the issue of having content that's too thin. But when I say an ecommerce site may have too much content, I'm really talking about two distinct issues:
  • Too many product pages.
  • Improper handling of paginated product pages.
And specifically how having too many pages of low value content can cause PageRank and crawl budget problems.

Too many product pages

This is really an addendum to the duplicate content issues posed by faceted navigation or hosting multiple versions of the same product on different pages.
Aside from low value content concerns, hosting a mass of duplicated product pages dilutes your site's PageRank or link equity, which weakens its overall ranking power of your important content.
The other issue pertains to your site's "crawl budget" (i.e. how deep/how many pages Googlebot crawls each time it visits your website). If a large percentage of your site if comprised of duplicate or low value content, you're wasting your budget on junk content and potentially keeping quality pages from getting indexed.

Improper handling of paginated product pages

Another concern of hosting "too many pages" is not handling pagination correctly. Often times, ecommerce sites can have product categories containing hundreds or thousands of products that span multiple pages.
Pagination Issues
Like duplicate product pages, excessive paginated results rob link equity from important pages and can hurt your crawl budget.

How to fix

Some of the ways to address equity dilution or crawl budget issues that can stem from too many product pages include:
  • Rel=next, rel=previous: This markup tells Google to treat ecommerce product listings spanning multiple pages in a logical sequence, thus consolidating link equity (rather than diluting it) with all pages in the series.
  • Canonicalization: It's effective for consolidating link properties (thus solving equity dilution), but it won't solve potential crawl budget issues, since Googlebot will still crawl all your dupe content.
  • "Noindex, follow": If your goal is to optimize crawl budget and keep duplicates or pagination out of the index, use brute force and block Googlebot via robots "noindex, follow" meta directive.

Monday, October 14, 2013

The Causal Nexus of SEO



Dominos Falling
There are some aspects of online marketing that play a huge role in the bigger picture, but aren't as easy to see. Things like emotion, motivation, awareness, and relationships can be hard to gauge with our usual metrics.
But sometimes the effects of an action aren't evident right away. There are times when we can't associate cause and effect directly. But everything we do in SEO fits into a much bigger chain reaction and we might not able to see every piece.
When something doesn't fit our typical measurements it may be easy to write it off entirely. There's actually a word for that: floccinaucinihilipilification.
The exact definition of floccinaucinihilipilification from Dictionary.com is "The estimation of something as valueless." It's actually the longest non-technical word in the English language. Sorry, antidisestablishmentarianism.
Aside from being a semi-useful piece of party trivia, floccinaucinihilipilification is actually a great description of one of the most frustrating aspects of modern SEO. There are just so many things that are easy to dismiss because they are outside of our usual expectations for results.
Search is evolving to a point where we get much less instant gratification. Things take a much less linear path than they did in the past; you can't just walk across the room and turn on the light anymore. You have to use a Rube Goldberg machine to do it.

The Social Part of Social Media

In social media you can measure friends, followers, retweets, circles, referral visits, and sales through unique promotions. There are all sorts of fantastic metrics for judging how well a social campaign is performing. Of course not every one of those translates to visits, or dollars.
If a comment on your wall doesn't result in a sale or if a retweet doesn't improve rankings, then does it matter? Yes.
With social media it's also about the prospect of exposure. It may not be as clearly measurable when someone shares something on Facebook and one of their friends sees it and later searches the brand name. It's not always obvious when retweeting someone's post and getting a "thank you" leads to that person clicking on the Tweeter's site in the SERPs because they recognize the name.
If social media efforts aren't directly impacting your rankings, or the traffic numbers aren't approaching search engine referral proportions, that doesn't mean the campaign isn't working.
A comment on a wall may not mean much on its own. But a comment may lead to a new fan that may lead to a new sharer, who could grow to be an evangelist if the relationship is cultivated.
While direct leads are a possibility from social media, there's more to it than that. It's access to a huge and active audience if you're willing to play to the crowd.

Simple, Single Links

Links are probably one of the hardest places to deal with all of the changes in the last year. Links have been both the salvation and devastation of too many websites.
Bought links, links with keyword anchor text, easy, cheap, unlimited links weren't supposed to work, according to the rules. But they did. So forget the rules, people made money. Except now, best case scenario they don't work as well and worst case, they can tank a site.
So now links mean a totally different thing. They aren't as easy to get any more. They don't necessarily go to the pages where products live and links that go to different kinds of content don't always work the same way.
Links with your URL as anchor text probably won't move a site up for its head terms as quickly as a hand full of links brandishing keywords used to. So now maybe it's a about getting a link from a small community organization instead of 150 directories. But those little links are a much bigger deal now.
It's never going to be the same, but this is where we live now. A link from a person's enthusiast site for a how-to guide may not seem as effective as syndicating an article across 300 sites, but it's real. Things that are authentic may take longer to feel.

Trust and the Human Factor

Google has shown a continued effort to become scary close to emotional intimacy with the preferences of its users.
Authorship is one indicator of Google's improving efforts to identify individuals as entities. Public signs point toward their increased attempts to incorporate that information into how they evaluate websites. This interest in using real people's association with websites to determine trust, should be more than enough to pique our in getting onboard early.
On the other side, Google also seems to be trying to figure out which sites people trust through their own choices and patterns. That means visitor loyalty isn't just important for repeat sales, the signals it sends can be beneficial for SEO.
Some loyalty is measureable. Getting people to want to return to a site is measurable. We can see when the percentage of repeat users goes up.
We can measure how many people come to a site through subscription based newsletters or email marketing. We can measure when people become regular commenters or forum posters.
But it's hard to measure where those relationships start. Was the first time they came to your site searching for what you sell? Or is it possible it's because they knew you before they needed what you sell?
It isn't always as clear cut as which search word brought you the most visitors, or what was the last click before the sale. Sometimes that sale was months in the making based on a chain reaction that couldn't be tracked.

A More Convoluted Path

Each action that creates a positive connection has value even if it falls outside of our traditional data tracking.
We absolutely have to evaluate numbers, show correlation and prove ROI. That's the job of anyone working in SEO. But trying to optimize within the new system, we've had to get more creative.
It may take time for an initial action to produce a desired end result and there may be 10 steps in between instead of 3. But that doesn't mean it isn't worth it.
So don't immediately floccinaucinihilipilificate an effort in which direct results are a little ambiguous. There may be more at play than is immediately evident.
There's a time to give up on something that isn't working, sure. But make sure you're not comparing more slow-burning efforts to the precedents of the past.
At this point, shortcuts are getting shut down more and more every day, and the long way is about the only option left. So yes, an action might not lead to more rankings, traffic, or sales directly, but that doesn't necessarily mean it didn't work; it may simply be the first domino to fall.

Top Search Result = Poor Ad CTR [Study]

Advertising network Chitika released a study today that showed how ad click-through rates on a website vary when users come to that website from Position 1 in the organic search results versus other positions. Data showed the highest CTR on ads in a website occured when users found the site from Position 10 in the SERPs.
ctr-by-referring-position
As a follow up to Chitika's study last summer that showed how rankings yielded traffic, Chitika said this is a stark contrast in terms of ad performance.
"What is clear from the data set is that although the first position of a Google search result drives the most search traffic, an average visitor coming from that link is the least likely to convert into an ad click," according to Chitika.
Chitika said the reason why Position 10 might be driving the most ad CTR on a site could be due to unsatisfactory results.
"When a user scrolls down and clicks on a link at Position 10, it is more likely that they have not found what they were looking for, increasing the probability of that person clicking on an ad related to their search query," Chitika said.
Chitika said that marketers shouldn't necessarily be vying for 10th position on every keyword, but that in terms of driving ad revenue, it's not a bad place to be overall.
google-results-page-rank-average-traffic-share-chart
"On a popular search term, 2.4 percent of potential visitors still represents a sizable audience, and by being the number 10 result, it's likely a site will see higher ad revenues," Chitika said in its report. "However, for lower volume or specialized search terms, ranking as high as possible will help in attracting the largest audience, since the proverbial 'pie' of users on those terms is already fairly small, along with the potential revenue impact of higher visitor CTRs."
So what's a marketer to do with this data? Cristian Potter, a data solutions engineer at Chitika, said it's important to note that this report examines aggregate traffic trends, and may not apply to groups of sites.
"Hitting the sweet spot requires some analysis of an individual site's traffic, for example, understanding how users are finding the site, and how certain campaigns have impacted actions undertaken by users on the site itself," Potter said. He added that this research can serve as a "as a point of reference in plotting metrics and key performance indicators."
While the data seemed to show an interesting relationship, sites that go after ad revenue have a seemingly delicate balance of providing a great user experience and making money. Not having the most relevant content (Position 10 versus Position 1) and subsequently driving users away through an ad doesn't seem like a great idea, either.
Potter agreed.
"User experience is always a key consideration when it comes to deciding on the number and placement of ad units," Potter said. "This also ties in with expected CPM on each ad unit - it should be worth the site's while to place an ad in a prime position. However, this study was solely behavior focused. The characteristics of the one or more ad units on each site within the sample will have varied considerably."

New AdWords Estimated Total Conversions Tracks Consumer Purchases Across Devices

Starting today and over the next few weeks, Google AdWords will roll out a major reporting update to conversion tracking called Estimated Total Conversions. This feature provides estimates of conversions that take multiple devices to complete and adds this data to the conversion reporting we see today.
Following the launch of enhanced campaigns this year, search advertisers have combined mobile and desktops with the ability to further modify bids by mobile and other targeting factors. One gap in reporting and comprehension of the campaigns effectiveness has been the limited data on how consumers are navigating and converting via multiple device options.

What is a Cross-Device Conversion?

What is a Cross-Device Conversion
Consumers constant connectivity has enabled them to browse, shop, and interact with businesses on the go and from multiple devices.
A September 2013 Google study found that more than 90 percent of multi-device consumers move sequentially between several screens like mobile to desktop, or mobile to tablet to complete a transaction online. Google found that a high percentage of converters actually jumped from desktop to desktop too, presuming a work desktop to home desktop computer.

How Estimated Total Conversions Works

Measuring AdWords Conversions in a Multi-Screen World
Google calculates cross-device conversions for a particular advertiser based on how their customers convert when they are logged in. They then use this as the basis for extrapolating out to the complete data set to form an estimate of what total conversions that cross devices might look like. This data is only used in aggregate and not personally identifiable.

What's Next?

Estimating conversions across devices (estimated cross-device conversions) is only the beginning and one conversion type Google intends to measure.
In the future Google plans to incorporate other conversion types such as phone calls and store visits where advertisers are hungry to gain new insights into how their advertising is working.

Link Building 101: Competitor Analysis

Link Building 101 Competitor Analysis
Link building is something anyone can accomplish. There's no great secret, just hard work, creativity, and determination to get links that matter.
When you're looking for some practical link building opportunities that will help you find and acquire quick, yet quality, links, there are five "quick wins" you should explore at the beginning of a link building campaign:
  1. 404 Pages and Link Reclamation
  2. Competitor Analysis
  3. Fresh Web Explorer/Google Alerts
  4. Local Link Building
  5. Past/Current Relationships

Competitor Analysis/Backlink Profile

Competitor analysis is an integral step in any link building campaign. Why? Because running a backlink analysis on a competitor:
  • Teaches you about the industry:
    • Gives you a sense of which sites within the vertical are providing links
  • Helps you understand your competitors, including:
    • Their link profile, and why they're ranking
    • Their strategies used to acquire links
    • Their resources that didn't acquire many links
Gives you a list of obtainable links (if they can, why not you?)
Competitor backlink analysis is great – you get the initial research into the industry done, it helps you understand the competition, and it gives you a tidy list of high opportunity links.
So, let's dive into the how of competitor backlink analysis:
  1. Make a list of competitors
    • Direct
    • Indirect
    • Industry influencers
    • Those ranking for industry money keywords
    • Watch fluctuations – who's winning and who's losing
  2. Take those competitors and run their sites' through a backlink tool previously mentioned (OSE, Majestic, Ahrefs, CognitiveSEO, etc.)
  3. Backlink Analysis
  4. Download the top 3-4 competitors' backlinks into CSVs. Combine into a single Excel sheet, removing duplicates, and find obtainable quality links already secured by competitors.
Step 2 and 3 were previously covered in "Link Building 101: How to Conduct a Backlink Analysis", and step 1 is pretty self-explanatory.
To recap the advice for these steps:
  • Don't phone-in the list of competitors. Spend time doing research and investigation, giving yourself a well thought out and understood list of potential competitors.
  • Information you should be examining in a backlink analysis:
    • Total number of links
    • Number of unique linking domains
    • Anchor Text usage and variance
    • Fresh/incoming links
    • Recently lost links
    • Page Performance (via top pages)
    • Link quality (via manual examination)
  • Additionally, think creatively while looking through competitors' backlinks. Think about:
    • Which resources/pages performed well
    • Which resources/pages performed poorly
    • Commonalities in competitor's link profiles
    • Differences in competitor's link profiles
    • Strategies likely used to acquire links

How to Find Obtainable Quality Links

So, that takes us to Step 4: downloading competitors links into CSVs, combining in Excel, and drilling down into the data to find worthwhile links and insights.
Honestly, SEER has done an amazing job of writing a very easy to follow guide for Competitor Backlink Analysis in Excel.
To summarize their steps, you:
  • Download CSVs of competitor's backlink portfolios (‘Inbound Links' will give you a list of all the pages linking, ‘Linking Domains' will give you only the domains).
    • Note: if you're unfamiliar with your own (or client's) backlink portfolio, you may wish to include their backlink portfolio in this process for reference.
    • Using OSE don't forget to filter to the whole domain:
Pages on this root domain export to CSV
  • Open the CSVs and combine (copy and paste) all the data into a single Excel sheet.
  • Filter down to clean URLs, keeping the originals intact.
    • Move Column J (target URL) to Column P (to be the last column)
Move Column
    • Delete Column J (the now empty column)
Delete Empty Column
    • Duplicate the URL and Target URL columns on either side
Duplicate URL Target URL columns
    • Remove http:// and www. from both column A and column P - select the column, click control+H (find and replace shortcut), type in what you want to find (http:// and www.) and replace them with nothing (by leaving the second line blank).
Remove http and www
    • You might want to rename column A and P at this point - call them bare URL and bare target URL, or whatever you so desire (in the SEER article they were called ‘clean').
  • Remove duplicates
Remove Duplicates
    • Make sure it's only for column A (bare URL) and P (bare target URL)
Remove Duplicates URL
Notice the check mark on "My data has headers". This is important to keep your data from being jumbled up. Anytime you're removing duplicates make sure this box is checked.
This will give you a complete list of stripped URLs next to the full URL linking (along with the rest of the important information provided by OSE) and a list of full target URLs next to a complete list of stripped target URLs.
Note: you'll still likely have a lot of duplicate URLs in column A (the linking URLs) at this point. This is because there's multiple links on the same page going to different landing pages – which is potentially important information (shows a competitor acquired multiple links per page).
If you'd like to delete these multiple link pages/URLs to reduce data noise, highlight column A, and run ‘Delete Duplicates' again - making sure to have the ‘My data has headers' box is checked:
Remove Duplicates Bare URLs
Now, you'll be down to unique URLs (pages, not domains if you've used Inbound Links) linking to competitors. If you're looking for only referring domains, you should start back at step 1 and download a CSV of referring domains, as opposed to all links.
At this point, you're still dealing with a lot of data, so you'll want to filter it further. I recommend filtering by domain authority to see the most authoritative links first.
Filter Domain Authority
This will make your list ordered from highest domain authority to lowest – pretty useful information. Keep in mind however that the domain authority is thrown off by any subdomains hosted on a popular site – example.wordpress.com, example.blogspot.com, etc.
So, don't take the domain authority as absolute – you'll need to verify.
There's also a few other filters you can use to find interesting data:
  • Page Authority (PA)
  • Anchor Text
  • Number of domains linking (shows best ranking pages - don't get stuck on home pages)
Take time and play around with the data. Look through the top DA's (manually excluding anything artificially inflated), then PA's, check out top performing pages via number of domains linking, and even play around with filtering the anchor text.
This should be the fun part - the analysis. You've filtered the data down to a semi-digestible level, and should start taking advantage to find insights and understand your competitor's links.
Remember, any links your competitor has should be considered fair game for yourself. Once you've determined quality links from domains you haven't secured, look into the link and pursue it appropriately.

More Insights

If you're looking for an even better (and more advanced) deep data insights you can move all this information into pivot tables. Simply select all rows, click over to the insert tab, and select ‘Pivot Table':
Insert Pivot Table
Once here you have the option to choose which fields you'd like to further examine:
Pivot Table Fields to Add
Playing with this data should reveal potential insights, although we're getting a bit beyond Link Building 101.
Furthermore, if you want to really dive into pivot tables (or excel in general), I can't recommend Annie Cushing enough. Check out her Moz article "How to Carve Out Marketing Strategies by Mining Your Competitors' Backlinks".

After '(Not Provided)' & Hummingbird, Where is Google Taking Us Next?

We've come a long way in a little over two decades of search. Archie, Veronica, Jughead, Excite, Wanderer, Aliweb, Altavista, WebCrawler, Yahoo, Lycos, LookSmart, Google, HotBot, Ask, dmoz, AllTheWeb, Goto (Overture), Snap, LiveSearch, Cuil, Bing, Blekko, DuckDuckGo, Yandex, Baidu... and too many other also-rans to name.
The earliest were simply a collection of resources, initially just in alphabetical order, then some introducing an internal search capability. Eventually, some began to crawl the web, while others contented themselves with using the indexes of others.
Among them all, Google now stands out as the giant. About two-thirds of all global searches happen on Google. So that means that those of us who want our sites to be found in Google's search results need to color between the (webmaster guide)lines, while trying to figure out what Google wants to see, today and hopefully, tomorrow.

Search Today

Figuring out what Google prefers to rank isn't really that complex. Pay attention, use some common sense, don't look for silver bullets, and provide quality and value. Get that down pat and you're in pretty good shape.
Most folks who find themselves crosswise of Google got there because they (or someone they hired) tried to take a shortcut. Do shortcuts still work? You bet! Do they still last? Not so much!
Google has gotten a lot better at detecting and handling manipulative tactics. No, they're not perfect – not by a far cry. But the improvement is undeniable, and a couple of recent developments offer hope.
What happened?
Google unleashed a one-two punch recently, with two important changes that stirred up a lot of chatter in SEO and marketing communities. And I'm not convinced they're unrelated. They just mesh too well to be coincidence (not to be confused with correlation, my friends).

1. '(Not Provided)'

No Keyword DataThe recent extension to "(not provided)" for 100 percent of organic Google keywords in Google Analytics got a lot of people up in arms. It was called "sudden", even though it ramped up over a period of two years. I guess "it suddenly dawned on me" would be more accurate.
As my bud, Thom Craver, stated perfectly, if you're one of those who is saying that no keywords means SEO is dead or you can't do your job, then you shouldn't be doing SEO to begin with.
That sums it up pretty well. There are still ways to know what brought users to your pages. It's just not handed to you on a silver platter any more. You'll have to actually work for it.

2. Hummingbird

HummingbirdNow let's look at the other half of that double-tap: Hummingbird. Since Google's announcement of the new search algorithm, there have been a lot of statements that fall on the inaccurate end of the scale. One common theme seems to be referring to it as the biggest algo update since Caffeine.
Wrong on both counts, folks! First, Caffeine is a software set for managing the hardware that crawls and indexes, not search. As such, it's not an algorithm. It was also new, not updated, but we'll let that slide.
That second point, however, applies strongly to Hummingbird. There is no such thing as a Hummingbird update. It's a brand new search algorithm.
Jeez-Louise. if you're going to speak out, at least try not to misinform, OK?

Why Might they be Related?

Now understand, there's a bit of conjecture from here on out. I can't point to any evidence that supports this theory, but I think many of you will agree it makes some sense.
Killing the easy availability of keywords makes sense to me. People have focused on keywords to a degree that approaches (and often passes) ridiculous. Google has finally, however, achieved a sufficient level of semantic ability to allow them to ascertain, with a reasonable amount of accuracy, what a page is about, without having exact keywords to match to a query.
Methinks it's a good idea for the folks who are generating content to try the same.
So... we can no longer see the exact keywords that visitors used to find us in organic search. And we no longer need to use exact keywords to be able to rank in organic search.
Yeah, I know, pure correlation. But still, a pattern, no?
My theory is that there's no coincidence there. In fact, I think it runs deeper.
Think about it. If you're no longer targeting the keywords, you can actually *gasp* target the user. Radical concept for folks who are still stuck in a 2005 rut.
Bottom line: You need to start building your content with concept and context in mind. That'll result in better content, more directed to your visitors – then you can stop worrying about whether Google has a clue about the topic your page is focused on.
Just communicate. If you do it right, it'll come through, for both. Just think things, not strings.

Where is Search Heading Next?

RainbowHere's where I think the Knowledge Graph plays a major role. I've said many times that I thought Google+ was never intended to be a social media platform; it was intended to be an information harvester. I think that the data harvested was intended to help build out the Knowledge Graph, but that it goes still deeper.
Left to its own devices, Google could eventually build out the Knowledge Graph. But it would take time, and it would undoubtedly involve a lot of mistakes, as they dialed their algos in.
With easily verified data via Google+, Google has a database against which they can test their algos' independent findings. That would speed the development process tremendously, probably shaving two or three years off the process.
But my theory doesn't end there. Although I suspect it wasn't a primary motivation, the removal of keywords, coupled with the improved semantic ability of Hummingbird, puts a whole new level of pressure on people to implement structured data. As adoption cranks up, the Knowledge Graph will be built out even faster.
As I said, I doubt that motivating people to implement structured data markup was a primary focus of the recent changes. But I'll bet it was a major benefit that didn't go unnoticed at the 'Plex.
The last week has definitely brought some changes to the way we'll be handling our online marketing and SEO efforts. The Internet continues to evolve. Those who don't follow suit may soon be extinct.
For my part, I'm pleased to see the direction that Google seems to be moving in. It's a win-win.

5 Things We've Learned From Google's New War on Links

It's been 18 months now since Google's Penguin update launched and a similar amount of time since the first manual penalty messages were sent to unsuspecting webmasters.
That's a long time in the world of digital marketing. While most industries deal with a level of change, the rate of iteration across the web is unprecedented.
Such a level of change requires an agile approach to processes. Google practices a Kaizen approach to product development and penalties, so it's imperative that we consistently reexamine how and why we do everything.
The same rule applies to how penalties are dealt with. It's a given that the tolerances Google allows across metrics have changed since those penalties were first introduced. Industry opinions would certainly support that theory.
Strangely, for a content led company, the digital marketing agency I run is now very experienced in penalty recovery, as a result of new clients coming to us looking for a way to market their companies in a different way.
It means, in short, that I have lots of data to draw conclusions from. I want to share our recent findings based on recent real world work, including a few key tips on areas that you may be missing while clean up is going on. Here are some top takeaways.

Link Classification

While Google has long been giving out examples of links that violate their guidelines, in recent weeks things have changed.
Until recently it was so easy to call a "bad" link you could spot them with your eyes closed. The classification was so easy it has spawned a proliferation of "link classifier" tools. And while they prove to be useful as a general overview and to help do things at scale, the pace of Google's iteration has made manual classification an absolute must.
So what has changed?
We've always known that anchor text overuse is a key metric. Here are the results of a charting study we ran across those clients escaping either manual or algorithmic penalties:
Percent of Suspect Links Post-Recovery
It isn't perfect, but the data shows an irrefutable trend toward a less tolerant stance on "spam" by Google.
I don't want this to be seen a definitive result or scientific study because it isn't. It is simply some in-house data we have collated over time that gives a general picture of what's going on. Recovery. in this instance. is classed either as manual revoke or "significant" improvement in rankings and traffic over more than a month.

The Link Types Being Classified as 'Unnatural' are Changing

The view that things are indeed changing has been supported by example links coming through from Google in the past four weeks as part of its manual review communication.
Instead of the usual predictable irrelevant web directory or blog network, the search giant seems to be getting much more picky.
And while I can't share exact links due to client confidentiality, here are a couple examples of specific link types that have been specifically highlighted as being "unnatural":
  • A relevant forum post from a site with good TrustFlow (Majestic's measure of general domain "trust").
  • A Domain Authority (DA) 27 blog with relevant and well-written content (DA is a Moz.com metric measured out of 100).
Ordinarily these links would pass most classification tests, so it was surprising to see them listed as unnatural. Clearly we can't rule out mistakes by whoever reviewed the site in question, but let's assume for a moment this is correct.
In the case of the forum post it had been added by a user with several posts and the text used was a relevant and part of the conversation. It looked natural.
The blog post was the same in being natural in almost all metrics.
The only factor that could have been put into question was the use of anchor text. It was an exact match phrase for a head term this site had been attempting to rank for in the past. That might be an obvious signal and is one of the first places to look for unnatural links, but it gives an interesting nod to where Google may be taking this.

3. Co-Citation and the End of Commercial Anchors?

A lot has been written about the changing face of anchor text use and the rise of co-citation and co-occurrence. I penned a piece a few months ago in fact one the future of link building without links. It seems as though Google now wants to accelerate this by putting more pressure on those still using exact match tactics.
It is certainly my view now that links are playing a less significant role in general rankings. Yes, a site has to have a good core of links, but Google's algorithms are now much more complex. That means Google is looking at more and more metrics to define the search visibility of a domain, which leaves less room for "links" as a contributory factor.
Given that semantic search also isn't reliant on links and that Google has made clear its intentions to move toward this future, it's clear that brand mentions, social sharing, and great content that is produced regularly and on point, is becoming more critical.
Links are by no means dead. Anyone that says that is crazy. But there is certainly more contributing to visibility now.

4. Check Your Page-Level Anchor Text

Penguin 2.0 has also changed the way we look at penalties in general. While it was OK to simply take a domain-wide view of link metrics such as quality, anchor text, and relevance, that's no longer enough.
The search giant has become much more targeted in its application of penalties, certainly since Penguin 2.0. As a result, we're now seeing partial penalties being reported in Webmaster Tools, as well as full manual actions and a plethora of other actions.
This means one thing: Google understands its data better than ever and is looking at the quality of links in a much deeper way, not just as those pointing directly to your site but even where those sites are getting their link juice from.

5. Look Out for Different Pages Ranking

One sure-fire sign of issues with individual page over-optimization or penalization is where Google struggles to index what you would consider as the "right" page for a term. This is often because Google is ignoring the "right" page and instead looking to other pages on your site.
If you see different pages ranking for a specific term within a few weeks, then it's worth checking the anchor text and links specifically pointing to that page.
Often you may find just one or two links pointing to it but 50+ percent may be exact match and that seems now to be enough to create issues.

What Now?

The key is to be informed. Invest in multiple data source to ensure you have the full picture. You can use the following:
The above combination allows you to take a full picture view of every link on your site and gives you a second opinion should you feel it necessary. Removing links is a significant strategy. It pays to have more than one view to back up initial findings on things such as anchor text use and link quality and trust.
Alongside that, it's worth running a check of every linked-to page on your site you can then check anchor text ratios for every one. That way you can reduce the impact of partial actions.
The key is to reduce the use of exact match anchors as much as humanly possible as tolerated percentages are only going one way!
Above all, it may be time to start thinking beyond links entirely and onto a world of "brand as publisher," creating great content from a clearly defined content strategy, and then supporting it with an informed distribution strategy. But that's a story for another day.