Monday, October 21, 2013

Big Brands, Google, Penalties & You

For years there has been controversy about big brands and their special place in Google's heart. For the most part, Google's supposed brand bias is really an SEO myth told late night over beers in darkened corners at conferences and in forum postings.
Most sites that are big brands rank well because they meet so many points on Google algorithm – everything from authority, to quality score, to links, to social signals.
If you see Wikipedia everywhere, as annoying as it may be, it positions so well because it has tons of content and more links pointed at it than stars in a desert sky. This doesn't mean Google prefers brands; it means the site is algorithmically awesome.
OK, so "algorithmically awesome" isn't really an SEO term, but it might as well be. If you naturally meet more points on the algorithm than any of your competitors, then your gift from Google will be to occupy a higher position in the search results. It just kind of works that way.

Wait, What About Google's Vince Update?

Yes, there was an algorithmic update called Vince in 2009 that threw big brands some special algo points.
That same year, there was "brand affinity." This quote from Eric Schmidt, Google's CEO at the time, probably said it best:
"Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard wired. It is so fundamental to human existence that it's not going away. It must have a genetic component."
So big brands were wired into the algorithm with Vince, then even more so with Panda. Yet, Google's Distinguished Engineer Matt Cutts said big brands "can't do whatever they want" and are subject to the algorithm just like regular sites. What is an SEO to believe? Which is it?
Well, it's really all of the above. You don't automatically get to the top by just being a big brand. If you have a poor website and are in general not doing well on the algorithm, you might do well for a few terms sure, but overall, no.

Big Brands and Search

Being a big brand naturally helps you with some algorithmic factors, including perceived site authority and quality. You also have the one thing most mom and pops don't: brand affinity.
One sentence of the Search Quality Rating Guidelines is telling: "Would you recognize this site as an authoritative source when mentioned by name?" Brands are more well known to users by simply being a brand, so the user intent is more likely to be Target the store then say target the bullseye.
Wikipedia ranks well everywherebecause, frankly it should. That said, you don't always position well just because you are a brand.
I once sat in a site clinic in which one of the largest ecommerce sites on the web didn't understand why it couldn't position for selling television sets. The problem? There were no signals from the site that it deserved to position to sell television sets. While the site did position well for things related to its core brand, it just didn't for things that weren't.
So where is the benefit for big brands outside of having a bit more ability to extract authority, quality, and large value points on links and social and rank for being a known brand?
Well mostly it seems limited to penalties.

Penalties and Big Brands

Many big brands have a direct connection at Google, which means someone at Google that will tell them when they crossed a line or at least one to Cutts to answer a question or two. And if you're really big, Cutts will warn you himself (see Mozilla)
Big brands also more likely to survive a Google penalty than a small- or medium-sized business (SMB) site, partially because they are stronger sites, partially because their penalties do seem to have much more limited damage.
Seriously, when was the last time you heard of a big brand being removed entirely from Google's index? Sure they get hit with penalties all the time, I know Cutts isn't misleading us about that. But the damage is much more focused and much more limited.
Remember JCPenney and Overstock? They lost keyword traffic, not their entire website.
Go to Google forums see how many SMBs can say the same.
Now some theorize that it is because Google gives these sites leniency – in the hey – they were at #1 they could not rank any higher, so those link buys didn't actually help them. Others believe it is just a strict brand preference.
Personally, my SEO brain has settled on it is a set of algorithmic factors, I like to term "the expectancy factor" or the "should be there factor" (this is just an easy way to say algorithmic factors combined with the fact that the sites are just stronger and the brands are given leniency and limited penalties).

'Should Be There' Status

Some sites should be in Google's index – not just because Google thinks it should be there, but because searches by end users have indicated to Google that they expect it to be there.
If one of these "should be there" sites didn't show up in Google's results, then the searcher might think Google has a pretty lousy product.
So Google protects its product by making sure to limit the effect of penalties on big brands by warning them directly and by helping certain ones recover quickly if a penalty is more damaging. Whichever it is, big brands are like a naval carrier in the middle of a penalty storm; your SMB is a Tiki raft.

Google Penalties – Big Brand Leniency

So how does it work in the real world when you have "should be there" status?
While a standard SMB site might receive one penalty and find itself without a homepage in the Google index (and many have), a big brand might look like this:
Manual Action Viewer
This is the manual action viewer of a big brand site.
Under each of these penalties, except for one, in the manual action viewer were approximately 1,000 pages (the maximum the viewer can handle). These penalties were on the subdomain, but the main domain was also penalized.
Both the main domain and the subdomain lost key terms in the search engine and key placements, but it did continue to position for new, highly trafficked terms though less relevant and longer tail, didn't cause enough damage that its core business functions were threatened.
Now what happens if you're a company that doesn't have "should be there" algorithmic triggers? And you receive even one of these manual actions on your site over a multitude of pages or even just a percentage of them?
You would have probably woken up to this:
Analytics OOPS
But this big brand site never saw this graph. This site hardly noticed the blip. Partially because it creates new pages all the time, which were positioning (well enough), which helped cover up the loss, but mostly because the penalties were isolated and not cross sectional.

Penalty Removal: Big Brand vs. SMB

These penalties were there for quite some time. So, if your site was not expected to be in the index, if you did not have site authority, site strength and your site was not a big brand, you would probably expect the road back would be pretty tough (noting that you would at most be dealing with one or two of these penalties as no small site would survive more than that without getting their homepage kicked out of the index along with the rest of their site).
Here's what the site looked like after one reconsideration request and one penalty removal.
2 million impressionsWithin a few days they regained 100 percent of their impressions, or 2 million (on average). This change shows that their site was repositioned into key terms and their new content was likely being shown highly in the search results.
We did a hand-check and yes, they regained key category terms and they were now positioning for relevant, highly competitive terms, even highly competitive phrases with short life expectancy (terms that would live only a few days).
If you aren't a big brand, you site isn't likely to work the same way. At the same time as this big brand site was recovering, we helped an SMB recover.
Instead of days, the SMB site took three months, three requests (one rewrite for depth and breadth), and a complete site rebuild. Only then did the homepage just start to show for their own name on the fifth page of Google.
This is more likely the outcome for an SMB. If you recover at all.

So Why do You Care About What Big Brands do?

Maybe that big brand corporation site is getting away with some black hat tactic, and your (clueless) boss, marketing team, board of advisors, or some other stakeholder knows about it.
"If they can do it, well we can, too!" they say.
No. You can't.
Unless you have algorithmic awesomeness, authority, and expectancy (and that expectancy is the key) you will much more likely end up losing your position, your pages or your homepage if you buy links or engage in other practices that violate Google's guidelines.

What Else Can You Learn From Big Brands?

Acting like a big brand is your best method for achieving success. Big brands send out strong signals to Google that tell Google there is a company and people behind them. These signals tell Google that the site is taken care of, that the company is awake at the helm, and that the site is going to be a good product.
Now not all big brands put out great websites, in fact a lot of them put out horrible websites, which is where expectancy (brand) can save them and where you can beat them.
Google has provided some guidance on building high-quality sites, in the form of these 23 questions. Follow these concepts, check your site. Does it meet the criteria of what Google (not you) considers a high-quality site? You don't have to hit every point, but the more signals you send Google the better.

Be a Big Brand in the Making

SEO times they are a' changing
Get your site to send out big brand signals. If you mimic all the good things a brand does well, Google will give you some of those authority and quality points:
  • Create content.
  • Build a natural link profile.
  • Use social.
  • Create more content (and more content).
  • Have a blog.
  • Make sure your site is technically sound, fast, visually appealing, and easy for users (and search engines to navigate).
  • Utilize experts in the industry to audit your site and tell you what you're not doing well, so you can do it better.
You aren't likely to get those valuable expectancy signals unless you have offline indicators as well, but that's OK. If you build a better site, with authority and meet more brand points on the algorithm, you don't need that to compete.
Brands do have leeway, but that is a much stronger case when it comes to penalties. Being a brand just means they need to be there and found, not found in the top position except for their name and a few core key terms.
Expectancy, being a brand, having authority doesn't get you automatic position; it just gets you some advantages in the game.

Friday, October 18, 2013

Securing the Future of SEO: Global Brands & 5 '(Not Provided)' Solutions

Future SEO
SEO has changed forever.
The great philosopher Heraclitus once said "change is the only constant". But wait? Einstein said a similar thing about the universe. Even the very subject on the correlation of change and constant in life is open to debate.
This sums up the situation that search marketers face themselves in today. SEO has a new meaning, a new direction. How we deal with it is driven by marketers' perception and the word "secure" now has more than one connotation in our market.
When Google made it apparent that 100 percent of its keyword data will be "(not provided)" (to the SEO community), many reacted with anger, angst, and frustration. Others sat back to absorb the news and some, content-based marketers, embraced the news as part of the natural evolution of SEO. They saw it coming and planned ahead. Read more

Will Google Punish a Guest Posting Strategy Due to Unnatural Link Patterns?

The Missing Link
"The Missing Link" is a Search Engine Watch exclusive reader-driven Q&A column with veteran content publicist Eric Ward. You can ask questions about all aspects of links and link building and Eric will provide his expert answers. Submit your questions here, and you may be featured in a future installment!

Until Google's recent algorithm changes we consistently ranked on top of the search results for our primary keyword phrase. Now we are on page two for that phrase so I am looking into pursuing guest posts on other sites. To my understanding Google will detect and penalize unnatural link patterns. As a result I am wondering if I obtain guest posts from a blog network would or could this be easily detected and penalized as a pattern?
– Blue on Page 2
I'll give you a short answer and a long answer.
Short answer: Writing posts for other sites (which is a form of guest posting) is still an effective way to build credible links.
The devil is always in the details though. Here's a longer answer.
First, a video from Google's Matt Cutts addressed this exact subject. I strongly suggest you watch it:
Of everything Cutts said, the thing that struck me most was when he said, "It's a long and time honored tradition" for writers with expertise in certain topics to share content with each other. In other words, it's absolutely acceptable.
In fact, this column you're reading likely falls into that category. I've written this post for Search Engine Watch and nobody else.
Technically, this is a guest post. But this is much different than a guest posting approach where you aren't selective about where you seek out posting opportunities or, on the receiving end, you aren't selective about who you accept guest post content from.
Here are few guidelines/criteria that may help you as you seek out guest posting opportunities:
  1. Look for signals/signs of credibility and longevity:
    • How long has the target site been on the web? Longer may mean more credibility.
    • Who owns/operates the site? Have you ever heard of them?
    • Is there an editorial team with clearly stated guidelines? There should be.
    • Is the target site's content made up of mostly guest posts? If so, be very cautious. Search to see if the posts have appeared on other sites.
    • Look at the other guest posts on the site. Search on author names to see who they are and what kind of web presence they have.
  2. If you want to take your credibility analysis really far, do a backlink analysis of the target site, as well as the target sites of the guest posters, to see just how credible their existing link profiles are. I personally stay away from any site that has any evidence of a spammy backlink profile, because I don't want my site to have any negative signal association with those sites.
One final point: at the start of your question, you specifically mentioned your Google rankings. Consider that even under the best guest posting circumstances, you can only take guest posting so far as a ranking-specific strategy.
You can't permanently guest post your way to the top of the search rankings. Nor should that be the only goal.
In fact, I look at guest posting opportunities for their potential to help my direct traffic and exposure to an audience I'm interested in reaching. I don't guest post for search rank. Seeking search rank via guest posting can lead you to make poor decisions and leave a linking footprint that Google can detect as manipulative.

3 SEO Success Factors for 2014

I recently came upon one of my old articles, "SEO Factors for 2011". I chuckled to myself, not only about how some of the details were actually still relevant but also how many seemed so elementary compared to today's search environment.
Yes, we still need to worry about the possession of "natural-looking" link profiles and how we feed our data to the engines. Those items will always be of consideration.
However, the nostalgic review of my piece had me thinking about what factors we need to consider for successful SEO as online marketers in 2014.

Social, Local & Mobile

We've spent the last handful of years practicing and preaching the importance of being in social, mobile, and local. This mindset was proactive. It allowed us to not solely focus on keywords and search results, but how these elements were going to change the search results our users saw as well as our user's experience.
While we walked down this road, at first it felt as if we were making strides to build silos of these efforts. Soon we saw the convergence of local and social sites molding into Google local results (e.g., Yelp reviews in Google local listings). We've also seen the fast paced growth of mobile and how localization of results has brought a more relevant delivery of results in this arena.

Search in 2013

This year has brought upon a lot for us to understand as marketers. As we close out 2013 algorithmic intelligence is changing faster than ever, at least in my opinion.
The buzz of 2013 and even more so the last few months has been upon the advancements of the Knowledge Graph, Local Carousel, Google Now, Hummingbird, and the great secure search/"(not provided)" change.
That's not even to mention Penguin and Panda, but those changes are more about what you may have done wrong in the past. We're here to talk about the future.

The Future of SEO

While the "(not provided)" announcement was a smack in the face to SEO professionals, hopefully it has helped you to realize that our intentions shouldn't be so focused so solely or intently on ranking a keyword in search results.
After watching what Google has been doing over the last year or so, where do keywords tie into the above-mentioned rollout features? They each in some way or another tie into local, mobile, or social.
  • Will keywords help you with the Local Carousel? No, proximity and review generation will.
  • How will Google Now propel your keyword strategy? It won't, but social efforts will.
  • Do you think that Google will give you a Knowledge Graph box for a keyword and link to your site? If so, you're dreaming.
Add in the Hummingbird update, and all of these changes tell us that Google is moving closer to bringing everything together through the tie-ins of localization and semantic improvements for conversational search, which is popular on mobile.

SEO Isn't Dead, It's Converging

SEO at its core will never be dead. All of the on-site needs of yesteryear will remain important in 2014. All of the newer processes of creating informational, enticing, and insightful content for link building and social digestion are still the hot topic now and will be heading into the future.
My point is that we need to watch the converging of our old silos into the new SERP display. SEO has taken on a converging role with other mediums which impact SERP display.
Pizza Near Shawnee KS Google SERP
For example, you've created optimized local listings for your local business, but know that the display weighs even more heavily on reviews, have you done your job at local-social integration.
Lowes Google SERP
Do you want to display your social activity in SERPs?

2014 Will Still be Big for SEO

Sites must be crawled efficiently, content must be targeted, and yes we still want to rank where desired. The focus as we move down the road is more so on what vehicles we use off-site to help drive traffic to our sites.
How we use the previous discussed pillars alongside their continual convergence by Google will determine how successful your online marketing strategy will become.
Quick takeaways:
  • Don't build a local listing. Allow your audience to help you build a local presence.
  • Don't build a brand. Build a community, a socialized brand, one that can keep your audience in tune with you in real-time.
  • Don't just optimize a site. Optimize an experience for those that are mobile and content hungry.

Ecommerce Product Pages: How to Fix Duplicate, Thin & Too Much Content

Content issues plague many sites on the web. Ecommerce sites are particularly at risk, largely due to issues that can stem from hosting hundreds or thousands of product pages.
Typical issues with ecommerce product pages are:

  • Duplicate content.
  • Thin content.
  • Too much content (i.e., too many pages).
Left unchecked, these issues can negatively impact your site's performance in the SERPs.
If you run an ecommerce site and you've seen traffic flat-line, slowly erode, or fall off a cliff recently, then product page content issues may be the culprit.
Let's take a closer look at some of the most common content woes that plague ecommerce sites, and recommendations on how to can fix them.

Duplicate Content

There are typically three types of duplicate content we encounter on ecommerce sites:
  • Copied versions of the manufacturer's product descriptions.
  • Unique descriptions that are duplicated across multiple versions of the same product.
  • Query strings generated from faceted navigation.

Copied product descriptions

A large degree of ecommerce resellers copy their generic product descriptions directly from the manufacturer's website. This is a big no-no. In the age of Panda, publishing copied or duplicated content across your site will weigh your site down in the SERPs like a battleship anchor.

How to fix it

The solution here is to author original product descriptions for every product on your site. If budget is an issue, prioritize and get fresh content written for your highest margin product pages first and work backwards.

Unique yet duplicated product descriptions

With many ecommerce sites, site owners have authored original product descriptions, which is fantastic. Where they run into trouble is they sell multiple versions of the same product (different sizes or colors or materials, etc), and each product version has a different page/URL with the same boilerplate description.
Now even though this content is technically unique to your site (it's not copied from somewhere else), it's only unique to a single page. Every other page it lives on is considered duplicated content.

How to fix it

The solution here is to concentrate multiple product version pages to a single page, with all the different product options listed down the page. Or you can position them as a list in a drop down menu, like Zappos does.
Product Dropdown Nike Lunarglide
Once you combine all pages to a single page, 301 redirect the other URLs to that single page, in the event they've attracted links and/or accrued link equity. The redirects will also help Google sort out the true version of your product page, and can help with any potential crawl budget issues.
Depending on the ecommerce platform you're using, concentrating multiple versions of a product page to a single URL can be difficult or impossible. If that's the case, think about moving to a SEO-friendly platform, like Magento or Shopify.

Faceted navigation issues

Many ecommerce sites host category pages with a range of filters to help users easily navigate their site and drill down to specific products, like this Weber Grill page on Home Depot.
Home Depot Faceted Navigation
A faceted navigation menu like the one above can create dozens if not hundreds of query strings that are appended to the URL, thereby creating duplicate versions of the same page. Faceted navigation can be a fantastic UX feature for consumers, but can problematic for SEO.

How to fix it

There are a few ways to prevent searches engines from indexing duplicate content from faceted navigation:
  • Block faceted pages via Robots.txt file.
  • Parameter handling via Webmaster Tools.
  • Add self-referential canonical tags (rel="canonical") Note: this may help Google distinguish original from duplicate content, but it won't address crawl budget issues.

Thin Content

Even if a site has 100 percent unique product descriptions, they can often be on the thin side (i.e., a few bullets of text). Now, product pages with light content can still rank well where domain strength helps supersede potential thin content issues.
But most sites don't have the backlink profiles of Amazon or Zappos, and I like to think in terms of risk/reward. Thickening up descriptions makes sense because:
  • It can reduce any risk that thin content issues might negatively impact SERP visibility
  • It adds more content for engines to crawl, which means more opportunities for your page to rank for a wider basket of search queries.
  • It freshens up your page, and freshening up your content can definitely pay dividends with Google.
To audit word count for every page on your site, crawl the site with Screaming Frog and looking for potential trouble spots in the "Word Count" column.
Word Count Audit

How to fix it

Some of the ways you can address thin content on your ecommerce product pages include:
  • Enable (and solicit) user reviews and feedback. User-generated content is free and helps thicken up your content with naturally-written text (not "SEO" content). This additional content can help improve potential relevancy scoring, time on page, user engagement levels, and can help the product page rank for a broader basket of search queries. Also, user reviews offer social proof and can improve conversion rates as well.
  • In the previous example, I spoke about condensing multiple versions of the same product to a single page. Doing this would also help thicken up that pages since you'd list all the different dimensions, size variations, colors available to consumers.
  • Write some additional, original content. You can hire a writer to help thicken up these pages with additional features and benefits, or you can do it yourself. Again, given it could be very costly to thicken up every product page on the site, you can prioritize your highest margin products first.
  • Pulling in mashups of links/text of similar products, product accessories, special offers and recently viewed items is another way to add more content to a page, and a tactic many larger ecommerce sites use like Amazon.com.
Amazon Product Mashups

Too Much Content

Saying that a site has "too much content" may sound contradictory to the issue of having content that's too thin. But when I say an ecommerce site may have too much content, I'm really talking about two distinct issues:
  • Too many product pages.
  • Improper handling of paginated product pages.
And specifically how having too many pages of low value content can cause PageRank and crawl budget problems.

Too many product pages

This is really an addendum to the duplicate content issues posed by faceted navigation or hosting multiple versions of the same product on different pages.
Aside from low value content concerns, hosting a mass of duplicated product pages dilutes your site's PageRank or link equity, which weakens its overall ranking power of your important content.
The other issue pertains to your site's "crawl budget" (i.e. how deep/how many pages Googlebot crawls each time it visits your website). If a large percentage of your site if comprised of duplicate or low value content, you're wasting your budget on junk content and potentially keeping quality pages from getting indexed.

Improper handling of paginated product pages

Another concern of hosting "too many pages" is not handling pagination correctly. Often times, ecommerce sites can have product categories containing hundreds or thousands of products that span multiple pages.
Pagination Issues
Like duplicate product pages, excessive paginated results rob link equity from important pages and can hurt your crawl budget.

How to fix

Some of the ways to address equity dilution or crawl budget issues that can stem from too many product pages include:
  • Rel=next, rel=previous: This markup tells Google to treat ecommerce product listings spanning multiple pages in a logical sequence, thus consolidating link equity (rather than diluting it) with all pages in the series.
  • Canonicalization: It's effective for consolidating link properties (thus solving equity dilution), but it won't solve potential crawl budget issues, since Googlebot will still crawl all your dupe content.
  • "Noindex, follow": If your goal is to optimize crawl budget and keep duplicates or pagination out of the index, use brute force and block Googlebot via robots "noindex, follow" meta directive.

Monday, October 14, 2013

The Causal Nexus of SEO



Dominos Falling
There are some aspects of online marketing that play a huge role in the bigger picture, but aren't as easy to see. Things like emotion, motivation, awareness, and relationships can be hard to gauge with our usual metrics.
But sometimes the effects of an action aren't evident right away. There are times when we can't associate cause and effect directly. But everything we do in SEO fits into a much bigger chain reaction and we might not able to see every piece.
When something doesn't fit our typical measurements it may be easy to write it off entirely. There's actually a word for that: floccinaucinihilipilification.
The exact definition of floccinaucinihilipilification from Dictionary.com is "The estimation of something as valueless." It's actually the longest non-technical word in the English language. Sorry, antidisestablishmentarianism.
Aside from being a semi-useful piece of party trivia, floccinaucinihilipilification is actually a great description of one of the most frustrating aspects of modern SEO. There are just so many things that are easy to dismiss because they are outside of our usual expectations for results.
Search is evolving to a point where we get much less instant gratification. Things take a much less linear path than they did in the past; you can't just walk across the room and turn on the light anymore. You have to use a Rube Goldberg machine to do it.

The Social Part of Social Media

In social media you can measure friends, followers, retweets, circles, referral visits, and sales through unique promotions. There are all sorts of fantastic metrics for judging how well a social campaign is performing. Of course not every one of those translates to visits, or dollars.
If a comment on your wall doesn't result in a sale or if a retweet doesn't improve rankings, then does it matter? Yes.
With social media it's also about the prospect of exposure. It may not be as clearly measurable when someone shares something on Facebook and one of their friends sees it and later searches the brand name. It's not always obvious when retweeting someone's post and getting a "thank you" leads to that person clicking on the Tweeter's site in the SERPs because they recognize the name.
If social media efforts aren't directly impacting your rankings, or the traffic numbers aren't approaching search engine referral proportions, that doesn't mean the campaign isn't working.
A comment on a wall may not mean much on its own. But a comment may lead to a new fan that may lead to a new sharer, who could grow to be an evangelist if the relationship is cultivated.
While direct leads are a possibility from social media, there's more to it than that. It's access to a huge and active audience if you're willing to play to the crowd.

Simple, Single Links

Links are probably one of the hardest places to deal with all of the changes in the last year. Links have been both the salvation and devastation of too many websites.
Bought links, links with keyword anchor text, easy, cheap, unlimited links weren't supposed to work, according to the rules. But they did. So forget the rules, people made money. Except now, best case scenario they don't work as well and worst case, they can tank a site.
So now links mean a totally different thing. They aren't as easy to get any more. They don't necessarily go to the pages where products live and links that go to different kinds of content don't always work the same way.
Links with your URL as anchor text probably won't move a site up for its head terms as quickly as a hand full of links brandishing keywords used to. So now maybe it's a about getting a link from a small community organization instead of 150 directories. But those little links are a much bigger deal now.
It's never going to be the same, but this is where we live now. A link from a person's enthusiast site for a how-to guide may not seem as effective as syndicating an article across 300 sites, but it's real. Things that are authentic may take longer to feel.

Trust and the Human Factor

Google has shown a continued effort to become scary close to emotional intimacy with the preferences of its users.
Authorship is one indicator of Google's improving efforts to identify individuals as entities. Public signs point toward their increased attempts to incorporate that information into how they evaluate websites. This interest in using real people's association with websites to determine trust, should be more than enough to pique our in getting onboard early.
On the other side, Google also seems to be trying to figure out which sites people trust through their own choices and patterns. That means visitor loyalty isn't just important for repeat sales, the signals it sends can be beneficial for SEO.
Some loyalty is measureable. Getting people to want to return to a site is measurable. We can see when the percentage of repeat users goes up.
We can measure how many people come to a site through subscription based newsletters or email marketing. We can measure when people become regular commenters or forum posters.
But it's hard to measure where those relationships start. Was the first time they came to your site searching for what you sell? Or is it possible it's because they knew you before they needed what you sell?
It isn't always as clear cut as which search word brought you the most visitors, or what was the last click before the sale. Sometimes that sale was months in the making based on a chain reaction that couldn't be tracked.

A More Convoluted Path

Each action that creates a positive connection has value even if it falls outside of our traditional data tracking.
We absolutely have to evaluate numbers, show correlation and prove ROI. That's the job of anyone working in SEO. But trying to optimize within the new system, we've had to get more creative.
It may take time for an initial action to produce a desired end result and there may be 10 steps in between instead of 3. But that doesn't mean it isn't worth it.
So don't immediately floccinaucinihilipilificate an effort in which direct results are a little ambiguous. There may be more at play than is immediately evident.
There's a time to give up on something that isn't working, sure. But make sure you're not comparing more slow-burning efforts to the precedents of the past.
At this point, shortcuts are getting shut down more and more every day, and the long way is about the only option left. So yes, an action might not lead to more rankings, traffic, or sales directly, but that doesn't necessarily mean it didn't work; it may simply be the first domino to fall.

Top Search Result = Poor Ad CTR [Study]

Advertising network Chitika released a study today that showed how ad click-through rates on a website vary when users come to that website from Position 1 in the organic search results versus other positions. Data showed the highest CTR on ads in a website occured when users found the site from Position 10 in the SERPs.
ctr-by-referring-position
As a follow up to Chitika's study last summer that showed how rankings yielded traffic, Chitika said this is a stark contrast in terms of ad performance.
"What is clear from the data set is that although the first position of a Google search result drives the most search traffic, an average visitor coming from that link is the least likely to convert into an ad click," according to Chitika.
Chitika said the reason why Position 10 might be driving the most ad CTR on a site could be due to unsatisfactory results.
"When a user scrolls down and clicks on a link at Position 10, it is more likely that they have not found what they were looking for, increasing the probability of that person clicking on an ad related to their search query," Chitika said.
Chitika said that marketers shouldn't necessarily be vying for 10th position on every keyword, but that in terms of driving ad revenue, it's not a bad place to be overall.
google-results-page-rank-average-traffic-share-chart
"On a popular search term, 2.4 percent of potential visitors still represents a sizable audience, and by being the number 10 result, it's likely a site will see higher ad revenues," Chitika said in its report. "However, for lower volume or specialized search terms, ranking as high as possible will help in attracting the largest audience, since the proverbial 'pie' of users on those terms is already fairly small, along with the potential revenue impact of higher visitor CTRs."
So what's a marketer to do with this data? Cristian Potter, a data solutions engineer at Chitika, said it's important to note that this report examines aggregate traffic trends, and may not apply to groups of sites.
"Hitting the sweet spot requires some analysis of an individual site's traffic, for example, understanding how users are finding the site, and how certain campaigns have impacted actions undertaken by users on the site itself," Potter said. He added that this research can serve as a "as a point of reference in plotting metrics and key performance indicators."
While the data seemed to show an interesting relationship, sites that go after ad revenue have a seemingly delicate balance of providing a great user experience and making money. Not having the most relevant content (Position 10 versus Position 1) and subsequently driving users away through an ad doesn't seem like a great idea, either.
Potter agreed.
"User experience is always a key consideration when it comes to deciding on the number and placement of ad units," Potter said. "This also ties in with expected CPM on each ad unit - it should be worth the site's while to place an ad in a prime position. However, this study was solely behavior focused. The characteristics of the one or more ad units on each site within the sample will have varied considerably."

New AdWords Estimated Total Conversions Tracks Consumer Purchases Across Devices

Starting today and over the next few weeks, Google AdWords will roll out a major reporting update to conversion tracking called Estimated Total Conversions. This feature provides estimates of conversions that take multiple devices to complete and adds this data to the conversion reporting we see today.
Following the launch of enhanced campaigns this year, search advertisers have combined mobile and desktops with the ability to further modify bids by mobile and other targeting factors. One gap in reporting and comprehension of the campaigns effectiveness has been the limited data on how consumers are navigating and converting via multiple device options.

What is a Cross-Device Conversion?

What is a Cross-Device Conversion
Consumers constant connectivity has enabled them to browse, shop, and interact with businesses on the go and from multiple devices.
A September 2013 Google study found that more than 90 percent of multi-device consumers move sequentially between several screens like mobile to desktop, or mobile to tablet to complete a transaction online. Google found that a high percentage of converters actually jumped from desktop to desktop too, presuming a work desktop to home desktop computer.

How Estimated Total Conversions Works

Measuring AdWords Conversions in a Multi-Screen World
Google calculates cross-device conversions for a particular advertiser based on how their customers convert when they are logged in. They then use this as the basis for extrapolating out to the complete data set to form an estimate of what total conversions that cross devices might look like. This data is only used in aggregate and not personally identifiable.

What's Next?

Estimating conversions across devices (estimated cross-device conversions) is only the beginning and one conversion type Google intends to measure.
In the future Google plans to incorporate other conversion types such as phone calls and store visits where advertisers are hungry to gain new insights into how their advertising is working.

Link Building 101: Competitor Analysis

Link Building 101 Competitor Analysis
Link building is something anyone can accomplish. There's no great secret, just hard work, creativity, and determination to get links that matter.
When you're looking for some practical link building opportunities that will help you find and acquire quick, yet quality, links, there are five "quick wins" you should explore at the beginning of a link building campaign:
  1. 404 Pages and Link Reclamation
  2. Competitor Analysis
  3. Fresh Web Explorer/Google Alerts
  4. Local Link Building
  5. Past/Current Relationships

Competitor Analysis/Backlink Profile

Competitor analysis is an integral step in any link building campaign. Why? Because running a backlink analysis on a competitor:
  • Teaches you about the industry:
    • Gives you a sense of which sites within the vertical are providing links
  • Helps you understand your competitors, including:
    • Their link profile, and why they're ranking
    • Their strategies used to acquire links
    • Their resources that didn't acquire many links
Gives you a list of obtainable links (if they can, why not you?)
Competitor backlink analysis is great – you get the initial research into the industry done, it helps you understand the competition, and it gives you a tidy list of high opportunity links.
So, let's dive into the how of competitor backlink analysis:
  1. Make a list of competitors
    • Direct
    • Indirect
    • Industry influencers
    • Those ranking for industry money keywords
    • Watch fluctuations – who's winning and who's losing
  2. Take those competitors and run their sites' through a backlink tool previously mentioned (OSE, Majestic, Ahrefs, CognitiveSEO, etc.)
  3. Backlink Analysis
  4. Download the top 3-4 competitors' backlinks into CSVs. Combine into a single Excel sheet, removing duplicates, and find obtainable quality links already secured by competitors.
Step 2 and 3 were previously covered in "Link Building 101: How to Conduct a Backlink Analysis", and step 1 is pretty self-explanatory.
To recap the advice for these steps:
  • Don't phone-in the list of competitors. Spend time doing research and investigation, giving yourself a well thought out and understood list of potential competitors.
  • Information you should be examining in a backlink analysis:
    • Total number of links
    • Number of unique linking domains
    • Anchor Text usage and variance
    • Fresh/incoming links
    • Recently lost links
    • Page Performance (via top pages)
    • Link quality (via manual examination)
  • Additionally, think creatively while looking through competitors' backlinks. Think about:
    • Which resources/pages performed well
    • Which resources/pages performed poorly
    • Commonalities in competitor's link profiles
    • Differences in competitor's link profiles
    • Strategies likely used to acquire links

How to Find Obtainable Quality Links

So, that takes us to Step 4: downloading competitors links into CSVs, combining in Excel, and drilling down into the data to find worthwhile links and insights.
Honestly, SEER has done an amazing job of writing a very easy to follow guide for Competitor Backlink Analysis in Excel.
To summarize their steps, you:
  • Download CSVs of competitor's backlink portfolios (‘Inbound Links' will give you a list of all the pages linking, ‘Linking Domains' will give you only the domains).
    • Note: if you're unfamiliar with your own (or client's) backlink portfolio, you may wish to include their backlink portfolio in this process for reference.
    • Using OSE don't forget to filter to the whole domain:
Pages on this root domain export to CSV
  • Open the CSVs and combine (copy and paste) all the data into a single Excel sheet.
  • Filter down to clean URLs, keeping the originals intact.
    • Move Column J (target URL) to Column P (to be the last column)
Move Column
    • Delete Column J (the now empty column)
Delete Empty Column
    • Duplicate the URL and Target URL columns on either side
Duplicate URL Target URL columns
    • Remove http:// and www. from both column A and column P - select the column, click control+H (find and replace shortcut), type in what you want to find (http:// and www.) and replace them with nothing (by leaving the second line blank).
Remove http and www
    • You might want to rename column A and P at this point - call them bare URL and bare target URL, or whatever you so desire (in the SEER article they were called ‘clean').
  • Remove duplicates
Remove Duplicates
    • Make sure it's only for column A (bare URL) and P (bare target URL)
Remove Duplicates URL
Notice the check mark on "My data has headers". This is important to keep your data from being jumbled up. Anytime you're removing duplicates make sure this box is checked.
This will give you a complete list of stripped URLs next to the full URL linking (along with the rest of the important information provided by OSE) and a list of full target URLs next to a complete list of stripped target URLs.
Note: you'll still likely have a lot of duplicate URLs in column A (the linking URLs) at this point. This is because there's multiple links on the same page going to different landing pages – which is potentially important information (shows a competitor acquired multiple links per page).
If you'd like to delete these multiple link pages/URLs to reduce data noise, highlight column A, and run ‘Delete Duplicates' again - making sure to have the ‘My data has headers' box is checked:
Remove Duplicates Bare URLs
Now, you'll be down to unique URLs (pages, not domains if you've used Inbound Links) linking to competitors. If you're looking for only referring domains, you should start back at step 1 and download a CSV of referring domains, as opposed to all links.
At this point, you're still dealing with a lot of data, so you'll want to filter it further. I recommend filtering by domain authority to see the most authoritative links first.
Filter Domain Authority
This will make your list ordered from highest domain authority to lowest – pretty useful information. Keep in mind however that the domain authority is thrown off by any subdomains hosted on a popular site – example.wordpress.com, example.blogspot.com, etc.
So, don't take the domain authority as absolute – you'll need to verify.
There's also a few other filters you can use to find interesting data:
  • Page Authority (PA)
  • Anchor Text
  • Number of domains linking (shows best ranking pages - don't get stuck on home pages)
Take time and play around with the data. Look through the top DA's (manually excluding anything artificially inflated), then PA's, check out top performing pages via number of domains linking, and even play around with filtering the anchor text.
This should be the fun part - the analysis. You've filtered the data down to a semi-digestible level, and should start taking advantage to find insights and understand your competitor's links.
Remember, any links your competitor has should be considered fair game for yourself. Once you've determined quality links from domains you haven't secured, look into the link and pursue it appropriately.

More Insights

If you're looking for an even better (and more advanced) deep data insights you can move all this information into pivot tables. Simply select all rows, click over to the insert tab, and select ‘Pivot Table':
Insert Pivot Table
Once here you have the option to choose which fields you'd like to further examine:
Pivot Table Fields to Add
Playing with this data should reveal potential insights, although we're getting a bit beyond Link Building 101.
Furthermore, if you want to really dive into pivot tables (or excel in general), I can't recommend Annie Cushing enough. Check out her Moz article "How to Carve Out Marketing Strategies by Mining Your Competitors' Backlinks".