Beware the Panda. According to a tweet from the official @Google
Twitter account this morning, a new data refresh is rolling out today.
This update, according to the notice, should only affect 1.2 percent
of English language queries. No other information is available so far.
This is the first Panda data refresh of 2013. It also marks the third consecutive month of Panda data updates.
The first Panda update was nearly two years ago in February 2011. Google's stated goal of Panda is to reward "high-quality sites."
While Google has never formally defined what a "high-quality site" is, Google has its own list of bullet points on their blog post from early 2011. The rationale has always been the same: to find more high-quality sites in search.
Google Goes Boom on Low-Quality Sites...So They Say
Chances are good that you or someone you know has seen some ranking changes today as Google rolled out a new algorithmic update. With the recent announcements aimed at "low quality sites" (many interpret this to mean content farms), even less than two weeks ago, Google stated they were exploring different new methods to detect spam."This update is designed to reduce rankings for low-quality sites--sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites--sites with original content and information such as research, in-depth reports, thoughtful analysis and so on," Google announced last night.
No one can say this one came out of left field. Google launched an algorithm tweak in January to combat spam and scraper sites, though that affected a much smaller number of sites.
This IS a big one. We're talking 11.8% across the board. Now, the big question is did they do it right?
From the looks of it, Google is not simply devaluing sites serving duplicated content, they are going after sites here with specific types of backlinks, spying through Chrome extensions, and this is only within the first 24 hours! More will become clear once site owners see drastic changes in their traffic stats.
As with every major Google update, SEO forums are dedicating a thread to this and they are filling up fast with reactions and reports. Since BackLinkForum.com tends to have the skilled gray/blackhat crowd, and because this update is only happening in the U.S. (for now), BLF is a great place to see what is really happening down in the trenches.
Two possible things happening there worth noting:
- Sites with the majority of their backlink profiles consisting of profile links could be a target.
- Not every content farm was red flagged. This may have been a response to the scraper update along with bigger content farm sites. Similar to the recent Blekko update.
Although coming to a conclusion about the update within 24 hours is extremely risky, I would be willing to bet that this is targeting self-service linking as much as content farms.
However, sites like eHow, Answers.com, and even low-level scraper sites still seem to be saturating the SERPs. That leaves me asking, "Who was penalized then?"
As with any Google algorithm changes, some innocent sites are going to be slammed. Some SEOs have reported seeing 40 percent traffic drops to their sites.
This latest update may just more evidence that Google simply can't distinguish between "good" or "bad" content.
Let us know what you're seeing today -- the good, bad, and disastrous.
Blekko Removes Content Farms From Search Results
In an effort to combat web spam, Blekko will block from its search results 20 of the worst-offending SERP clogging content farms, including Demand Media's eHow and Answerbag, TechCrunch reports. This list of barred sites are as follows:- ehow.com
- experts-exchange.com
- naymz.com
- activehotels.com
- robtex.com
- encyclopedia.com
- fixya.com
- chacha.com
- 123people.com
- download3k.com
- petitionspot.com
- thefreedictionary.com
- networkedblogs.com
- buzzillions.com
- shopwiki.com
- wowxos.com
- answerbag.com
- allexperts.com
- freewebs.com
- copygator.com
Blekko seems to be taking spam seriously. Last month, the newest search engine introduced the spam clock, which announced that 1 million new spam pages are created every hour. As of this morning, the total number of spam pages was at 750 million and counting (though Blekko admits it is more "illustrative more than scientifically accurate.")
The reasoning for the spam clock, according to Blekko CEO Rich Skrenta:
"Millions upon millions of pages of junk are being unleashed on the web, a virtual torrent of pages designed solely to generate a few pennies in ad revenue for its creator. I fear that we are approaching a tipping point, where the volume of garbage soars beyond and overwhelms the valuable of what is on the web."So Blekko seems to be doing its small part for cleaning up its own search results.
Meanwhile, Google has also announced an algorithm change to combat spam. But as Mike Grehan notes in his column today "The Google Spam-Jam," spam "is a problem that Google has had from day one and it's not likely to go away anytime soon" with its current search model.
Google's War on Spam Begins: New Algorithm Live
Google's Matt Cutts today announced the launch of a new algorithm that is intended to better detect and reduce spam in Google's search results and lower the rankings of scraper sites and sites with little original content. Google's main target is sites that copy content from other sites and offer little useful, original content of their own.Positing on Hacker News, Cutts wrote:
"The net effect is that searchers are more likely to see the sites that wrote the original content. An example would be that stackoverflow.com will tend to rank higher than sites that just reuse stackoverflow.com's content. Note that the algorithmic change isn't specific to stackoverflow.com though."On his blog, Cutts wrote:
This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice. The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content.Cutts said the change was approved last Thursday and launched earlier this week. Cutts announced Google's intention to up the fight against spam in an Official Google Blog post last Friday.
In response to criticism that Google's results were deteriorating and seeing more spam in recent months, Cutts said a newly redesigned document-level classifier will better detect repeated spammy words, such as those found in "junky" automated, self-promoting blog comments. He also said that spam levels today are much better than five years ago.
At Webmaster World, there is discussion about big drops in traffic. Are you seeing any changes as a result of this change?