Friday, January 25, 2008

6. Inbound Links (IBL) Repetition

Inbound links are links that are pointing to pages within the same domain. These links of course are needed in every website, simply within your main menu for example, otherwise there would be no proper navigation. This is why Google is not fully neglecting these links, even after the Jagger Update. Inbound or navigational links are still picked up and followed by the search engine spider for indexing of your pages. This is the good news. The bad news, at least for spammers, is that Google is not rating them for pagerank purposes any more. And if you are overusing inbound links by giving them unnatural different positions and link names or keyword phrases, or by repeating them too often, they may even be treated as link spamming.

Solution:
One or two multiple inbound links are still O.K. Google knows that one could be text link based e.g. on the bottom of your page, the other one image link based, e.g. within the left hand main navigation menu of your page. But do not go beyond this limit, to be on the safe side.

5. Outbound Link (OBL) Relevancy

Or better say outbound link (OBL) Irrelevancy! This is linking with irrelevant websites or categories. After the Jaggar Update this is simply a waste of time and money.

Solution:
Find relevant categories related to the content and topics of your website and ask for placements - both paid and free, depending on your budget - within these categories. There are hundreds or even thousands of both paid and free directories out there waiting for the submission of your website. Always remember to submit to the most apropriate category and avoid submission to FFA linklists, linkfarms or bannerfarms (see above).

4. Abuse Usage of Reciprocal Link Exhanges (RLE)

In general there is nothing to say against link exchanges, as long as it is not done for the sole purpose of cheating the search engine spiders for better ranking. A honest exchange of links in between two sites, because their webmasters feel each other´s content to be a good supplement to their content is nothing to feel ashame or worried about. Therefore Google obviously has decided not to totally ignore link exchanges for a better ranking. But their overall value has been decreased, they are not weighed as heavily as before after the Jagger Update. Some reciprocal links or so called cross link exchanges have been detected as black hat SEO and are now totally neglected or even rated as link spamming. These are known sites created for the sole purpose of exchanging links as well as many FFAs (Free For All Link Lists), link farms, link selling or link renting and so on.

Solution:
Keep away or at least be very careful about exchanging links with suspicious FFAs, link farms or link exchange sites. They may more likely hurt you, than bringing any benefit to your website ranking. Stop or decrease your RLE link building strategy. Try to build one way linking through partnership programs, web and niche directories, relevant category listings and relevant industry portal registrations. Also to write your own articles with a link to your website within them and submitting these articles to above mentioned free article directories can be of great help for link building. Every time your article is picked up and distributed by another webmaster, they will include the link to your site within it. Allf for free and without you to have to link back. Make sure to write useful high quality articles without spelling mistakes, in order to have them republished more frequently. Also allow the other webmasters to modify your content as long as they leave your link active, in order to avoid duplicate content filters of the search engines.

Wednesday, January 23, 2008

3. Pirate Sites or Pirate Matter

Pirate sites and pirate matter are sites which mainly contain out of other - mostly automatically - stolen content without having the permission of the original author or webmaster. It could also be a mirror copy from the same domain, which has also been rated as spam. Sometimes, this is a dangerous game even to honest, straight forward webmasters. For example I run a website where I have two domains leading to it (one with a hyphen within the domain name, one without the hyphen). Of course it was not my intention to spam, but to have both very similar domain names, pointing to the same content, for obvious, legal reasons. Well, unfortunately this was enough for Google to rate this as spam and I had lot´s of trouble to get out of this mess, I can tell!

Solution:
If you cannot manage to keep up with regular fresh content, go to free content and article distribution websites, from which you can grab fresh nad free content at no or low cost. Use content from there, it is totally legal as long as you leave all links within these free articles within it and being active. It is a simple deal with advantages for both, the author and the webmaster copying the content offered. Some good sites with free articles and content for legal downlad are http://www.goarticles.com/ , http://www.az-articles.com/ or http://www.dreamicles.com/ to only mention a few.

2. Detection of CSS Spam

The bad thing about black hat SEO is, that some people always think they are too clever. The good thing about search engine algorithms is, that they are updated more or less regularly and that they are constantly getting better in trapping black hat SEO down. As a result many of these dark freaks find themselves or their websites penalized or even banned by the search engines. One of the latest search engine "tricks" had been CSS spamming. CSS spam are CSS pages which have been solely created for the misleading of both, search engine bots and human visitors. Playing such games with CSS is no longer worth the effort, at least after the recent Google Jagger Update. At least the Google robot will with a pretty good certainty detect such spam CSS leading to invisible text, which it by any means will rate as SPAM.

Solution:
Use CSS and CSS style sheets for better formatting of your website, not in order to mislead search engine robots. Besides, I can still not understand why so many webmasters are trying to hide their text away. Write good, honest content and make it officially visible to your visitors and the search engine spiders. The result is even better than spamming, because you will not mislead and will attract both, robots and humans, to rate your site to be more valuable. And all this without the constant danger of getting trapped and penalized on any other day.

1.Detection of Blog Spam

A Spam Blog has been defined to be a blog which is not more or less regularly updated and which has been created for the sole purpose to promote another website or product.

Solution:
Create blogs and maintain them regularly. Do not always use the same links to yours or other websites (including affiliate links). Try not to use your blogs as spam. Try to use them as useful addition to your website, to add community value to your visitors. Use your blogs as a guidline or help area rather than filling it with useless content or even worse to use it as a dull promotional platform only.

Tuesday, January 22, 2008

Jaggar, Updates and Solutions! The Google Jagger Update. Improving efforts from Google!

Google´s Jaggar Update was the most frightening for the SEO community of all Goolge Updates so far. It has also been the update taking the longest time so far. The Jagger Update has been carried out in various steps over a longer period of time. The Search Engine algorithm update affected almost all websites, especially websites with very competetive keywords or keyword phrases. A vast majority of webmasters, website owners and search engine optimizers (SEO) did not like the Jaggar Update very much. This is because many of them had to totally rethink and change their SEO strategies.

Google today is the major search engine. Personaly, I like Google. Not only because of it´s popularity and search volume. I like Google for it´s continuing efforts to maintain and improve their search engine quality and relevance of search results. Google after all is a popular search engine, not a piece of SEO playground. In terms of internet search and research, Google always comes up with new ideas and innovative approaches. The recent updates of Google´s search algorithms have only been additional steps in the direction to offer even better and more relevant search results to the search engine´s users. This is only legetimate.

In the following I will try to provide a list of elements that had been implemented within the most recent Jagger Update:

1. Detection of Blog Spam
2. Detection of CSS Spam
3. Pirate Sites or Pirate Matter
4. Abuse Usage of Reciprocal Link Exchanges (RLE)
5. Outbound Link (OBL) Relevancy
6. Inbound Links (IBL) Repetition
7. Redirect Domains or Redirect Pages
8. Keyword Stuffing within ALT-Tags
9. Hidden Links and Hidden Text
10. Top Domain PageRank (PR) Weight

A lot of threats Google had to deal with, mostly very rightfuly, but let´s hope they do the job right without causing too much of colatoral damage to innocent webmasters and honest website owners. For example I found, that they are blocking and deleting lot´s of Blogs run on Google owned blogger.com for spam reasons, but without that you can find out or understand why these blogs had been abused to be spamy. It seems, blogger is overdoing things here, hitting lot´s of innocent bloggers away as well. But let´s go back and take a closer look at the individual points of the recent Google Jaggar Update as listed above. I have tried to give a decent description of every issue and I also try to give you some hints about how to avoid getting trapped or penalized by Google´s updated filters.

Monday, January 21, 2008

Top Webmaster Forums

Webmaster Discussion Forums
Web Talk Forums is a revenue sharing webmaster discussion forum.

SEO.com Search Engine Optimization Forums
Discuss the various aspects of SEO at SEO.com's search engine optimization forums.

Digital Point Webmaster Forum
Search Engine Optimization and Marketing forum.

Search Engine Marketing Forum
Search engine marketing forum and community.

Webmaster Lighthouse Webmaster Forums
Webmaster Discussion Forums.

Webmaster Forum at V7N
Webmaster Community Forum.

Friday, January 18, 2008

Did you know that every page of your Web site stands on its own?

Every page should have a unique title, description, and keyword tag.The description tag should describe ONLY that page.

The keyword tag should include keywords for just that page.

Include 5-6 keywords, including the main keyword phrase and synonyms of thatkeyword phrase.

Don't make the mistake of including every keyword that could possibly describewhat your site is about in your keyword tag. Make your keyword meta tag specificfor each page.

The keyword tag holds very little importance anyway, but be sure to make it pagespecific. FOCUS!