Wednesday, January 23, 2008

2. Detection of CSS Spam

The bad thing about black hat SEO is, that some people always think they are too clever. The good thing about search engine algorithms is, that they are updated more or less regularly and that they are constantly getting better in trapping black hat SEO down. As a result many of these dark freaks find themselves or their websites penalized or even banned by the search engines. One of the latest search engine "tricks" had been CSS spamming. CSS spam are CSS pages which have been solely created for the misleading of both, search engine bots and human visitors. Playing such games with CSS is no longer worth the effort, at least after the recent Google Jagger Update. At least the Google robot will with a pretty good certainty detect such spam CSS leading to invisible text, which it by any means will rate as SPAM.

Solution:
Use CSS and CSS style sheets for better formatting of your website, not in order to mislead search engine robots. Besides, I can still not understand why so many webmasters are trying to hide their text away. Write good, honest content and make it officially visible to your visitors and the search engine spiders. The result is even better than spamming, because you will not mislead and will attract both, robots and humans, to rate your site to be more valuable. And all this without the constant danger of getting trapped and penalized on any other day.

1.Detection of Blog Spam

A Spam Blog has been defined to be a blog which is not more or less regularly updated and which has been created for the sole purpose to promote another website or product.

Solution:
Create blogs and maintain them regularly. Do not always use the same links to yours or other websites (including affiliate links). Try not to use your blogs as spam. Try to use them as useful addition to your website, to add community value to your visitors. Use your blogs as a guidline or help area rather than filling it with useless content or even worse to use it as a dull promotional platform only.

Tuesday, January 22, 2008

Jaggar, Updates and Solutions! The Google Jagger Update. Improving efforts from Google!

Google´s Jaggar Update was the most frightening for the SEO community of all Goolge Updates so far. It has also been the update taking the longest time so far. The Jagger Update has been carried out in various steps over a longer period of time. The Search Engine algorithm update affected almost all websites, especially websites with very competetive keywords or keyword phrases. A vast majority of webmasters, website owners and search engine optimizers (SEO) did not like the Jaggar Update very much. This is because many of them had to totally rethink and change their SEO strategies.

Google today is the major search engine. Personaly, I like Google. Not only because of it´s popularity and search volume. I like Google for it´s continuing efforts to maintain and improve their search engine quality and relevance of search results. Google after all is a popular search engine, not a piece of SEO playground. In terms of internet search and research, Google always comes up with new ideas and innovative approaches. The recent updates of Google´s search algorithms have only been additional steps in the direction to offer even better and more relevant search results to the search engine´s users. This is only legetimate.

In the following I will try to provide a list of elements that had been implemented within the most recent Jagger Update:

1. Detection of Blog Spam
2. Detection of CSS Spam
3. Pirate Sites or Pirate Matter
4. Abuse Usage of Reciprocal Link Exchanges (RLE)
5. Outbound Link (OBL) Relevancy
6. Inbound Links (IBL) Repetition
7. Redirect Domains or Redirect Pages
8. Keyword Stuffing within ALT-Tags
9. Hidden Links and Hidden Text
10. Top Domain PageRank (PR) Weight

A lot of threats Google had to deal with, mostly very rightfuly, but let´s hope they do the job right without causing too much of colatoral damage to innocent webmasters and honest website owners. For example I found, that they are blocking and deleting lot´s of Blogs run on Google owned blogger.com for spam reasons, but without that you can find out or understand why these blogs had been abused to be spamy. It seems, blogger is overdoing things here, hitting lot´s of innocent bloggers away as well. But let´s go back and take a closer look at the individual points of the recent Google Jaggar Update as listed above. I have tried to give a decent description of every issue and I also try to give you some hints about how to avoid getting trapped or penalized by Google´s updated filters.

Monday, January 21, 2008

Top Webmaster Forums

Webmaster Discussion Forums
Web Talk Forums is a revenue sharing webmaster discussion forum.

SEO.com Search Engine Optimization Forums
Discuss the various aspects of SEO at SEO.com's search engine optimization forums.

Digital Point Webmaster Forum
Search Engine Optimization and Marketing forum.

Search Engine Marketing Forum
Search engine marketing forum and community.

Webmaster Lighthouse Webmaster Forums
Webmaster Discussion Forums.

Webmaster Forum at V7N
Webmaster Community Forum.

Friday, January 18, 2008

Did you know that every page of your Web site stands on its own?

Every page should have a unique title, description, and keyword tag.The description tag should describe ONLY that page.

The keyword tag should include keywords for just that page.

Include 5-6 keywords, including the main keyword phrase and synonyms of thatkeyword phrase.

Don't make the mistake of including every keyword that could possibly describewhat your site is about in your keyword tag. Make your keyword meta tag specificfor each page.

The keyword tag holds very little importance anyway, but be sure to make it pagespecific. FOCUS!

Friday, December 28, 2007

Robots.txt - Its definition and use

Robots.txt : It is a file created by webmasters is a set of instructions for the search engines robots that index the web pages. This is called The Robots Exclusion Protocol.

It includes User-Agent and Disallow. Here is the meaning of these words as these are very important part of robots.txt file :

User-Agent : It is used for the search engine robots. It gives the permission to a particular or all robots to index the pages. Here is the syntax :

User-Agent :* ("*" means all robots have permission to crawl all the files/pages)

Disallow : It denies access to robots to directories/files. Here is the syntax :

Disallow : /images/ (it denies access to images directory.)

A simple robots.txt file should be :

User-Agent: *
Disallow:

It means we give permission to all the robots to visit and the directories and files.

It is very easy to create this file. Open notepad editor. Insert these instructions and save it by name robots in text format. Put it in top-level directory and upload it. Suppose your website name is http://www.abctest.com/ then your robots file's path should be here http://www.abctest.com/robots.txt

You can check your robots.txt file here http://tool.motoricerca.info/robots-checker.phtml.

If you need help to create robots.txt, use robots.txt generator tool here : http://www.seochat.com/seo-tools/robots-generator/

Some more useful Search Engines

There are lots of search engines where we can submit our website free of cost. Organic optimization is very helpful to increase traffic on the websites. I am writing the name of few search engines where you can submit your website. It is not very popular but can be useful to increase business and traffic on the website :

http://www.surfsafely.com
http://www.amidalla.com
http://www.amfibi.com
http://www.splatsearch.com
http://shoula.com
http://www.searchramp.com
http://axxasearch.com
http://www.moffly.com
http://www.nalox.com

How to remove pages from supplemental result

Steps to remove pages from supplemental result :

I already mentioned about the cause of supplemental result and it also decreases visitors. Whole world knows google search engine and they are prefering google for search. It provides quality search to the visitors. So at present we should work optimize or promote our website according to the google search engine. I want to write few steps helpful to remove a page from supplemental index :

a) Strictly follow the guidelines of google.
b) Use a right word to target the page.
c) Avoid query string on the page if possible.
d) Keep file name short and relevant to the content of the page.
e) Make a site map for the website and add it on the index page.
f) Remove useless comments or tags if any from the source of the page.
g) Write relevant meta description for the page and use very few commas not more than 3-4, avoid keyword stuffing.
h) If your website uses duplicate contents, change it.
i) Increase the backward link of your website and the inside pages. In this case link exchange is very helpful.
j) Donot use any affilate links on the pages of website.
k) Check dead link or bad link on your page. Use the URL for it : http://validator.w3.org/checklink/

Google will take about 1-2 months time to remove the pages from supplemental result. But I am not sure about it. It will take more time also.

Keyword Suggestion Tools

I want to write my experience related to keyword analyzing. Good keyword selection is also important to increase traffics. Sometimes it is not possible to find a right keyword for our web pages. It is better to find the relevant keywords used by the visitors. But don't forget to read the content of your page and after reading, use the keyword suggestion tools and insert a word related to the content of your page. After getting result, use the combination of this word with other relevant keywords. Here is the few useful keyword suggestion tools :

http://freekeywords.wordtracker.com/
http://www.keyworddiscovery.com/
http://adwords.google.com/select/KeywordToolExternal
http://www.digitalpoint.com/
http://inventory.overture.com/