Wednesday, January 30, 2008

Page Rank calculate

Q: What is the hightest Page Rank you can get?
A:
PageRank is from 0 - 10, with 10 being the highest number awarded

Q: What Is a PageRank?
A: PageRank is a numerical value which represents the popularity of a website. This is status given to websites by http://www.google.com/

Q: How is PageRank Used?
A:
PageRank is one of the methods Google uses to determine a page's relevance or importance. It is only one part of the story when it comes to the Google listing, but the other aspects are discussed elsewhere (and are ever changing) and PageRank is interesting enough to deserve a paper of its own.

How is PageRank calculated?

To calculate the PageRank for a page, all of its inbound links are taken into account. These are links from within the site and links from outside the site.

PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

That's the equation that calculates a page's PageRank. It's the original one that was published when PageRank was being developed, and it is probable that Google uses a variation of it but they aren't telling us what it is. It doesn't matter though, as this equation is good enough.

Saturday, January 26, 2008

10. Top Domain PageRank (PR) Weight

After the Jagger Update Google has started to give more weight to your Homepage and less weight to any of your subpages under your homepage. The homepage is the page to show up by default when you type in your domain name (eg. http://www.homepage-wizard.com ) coming from the root directory of your domain (e.g. index.htm, start.htm or the like).

Solution:
Register your top domain in relevant directories and web portals to receive maximum incoming links to your top domain. Always keep updates and fresh content on your homepage (main start page, e.g. index.htm) as well. Register sub-pages in relavant directories and try to get one way inbound links to them as well in order to increase their page rank.

9. Hidden Links and Hidden Text

Hidden text is text that is made invisible on a website for the sole purpose to mislead search engines to see a content rich page, whilst the human visitor is only seeing images or a flash animation for example. The text is either made invisible by using same page and text colour or by use of CSS spam (see above). Why anybody is trying to hide text or links away from human visitors is totally beyond my mind. Mostly these are spammy, keyword stuffed text passages with the sole purpose to achieve a better search engine placement for selected keywords. It is, to put it simple, cheating. And search engines are treating it as is. At least after the Google Jagger update these tricks are detected and tracked down with very high precision. And I think this is a very good thing, because it was never good to give these spammers and cheaters any advantage over honest, content rich pages, that where not using these methods.

Solution:
Just don´t do it! Build useful, content rich, unique and relevant websites with are worth visiting for both, search engine spiders and robots. Avoid use of flash or image entry pages as long as you have no very good reason to do so. If you have already used invisible text or links, delete it, or even better, rewrite it to good, honest content and make it visible to your visitors. Put your main keywords into a headline or bulleted list, instead of unnaturally repeating them again and again. The search engines will love your site for this and will most probably rate it higher. Be quick, because once you fetch a penalty for your hidden text or hidden links, it is very difficult and will take long time to get the page back into the search engine result pages again!

8. Keyword Stuffing within ALT-Tags

Keyword stuffing is one of the oldest black hat SEO tricks and had been hunted down by search engine algorithm updates since quite some time. As it became more and more ineffective, "clever" webmasters have shifted to overuse the image ALT-Tags for keyword stuffing purposes. Well, everytime some cheat technique is overused, it is only a question of time for the search engine robot programmers to find a way to track these technics down and eleminate or penalize the sites using them. As many black hat SEO techniques, the grenade is now very likely to explode within your own hands, this is why I always recommend to use honest, so called white hat SEO techniques only, and to follow the search engine guidlines. In the long run, this will pay of much better.

Solution:
Do not stuff your image or other ALT-tags with keywords. You can use, some keywords, as long as they are relevant to the image or website content, there is no problem with this. But do not overuse this feature. Best is, you simply do what image ALT-tags are meant to be. Image ALT-tags are an additional, very brief description of your image using one or two short sentences including two or three RELEVANT keywords. They are meant to be help blind people or people that have disabled the display of images in their web browser to understand the content of the image. If a keyword does not suit, do not use it. Do not create ALT tags that look unnatural or have to many words within them.

7. Redirect Domains or Redirect Pages

Redirect domains and redirect pages are domains or webpages which are only used to forward visitors somewhere else, be it within the same domain or to another domain. Redirects are very similar to doorway pages, which are since some time considered as spam technique and are being tracked down by Google. The problem is, sometimes redirects are required, for example if a domain or webpage name has changed. To make it worth, there seems to be no way for Google to detect whether a redirect is just or evil.


Solution:
Do not use redirects within your domains or webpages, unless you have to because you have restructured or updated your website. If you cannot avoid using redirects it is better to use a 301 error page as permanent director, instead of direct redirect methods. Use the redirect option for a limited time only, and skip it after several weeks of use.

Friday, January 25, 2008

6. Inbound Links (IBL) Repetition

Inbound links are links that are pointing to pages within the same domain. These links of course are needed in every website, simply within your main menu for example, otherwise there would be no proper navigation. This is why Google is not fully neglecting these links, even after the Jagger Update. Inbound or navigational links are still picked up and followed by the search engine spider for indexing of your pages. This is the good news. The bad news, at least for spammers, is that Google is not rating them for pagerank purposes any more. And if you are overusing inbound links by giving them unnatural different positions and link names or keyword phrases, or by repeating them too often, they may even be treated as link spamming.

Solution:
One or two multiple inbound links are still O.K. Google knows that one could be text link based e.g. on the bottom of your page, the other one image link based, e.g. within the left hand main navigation menu of your page. But do not go beyond this limit, to be on the safe side.

5. Outbound Link (OBL) Relevancy

Or better say outbound link (OBL) Irrelevancy! This is linking with irrelevant websites or categories. After the Jaggar Update this is simply a waste of time and money.

Solution:
Find relevant categories related to the content and topics of your website and ask for placements - both paid and free, depending on your budget - within these categories. There are hundreds or even thousands of both paid and free directories out there waiting for the submission of your website. Always remember to submit to the most apropriate category and avoid submission to FFA linklists, linkfarms or bannerfarms (see above).

4. Abuse Usage of Reciprocal Link Exhanges (RLE)

In general there is nothing to say against link exchanges, as long as it is not done for the sole purpose of cheating the search engine spiders for better ranking. A honest exchange of links in between two sites, because their webmasters feel each other´s content to be a good supplement to their content is nothing to feel ashame or worried about. Therefore Google obviously has decided not to totally ignore link exchanges for a better ranking. But their overall value has been decreased, they are not weighed as heavily as before after the Jagger Update. Some reciprocal links or so called cross link exchanges have been detected as black hat SEO and are now totally neglected or even rated as link spamming. These are known sites created for the sole purpose of exchanging links as well as many FFAs (Free For All Link Lists), link farms, link selling or link renting and so on.

Solution:
Keep away or at least be very careful about exchanging links with suspicious FFAs, link farms or link exchange sites. They may more likely hurt you, than bringing any benefit to your website ranking. Stop or decrease your RLE link building strategy. Try to build one way linking through partnership programs, web and niche directories, relevant category listings and relevant industry portal registrations. Also to write your own articles with a link to your website within them and submitting these articles to above mentioned free article directories can be of great help for link building. Every time your article is picked up and distributed by another webmaster, they will include the link to your site within it. Allf for free and without you to have to link back. Make sure to write useful high quality articles without spelling mistakes, in order to have them republished more frequently. Also allow the other webmasters to modify your content as long as they leave your link active, in order to avoid duplicate content filters of the search engines.

Wednesday, January 23, 2008

3. Pirate Sites or Pirate Matter

Pirate sites and pirate matter are sites which mainly contain out of other - mostly automatically - stolen content without having the permission of the original author or webmaster. It could also be a mirror copy from the same domain, which has also been rated as spam. Sometimes, this is a dangerous game even to honest, straight forward webmasters. For example I run a website where I have two domains leading to it (one with a hyphen within the domain name, one without the hyphen). Of course it was not my intention to spam, but to have both very similar domain names, pointing to the same content, for obvious, legal reasons. Well, unfortunately this was enough for Google to rate this as spam and I had lot´s of trouble to get out of this mess, I can tell!

Solution:
If you cannot manage to keep up with regular fresh content, go to free content and article distribution websites, from which you can grab fresh nad free content at no or low cost. Use content from there, it is totally legal as long as you leave all links within these free articles within it and being active. It is a simple deal with advantages for both, the author and the webmaster copying the content offered. Some good sites with free articles and content for legal downlad are http://www.goarticles.com/ , http://www.az-articles.com/ or http://www.dreamicles.com/ to only mention a few.

2. Detection of CSS Spam

The bad thing about black hat SEO is, that some people always think they are too clever. The good thing about search engine algorithms is, that they are updated more or less regularly and that they are constantly getting better in trapping black hat SEO down. As a result many of these dark freaks find themselves or their websites penalized or even banned by the search engines. One of the latest search engine "tricks" had been CSS spamming. CSS spam are CSS pages which have been solely created for the misleading of both, search engine bots and human visitors. Playing such games with CSS is no longer worth the effort, at least after the recent Google Jagger Update. At least the Google robot will with a pretty good certainty detect such spam CSS leading to invisible text, which it by any means will rate as SPAM.

Solution:
Use CSS and CSS style sheets for better formatting of your website, not in order to mislead search engine robots. Besides, I can still not understand why so many webmasters are trying to hide their text away. Write good, honest content and make it officially visible to your visitors and the search engine spiders. The result is even better than spamming, because you will not mislead and will attract both, robots and humans, to rate your site to be more valuable. And all this without the constant danger of getting trapped and penalized on any other day.

1.Detection of Blog Spam

A Spam Blog has been defined to be a blog which is not more or less regularly updated and which has been created for the sole purpose to promote another website or product.

Solution:
Create blogs and maintain them regularly. Do not always use the same links to yours or other websites (including affiliate links). Try not to use your blogs as spam. Try to use them as useful addition to your website, to add community value to your visitors. Use your blogs as a guidline or help area rather than filling it with useless content or even worse to use it as a dull promotional platform only.

Tuesday, January 22, 2008

Jaggar, Updates and Solutions! The Google Jagger Update. Improving efforts from Google!

Google´s Jaggar Update was the most frightening for the SEO community of all Goolge Updates so far. It has also been the update taking the longest time so far. The Jagger Update has been carried out in various steps over a longer period of time. The Search Engine algorithm update affected almost all websites, especially websites with very competetive keywords or keyword phrases. A vast majority of webmasters, website owners and search engine optimizers (SEO) did not like the Jaggar Update very much. This is because many of them had to totally rethink and change their SEO strategies.

Google today is the major search engine. Personaly, I like Google. Not only because of it´s popularity and search volume. I like Google for it´s continuing efforts to maintain and improve their search engine quality and relevance of search results. Google after all is a popular search engine, not a piece of SEO playground. In terms of internet search and research, Google always comes up with new ideas and innovative approaches. The recent updates of Google´s search algorithms have only been additional steps in the direction to offer even better and more relevant search results to the search engine´s users. This is only legetimate.

In the following I will try to provide a list of elements that had been implemented within the most recent Jagger Update:

1. Detection of Blog Spam
2. Detection of CSS Spam
3. Pirate Sites or Pirate Matter
4. Abuse Usage of Reciprocal Link Exchanges (RLE)
5. Outbound Link (OBL) Relevancy
6. Inbound Links (IBL) Repetition
7. Redirect Domains or Redirect Pages
8. Keyword Stuffing within ALT-Tags
9. Hidden Links and Hidden Text
10. Top Domain PageRank (PR) Weight

A lot of threats Google had to deal with, mostly very rightfuly, but let´s hope they do the job right without causing too much of colatoral damage to innocent webmasters and honest website owners. For example I found, that they are blocking and deleting lot´s of Blogs run on Google owned blogger.com for spam reasons, but without that you can find out or understand why these blogs had been abused to be spamy. It seems, blogger is overdoing things here, hitting lot´s of innocent bloggers away as well. But let´s go back and take a closer look at the individual points of the recent Google Jaggar Update as listed above. I have tried to give a decent description of every issue and I also try to give you some hints about how to avoid getting trapped or penalized by Google´s updated filters.

Monday, January 21, 2008

Top Webmaster Forums

Webmaster Discussion Forums
Web Talk Forums is a revenue sharing webmaster discussion forum.

SEO.com Search Engine Optimization Forums
Discuss the various aspects of SEO at SEO.com's search engine optimization forums.

Digital Point Webmaster Forum
Search Engine Optimization and Marketing forum.

Search Engine Marketing Forum
Search engine marketing forum and community.

Webmaster Lighthouse Webmaster Forums
Webmaster Discussion Forums.

Webmaster Forum at V7N
Webmaster Community Forum.

Friday, January 18, 2008

Did you know that every page of your Web site stands on its own?

Every page should have a unique title, description, and keyword tag.The description tag should describe ONLY that page.

The keyword tag should include keywords for just that page.

Include 5-6 keywords, including the main keyword phrase and synonyms of thatkeyword phrase.

Don't make the mistake of including every keyword that could possibly describewhat your site is about in your keyword tag. Make your keyword meta tag specificfor each page.

The keyword tag holds very little importance anyway, but be sure to make it pagespecific. FOCUS!