Friday, December 28, 2007

Robots.txt - Its definition and use

Robots.txt : It is a file created by webmasters is a set of instructions for the search engines robots that index the web pages. This is called The Robots Exclusion Protocol.

It includes User-Agent and Disallow. Here is the meaning of these words as these are very important part of robots.txt file :

User-Agent : It is used for the search engine robots. It gives the permission to a particular or all robots to index the pages. Here is the syntax :

User-Agent :* ("*" means all robots have permission to crawl all the files/pages)

Disallow : It denies access to robots to directories/files. Here is the syntax :

Disallow : /images/ (it denies access to images directory.)

A simple robots.txt file should be :

User-Agent: *
Disallow:

It means we give permission to all the robots to visit and the directories and files.

It is very easy to create this file. Open notepad editor. Insert these instructions and save it by name robots in text format. Put it in top-level directory and upload it. Suppose your website name is http://www.abctest.com/ then your robots file's path should be here http://www.abctest.com/robots.txt

You can check your robots.txt file here http://tool.motoricerca.info/robots-checker.phtml.

If you need help to create robots.txt, use robots.txt generator tool here : http://www.seochat.com/seo-tools/robots-generator/

Some more useful Search Engines

There are lots of search engines where we can submit our website free of cost. Organic optimization is very helpful to increase traffic on the websites. I am writing the name of few search engines where you can submit your website. It is not very popular but can be useful to increase business and traffic on the website :

http://www.surfsafely.com
http://www.amidalla.com
http://www.amfibi.com
http://www.splatsearch.com
http://shoula.com
http://www.searchramp.com
http://axxasearch.com
http://www.moffly.com
http://www.nalox.com

How to remove pages from supplemental result

Steps to remove pages from supplemental result :

I already mentioned about the cause of supplemental result and it also decreases visitors. Whole world knows google search engine and they are prefering google for search. It provides quality search to the visitors. So at present we should work optimize or promote our website according to the google search engine. I want to write few steps helpful to remove a page from supplemental index :

a) Strictly follow the guidelines of google.
b) Use a right word to target the page.
c) Avoid query string on the page if possible.
d) Keep file name short and relevant to the content of the page.
e) Make a site map for the website and add it on the index page.
f) Remove useless comments or tags if any from the source of the page.
g) Write relevant meta description for the page and use very few commas not more than 3-4, avoid keyword stuffing.
h) If your website uses duplicate contents, change it.
i) Increase the backward link of your website and the inside pages. In this case link exchange is very helpful.
j) Donot use any affilate links on the pages of website.
k) Check dead link or bad link on your page. Use the URL for it : http://validator.w3.org/checklink/

Google will take about 1-2 months time to remove the pages from supplemental result. But I am not sure about it. It will take more time also.

Keyword Suggestion Tools

I want to write my experience related to keyword analyzing. Good keyword selection is also important to increase traffics. Sometimes it is not possible to find a right keyword for our web pages. It is better to find the relevant keywords used by the visitors. But don't forget to read the content of your page and after reading, use the keyword suggestion tools and insert a word related to the content of your page. After getting result, use the combination of this word with other relevant keywords. Here is the few useful keyword suggestion tools :

http://freekeywords.wordtracker.com/
http://www.keyworddiscovery.com/
http://adwords.google.com/select/KeywordToolExternal
http://www.digitalpoint.com/
http://inventory.overture.com/

How to increase page rank of a website?

Google uses the algorithm for the page rank is it count the external links of your site on other sites. So it is very important to get external links of website from other sites. For this, you must submit your website in Dmoz.org and Yahoo.com directories because it is most important directories consider by the Google in Google's algorithm.

The next you can exchange link with other websites to get more external links.
Link Exchange : In this process both two sites links to each other. It play an important role in improving the page rank. But we should remember about the quality pages. The page where we want link should be related theme to our website otherwise it is wasteless. Suppose if we have the site related to gift and we are requesting for the link exchange to that sites which contains the information on career then it is not relevant. If the career related sites provides the gift related category for the link exchange then it is good.

Search Engine Optimization

Definition : It is the tactics to gain the higher page rank of web-pages in search engines. To do so, we use various techniques. I hereby explain you about this in detail....First we should know about search engine :

Search Engine : Search engine provides the facility to the visitors to search the contents of web pages using key phrase. It stores the information of all the pages of a websites into a large database. Google uses robots to add the text in its large database. Search engine stores the information periodically from the websites.

About Website Submission : It is very important to submit the website in search engines to improve page ranks. Google will provide a good page rank if your website is approved in yahoo directory. So it is necessary also to keep quality title & description of the website in search engines. If your submission in Dmoz directory will approve then it will also added in Yahoo, MSN, Google etc. in a month or two or three. So you must submit your directory in Dmoz first. But you should choose relevant category for the submission and the information of your website i.e. title & description should be relevant and grammatical mistake free. Few of the Search Engines where we should submit our site is khoj.com, Zatka.com, Atlastraveldirectory.com(Travel Related Website)

Submission in Google, Yahoo, MSN

Without listing in search engines, we donot expect traffic to our website. So it is better to submit our website in top search engines when the work has finished. But sometimes google, yahoo and msn search engines crawl the website without submission. The crawler read the address somewhere else and crawl the website. If you have inbound links or if you submitted in the directories or in article sites then it can be possible else not. So better if we manually submit it in the search engines. As we all know about Google, Yahoo and MSN. Sometimes it is very difficult to find the exact page for submission for the fresher. So I want to write here the exact location where you can submit your website directly in these search engines :

Google - http://www.google.com/addurl.html
MSN - http://search.msn.co.in/docs/submit.aspx?FORM=WSDD2
Yahoo - http://search.yahoo.com/info/submit.html(Must have yahoo id to submit website)

Black Hat SEO & White Hat SEO

Definition : Black Hat SEO

Anything that decieves search engine is known as Black Hat SEO. It includes link farms, page redirects, link to bad Neighborhoods, duplicate contents on site, duplicate websites, keyword stuffing, creating doorway pages. Search Engine may panalize the offending site or even remove the site from listing. It is not a good search engine practice to generate traffic and increase business. We should avoid it and use the White Hat SEO for promotion.

White Hat SEO :

White hat SEO follow the guidelines of search engines and design the web pages for the humans to read it easily not for the search engines to use techinque to get listed on top in search engine using wrong method of search engine optimization.

Organic SEO Vs PPC

Organic SEO :

In organic search engine optimization, website owner or webmasters spend time and effort to get listed on top in major search engines. But natural optimization is much better than PPC campaign. It costs nothing. Only we spend time, follow the guidelines of search engines, and search some good websites to increase the backlinks of our website. We need also fresh content for our website pages means no duplicate or copied content. Long term market strategy is organic search engine optimization not PPC. Website owner always focus on nature search not PPC.

Advantages of Organic Search Engine Optimization :

1. Not require permanent investment in PPC campaign.
2. Organic search engine optimization delivers result for long time.
3. The traffic comes through organic search engine results is free and we can utilize the money in our other project without investing money in PPC campaign.

PPC :

It is time saving process and provides immediate results and very useful for the new business. Google and other search engines takes some times to crawl a new website. So it is better to get listed on top in sponsored ads in the search engines like google, yahoo and msn and other popular search engines. It needs good keyword selection to convert visitors into business and attractive content. It is the temporary solution to get traffic on the website. If you have money then you run PPC campaign else not.

We have lots of option to increase traffic and get good position in search engines. We have plenty of free articles directories, blogs. We could use these options to increase the backlinks of our websites. It takes time but only organic seo is the solution not PPC.

Advantages of PPC Campaign :

1. Instant result in business.
2. We can stop our PPC campaign any time.
3. We use any keyword to list our website.
4. We can send visitors to any landing page.
5. Perfect for time limited scheme.
6. It works with not well designed websites or whose ranking is low in search engines.

Articles Role in Search Engine Optimization

As I already mentioned about the role of backlink these days. It is difficult to get one-way link these days. But blog services and article directories make it easy. Use of article directories and blogs to increase the backlinks of the website. There are many directories where you can publish your articles.

Please keep few points in mind during article writing for your website :

a) Must check the spelling and grammer. Some article directories provides online spelling check else use ms-word or other free spelling checker tools.

b) Article should be simple and use heading for each section.

c) Donot advertise your product in article. Write the quality, advantages, etc of your product.

d) Write article between 300-700. Some article directories mention the word limit. Article should not be short. Try to explain the product or business your are writing about in meaningfull way.

Here is the list of some free articles directory. Publish your article here. I hope you will get more traffic and it helps to increase the backlinks of your website also :

http://www.articlesisland.com/
http://www.articlesworldonline.com/
http://www.allthebestarticles.com/
http://freearticles101.info/
http://www.thenichearticles.com/
http://www.articlegold.com/
http://www.bigarticle.com/
http://www.goarticles.com/
http://www.easyarticles.com/
http://www.digital-web.com/
http://www.articlecity.com/
http://www.ezinearticles.com/
http://www.articledashboard.com/
http://www.articlebots.com/
http://www.articles-book.com/
http://www.articles4free.com/
http://www.articlesbeyondbetter.com/
http://www.articlesexpress.com/
http://www.article-content-king.com/
http://www.hotlib.com/
http://www.thewhir.com/
http://www.whatsay.com/
http://www.articledestination.com/
http://www.article-treasure.com/
http://www.goarticle.net/

I will update more article directories in future and also blog sites.

Open Directory Project - Dmoz Directory

The DMOZ Open Directory Project or ODP having more than 590,000 categories manage by human volunteers or editors who manage and review the website. You must read the rule before submission otherwise under strict guidelines your website will be rejected. The suggested website carefully reviewed by the human editor before listing.

It could take 3-6 months to review a website but it is not exact. Some websites listed in few weeks, some takes more time.

ODP play an important role in search engine ranking because you have the chance to get link back from the ODP and boost the ranking in google. Also Google, AOL, AllTheWeb, Hotbot and other web directories use the ODP data. So the role of ODP is very important to improve search engine ranking.

Remember few points in the mind. It could be helpful and more chances to get listed in ODP :

1) Donot exchange link with porn websites.
2) The site should not be redirected to other location.
3) It shouldnot be mirror or under construction.
4) Submit the site in most specific category in deep.
5) The description should be relevant to the category. Donot include promotional word.
6) It accept more than one page but the content should unique and relevant to the category.

Role of Social Booking in Search Engine Optimization

Social Bookmarking is another way to increase traffic and even back links. It is the best way to share, store and search the internet bookmarks. It is the websites where we stores our favorite websites and these stored links can be accessible to all other users joined the same social networking website. Users can find the favorite websites through tags.

It is not possible for the automated softwares to add the links in the social networking websites. All the links added here by human user. We can increase the traffic to our websites by social networking websites. Other users of similar interest search our websites by using tags and visit it. Few of the social bookmarking websites are :

simpy.com
del.icio.us
netvouz.com
givealink.org
stumbleupon.com
digg.com
mybookmarks.com
ma.gnolia.com

Create your profile here and add your favorite websites here.

Thursday, December 27, 2007

SEO Tools

These tools are very important in Search Engine Optimization. So I want to add the information about these tools and best sites provided these tools free.

Keyword Density Analyzer Tool : If you want to check the percentage of a particular keyword comes how many times in the page, then you can check it by this tool.
Useful Sites :
http://www.keyworddensity.com/
http://www.seochat.com/seo-tools/keyword-density/

Meta Tag Generator : This tool helps you to generate meta tags and configure them.
Useful Sites :
http://www.seochat.com/seo-tools/meta-tag-generator/
http://www.free-webmaster-tools.com/Meta-Tag-Generator.htm

Backlink Checker : This tool provides you detail about the websites linking to your site.
Useful Sites : http://www.iwebtool.com/backlink_checker
http://www.build-reciprocal-links.com/backlink_checker_tool.html

Google PageRank Prediction Tool : This tool predict your website future page rank developed by Google.
Useful Sites : http://www.iwebtool.com/pagerank_prediction
http://www.gopagerank.com/pagerank/future-pagerank-prediction.php

Robots Text Generator : A very useful tool for the new webmaster to generate robots.txt file. You already know the purpose of robots.txt file. It is the file that control search engines robots which webpages or directory of a website indexes. There are many websites providing tools to generate robots.txt. Here is the suggested tools :
Useful Sites : http://www.1-hit.com/all-in-one/tool-robots.txt-generator.htm
http://www.seochat.com/seo-tools/robots-generator/

Page Size Check Tool : It is very important tool for the website designers. Heavy pages takes time to download at user end. This tool calculate the size of the webpage including the size of text, images(GIFs and JPEGs files).
Useful Site : http://www.seochat.com/seo-tools/page-size/

Wednesday, December 26, 2007

SEO Secrets

Less code more content
When a web crawler visits your site, it is there to read your title, description and content. What it doesn't want to do is wade though pages of code in order to find a few lines of content. There are two reasons why this is bad:
1. It looks to the crawler that you are trying to hide something (spammers will use code to hide hidden text or redirects). If the crawler doesn't understand something, it will ignore it and, most likely, your site.
2. The web crawler may think you don't have enough content to put any importance into your site and, if it doesn't think your site is important, then it will not list you too high in the returned search results.
What you should be shooting for is at least a 2/1 ratio in favor of content on 75% of your pages. This will let the web crawler know your site has some sort of relevance and also gives the crawler more content to index, which it really likes.

Frames and Flash
I have addressed this subject before, but I want to go into it again because it is one of those things which a lot of SEO's just don't understand. The major search engines, i.e. Yahoo, Google and MSN, will all say the same thing on their web masters pages and that is "some web crawlers have difficulty reading the context of these pages". What it doesn't say is don't use flash or framed pages because this is a no-no. The reason why I want to make that clear is a lot of SEO's will try and tell you that this is a bad thing and that all your pages need to be flat static pages with lots of alt text behind your flat images. This is not the case. We have gotten many websites in the top ten and number one spots, some of which use both flash and framed pages. Some do not even have any indexable text or links on the home page. So how do we do it? Simple and its something you should be doing anyway. You should have an alternative version of your site which viewers with flash installed will be directed to. For a framed site, you should also have a non-framed version for people whose browsers don't support framed pages. Now that you know how simple it is, here is the temptation. Because your index page will most likely only be seen by maybe 10% of the people that come to your site, the temptation will be to flood that page with keywords and search text to get yourself bumped up the search and that will happen. However, when you get caught (which you will), you will be dumped. This page should only contain a stripped down version of your site and should only contain the same text content as your flash or framed site.

Keyword overindulgence
When writing the copy for your site, write it for your viewer and not for the web crawler. A lot of people get tempted to use the same word or phrase over and over to try and get themselves placed first on an web search. This is a total rookie move and can also get you bumped for what's called keyword flooding. Here are some examples:
Bad : Bay Area Search Engine Optimizing
Bay Area search engine optimizing company, located in the bay area. Our services include search engine optimizing, SEO and search engine placement. As part of our search engine optimizing service, we guarantee you top placement on all search engines within 30 days or your money back. When you chose our company for your search engine optimizing needs we will........ I think you get the point.
Good : Bay Area Search Engine optimizing
Jigsaw design is located in the Bay Area California. We are committed to serving all your design needs. Also, as part of our service, we also include search engine optimization, web hosting and online store solutions...... bla bla bla
I'm sure you notice the difference. Our web site does show up in the top ten for the search term "bay area web search engine optimizing". However, that was before we wrote this article. So how many times can you include the same search term? That only the Search engine companies know for sure. However, based our experience with sites we have successfully placed in the top ten, we would say no more than once every two paragraphs or once every 100 words.

Out of style sheets
Style sheets are one of those things that are easy to abuse and can get you in trouble. When use correctly, they can really help you when optimizing a site. To abuse the use of style sheets is simple. All you have to do is make your H1 tag the same as your normal text style to try and make the crawler think that all your text is important. The trouble is crawlers are not as dumb as the used to be. They figured out that nobody has a header that lasts 5 paragraphs. The proper use of the H1 tag is your main page header. For example, this page is "SEO Secrets". Sub headers like "out of style sheets" should use the H2 tag and your content should always be normal text. Never, no matter how tempting it is don't make any text hidden by use of style sheets or any other method because your site will be bumped. Also do not make the folder that your style sheet is in hidden or private because, if the web crawler can't read your style sheet, it's not going to trust that everything is legitimate.

Stale content
ou may be able to get your site listed in the top ten within 30 - 60 days, but that's no guarantee it will stay there. If your site is not continually being updated with fresh content and growing, then the search engines will simply dump it in favor of newer fresher sites. You should be adding a new page to your site every time the crawler visits and updating your old content as much as you can. Daily is preferred, but weekly at the very least. This will mean that the crawler will have fresh content to index and new links to explore. Thus, it will never see your site as stale and drop it.

NOT AS IMPORTANT AS A LOT OF SEO'S WOULD HAVE YOU BELIEVE

Page Rank
If you have downloaded Google's toolbar, you will see a bar at the top of each page which gives you a rating from 1 to 10 of how Google ranks your site. This means absolutely nothing when it comes to search engine optimizing. Some search engine companies think that getting a high page rank means you are going to be listed higher in the search. This is not the case. In fact, it couldn't be further from the truth. What this little bar indicates is simply how important Google sees your site and it makes that judgment on the quality of sites that link to you. The single most important thing about your site is not page rank, it is not how many incoming links you have and it is not how long your site has been up. It is your content, plain and simple. If your content matches the search criteria, then you will be listed. The more relevant content you have, the higher your site will be listed. There are a lot of sites listed in the number one spot that don't have any page rank.

Incoming links
Don't believe a word you hear about incoming links, the quality of the link or anything to do with link exchange because the truth is; it doesn't really matter how many links you have to your site and what sites link to you. What does matter is the way the sites link to you and how often new links show up to your site. If you wake up tomorrow and, overnight, 1 million other sites decided to link to you, that would not affect your position in any way (what it may do is hurt your listing because it will look like spam). Here is how links work for you:
1. If you are a plastic surgeon and another plastic surgery site links to you, that will give you a boost because you are being linked to a site that has some relevance to yours.
2. If your site sells used network equipment and the link to your site is like this used network equipment that will give you a boost because the link is a search term.
3. If you acquire new links slowly over the course of time, that will give you a boost because it looks like your site is gaining popularity.
4. If people link to internal pages of your site, as well as your home page, that will give you a boost because it looks like you have lots of pages of relevant information.

Thursday, December 20, 2007

crawler

A program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Another term for these programs is webcrawler.
Because most Web pages contain links to other pages, a spider can start almost anywhere. As soon as it sees a link to another page, it goes off and fetches it. Large search engines, like Alta Vista, have many spiders working in parallel.
Also see How Web Search Engines Work in the Did You Know . . . ? section of Webopedia.

What is SEO

search engine optimization, the process of increasing the amount of visitors to a Web site by ranking high in the search results of a search engine. The higher a Web site ranks in the results of a search, the greater the chance that that site will be visited by a user. It is common practice for Internet users to not click through pages and pages of search results, so where a site ranks in a search is essential for directing more traffic toward the site.
Good search engine optimizing (SEO) is a combination of design, user friendliness and content

Design:
All content including links, headers, alternative image text, alternative frame text and content text on a well designed website is viewable to the user and, more importantly, in terms of SEO, the web crawler.

User Friendliness :
Everything the viewer wants to know on your site should be no more than three clicks away (important information like contact info should always be one click away). In terms of SEO, a web crawler can only see the pages on your site that are directly linked, so a well optimized site will have text links as well as any graphical links such as flash or java script.

Content :
As well as the body content of your site, a web crawler will also look at the page title, page description, keywords, and file name. It will make its judgments on how to list your site based on these factors. A well-optimized site will have a page title that briefly describes the content, a description tag that will describe the content in more detail, keyword tags that will pull select keywords and key terms from the content, a file name that reflects the page title and content that is clear and relevant to the user and the web crawler.

More About Content :
The biggest mistake most people/companies make when optimizing a site is overdoing it, i.e., repeating keywords, duplicating image tags, flooding the backend with descriptive code and trying to fit as many keywords/terms into the meta tabs as possible. Not only is this a messy practice. but it can also get your site penalized and may be dropped from searches altogether. When building/optimizing a site, the first person you should be thinking of is the user and not the web crawler. Not only will this be better for your SEO efforts, it will also be a better experience for your potential customer.

What is a Web Crawler?
The best way to describe a web crawler is a blind child with a clear sense of right and wrong. It is blind because it doesn't see all the fancy graphics and creative flash. It only sees the text and alternative text on your site. It is a child because it has to be guided around your site by clear paths that it understands. The sense of right and wrong comes from it having the ability to determine if something is not right. i.e., if it comes across some red text that is on top of a red background it will see it as hidden text or if it sees a page redirect to a different site without any reason for it, it will see it as a gateway page, both of which are no-no's.
So to sum up on what is good SEO, the short answer is we will design your site for your customers first and web crawlers second taken into consideration browser compatibilities, text-only browsers and blind viewers. The result is a clear, informative experience for your users and a friendly experience for your web crawler.

Monday, December 17, 2007

Image Optimization

whether the use of ALT text and keywords in image file names have any effect on rankings. In two words - they do.
For those who don't know what Universal Search is, Google is now blending search results from all of their search properties - web pages, images, books, videos, etc. Where a search in Google used to bring up a page of HTML results, you will now get all of the above. This means everything on your site must be optimized these days, not just your web pages.

1.Context is extremely important. Images can rank based on what surrounds them on the page. Pay attention to keyword text, headings, etc. on the page. Image-only sites generally only work well if it is an extremely well-known brand or product. Otherwise, you need keyword rich text.

2.URL content text is important, too. The text of your URL is looked at as part of the context surrounding the image. The domain name, directory name and filename of the image (name it with keywords) are taken into consideration as far as relevance.

3.Use captions if possible. Take a tip from newspaper photos and place keyword rich captions with your images. Make the text good, quality content, not keyword spam.

4.Proper image type is crucial. Make absolutely sure your photographs are .jpg and not .gif. The .jpg format is standard for photographs whereas the .gif format is normally used for graphic images.

5.Images can effect reputation management. Non-flattering images can really hurt you. Optimizing your images can help push any images you would rather the public not see off the results pages.

6.Create a sitemap feed of images. As with web pages, the search engines can follow a sitemap that you create of images that you want spidered.

7.Use descriptive ALT text. Search optimizers used to debate over whether ALT text had any influence on rankings. That debate is over. Optimize your images using descriptive keyword-rich ALT text. Don't stuff the ALT attribute. Make it short and to the point. The keyword rich ALT text for your images could be the tie-breaker over who gets the top spot, so always use it.

8.Use the word "picture" or "photo" in your ALT text. Some searchers do use those words when they search for images.

9.Label images with your brand or URL. This will capture some home page traffic for you. Watermark the images or just add your URL somewhere.

10.Use high quality images. Unique images with good contrast tend to be the best.

11.Add photos to Google Maps. Fluff up your Google Maps listing with photos and images. These can influence visitors who find you through local search, etc.

12.Post images to Flickr. Open a Flickr account and put unique photos in your account. Basically, each photo you put up is its own web page with a title, description and tags. You can include a link back to your site and share the photos with other Flickr users and social sites.

13.Use Feedburner. I know this might sound strange since Feedburner is for RSS feeds, but if you handle your RSS feed through this free Google-owned service, you can also have images you post to your Flickr account included in the e-mail updates that are sent to e-mail subscribers of your RSS feed. This is all set up in your Feedburner account admin.

Saturday, December 15, 2007

Off-Page Factors

Links
If Content is King, then Links are Queen. Search engines look at links pointing to your site as verification that you are an important authority site. It’s not just the quantity of links but the quality that counts. You can have thousands of links pointing to you, but if they are all from link farms or spammy sites, they won’t do you any good. Try to get back links from quality sites. If you have good content, a lot of links will come your way naturally, but if you want to speed things up, you’ll need to actively pursue those links. One way is to contact theme related, non-competitive authority sites and request a link. The acid test for a potential link is if there is a natural, logical reason for that site to link to you. If not, then you don’t want the link.

And, you want the links back to your site to use your keyword text in them. This is extremely important. If the keyword you are targeting is “widget” then you want the link back to your widget page to use that text and not “click here” or something like that.

Another way to use your content to get back links is by submitting articles to other sites for publication (A blog and RSS feed are great for this). Just be sure the content includes links to your site.

Submitting to trusted directories is also a good place to start. Most of the best require a fee for a listing, but they can be a great first step in your link building campaign.

There’s no simple, easy one-step way to gain links. It’s really about networking and relationships and your useful content is the key.

Competition
Keep track of your competition by searching for your primary keywords and study what they are doing. Don’t copy them, but you can analyze what they are doing right and you are doing wrong. See who is linking to them and investigate getting your own link. If you are a new site, you’ll be playing catch up for a while, but have faith. That guy in the #1 spot had to start from scratch at some point, too.

Training & Support
If you are on a shoestring budget and don’t have money to hire an SEO, you’ll have to do it all yourself. SEO changes daily and if you think all you have to do is tweak Meta Tags, you’re several years behind and have a LOT of catching up to do. You’ll be learning as you go. You’ll probably want to invest in some SEO training.

You can get ideas, updates and recommendations from my blog and from SE O forums and blogs online. Don’t rely on the forums as a solution for all of your website problems, but as a place to go for advice from SEOs and others who are also asking questions.

Analysis & Statistics
Sounds boring, but all of your hard work is worthless if you don’t know how you are doing. Chances are your hosting company will have some sort of web statistics feature where you can check basics such as unique visitors, where your traffic is coming from (referrals), page not found errors, etc. One mistake newbies make is to consider “hits” as the number of visitors they are getting. In actuality, “hits” are useless information. Hits are simply server pulls. As an example, if you have ten images on a page each time the page is loaded each image results in a server pull or “hit.” What you really want to look at is the number of “unique visitors” to your site, not hits, as an indication of your traffic.

If you are an e-commerce site, you’ll also want a way to track conversions, which will require something more than your basic hosting stats. Google offers free web analytics that could be adequate for many site owners, but there are also commercial applications available that offer greater functionality.

Whatever you do, don’t leave the site on autopilot. Check your stats frequently. You’d be surprised at the little things you’ll see that will help you bring in more traffic.

History
There is evidence that the search engines actually look at your domain history in their ranking algorithms (How long the domain has been up, how many years you’ve renewed for, if you’ve changed IP addresses frequently, etc.). The more stable you are the more they consider you a trusted site.

If you’re in it for the long haul, renew your domain for several years at a time (not just annually) and get a dedicated IP address and keep it. The best situation is to have a dedicated web server, but not all of us can afford that. The next best thing is to pay for a dedicated IP address with your host so that you are no longer sharing the hosted IP block. It usually doesn’t cost that much. Not only will the search engines see you as stable, you don’t run the risk of the IP being banned if one of your shared hosting neighbors is naughty.

Don’t bounce from host to host because that screams SPAMMER to the search engines. Find a good hosting company and stay there.

Wednesday, December 12, 2007

On-Page Factors

Title Tag
This one is very important. Among the first things the spiders will crawl on your page is the Title Tag at the very top of your HTML code. This is what you see in the blue bar at the top of your browser when you land on a page. Using unique text in this tag on each page is absolutely essential. I have seen huge sites with thousands of pages all using the same content in the Title Tag of each page, frequently the name of the company as the only text. Not only will you NOT rank for anything but what is in that tag for your entire site (Do you want every page on your site to rank for nothing but your company name? I don’t think so.), but you run the risk of most of your pages winding up in Google’s Supplemental Index, probably never to be seen again. You must have a unique Title Tag related to the unique subject matter of each page throughout your website (10 to 15 words, 80 characters maximum).

Internal Navigation
There was a time when the search engine crawlers choked on javascript links and database driven web pages that looked something like http://www.widgets.com/product.php?categoryid=1&productid=10, but they are better at reading them these days. However, you still need to make your links as digestible to the spiders as you can. As much as possible, you should make your links through plain text and CSS (Cascading Style Sheets). Javascript and image map links should be avoided as well as session IDs and variables in dynamic pages. Avoid using frames like the plague! These can all still give spiders a fit. Also, use a sitemap with text links to not only help visitors find what they are looking for, but to direct the spiders to all of your internal pages.

Make Your Site Unique
They say that imitation is the highest form of flattery, but that’s a big no-no on the web. Do not copy someone else. Make your site as unique as possible with information that no one else has. In other words, don’t steal content off of someone else’s site. Not only can that be copyright infringement, but it can put you and the site you copied from in hot water with the search engines for duplicate content (see Duplicate Content below). Creating a buzz about something unique is great link bait. Which leads us to.

Content
Content is King. Content is spider food. The search engines are looking for the foremost authority on a keyword or phrase. Make sure your site has plenty of keyword rich content high on the page that is useful to the visitor as well as digestible to the spiders. Make use of H1, H2 and H3 headlines that contain your keywords. Make sure your prose is natural and easy to read.

Don’t go overboard and make every other word on the page the keyword you want to rank the page for. Stuffing the page with keywords is considered a form of spam.

Focus on search phrases, not single keywords, and put your location in your text (“our Palm Springs showroom” not “our showroom”) to help you get found in local searches.

Having terrific content will not only be great for your visitors and spiders, but it’s wonderful link bait, too (see Links below). A blog is a great way to create fresh, new content (for the spiders and for visitors) and attract inbound links.

Also, use Flash animation and images sparingly. Spiders can read text, not Flash nor pictures. A sure way to kill any chance of ranking well is to create a site that is all Flash or mostly images.

Duplicate Content
Let’s say you have a site that sells a thousand different types of widgets and the pages are all built from the same template with the same text and the only difference is the model of widget on the page. What could happen is that the search engines will not see enough difference in the pages to consider them unique and will rank what it considers the best single page and dump the rest, in the case of Google, into Supplemental Index limbo.

Make sure all of your pages have unique Title Tags, Meta Tags (see below) and text, in this case probably in the form of product description text.

And, if you are writing articles for distribution to the various article sites for mass distribution (a great way to get back links), be sure to publish the article on your own site first and give the spiders a chance to crawl it. That identifies you as the originator of the content. Then push the article out for distribution across the web, making sure you have a link back to your site in the article content.

Code Bloat
Between you, your web designer and web programmer, it’s real easy to wind up with a page that is full of internal code that not only impedes spiders, but causes your pages to load at a snail’s pace. Be very careful with this. Too much code will send both the spiders and the visitors away and can knock the meat of your pages down to the bottom. It’s best to have your spider-friendly content as high in your code as possible, so when you can, place javascript (if you absolutely MUST use it) and CSS in external files that can be called with a single line of code from each page.

For instance, one site I worked with had so much javascript going on that the first 200 lines of code after the Title and Meta Tags were javascript, knocking the rest of the content down and making the page load size huge. I was able to move the javascript into external files, each simply called by a single line of code. This made every page on the site smaller in size and brought the spider-friendly content up higher in the code by 199 lines.

For example, you could put your all 100 lines of your CSS on each and every one of your 300 site pages or you could call your CSS from an external file called style.css with one single line of code on each page like: If you don’t know what I’m talking about, you’ll need to ask your web developer or learn a bit about HTML.

Tweak and Test
Make one change at a time and evaluate. Changing too many things at once can confuse things to the point where you don’t know which change you made did what. For instance, let’s say you changed your content on a page as well as the linking structure and Meta Tags at the same time and the page dropped in the rankings a few days later. How would you know which to point to as the problem?
Try one tweak at a time and give the search engines time to digest it before moving on to the next.

Meta Tags
The only Meta Tag that carries any weight at all as far as SEO is the description, and it doesn’t have the influence it once had. Still, it’s a good idea to make it keyword rich and include what you want to show up in the SERPs (search engine result pages) as your description. Yes, this is what frequently comes up describing your site in the results, so be sure it says what you want it to say.

And, it is believed that having a unique TITLE and Meta description tag on each page will help keep pages out of Google’s Supplemental Index.

The keyword tag has very little influence on rankings anymore, practically none, but it can’t hurt to include it. Just don’t stuff if with a thousand words. Ten or so should be enough for any page.

Tuesday, December 11, 2007

keyword phrases

Remember to use supporting words, synonyms, and surrounding textwhen linking between pages. Allow for at least a 20% topic driftand avoid use the exact same link text (anchor text) to link betweenall of your pages and Web sites.Avoid building up link reputation that looks unnatural or contrived.If you're a writer, swap up your bio as well. After all, don't youwant to be found under a variety of different keyword phrases thatare important to your business?

One quick and easy way to see how you are using your keyword phrase"visually" at a glance, is to enter your main keyword phrase intoGoogle's tool bar and turn on the "hilighter" option. Thishighlights every instance of your phrase as it is being used on thepage. You can glance at it briefly and visually spot where ever youmay be over-using any keyword.

Monday, December 10, 2007

stuff keywords , robots , article or product review .

Do NOT stuff keywords into your image Alt tag. Use the Alt tag only for describingwhat the image is. Keywords in the Alt tag do NOT influence relevency at all.It is important to use every tag for the purpose it was meant to be used.
Don't stuff keywords in comment tags. Use comment tags in the manner in which thetag was designed to be used -- to place comments about the page itself. Don't putkeywords in comment tags at all.

Include a link to your site map on every page of your Web site.Many search engine robots look for your sitemap in order to crawlthe entire Web site. If you include a link to your sitemap on eachpage, you are giving the robots easy access to crawl your entireWeb site, regardless of which page they come in on.
Place a robots.txt file on every domain. The first thing a spiderlooks for when it hits your Web site is a robots.txt file. If it'slooking for that file, put it on your site! Robot Manager Pro is anexcellent tool for creating robots.txt files as well as monitoringspider traffic. Download a free copy athttp://www.websitemanagementtools.com/robot.html.

Write and article or product review add it to your site and submit it toArticle Announce. The article is then archived in Yahoo! and you can expectto see the search engines spider your site more often the more you updateyour own pages with new content.

Are you prepared for the future SEO challenges

There are approximately 3 types of responses that we hear expressed from the professional SEO community with respect to how the personalization of search will affect the average business owner's profitability and traffic. In just a minute, I'll let you watch a video of Michael Marshall explaining these 3 responses. But if you are wondering what all of the controversy over the issue of personalization is about, let's touch on that for a moment first.
Most of the major search engines have been gravitating towards providing a much more user friendly experience by offering enhanced personalization which can be switched on or off. The result of switching it on is that the search engine begins to gather data about the type of information you search for over time and then it has the effect of "personalizes" the search results by delivering up the most pertinent data based on an individuals search preferences.
The advantage or benefit is geared toward the benefit of the user. Over time the specific search engine seems to be able to shuffle through search results and quickly deliver the best information based on the independent searcher's interests. Not only that, but the results are shuffled based on each users interests even if 3 or 4 people are all searching on the same machine.
What this means is that the aspect of traditional "rankings" or where a Web page shows up in the results will totally depend on who is doing the search. The same search for the exact same keyword phrase, will no longer produce the same exact results to different searchers because results will be shuffled and delivered up according to each searcher's interests. Much of how you performed SEO and traditional rank checking will become obsolete since over time, you can no longer assume every user sees the same results for a specific search term. In fact as personalization becomes more widely adopted, the traditional forms of rank checking will eventually become meaningless. In the future, traditional SEO software tools for rank checking will no longer help you determine how well you are doing or what percentage of your audience is even seeing your page.

What is your reaction to the personalization of search?What are other people saying about the "personalization of search" and it's implications to traffic and business.

Saturday, December 8, 2007

Best Content

Who is your "target audience"? When you write your Web sitecontent, who are you focusing it on?

Have the search engines ever purchased anything from you? Forgetthe search engines! Focus on your customers! When you writecontent, write valuable content focused on your customers. Focus onone topic (keyword phrase) only. Your keyword-containing tagsshould include that one topic (keyword phrase) only. Everything onthe page should be focused on that one topic. Prove to the searchengine that your page is about that topic. The page will then bevery relevant for the topic. But it be valuable for the customers.

Don't ever create a page just for the search engines. The searchengines aren't and probably won't ever be your customers. Keep yourtrue customers in mind, and give THEM true value throughout yoursite and in your content.

When performing SEO - Focus on writing for the human reader first and search enginessecond. While search engine optimization is important to your visibility, trywriting your content first. Most people don't write their best content when tryingto optimize AND create content at the same time.
Yet we continually see people trying to do this. We teach our students in class,to spend time wisely creating useful content that will be a genuine benefit totheir visitors. Once you have written something of genuine value that stands onits own merit, you can go back over the article as a second step and apply somemild optimization. This is much easier than trying to optimize while creatingyour content at the same time. Following this simple rule of taking it in 2 phaseswill help you enormously.
For effective search engine marketing, you need more than just top rankings.
Remember to include strong calls to action to compel a response from the reader.What do you want the reader to do once they finish reading your content? You needto clearly build these options into each page and leave nothing to guess work.
Describe what action you would like them to take and never assume that they willjust know what to do, because they won't act unless you spell it out.
While you are increasing your visibility of your newly optimized content, why notalso look for ways to increase response from your current traffic too?
One way to increase "business response" with your current traffic is to offera special promotion. This could be a seasonal promotion, a special offer or adiscount on a product featured prominently on your home page. Offer somethingthat is popular at the right price, but don't offer too many choices.
Increasing sales with your current traffic is just as nice as bringing in newtraffic. When deciding what you will do, offer something priced for thespontaneous buyer.

Thursday, December 6, 2007

Be sure to include geographical information like your real world address on your
pages. Your information should be included in HTML so the spiders can read it.
A whole new breed of spiders is looking for area codes, zip codes and city
names, to assist searchers in finding stores and services near their location.

Doing so will allow you to come up ahead of the competition when someone searches
for your business... in Orlando or a specific geographical region.

Write customer focused content that appeals to your audience.

Remember that your Web site should be focused on your audience first and
appeal specifically to a niche interest. When writing your Web copy, you
need to dialogue in an appropriate tone and format for the right audience.

Some of the most interesting content will have more appeal if it speaks to
your audience in terms like: you can, you might, you will, yours, your and
you're INSTEAD of we, ours, we're, we will, we can, etc.

This is extremely important when working on sales related copy.

Search for your most important keyword phrase।

Search for your most important keyword phrase Carefully study thetop 10 results। Which is the best written title? Which is themost captivating?
Which grabs the reader by the throat and makes him or her wantto click on the link? Is it the #1 ranked result? Not necessarily,is it? Is it your listing?
If it's not your listing, and especially if you're having problemswith click throughs with this keyword phrase, how can you changeyour title to make it more intriguing and designed to pull intraffic? We're talking about click throughs here। You can oftenbeat the top ranked sites if you have a moreappealing and captivating title.
Make a list of title changes, being sure to include your keywordphrase in the title। Set it aside and don't look at it for twodays। Then, bring it out again and study the list again।Begun culling down the list। Get others' opinions।Try a new title, and see if your click through rate will improve.

Wednesday, December 5, 2007

Are your links visible to a search engine spider?

A lot of Content and Links displayed on a webpage may not actually
be visible to the Search Engines, eg. Flash based content,
content generated through javascript, content displayed as
images etc.

Try this tool which Simulates a Search Engine by displaying the
contents of a webpage exactly how a Search Engine would see it.

http://www.webconfs.com/search-engine-spider-simulator.php