Friday, December 26, 2008
Saturday, December 20, 2008
An Incomplete Guide to SEO : Designing your Site
Designing your Site
Ready for the techie stuff? OK, grab your coffee/beer/herbal chai.
First and most important:
You need lots of content, LOTS of it. Before you have even considered site design and such, you should have 100 odd pages of actual content. Yes, there are supposed to be two zero’s on the end of that 1… 100, I mean it. A page of content means about 200-500 words.
Of course, no-one does this, I didn’t! But, if you are serious of getting gobs of traffic, and you do have lots of rich content to publish, just think how far ahead you will be of poor schmuks like me.
As I mentioned before, designing your site for traffic, both human and search engine spider is very different than a few years ago. Its now about what is on the page that people can see. No more having a 200 keyword list that is set to the same color as the background at the bottom of the page.
If your impatient, according to the “SEO guys”, here are the most important factors in deciding your SERP, along with a vague number I came up with to show relative value. These ten factors add up to a whopping 21% of the SERP.
Title Tag – 2.3%
This is what appears in the blue bar at the top of your browser, it comes from a metatag called
Joomla Joomla! Note!
SEF enabled, you title will reflect the content of the page. Even better is to install a 3rd party SEF, then you can set the page title to be the title alias of that page. I prefer using the title alias for my page title, then I can have the title on the page and control the one delivered in
Critical note:
You MUST have some sort of SEF enabled. Search engines hate dynamically generated pages, and that’s the whole point of Joomla! Even if you have just basic enabled, the benefit of the search engine “seeing” static pages is huge, far outweighing the little bonus gained from having keywords in the title too.
Anchor Text of Links – 2.3%
The phrasing, terms, order and length of a link's anchor text is one of the largest factors taken into account by the major search engines for ranking. Specific anchor text links help a site to rank better for that particular term/phrase at the search engines. In other words, it’s the actual text that represents the link on a web page.
Keyword Use in Document Text – 2.2%
Your keywords must appear in the actual copy of the page. Supposedly search engines pay more attention to the first and last paragraphs. The way to go about this is have your keywords firmly in your mind as you write your copy. I don’t know about you, but I find this really hard. I prefer a different approach.
There is a simple trick here, write your quality content, then use a keyword density tool to find the keyword density. THEN, take the top words and add them to the meta keywords tag for that page. This is somewhat backwards for some maybe, it optimizes a page for what you actually wrote, rather than trying to write a page optimized for certain words. I find I get much better correlation like this and can then tweak my text afterwards.
Sure, if you want to you can further optimize by having the keywords in header tags and bold etc. As a guide, these might contribute less than 1% to the SERP.
Joomla! Note!
Joomla is good and bad here. The good part is its easy to add keywords to the meta keywords tag for that page. You just go to the meta info when you are editing the content and plop them in. Note that they are added as well as any keywords you have specified in the main global configuration. Its good to only have your most important 2-3 words there and put the rest in the pages.
The bad part is linked to the fact that Joomla is dynamic. The code is not very lean, that is, there is a lot of html compared to actual copy text. This in turn reduces your keyword density (indirectly). One way to address this is to design without tables (I hear vavroom applauding). Using CSS instead of tables means leaner code. Its also possible with CSS to have your page “source ordered”. This means that the real content (the middle column to you and me) comes before the side columns and/or navigation.
Accessibility of Document – 2.2%
We are not talking human accessibility here (as in 508). Accessibility is anything on the page that impedes a search egine spider’s abilty to crawl a page. There can be a number of culprits:
* Avoid Splash Pages: Flash and heavily graphic introductions prohibit engines from crawling your site.
* Avoid Frames: Never use pages with frames. Frames are too complex for the crawlers and too cumbersome to index.
* Avoid Cookies: Never require cookies for Web site access! Search engine crawlers are unable to enter any cookie-required materials.
* Avoid JavaScript when Possible: Though JavaScript menus are very popular, they disable crawlers from accessing those links. Most, well-indexed Web sites incorporate text-based links primarily because they are search engine friendly. If necessary, JavaScript should be referenced externally.
* Avoid Redirects: Search engines frown upon companies that use numerous Web sites to redirect to a single Web site.
* Avoid Internal Dynamic URLs on the Home page: Though many sites incorporate internal dynamic links, they should not incorporate those links on the home page. Engine crawlers are currently ill-equipped to navigate dynamic links - which often pass numerous parameters using excessive characters.
* Utilize Your Error Pages: Too often companies forget about error pages (such as 404 errors). Error pages should always re-direct "lost" users to valuable, text-based pages. Placing text links to major site pages is an excellent practice. Visit www.cnet.com/error for an example of a well-utilized error page.
Joomla!Note!
Many things to be careful of here. The most important is go turn on Search Engine Friendly URL’s (SEF). It changes your links and pages from dynamic to static.
The other important factor is JavaScript menus. They are very popular because the look great. As good as they look to people however, they look equally as bad to spiders. Try using CSS to style you menus, you’ll be surprised how good they look. You can even have drop-down sub menus.
Internal Links- 2.1%
Even more important than the holy grail of external links is internal links. Who knew! Easily the most underrated criteria. But, its important to make sure you are making good use of anchor text. A well-linked to document is considered more important than an obscure page.
Tight Site Content Theme– 2.1%
What your website is about is determined through analysis of the content. Its critical that it correlates to keywords, anchor text, etc.
One strange off shoot of this is perhaps its not worth spending much effort trying to build the page rank of the home page. This strange concept is explained in the idea of Search Engine Theme Pyramids.
A related factor is having a good sitemap. Not only is it good spider food, you can also load it with lots of quality anchor text for those internal links as well as relevancy text (that which appears near a link). Also important is the invisible Google sitemap which is an xml file for the Google spider only.
Joomla!Note!
Thumbs up for Joomla! Add-ons such as Docman make it effortless to add globs of content quickly and easily. Remember, it’s a Content Management System after all. There are also some add-ons for sitemap, though I think that you have to upload the Google sitemap independently.
External Links – 2.0%
These are the links from other sites to you. Note its much better to have specific pages linked rather than your homepage because of the idea of Search Engine Theme Pyramids. Don’t bother with link farms or anything you see advertised for a link. You are much better off finding links from sites that have similar topics as yourself (see below)
Theme of Linking Sites– 2.0%
The search engine is trying to figure out what your page is about, so it can decide if its relevant to a users search. Links from pages with similar topics add credence to your page. When trying to search out those links you can use something like WebFerret. Or if you just want a quick method, use the “related:” tag in Google, e.g. type “related:www.yahoo.com” in and it will search for sites related to the topic of Yahoo (whatever that is?). Then spend some time emailing webmasters and asking for links. There is software out there that will do this automatically for you.
Popularity of Linking Sites – 1.9%
This means that links from sites that are “important” (i.e. have a high SERP) are more valued than those from a lower SERP. A factor worth considering when searching out links, get the ones from sites with a high page rank first.
Keyword Spamming – 1.9%
Careful, this is a negative factor!! This means having a keyword density in text or tags so high that the engine decides you are stuffing. Your rank will go from #1 to #10000 in a heartbeat. Want to know the best part? No-one actually knows what percent density this is, and its probably different for different engines! Between you and me, I am not going above 15% on my pages.
For the morbidly curious, the other factors I have on my site (there are too many to post here) at Search Engine Ranking Factors.
Part 3 Summary
* Fortune favors those with rich content
* There are many factors that determine search engine page ranking.
* Rather than tweak minor tags, its better to leverage Joomla’s true power of being a fully fledged Content Management System to gain rank.
* Don’t use flash (ok, I admit I am biased)
* Make sure your pages are under 10k. Not mentioned above, but it just occurred to me.
An Incomplete Guide to SEO : Launching your web site
Launching your web site
So you have designed your web site, and now its time to put it online. Ill leave the actual process of installation for another time. I will mention hosting however.
Joomla!Note
Not every host will meet the needs of a Joomla site. One issue is safemode. Its a server setting and it needs to be off for Joomla to work properly. Other issues that often crop up are ones involved ownership of files on the server. I have a few reviews of recommended Joomla hosts here.
Ok, so we have our site up, what next?
Open your doors to the spiders
To start showing up on rankings, your site needs to be indexed. This means a program called a spider comes to your web site and crawls it. Crawling involves looking at the tags, text and following all the links it can find. Make sure your site is easy to crawl:
All pages should be linked to more than one page on your site. This is easy to do with Joomla, it happens with the mainmenu and other menus. Also try and make all pages within two levels of the root (home page). If they are buried, try and add more specific sections to hold that content.
Joomla!Note
Two Common Joomla Mistakes!
* Flash menus. I showed my bias against flash in my last article. Spiders struggle to follow flash. If you really must have flash navigation, then you need to include some plain old text links somewhere on the page. An easy way to do this is in the footer. Go to /includes/footer.php and add your links there. They will then turn up on every page, easy eh?
* Don't put it online before you have a quality site to put online. It's worse to put a "nothing" site online, than no site at all. You want it flushed out from the start. Its very easy to fall into this trap with Joomla as its so easy to put a site up, especially with the built in templates. Better to work off line with MSAS and import the SQL database (note to self: write guide to working offline)
One last thing, to actually be indexed, the spiders need to know you exist. This happens by submitting your site and linking.
Submitting your site
The first part is real easy. Go and submit your site by hand to all the major engines, heres a few to get you started.
http://www.google.com/addurl/?continue=/addurl
http://search.yahoo.com/info/submit.html
When you do submit, take note of who supplies the search. Alltheweb is done by yahoo for example, you dont need to submit there.
The second part is much harder. Forget about you submissions for a few months. Thats right, submit them and forget about it. Dont even think about using one of those submit your site to 89768 engines for $20 deals.
Also go submit to a few directories. If you have the right contacts, sacrifice a goat or something and submit to dmoz.org. Its the grand daddy, with a page rank of 9, but almost impossible to get on.
Linking your site
Getting links to your site is perhaps the most important part of SEO and perhaps worth a topic all in itself. Needless to say, the more links from quality sites you can get the better. Also ones with the same topic
An easy submission is in the community news section of Joomla.org. Hey, its free, will give you a link and also might trigger a spider to crawl you. If you have a useful site, announce it o the community!
Logging and Tracking
Get a decent tracker that can track inbound referrals (where someone came from). Most hosts have several built in, I use awstats. Whatever you do, don't use a lame graphic counter, it doesnt give you want you want and looks unprofessional. If your host doesn't support referrers, then back up and get a new host. You can't run a modern site without full referrals available 24x7x365 in real time.
For the more compulsive amongst us, you can start watching for spiders from search engines. Make sure those that are crawling the full site, can do so easily. If not, double check your linking system (use standard hrefs) to make sure the spider found it's way throughout the site.
Buying Traffic
One underused way of SEO is simply buying traffic. You might not think of advertising when you think of optimizing your site, but the ultimate goal of all this is traffic, so why not just skip the middle man.
I recommend using Google AdWords. Its a Pay-per-click program that has somewhat revolutionized online advertising. Basically you only pay (usually a few cents) when someone actually clicks on your link. Your actual ad is designed based on certain keywords you want (remember part 2?), this means it targeted traffic, the best kind.
Ill probably do a guide at some point, but to get started, you need a Google account. Also, to help you figure out how much to bid, and on what words, I use this tool the most:
http://www.pixelfast.com/overture/
It does the bid and terms at the same time.
Where is all my traffic?
In March 2004, Google implemented a new filter, now referred to as "The Sandbox".
Google's thinking was, A new web site shouldn't be able to get good ranking, until they prove themselves. Spammers generate millions of new pages daily, along with millions of new links to go with them.
Google withholds high ranking ability on new sites, by de-valuing the new links for 2-4 months. If the domain and backlinks have existed for a certain length of time (4 months?), then you are OK, and escape from the sandbox.
This penalty is new-site based. Long-standing sites have no trouble ranking new pages. Over time, the newly generated links are given weight, and eventually the sandbox effect goes away.
Dont get too worked up about instant traffic, its probably not going to happen anyway because of the sandbox. For the next few months you are better off spending your time writing content, a page every few days.
Part 4 Summary
* Use a Joomla friendly host
* Make sure your site can be spidered
* Submit and forget
* Buying traffic is surprisingly cheap
* You wont get good SERP to start
The SEO and Joomla Series:
Part 1: How to earn $1,000s a day with Search Engine Optimization and Joomla
Part 2: Planning your site
Part 3: Designing your Site
Part 4: Launching your site
An Incomplete Guide to SEO : Planning your site
Planning your site
Why do you want traffic?
Before you go anywhere you need to answer this question. You can break it down into:
* What is your web site about?
* Who will visit it?
* What will they gain?
* What will you gain?
Write the answers on a piece of paper
no really!
Now you have thought some what about who is going to visit your site, we can talk about the how.
Keywords..
Keywords..
Keywords..
Imagine you are a potential visitor to your site. What keywords will you type in to find it? Take a blank piece of paper. Now, on your piece of paper, write down as many words or phrases as you can that you as a potential visitor would search for to find a site like yours in a search engine. Try to write 20 to 30 keywords or phrases on your piece of paper. If youre having trouble coming up with keywords, ask your partner, friends or family members which keywords they would use to find your site. At this point you should have a list of no less than 20 keywords or phrases at your disposal.
You need this tool below:
http://www.pixelfast.com/overture/
I use it almost on a daily basis. It allows you to find out which keywords people are using in their searches, which as Im sure you will agree is very valuable information!
Another useful way of doing this is a beta Google tool:
http://www.google.com/webhp?complete=1&hl=en
Start at the top of your keyword list that you wrote earlier and enter each one into the text box. As you can see, the term suggestion tool returns a list of keywords and how many times they were searched for. As you type each of your keywords into the text box and see the number of searches, write that number down next to your keyword on the page.
You should now have a list of keywords with the number of searches for that keyword from last month on your page. To get the 5 most popular keywords, simply take the 5 keywords with the highest number of searches. Flip your paper over and write them down in order of most to least popular. You should now have your list of 5 popular keywords, maybe something like this:
* Marketing: 1,406
* Advertising: 704
* Web Site Promotion: 442
* Marketing Online: 56
* Branding: 5
These keywords are going to form the basis for all of your site optimization strategies. Keep your keyword list with you as you read through the rest of these articles.
The first way to use these keywords
Engines use your domain name as a factor in the Search Engine Results Page (SERP). Now there is a lot of debate here, some think that branding for the viewers is more important than having a keyword in the URL (remember the URL of my lead example?). But, if you can combine both, then great! Notice my domain is www.compassdesigns.net. This will get me a little boost if someone searches for web design.
Anyway, you cant easily change your domain after you have made your site, so this is why we are thinking about SEO before we have even started on the site design. If we can use a keyword in the domain, go for it.
Part 2 Summary:
* Ask yourself who will visit your site, why and what will you get out of it.
* Research your keywords.
* Domain name; branding or keyword?
An Incomplete Guide to SEO
How to earn $1,000’s a day with Search Engine Optimization and Joomla Joomla.
"How you can profit from the EXACT SAME search engine optimization strategies that I used to charge clients $3,590 a day to implement!"
In this web-based, no-hype guide, I'll reveal my simple step-by-step search engine optimization strategy that I have been using for 2 years on over 350 clients and that anyone can use to get a front page ranking on Google:
Do you want:
More web site traffic? More orders? A better cash-flow? A healthier Internet business? A front-page ranking on Google and all major search engines? More money than you thought humanly possible?
... if so, keep reading. Get ready to become a search engine optimization "insider". What I'm about to teach you are the actual secrets that I have used to get unlimited daily traffic for over 350 of my clients from Google. For the sake of your business, you can't afford not to read my search engine optimization guide!
Taken from: http://www.google-search-engine-optimization.com/
Used without permission as an example of how web marketing consultants get a bad name!
There are millions (literally) of these sites on the web. The real truth about search engine optimization is that there is no “silver bullet” any more. It used to be true that you could stuff a few keywords into some metatags and you get lots of traffic. Now, search engines are much smarter. Google recently released its patent #20050071741 on its "Information Retrieval Based on Historical Data" (that's that little search page to you and me). In the document were over 118 factors that effected a web site's position in the search engine's rankings!
This is the real truth about SEO:
There is no such thing as Search Engine Optimization any more.
The only reality now is having a long term web marketing strategy and a commitment to building a site full of quality information.
Having said that, assuming that your site is one of the ones with the quality content, SEO still has its place.
On an average day, about 68 million Americans will go online."
"More than half of them, over 38 million people, will use a search engine."
Source: Pew Internet & American Life Project, January 2005.
There are a lot of people out there, and why shouldn’t they come to your site? Especially if all the “other guys” are still just stuffing metatags. Over the next five articles in this series, I will explain some of the things you can do to increase your traffic and visibility, and make specific references to how this is implemented in Joomla. I will be looking at the steps in a roughly chronological order that you might take as you launch a new site. Follow this guide and some time in the next 6 months, you might be getting that traffic you wanted.
The Most Out of Blogging/RSS
Optimizing Your RSS Feed
1. Subscribe to your own feed and claim it on blog engine Technorati
2. Focus your feed with a keyword theme
3. Use keywords in the title tag; keep it under 100 characters
4. Most feed readers display feeds alphabetically, title accordingly
5. Write description tags as if for a directory; keep them under 500 characters
6. Use full paths on links and unique URLs for each item
7. Provide email updates for the non-techies
8. Offer an HTML version of your feed
9. For branding, add logo and images to your feed
Optimizing Your Blog
1. Simplify archiving structure for shorter, cleaner URLs
2. Use CSS to improve usage of H1, H2, and H3
3. Tweak titles for keyword placement
4. Add robot.txt and favicon
5. Widgets: use them sparingly; test one at a time
6. Blogging platforms WordPress and Moveable Type can make the process more user friendly
7. Use a separate domain
Linking
1. Use keywords in anchor text with links; do this a lot
2. Link to topic authorities
3. Give lots of link love in general
4. Delete the word "permalink," replace with real title
5. Create link lists directing to other useful information
6. Cross-link blog and main website
7. Deep link to content on target site
8. Inbound links are valuable
9. Become a "link hub," an authority site, a real resource
Promoting Your Blog
1. Get content or feeds syndicated in other publications
2. Use full-text feeds – "don't be stingy"
3. Increase items in feed from 10 to 20
4. Highlight popular posts and chestnuts on your blog
5. Publish feed as an HTML page (RSS Digest) or as a podcast (Feed2Podcast)
6. Publish headlines from your blog on your website, your MySpace page, etc.
7. Make sure pinging is turned on
8. Use MyYahoo and MyMSN accounts to submit to these search engines9. Post often if you want to become an authority over time
Source :.webpronews
Saturday, November 22, 2008
Number of pages indexed
This tool will generate a report showing how many pages of your website has been indexed by the search engines so far. It supports all major search engines: Google, Yahoo, MSN, Alltheweb, Hotbot and AltaVista.
Tips to get your website indexed quickly
Want to get more pages of your website indexed by search engines ? Here are some useful tips:
- Site structure
Make sure your subpages are easily accessible by search engine bots. Create a sitemap page that links to all pages of your website and place a link to it from the homepage. If you have a large website, break the sitemap up in several parts and keep the total link number under 100 per page. Link from your homepage to deep pages to get them and the sites around them crawled faster. - No session IDs
Get rid of session IDs. Bots rarely index pages with a session IDs because they think that those are different pages (because of the different IDs) with the same content. - No variables in the URLs
Avoid variables in the URL. They are indexed slowly and often getting no PageRank. Use Apache's mod_rewrite feature to convert your dynamic pages to static. Redirect all old pages to the new ones with a 301 redirect. Note however that if you do that for your established website, search engines will need some time to crawl the new static pages. - Get links !
Inbound (incoming) links are half of the SEO battle nowadays. Your website needs links from quality, on topic websites. The higher your PageRank the faster and deeper your website will be crawled by bots. Get links not only to your home page, but also to the deep pages. Make sure the spiders can find your site from different places.
Keyword Suggestion Tool - Keyword Popularity Tool
This keyword suggestion tool will help you with the choosing of the right keywords for your website. You can see which keyword combinations are more popular and also get ideas for more keyword combinations. Those keyword popularity results are extracted from two sources - Keyword Discovery and Overture.
Keyword suggestion tips
Choosing the right keywords is critical for a success of a website. Target the wrong keywords and you have lost. There is pure gold hidden in less competitive keyword combinations and variations. Here are some more keyword suggestion tips for your success:
- Start your research with more general terms. Let's say your website is about widgets. Type in "widgests" into our keyword suggestion box and study the results. Write down all results that are somehow relevant for your website.
- Dig for more specific keywords. Based on the list you have just created, do a search for every keyword combination (let's say "red widget", "discount widget" and so on) and write down the new keyword combinations. It makes no sense to search for those new terms as you would not get anything new.
- Now think about synonyms that people might search for like "buy widgets" and "purchase widgets". Do a research for them too.
- Now you should have a decent list of relevant keywords. Optimize your pages for them. Create new pages optimized for the missing keyword combinations and link to them from your existing pages.
- If you are serious about keyword research, we suggest to sign up for Keyword Discovery. You will be able to search their huge database of terms that people search for. Here are just some of their advantages:
- Search for keywords and get hundreds of keyword combinations
- Search for related terms (like if you enter "golf", you will get "tee time", "tiger wood" etc !)
- Search for common misspellings
- Check the popularity of all search terms
- Find the balance between the total number of competing web pages for each search engine, and popularity of the search term. Then attack the niches!
- Keyword Research
- Industry Keywords
- Spelling Mistake Research
- Seasonal Search Trends
- Related Keywords
- KEI Analysis
- Keyword Density Analysis
- Domain Researcher Tool
- Wordtracker™ Comparison
All of these tools will bring you a lot of targeted traffic relevant to your website. Trust us - you will get your investition back in no time.
You can take advantage of this information to:- Optimize the content of your web pages and your meta tags
- Maximize your pay per click campaigns
- Take traffic away from your competitors
Link popularity tips
Here are some tips to improve your link popularity:
- Submit to the major link directories
Google seems to add weight to the websites that are listed in DMOZ and Yahoo directories. If your site will be accepted in DMOZ, soon it will appear in the Google directory and in hundreds of DMOZ clones out there. If you've got budget, submit your site to the Yahoo directory. If you don't have the budget, try the free submission (don't expect too much though). For non-commercial sites a submission to Zeal.com could be worth the time you need to understand their submission guidelines. - Free directories
Almost every topic has a directory that accept free submissions. Submit your site there. Do a search on your favorite search engine for "yourkeyword directory", "yourkeyword links" etc. - Link exchanges
The link exchanges are not as effective as one way links to your website and pretty time consuming but still worth it sometimes. Look around in the Google directory for your keywords. You will find a lot of sites with link pages that do link exchanges with other sites. Simply ask for a link swap. A non-templated personal message can do wonders. A search for "yourkeyword add url", "yourkeyword submit site" etc will pop up hundreds of websites to exchange links with. - Links Quality
Quality, on topic links work best. Don't waste your time by submitting your site to FFA pages and by spamming guestbooks and forums - this is not a long term solution and those links seems to count less and less in the last time. - The secret to get a lot of good one-way links for free
Write quality content for your website. Provide some free services that everyone would love. Work really hard and with passion and you will wonder how many people will link to you. Not because of the link exchanges or PR, but because they like your website. At last, that is what the web is about - the links from some sites to other.
Wednesday, November 19, 2008
7 Steps to Greater Link Popularity and Higher Search Engine Rankings
Link Popularity is one of the most important factors affecting your Search Engine rankings. When your site is popular with other web-sites, the Search Engines love you, but if you are the new kid on the block, they ignore you - totally. It's great to be popular and improving your Link Popularity will also get you higher rankings on the search engines.
Why is Link Popularity important for Higher Rankings?
Search Engines are just software programs that try to analyze the utility and validity of your content with regard to certain keywords. They also try to understand the importance or ranking of your site against all competing sites in their indices.
Just How do Search Engines Learn About the Utility and the Validity of Your Content?
Search Engines just love sites that users love to visit. They love it even more if users spend more time at the site before they browse away. Now how do they know which sites humans love? Obviously with the billions of web pages on the world wide web, the Search Engines cannot manually go to each site and check their content. They let other humans do that work for them. Then they watch as others link to your site. They watch your Link Popularity.
Link popularity simply means how popular are you on the World Wide Web. And your popularity is determined by the number of people linking to your website. The greater the number of external links pointing to your web-site, the greater your ranking. Simple, isn't it?
Here is how to find YOUR link popularity: Go to Google.com and type in "link: followed by your URL". This returns all the sites that link to you in the Google index. eg. link:http://www.yourdomain.com.
How Do You Get a Lot of People to Link to You?
Getting people to link to you is easy if you establish your presence. Show that you are a master of your domain. Even if you deal in manure, and there are people out there wanting to know more about it, you have an audience. And if you establish your presence well enough, the links will automatically follow.
Here are a few ways you can increase your inbound links.
1) Articles: Almost anybody can write articles. This is the easiest way to get started on your way to Search Engine manna. It's just a question of putting down your thoughts in a coherent manner. As long as the information you are providing is useful and valid, your articles will be sought after by publishers out there. There are also hundreds of article directories on the net.
To find article directories, Google 'Article Directories'. You get nearly 53 million hits. There are also some awesome free tools available for this.
For those with a morbid fear of writing, you can find 'Ghostwriters' as well as 'Virtual Assistants' on the Internet. Search on Google for those terms and you will find many a link to sites that provide you suitable services. You may also use sites like Elance.com and RentACoder.com for people who will write articles for you.
2) Blogs: Blogs are not as easy to do as Articles are. But they are a very useful tool, nevertheless. When you have a blog, you have an audience. It's like a fan base. They have certain expectations of you. They want you to churn out hit singles or albums. But they want you to give them more. Even though they started out as mere diaries or daily jottings of a few people, currently they are excellent link building, content management and search engine optimization tools.
In fact almost any industry can have a blog portal. As always run a search and you will find something in your industry. Or you can go to blogger.com or wordpress.com to start your own blog free of charge.
3) Free Products: The easiest way to do this is to write a tool and offer it for download. Or maybe even provide a service for free. The greatest example of building a business with free service that comes to mind is hotmail.com. Once your service becomes a must-have, it's only a matter of time before you can start making money from it. Also you can have people point to your site to give their customers access to your tool.
4) Trackbacks: This is the easiest way to get your links out there. What you do is go to other blogs (Authority Sites) in your industry and become a regular contributor. Or comment on the blogs. Along with your signature, most people offer a link to your web-site. And the more links you have pointing back to you, the more popular you get. Also, when you become a major contributor/commentor, people start valuing your opinion more.
5) Syndication: You can offer your articles to other sites for publishing for free or for payment. Your article links back to your site.
6) Reciprocal links: You offer a link to another peer site, in return for them linking back to you. This is a good way to get your target industry users to come to your site. Also you introduce your users to other sites that they may be interested in visiting. They will be thankful to you for pointing them in the right direction. And they come to trust you more.
7) Partner links: Your local Chamber of commerce, Suppliers, Resellers, Affiliates, Sites selling complementary products etc.
Monday, September 29, 2008
Link Building
There are three types of links which will increase link popularity for a website.
Internal links :
Incoming links :
Outgoing links :
Link popularity is defined as the number of links pointing to and from related websites and it is an extremely important method for improving sites relevancy in search engines.
Internal links :
Number of Links to and from pages within a site is called as internal link popularity. Search engine spiders find and index important related pages quicker if some pages are buried deep within the site when we do cross linking of important related Pages
Incoming links :
Incoming links are of 2 types.- (Links from sites we control)
- (Links from sites we don’t control)
Links pointing to a website from other related sites is called incoming link popularity.
To find link popularity for a website or which sites are linking to our website or competitors website we need to go to Google search box and enter “link:” followed by domain name with out using “www”
Outgoing links :
Outgoing links refers to links pointing to other related sites from your site. Search engine spiders will crawl your site's outgoing links and determine that the content of the sites you link to are related to the content of your own site.
How much importance outgoing links add to a site's link popularity rating is still being debated by search engine optimization specialists>> More on Incoming Links
Site maps :
Site maps leads to links which leads to most or all pages of the website. Site maps are hierarchically organized. Site maps are visual models of web site content that allows users to find specific webpage. If more pages are available in a website it is recommended to have a site map, by using site map search engine spiders will crawl the links and index the entire à°µెà°¬్à°¸ైà°Ÿు.
Outgoing links
Outgoing links : |
Outgoing links refers to links pointing to other related sites from your site. Search engine spiders will crawl your site's outgoing links and determine that the content of the sites you link to are related to the content of your own site. How much importance outgoing links add to a site's link popularity rating is still being debated by search engine optimization specialists |
Link keywords : |
It is important to name your internal and outgoing links carefully. Since keywords play a major part in determining the relevancy of a Web page, it is essential that they are also included in link text. |
Link quality : |
The quality of the links is just as important. The types of sites we should concentrate on getting links from include, Major search engines like (Google.com), popular search portals like (MSN.com), web directories like(Yahoo.com and Open Directory Project - dmoz.org), high trafficked sites like (eBay.com and Amazon.com), news sites like (CNN.com), and sites related to our site's theme. |
Link exchanges and farms : |
Do not get links from link exchange sites and link farms. Link farms are networks of heavily cross linked pages on one or more sites, with the sole intention of improving the link popularity of those pages and sites. All of the major search engines consider such links as spam, so stay clear of these types of links. |
Incoming links
Incoming links are of 2 types.
- (Links from sites we control)
- (Links from sites we don’t control)
Links pointing to a website from other related sites is called incoming link popularity.
To find link popularity for a website or which sites are linking to our website or competitors website we need to go to Google search box and enter “link:” followed by domain name with out using “www”
For example:
Link: tipsoninterview.com
1. Links from sites we control
It is good to do cross linking of our web sites. Select keywords that describe the site we are linking to. The reason for doing this is because some of the major search engines, such as Google, place a great importance on the text used within, and close to, links.
2. Links from sites we don't control
There are two ways of finding sites to link to our website. The best way to get other sites to link to our website is to ask them politely. And the best way to find likely candidates is to ask web sites that link to your competition.
Once we have compiled a list of related sites, add a link of those website to our web site. Then send an email to each web site owner informing them that we have linked to their site and politely ask them for a link back to our site.
Another way of finding sites to link to our web site is to find web sites that accept site submissions. To find such web sites, visit a search engine, such as Google, and search for:
"add url" “keyword"
Include the quotation marks to ensure the search engine only return pages with the exact search phrases we enter. Also try replacing, "add url" with one of the following sets of search phrases:
add site +keyword
add link +keyword
Submit url +keyword
Submit site,
Submit link,
we can also find site submission pages by searching for the actual page. By replacing the "add url" search phrase with one of the following page names we get the submission pages.
addurl.html, addsite.html, addlink.html.
submiturl.html, submitsite.html, submitlink.html.
Add-url.html, add-site.html, add-link.html,
Submit-url.html, submit-site.html, submit-link.html,
add_url.html, add_site.html, add_link.html,
submit_url.html, submit_site.html, submit_link.html
Internal links
Number of Links to and from pages within a site is called as internal link popularity. Search engine spiders find and index important related pages quicker if some pages are buried deep within the site when we do cross linking of important related pages
at 3:23 PM
Keyword Placement- where to place keywords
1.Keywords in title tag :
As per HTML specifications it is not mandatory to write something in the 'title' tag. In case if you leave title tag empty then the title bar of the browser will read as “Untitled Document” or similar, for SEO purposes it is good to use 'title' tag.
The title tag must be short with in 6 to 7 words and the keyword must be near the beginning. Keyword in the title tag is one of the most important places because what is written inside the 'title' tag shows in search results as your page title.
Search engines (including Google). Mostly display content of the 'title' tag.
For example:
The title tag of the home page for the http://see-seo-tips.blogspot.com can include something like this:
'title' Search Engine Optimization Tips – go through these useful interview tips to be successful in interviews. '/title',
'title' Search Engine Optimization Tips - Everything You Need to Know regarding interview tips '/title'
2.Keywords in URL :
Keywords in URLs help a lot in getting better position in search engine result pages.
Eg:-http://srihithas.com/webdesigning.html,
Where “webdesigning’” is the keyword phrase where we attempt to rank well. But if we don't have the keywords in other parts of the document, we don’t need to rely on having them in the URL.
The domain name and whole URL plays an important role IN SEO process. The presumption is that if our web site is about cows, you will have “cows”, “cow”, or “calf” as part of our domain name. For instance, if our web site is mainly about cow milk , it is much better to name your cow site “cow-milk.net” than “animal-milk.org”,
Because in the first case we have two major keywords in the URL, while in the second one we have no more than one potential minor keyword.
Don’t be greedy when hunting for keyword rich domain names. From search engine optimization point of view it is better to have 5 keywords in the URL.
But if we imagine the URL by using 5 potential keywords it will be long and difficult to memorize. So we need to balance keywords in the URL and the site usability. So it is advisable to use less than or equal to 3 keywords in the URL.
Directory names and file names are also important. Often search engines will give preference to pages that have a keyword in the file name.
For instance http://ourdomain.com/interview-tips.html is not as good as http://tipsoninterview.com/interview-tips.html but is certainly better than http://ourdomain.com/interview-guidence.html. The advantage of using keywords in file names over keywords in URLs is that they are easier to change, if you decide to move to another niche.
3.Keyword density in document text :
After we have chosen the keywords that describe our web site the next step is to make our web site keyword-rich and to have good keyword density for our target keywords. Keyword density is a common measure of how relevant a page is, the higher the keyword density, the more relevant to the search string a page is. Recommended keyword density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. We can check keyword density by Keyword Density Checker tools which are available freely.
If we make keyword stuffing there are severe penalties including ban from the search engine because this is considered an unethical practice that tries to manipulate search results.
4.Keyword stuffing :
Keyword density over 10% and above which are artificially inflated in a webpage is regarded as Keyword stuffing and there is a risk of getting banned from search engines.
5.Keywords in anchor text :
If you have keyword in the anchor text in a link from another web site, this is regarded as getting a vote not only from this web site but also from your site in general but about the keyword in particular. So it is good to have anchor text for inbound links.
6:Keywords in headings (H1, H2, etc tags) :
Keyword in header tags counts a lot. But before placing ensure that the web page has actual text about the particular keyword.From literacy point of view headings separate paragraphs into related subtopics. From seo point of view it is good to have as many headings on a page as possible even though it may be pointless to have a heading after every paragraph, Especially if we have keywords in the headings.
Even though there is no technical length limits for the contents of h1, h2, h3,…. h5 tags.
We need to be wise with the length of headings; too long headings are bad for page readability. Another issue we need to consider is how the heading will be displayed. Heading 1 (h1), means larger font size in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good.
7.Keywords in the beginning of a document :
Placing keywords in the beginning of the document also counts, however, beginning of a document does not necessarily mean the first paragraph – for instance if we use tables, the first paragraph of text might be in the second half of the table.
8.Keywords in ALT TAGS :
Search engine Spiders don't read images but they read the textual descriptions in the
9.Keywords in metatags :
Placing keywords in metatags is important, especially for Google, Yahoo! and MSN as they rely on them, so if you are optimizing for Google, Yahoo or MSN, fill these tags properly. In any case, filling these tags properly will not do any harm so better to have metatags in the page.
10.Keyword proximity :
Keyword proximity measures how close the keywords are placed in the text. It is best if they are placed immediately one after the other (e.g. “puppy food”), with no other words between them. For instance, if you have “puppy” in the first paragraph and “food” in the third paragraph, this also counts but not as much as having the phrase “puppy food” without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words.
11.Secondary keyword :
Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, “web designing in Hyderabad” might have thousand times less hits than “web designing” but if you are operating in Hyderabad, you will get less but considerably better targeted traffic.
12.Keyword stemming :
In English we have words that stem from the same root.
Example :- ( dog, dogs, doggy)
(KISS, KISSES, KISSING)
(JOKE, JOKES, JOKEY, JOKER)
If you have “JOKE” on your page, you will get hits for “JOKES” and “JOKER” as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them.
13.Synonyms :
Synonyms are not taken into account when calculating rankings and relevancy in languages other than English. Optimizing for synonyms of the target keywords, in addition to the main keywords is good for those sites developed in English, for which search engines are smart enough to use synonyms as well when ranking sites.
14.Keyword Mistypes :
When searching for information using a particular keyword Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings.
Examples for keyword mistypes
(Christmas and Xmas) (Diwali and Dipawali)
Targeting these misspelled keywords might get you some more traffic but having spelling mistakes on your website does not make a good impression, So it is better not to use these misspelled keywords in the content, but we can use them or try misspelled keywords in metatags.
15. Keyword Dilution :
When webpage is optimized with an excessive amount of keywords, especially unrelated ones, this will affect the performance of all the keywords and even the major keywords will be lost or diluted in the text. So don’t target more keywords in one single web page.
16. Keyword phrases :
Keyword phrases consisting of several words can be optimized in addition to keywords, e.g. “Keyword Analysis”. It is best when the keyword phrases you optimize are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords (“keyword” and “analysis”) than for one phrase that might occasionally get an exact match.
Saturday, September 27, 2008
Keywords Terminology
1.Keyword Density :
Keyword density refers to the ratio or percentage of keywords contained within the total number of indexable words within a web page. It is important for your main keywords to have the correct keyword density to rank well in Search Engines. Keyword density ratio varies from search engine to search engine. The recommended keyword density ratio is 2 to 8 percent. Keywords quality is very important more than keyword quantity because if we have keywords in the page title, the headings, and in the first paragraphs this count more. The reason is that the URL (and especially the domain name), file names and directory names, the web page title, the headings for the separate sections are more important than ordinary text on the web page . we may have same keyword density as our competitors web site but if we have keywords in the URL, this will boost our website ranking incredibly, especially with Yahoo search.
2.Keyword Frequency:
Keyword frequency means the number of times a keyword phrase or keyword appears with in a webpage. The more times a keyword or keyword phrase appears within a web page, the more relevance a search engines are likely to give the page for a search with those keywords or key phrase.
3.Keyword Prominence :
Keyword prominence refers to how prominent keywords are within a web page. It is recommendation to place important keywords at the start or near of a web page, starting of a sentence, TITLE or META tag.
4.Keyword Proximity:
Keyword proximity refers to the closeness between two or more keywords. It is better to have keywords placed closer in a sentence.
Keyword proximity examples:
Example 1: How Keyword Density Affects Search Engine Rankings.
Example 2: How Keyword Density Affects Rankings In Search Engine.
In above example, if someone searched for "search engine rankings," a web page containing the first sentence is more likely to rank higher than the second.
The reason is because the keywords are placed closer together.
Levels of keywords
Keywords Levels are totally ౩త్à°¯్à°ªెà°¸్
1.Low level keywords : |
If the keyword competition (in Google) is <= 4 digits then it is considered as low level keyword. For low-level keyword we can optimize a website with in 3 months. |
2.Medium level keywords : |
If the keyword competition (in Google) is <= 7 digits it is considered as medium level keyword, For medium -level key words we can optimize our site with in 6 to 7 months. |
3.High level keywords : |
if keyword competition is above or equal to 8 digits it is considered as high level keyword .we can optimize these high level keyword in 1year. |
keyword Analysis
1.Keywords for Search engine optimization :
Keyword is nothing but “on what we are searching for”.
Ex: - India temples, job opportunities, web development, Telugu movies, it solution, software training.
Key word should be “well define” and It must be a meaningful
Key Phrase: -Combination of 2 key words
Keywords are considered the most important in Search engine optimization. So it is necessary to see that the right keyword is optimized for your website. Choosing keywords seems easy at first but when you get into more detail, it might be a bit confusing to correctly determine the right keywords. But with a careful keyword research and thinking the problem of selecting the right keywords to optimize for a particular website can be solved.
Keywords are what search strings are matched against. when you start optimization, the first thing you need to consider is the keywords that describe the content of your web site best and that are most likely to be used by users to find you.
Choosing the right keyword is the first and crucial step to be successful in seo process, Before choosing keywords we need to see:
What the online population is searching for?
What are the keywords our competitors chosen?
What are the keywords that describe our website best?
If we fail in choosing the right keyword we will waste ours or our client’s money and time. after we have made a long and detailed list of all the important keywords that are searched by tens or thousands a day and If you have many competitors for that particular keywords we have chosen, chances are that it will be difficulty to overtake them and place our website among top ten results,
If we are not placed in the first page or on the second page or in the worst case on the third page of the organic search results, we will have very few visitors to visit our website. It is true that sometimes even web sites that are after the first 50 results get decent traffic from search engines but it is certain that you can't count on that.
We should not get discouraged if all lucrative keywords are already occupied; Low-volume search keywords can also be as lucrative as the high-volume ones and their main advantage is that you will have less competition with these keywords.
Some of the SEO experts confirm that with low volume search keywords it is possible with less effort and with in less budget we can achieve much better results than targeting for high-volume search keywords.
Before selecting a low volume or high volume search keyword we need to make an estimate about how difficult it would be to rank well for a particular keyword.
2.Choosing the Right Keywords to Optimize for search engines :
It is impossible to achieve constant top ratings for a one-word search string as the Web is densely populated with web sites; It will be a realistic goal when we achieve constant top ratings for two-word or three-word search strings.
we can include one-word strings in our keywords list but if they are not backed up by more expressions, it is waste of dreaming for high ratings.
For Example if we have a website about temples, “temple” is a mandatory keyword but we will be not successful if we do not optimize for more words, like “temples in India”, “temple visiting times”, “temple gods”.
we need to think broad when choosing the keywords, When we start optimization (seo), the first thing we need to consider is the keywords that describe the content of the web site best and likely to be used by users to find the website or service provided by the website. We should understand our users well and guess correctly what search strings they are likely to use to search our website. Synonyms can be taken into consideration.
For Example, in a dog site, “canine” is a synonym and it is for sure the users will use it, so it does not hurt to use it as a keyword in the content of the web pages.
But search engines have algorithms that include synonyms in the keyword match, especially in languages like English so do not rush to optimize for every synonym.
3.Website Keyword Suggestions Tools :
can help you to see how search engines determine the theme of a web site and what keywords fit into this theme. We can try Google’s Keyword Tool to get more suggestions about which keywords are hot to use.
Before choosing particular keyword or keywords or key phrase to optimize for Google or any search engine, we need to see relevancy of keyword to our web site and the expected monthly number of searches for this particular keyword or keywords or key phrase.
Some times narrow searches are more valuable because the users that come to our site are those that are really interested in our service or product. If we go on with the temples example, we might discover that the “festivals in India” key phrase also brings us more visitors because we have a special section on our web site where we give information or list of festivals in Andhra Pradesh. This page is not of interest for current visitors for temples but potentially attracting this niche can be better than attracting everybody who is interested in temples in general. So when we look at the number of search hits per month consider the unique hits that fit the theme of the website.
4.Effective Keyword Choice Strategy and Useful Tools :
The search engines will know the Web site theme while "crawling" the site pages; this is where the keyword power needs to be utilized. One correctly targeted keyword will produce a better overall search engine ranking than keywords just generally placed. The presence of that keyword placed strategically across the site is a factor in determining whether or not the web site is ranked high in a search on sites like GOOGLE, YAHOO and MSN. Many users will type broader search criteria into the search engines if they are uncertain of exactly what they seek. We should try to establish what keywords are most used in the context of the web site content and theme. Choosing keywords depend on the target audience's needs, and what keywords they will use to locate sites. Keywords will be relevant words and parts of phrases that relate to the site content. So what kinds of keywords should we use? For example: if you have a site that focuses on mobile phones, When any person is not aware of model and service provider and if he is searching for information like what are the best mobile deals available at present? Some keywords might be "mobile, mobile phone, phone tariffs, cellular," etc. If the user wants to find mobile phone sites that host budget phones, he or she may type the keywords mentioned above. If it is a movie information site, keywords could be "movies, film, movie, films, horror movies, Horror, sci-fi, action, cinema" This positioning and choice of keywords is crucial for a site towards SEO optimization. |
5.Seeing in Overture which of the keywords that are related to our web site had most searches recently and optimize for those keywords is wrong because the fact that a particular keyword is often searched does not mean that this alone makes it a worthy target. If the competition for this highly desired keyword is tough, your efforts might be useless, |
Wednesday, September 24, 2008
Common Mistakes in SEO
Search Engine Optimization is the practice of making website to be search friendly to search engines and for those who search business through search engines webmasters make following mistakes.
Over Optimization affects your website Ranking.
1.Incorrectly Designed Websites :
- Lack of proper Navigation
- Using frames to save web designers designing times
- Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
- Using high resolution graphics (Try to use low resolution graphics)
content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases.
Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.
More sure to use keywords placed in the meta keyword tag is logical to your content.
People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.
3.Replica of Content :
Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.
4.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.
5.Absence of Sitemap for Websites :
Sitemaps assist web crawlers in indexing websites more efficiently and efficiently. Sitemap provides the structure of entire website in one page which is very useful for search engine Optimization. Generally site maps drive search engines to go directly to the page instead of searching the links.
Google sitemaps is an easy way to tell Google about all the pages on your site; which pages are most important to you, and when those pages change, for a smarter crawl and fresher search result.
6.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.
7.Page Cloaking :
In this method the web masters deceive search engines by altering the real page content with declared description. Normally spidering robots recognized by their IP addresses or host names are redirected to a page that is specially polished to meet search engines' requirements, but is unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected to see the relevancy of content with description, the page is also revised by search engine owners’ staff and if found any difference the sites are penalized.
8.Spamming:
If the keyword density value becomes excessive it will be regarded as spam which is also called keyword damping. Some webmasters use keywords or key phrase as many times as possible to make a page more relevant for keyword or for a certain key phrase and if it is used more it looks unnatural, These pages will look odd both for search engines and also for human visitors. Search engines will penalize these type of web pages by reducing page ranking and any WWW user will hardly wish to return to this web page after having visited it once.
The other type of spamming method followed by webmasters is using colours to hide multiple keywords
9.Link farm :
Inorder to artificially increase the link popularity many site owners unite in so called link farms. this is nothing but networks where everyone links to everyone else just concentrating on the quantity of links and disregarding their quality.
Having too many links will be worthless, a page that contains links, just links, and nothing else but links will not be authoritative. modern search engines analyse the link quality in terms of web site relevancy, they rank highly if it leads to a site devoted to similar issues. Choose link excange partners whose business is similar to yours. The sites of your partners, or web portals devoted to your business issues are ideal for link exchange.
10. Don’t use Robot to write content for your Web Site:
Never try a machine to write content for your website, there are certain programs that duplicate the same content but make some small changes here and there. if Google catches your website employing this technique you are in trouble, so always write your own content.
11. Hiding keywords:
Font matching: Hiding keywords by making the background color the same as the font color. This is called as font matching. All the search engines are sophisticated at catching these frauds and they will remove any websites using these bad seo techniques.. Placing teeny tiny text at the bottom of a page also don’t help in search engine optimization.
12. Distributing Trojans, Viruses, and Bad ware:
Don’t distribute Trojans, viruses and badware, If Google finds your website distributing them then your website will be removed from the Google index, to protect the public from these bad practices., so always check the software before you agree to distribute them, we must also see our servers are secured so that hackers cant hijack the website and distribute malicious software.
13. Doorway Pages:
Doorway pages or also called as Gateway pages and these are optimized by unethical seo professionals , in this technique they make webpage designed for one key term but are really designed to be gateways to lead you to different content.
For instance by seo technique they make the "blackberry," "blueberry," and "strawberry" gateways all designed to get you to go to "fruit punch."
We should be aware of affiliate programs, because google may consider some of these affiliate programs as doorway pages .
Saturday, September 20, 2008
Seo Methodology
When a website is specifically designed so that it is friendly to the tools that search engine use to analyze websites (Called spiders) is called Search Engine Optimization.
SEO Methodology, methodology, Search Engine Methodology, off page methodology, on page methodology, methodology for seo, methodology for Optimization Tips, search engine optimization, step by step seo methodology, seo process, onpage optimization, off page optimization, what is seo methodology, define seo methodology?, definition of seo methodology, methodology of seo.
- Offline/off page Optimization
- Online/on page Optimization
- Position Monitoring
Offpage Optimization :
- Hosting of Google sitemap
- Website submission to all leading search engines having global data base.
- Submission to country specific search engines having country related data base
- Submission to general directories
- Submission to product specific directories
- Submission to country specific directories
- Trade lead posting
Onpage Optimization :
- Pre Optimization Report
- Key word research
- Competitors website analysis
- Rewriting robot friendly text
- H1 H2 Tags Optimization
- Title Tag Optimization
- Meta Tag Optimization
- Key words Optimization
- Alt Tag Optimization
- Website structure Optimization
- Body text and content Optimization
- Sitemap for link Optimization
Position Monitoring :
- Monitoring website ranking with different keywords
- Renewal of expiry trade leads and posting new trade leads
- Constant research of updated technology for better positioning
- Research on current popular directories and site submission
- Changing methodology with change in search engine algorithm
Architecture of search engines:
Spider - a browser-like program that downloads web pages.
Crawler – a program that automatically follows all of the links on each web page.
Indexer - a program that analyzes web pages downloaded by the spider and the crawler.
Database– storage for downloaded and processed pages.
Results engine – extracts search results from the database.
Web server – a server that is responsible for interaction between the user and other search engine components.
How do Search Engines Work?
All search engines consist of three main parts:
|
The spider (or worm), continuously ‘crawls’ web space, following links that leads either to different website or with in the limits of the website. The spider ‘reads’ all pages content and passes the data to the index.
The index is the next step of search engine after crawling. Index is a storage area for spidered web pages and is of a huge magnitude. Google index is said to consist of more than three billion pages.
For example Google’s index, is said to consist of more than three billion pages.
Search algorithm is more sophisticated and third step of a search engine system. Search algorithm is very complicated mechanism that sorts an immense database within a few seconds and produces the results list. The most relevant the search engine sees the webpage the nearer the top of the list. So site owners or webmasters should therefore see site’s relevancy to keywords.
Algorithm is unique for each and every search engine, and is a trade secret, kept hidden from the public.
Most of the modern web search combines the two systems to produce their results
Wednesday, August 6, 2008
Hit
Hit is a somewhat misleading measure of traffic to a web site. One hit is recorded for each file request in a web server’s access log. If a user visits a page with four images, one hit will be recorded for each graphic image file plus another for the page’s HTML file. A better measure of traffic volume is the number of pages/HTML files accessed.
HTML
The acronym HTML stands for HyperText Markup Language, the authoring language used to create pages on the World Wide Web. HTML is a set of codes or HTML tags that provide a web browser with directions on how to structure a web page’s information and features.
Hyperlink
Also known as link or HTML link, a hyperlink is an image or portion of text that when clicked on by a user opens another web page or jumps the browser to a different portion of the current page. Inbound Links with keyword-relevant Link Text are an important part of Search Engine Optimization Strategy.
Index
An index is a Search Engine’s database. It contains all of the information that a Crawler has identified, particularly copies of World Wide Web pages. When a user performs a Query, the search engine uses its indexed pages and Algorithm set to provide a ranked list of the most relevant pages. In the case of a Directory, the index consists of titles and summaries of registered sites that have been categorized by the directory’s editors.
Inbound Links
Also known as back link, backward link, or backlinks, inbound links are all of the links on other websites that direct the users who click on them to your site. Inbound links can significantly improve your site’s search rankings, particularly if they contain Anchor Text keywords relevant to your site and are located on sites with high Page Rank.
Monday, August 4, 2008
Google AdSense
Google AdSense is an ad-serving program operated by Google that provides relevant text, image, and video-based advertisements to enrolled site owners. Advertisers register via Google AdWords and pay for ads on a Pay-Per-Click, Cost-Per-Thousand or Cost-Per-Action basis. This revenue is shared with Google AdSense host sites, typically on a PPC basis (which sometimes leads to Click Fraud). Google uses its search Algorithms and Contextual Link Inventory to display the most appropriate ads based on site content, Query relevancy, ad “quality scores,” and other factors.
Google AdWords
Google AdWords is the Keyword Submission program that determines the advertising rates and keywords used in the Google AdSense program. Advertisers bid on the keywords that are relevant to their businesses. Ranked ads then appear as sponsored links on Google Search Engine Results Pages (SERPS) and Google AdSense host sites.
Graphical Search Inventory (GSI)
Graphical Search Inventory is the visual equivalent of Contextual Link Inventory. GSI is non-text-based advertising such as Banner Ads, pop-up ads, browser toolbars, animation, sound, video and other media that is synchronized to relevant Keyword queries.
Gray Hat SEO
Gray hat SEO refers to Search Engine Optimization strategies that fall in between Black Hat SEO and White Hat SEO. Gray hat SEO techniques can be legitimate in some cases and illegitimate in others. Such techniques include Doorway Pages, Gateway Pages, Cloaking and duplicate content.
Hidden Text
Hidden text is a generally obsolete form of Black Hat SEO in which pages are filled with a large amount of text that is the same color as the background, rendering keywords invisible to the human eye but detectable to a search engine Crawler. Multiple Title Tags or HTML comments are alternative hidden text techniques. Hidden text is easily detectable by search engines and will result in Blacklisting or reduced Rank.
Thursday, July 24, 2008
What type of content are you adding to your Web site?
Your Web content needs to be good to high quality content that is
well-written and engaging. We talk a lot about the richness and
value of you providing useful content that hold high-interest for
your audience of visitors.
Tip: Remember the power of using nostalgia where appropriate.
By using nostalgia, you can often connect with your readers to
activate their positive memories and past experiences. After all,
regardless of what your online business is about, the Web is really
all about connecting with others.
Wednesday, July 23, 2008
Search Engine Optimization And Sitemaps Effects
Search Engine Optimization is the process which improves the amount of traffic that a website receives naturally from the search engines. A website gets traffic from search engines when it ranks high for its targeted keywords. A ranking in the search result is not permanent as search engines frequently change their algorithms in order to provide the best search results and so, an individual needs to work consistently on his site so as to maintain the rankings and also, to improve the rankings.
However, it can take some good amount of time to see the desired results as there are already a number of websites on Internet and new ones are being launched at regular intervals. So, you need to work consistently without getting deviated from your target as you're competing against a large number of websites.
On-page optimization and off-page optimization are two forms of Search Engine Optimization and both of them are to be considered while optimizing a website. In on-page optimization, you've the control over the page and you modify the internal aspects of the page so as to optimize it. In off-page optimization, you don't have the control over the page that is linking to your website.
There are a number of factors that are to be considered while optimizing a website so as to improve its ranking. Title, keyword density, unique content, interlinking, anchor text, backlinks, sitemap are some of the key factors that are to be considered while optimizing a website. Each factor has its own importance and it needs to be properly used in order to rank high in the search results.
What's a Sitemap and what are its Benefits: Using a sitemap is one of the tricks that are usually underestimated while optimizing a website. If you're wondering what's sitemap then a sitemap is the map of the site as it's a page which displays the different sections of the website, articles and how the different sections are linked together.
A sitemap is very important as it is used to communicate with search engines. However, XML sitemap is used for search engines whereas HTML sitemap is used for human beings. Sitemaps inform search engines about changes to your site and this helps in faster indexing of the changes when compared to the site without a sitemap. In addition to faster indexing, sitemaps also helps an individual to fix the broken internal links. A website without a sitemap can also achieve high rankings as it's is not a strict requirement for achieving high rankings. Although having a regularly updated sitemap helps in improving the rankings at a better rate when compared to a site without one.
Now, if you're wondering how a sitemap is created and where it is placed then you can use sitemap generator tools to generate a sitemap for your website. Once you've the sitemap ready with you, you need to upload it to the server. Before uploading the sitemap, make sure your sitemap is absolutely perfect as an improper sitemap can cause de-indexing of the website. You can use online tools to check whether the sitemap is properly created or not.
Also, adding a link to the sitemap from the website's page helps in improving the rate at which the sitemap is crawled by the search engine spiders. You should also add the sitemap to your Google Webmaster account as this decreases your reliance on external links for improving the indexing rate. So, a sitemap is a very important aspect that should be considered appropriately while optimizing a website.
Saturday, July 19, 2008
Dynamic Content
Dynamic content is web content such as Search Engine Results Pages (SERPS) that are generated or changed based on database information or user activity. Web pages that remain the same for all visitors in every context contain “static content.” Many e-commerce sites create dynamic content based on purchase history and other factors. Search engines have a difficult time indexing dynamic content if the page includes a session ID number, and will typically ignore URLs that contain the variable “?”.Search engines will punish sites that use deceptive or invasive means to create dynamic content.
Flash Optimization
Flash is a vector graphics-based animation program developed by Macromedia. Most corporate sites feature Flash movies/animation, yet because search engine Crawlers were designed to index HTML text, sites that favor Flash over text are difficult or even impossible for crawlers to read. Flash Optimization is the process of reworking the Flash movie and surrounding HTML code to be more “crawlable” for Search Engines.
Gateway Page
Also known as a doorway page or jump page, a gateway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page. Some search engines frown on gateway pages as a softer form of Cloaking or Spam. However, gateway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.
Geographical Targeting
Geographical targeting is the focusing of Search Engine Marketing on states, counties, cities and neighborhoods that are important to a company’s business. One basic aspect of geographical targeting is adding the names of relevant cities or streets to a site’s keywords, i.e. Hyde Street Chicago
Geographic Segmentation
Geographic segmentation is the use of Analytics to categorize a site’s web traffic by the physical locations from which it originated.
Wednesday, July 9, 2008
Crawler
Also known as Spider or Robot, a crawler is a search engine program that “crawls” the web, collecting data, following links, making copies of new and updated sites, and storing URLs in the search engine’s Index. This allows search engines to provide faster and more up-to-date listings.
Delisted
Also known as banned or blacklisted, a delisted site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Delisted sites are ignored by search engines.
Description Tag
Also known as a meta description tag, a description tag is a short HTML paragraph that provides search engines with a description of a page’s content for search engine Index purposes. The description tag is not displayed on the website itself, and may or may not be displayed in the search engine’s listing for that site. Search engines are now giving less importance to description tags in lieu of actual page content.
Directory
A directory is an Index of websites compiled by people rather than a Crawler. Directories can be general or divided into specific categories and subcategories. A directory’s servers provide relevant lists of registered sites in response to user queries. Directory Registration is thus an important method for building inbound links and improving SEO performance. However, the decision to include a site and its directory rank or categorization is determined by directory editors rather than an Algorithm. Some directories accept free submissions while others require payment for listing. The most popular directories include Yahoo!, The Open Directory Project, and LookSmart.
Doorway Page
Also known as a gateway page or jump page, a doorway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page. Some search engines frown on doorway pages as a softer form of Cloaking or Spam. However, doorway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.
Friday, July 4, 2008
Cost-Per-Acquisition (CPA)
Cost-per-acquisition (CPA) is a return on investment model in which return is measured by dividing total click/marketing costs by the number of Conversions achieved. Total acquisition costs ÷ number of conversions = CPA. CPA is also used as a synonym for Cost-Per-Action.
Cost-Per-Action (CPA)
In a cost-per-action advertising revenue system, advertisers are charged a Conversion-based fee, i.e. each time a user buys a product, opens an account, or requests a free trial. CPA is also known as cost-per-acquisition, though the term cost-per-acquisition can be confusing because it also refers to a return on investment model.
Cost-Per-Click (CPC)
Also known as pay-per-click or pay-for-performance, cost-per-click is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for each click of their ads. This Click-Through Rate-based payment structure is considered by some advertisers to be more cost-effective than the Cost-Per-Thousand payment structure, but it can at times lead to Click Fraud.
Cost-Per-Thousand (CPM)
Also known as cost-per-impression or CPM for cost-per-mille (mille is the Latin word for thousand), cost-per-thousand is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for every 1,000 users who see their ads, regardless of whether a click-through or conversion is achieved. CPM is typically used for Banner Ad sales, while Cost-Per-Click is typically used for text link advertising.
Thursday, July 3, 2008
Click-through rate is the percentage of users who click on an advertising link or search engine site listing out of the total number of people who see it, i.e. four click-throughs out of ten views is a 40% CTR.
Contextual Link Inventory (CLI)Search engines/advertising networks use their contextual link inventory to match keyword-relevant text-link advertising with site content. CLI is generated based on listings of website pages with content that the ad-server deems a relevant keyword match. Ad networks further refine CLI relevancy by monitoring the Click-Through Rate of the displayed ads.
Cloaking
Cloaking is the presentation of alternative pages to a search engine Spider so that it will record different content for a URL than what a human browser would see. Cloaking is typically done to achieve a higher search engine position or to trick users into visiting a site. In such cases cloaking is considered to be Black Hat SEO and the offending URL could be Blacklisted. However, cloaking is sometimes used to deliver personalized content based on a browser’s IP address and/or user-agent HTTP header. Such cloaking should only be practiced with a search engine’s knowledge or it could be construed as black hat cloaking.
Conversion
Conversion is the term used for any significant action a user takes while visiting a site, i.e. making a purchase, requesting information, or registering for an account.
Conversion Analytics
Conversion analytics is a branch of Analytics concerned specifically with conversion-related information from organic and paid search engine traffic, such as the keywords converts used in their queries, the type of conversion that resulted, landing page paths, search engine used, etc.
Conversion Rate
Conversion rate is the next step up from Click-Through Rate. It’s the percentage of all site visitors who “convert” (make a purchase, register, request information, etc.). If three users buy products and one user requests a catalogue out of ten daily visitors, a site’s conversion rate is 40%.