Thursday, July 24, 2008

What type of content are you adding to your Web site?

Your Web content needs to be good to high quality content that is
well-written and engaging. We talk a lot about the richness and
value of you providing useful content that hold high-interest for
your audience of visitors.

Tip: Remember the power of using nostalgia where appropriate.

By using nostalgia, you can often connect with your readers to
activate their positive memories and past experiences. After all,
regardless of what your online business is about, the Web is really
all about connecting with others.

Wednesday, July 23, 2008

Search Engine Optimization And Sitemaps Effects

Search Engine Optimization is the process which improves the amount of traffic that a website receives naturally from the search engines. A website gets traffic from search engines when it ranks high for its targeted keywords. A ranking in the search result is not permanent as search engines frequently change their algorithms in order to provide the best search results and so, an individual needs to work consistently on his site so as to maintain the rankings and also, to improve the rankings.

However, it can take some good amount of time to see the desired results as there are already a number of websites on Internet and new ones are being launched at regular intervals. So, you need to work consistently without getting deviated from your target as you're competing against a large number of websites.

On-page optimization and off-page optimization are two forms of Search Engine Optimization and both of them are to be considered while optimizing a website. In on-page optimization, you've the control over the page and you modify the internal aspects of the page so as to optimize it. In off-page optimization, you don't have the control over the page that is linking to your website.

There are a number of factors that are to be considered while optimizing a website so as to improve its ranking. Title, keyword density, unique content, interlinking, anchor text, backlinks, sitemap are some of the key factors that are to be considered while optimizing a website. Each factor has its own importance and it needs to be properly used in order to rank high in the search results.

What's a Sitemap and what are its Benefits: Using a sitemap is one of the tricks that are usually underestimated while optimizing a website. If you're wondering what's sitemap then a sitemap is the map of the site as it's a page which displays the different sections of the website, articles and how the different sections are linked together.

A sitemap is very important as it is used to communicate with search engines. However, XML sitemap is used for search engines whereas HTML sitemap is used for human beings. Sitemaps inform search engines about changes to your site and this helps in faster indexing of the changes when compared to the site without a sitemap. In addition to faster indexing, sitemaps also helps an individual to fix the broken internal links. A website without a sitemap can also achieve high rankings as it's is not a strict requirement for achieving high rankings. Although having a regularly updated sitemap helps in improving the rankings at a better rate when compared to a site without one.

Now, if you're wondering how a sitemap is created and where it is placed then you can use sitemap generator tools to generate a sitemap for your website. Once you've the sitemap ready with you, you need to upload it to the server. Before uploading the sitemap, make sure your sitemap is absolutely perfect as an improper sitemap can cause de-indexing of the website. You can use online tools to check whether the sitemap is properly created or not.

Also, adding a link to the sitemap from the website's page helps in improving the rate at which the sitemap is crawled by the search engine spiders. You should also add the sitemap to your Google Webmaster account as this decreases your reliance on external links for improving the indexing rate. So, a sitemap is a very important aspect that should be considered appropriately while optimizing a website.

Saturday, July 19, 2008

Dynamic Content
Dynamic content is web content such as Search Engine Results Pages (SERPS) that are generated or changed based on database information or user activity. Web pages that remain the same for all visitors in every context contain “static content.” Many e-commerce sites create dynamic content based on purchase history and other factors. Search engines have a difficult time indexing dynamic content if the page includes a session ID number, and will typically ignore URLs that contain the variable “?”.Search engines will punish sites that use deceptive or invasive means to create dynamic content.

Flash Optimization
Flash is a vector graphics-based animation program developed by Macromedia. Most corporate sites feature Flash movies/animation, yet because search engine Crawlers were designed to index HTML text, sites that favor Flash over text are difficult or even impossible for crawlers to read. Flash Optimization is the process of reworking the Flash movie and surrounding HTML code to be more “crawlable” for Search Engines.

Gateway Page
Also known as a doorway page or jump page, a gateway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page. Some search engines frown on gateway pages as a softer form of Cloaking or Spam. However, gateway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.

Geographical Targeting
Geographical targeting is the focusing of Search Engine Marketing on states, counties, cities and neighborhoods that are important to a company’s business. One basic aspect of geographical targeting is adding the names of relevant cities or streets to a site’s keywords, i.e. Hyde Street Chicago

Geographic Segmentation
Geographic segmentation is the use of Analytics to categorize a site’s web traffic by the physical locations from which it originated.

Wednesday, July 9, 2008

Crawler
Also known as Spider or Robot, a crawler is a search engine program that “crawls” the web, collecting data, following links, making copies of new and updated sites, and storing URLs in the search engine’s Index. This allows search engines to provide faster and more up-to-date listings.

Delisted
Also known as banned or blacklisted, a delisted site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Delisted sites are ignored by search engines.

Description Tag
Also known as a meta description tag, a description tag is a short HTML paragraph that provides search engines with a description of a page’s content for search engine Index purposes. The description tag is not displayed on the website itself, and may or may not be displayed in the search engine’s listing for that site. Search engines are now giving less importance to description tags in lieu of actual page content.

Directory
A directory is an Index of websites compiled by people rather than a Crawler. Directories can be general or divided into specific categories and subcategories. A directory’s servers provide relevant lists of registered sites in response to user queries. Directory Registration is thus an important method for building inbound links and improving SEO performance. However, the decision to include a site and its directory rank or categorization is determined by directory editors rather than an Algorithm. Some directories accept free submissions while others require payment for listing. The most popular directories include Yahoo!, The Open Directory Project, and LookSmart.

Doorway Page
Also known as a gateway page or jump page, a doorway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page. Some search engines frown on doorway pages as a softer form of Cloaking or Spam. However, doorway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.

Friday, July 4, 2008

Cost-Per-Acquisition (CPA)
Cost-per-acquisition (CPA) is a return on investment model in which return is measured by dividing total click/marketing costs by the number of Conversions achieved. Total acquisition costs ÷ number of conversions = CPA. CPA is also used as a synonym for Cost-Per-Action.

Cost-Per-Action (CPA)
In a cost-per-action advertising revenue system, advertisers are charged a Conversion-based fee, i.e. each time a user buys a product, opens an account, or requests a free trial. CPA is also known as cost-per-acquisition, though the term cost-per-acquisition can be confusing because it also refers to a return on investment model.

Cost-Per-Click (CPC)
Also known as pay-per-click or pay-for-performance, cost-per-click is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for each click of their ads. This Click-Through Rate-based payment structure is considered by some advertisers to be more cost-effective than the Cost-Per-Thousand payment structure, but it can at times lead to Click Fraud.

Cost-Per-Thousand (CPM)
Also known as cost-per-impression or CPM for cost-per-mille (mille is the Latin word for thousand), cost-per-thousand is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for every 1,000 users who see their ads, regardless of whether a click-through or conversion is achieved. CPM is typically used for Banner Ad sales, while Cost-Per-Click is typically used for text link advertising.

Thursday, July 3, 2008

Click-Through Rate (CTR)

Click-through rate is the percentage of users who click on an advertising link or search engine site listing out of the total number of people who see it, i.e. four click-throughs out of ten views is a 40% CTR.

Contextual Link Inventory (CLI)
Search engines/advertising networks use their contextual link inventory to match keyword-relevant text-link advertising with site content. CLI is generated based on listings of website pages with content that the ad-server deems a relevant keyword match. Ad networks further refine CLI relevancy by monitoring the Click-Through Rate of the displayed ads.

Cloaking
Cloaking is the presentation of alternative pages to a search engine Spider so that it will record different content for a URL than what a human browser would see. Cloaking is typically done to achieve a higher search engine position or to trick users into visiting a site. In such cases cloaking is considered to be Black Hat SEO and the offending URL could be Blacklisted. However, cloaking is sometimes used to deliver personalized content based on a browser’s IP address and/or user-agent HTTP header. Such cloaking should only be practiced with a search engine’s knowledge or it could be construed as black hat cloaking.

Conversion
Conversion is the term used for any significant action a user takes while visiting a site, i.e. making a purchase, requesting information, or registering for an account.

Conversion Analytics
Conversion analytics is a branch of Analytics concerned specifically with conversion-related information from organic and paid search engine traffic, such as the keywords converts used in their queries, the type of conversion that resulted, landing page paths, search engine used, etc.

Conversion Rate
Conversion rate is the next step up from Click-Through Rate. It’s the percentage of all site visitors who “convert” (make a purchase, register, request information, etc.). If three users buy products and one user requests a catalogue out of ten daily visitors, a site’s conversion rate is 40%.

Saturday, June 28, 2008

Algorithm
An algorithm is a set of finite, ordered steps for solving a mathematical problem. Each Search Engine uses a proprietary algorithm set to calculate the relevance of its indexed web pages to your particular Query. The result of this process is a list of sites ranked in the order that the search engine deemed most relevant. Search engine algorithms are closely guarded in order to prevent exploitation of algorithmic results. Search algorithms are also changed frequently to incorporate new data and improve relevancy.

Algorithmic Results
Algorithmic results are the ranked listings search engines provide in response to a Query. They are often referred to as Organic Listings in contrast to Paid Listings because their rank is based on relevancy rather than advertising revenue paid to the search engine. However, paid listings do appear alongside algorithmic results in many search engines, provided they are relevant. Improving a website’s unpaid algorithmic results is known as Natural Search Engine Optimization.

Alt Tag/Alt Text
An alt tag is the HTML text that appears while an image is loading or when a cursor is positioned over an image. Alt text is useful in Search Engine Optimization because it can include keywords that a search engine looks for in response to a query.

Analytics
Analytics refers to all the technology, programming, and data used in Search Engine Marketing to analyze a website’s performance or the success of an Internet marketing campaign.

Anchor Text
Also known as link text, anchor text is the visible, clickable text between the HTML anchor and tags. Clicking on anchor text activates a Hyperlink to another web site. Anchor text is very important in Search Engine Optimization because search engine algorithms consider the Hyperlink keywords as relevant to the Landing Page.

Backlinks
Also known as back link, backward link, or inbound links, backlinks are all of the links on other websites that direct the users who click on them to your site. Backlinks can significantly improve your site’s search rankings, particularly if they contain Anchor Text keywords relevant to your site and are located on sites with high Page Rank.

Banned
Also known as delisted or blacklisted, a banned site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Banned sites are ignored by search engines.

Banner Ad
A banner ad is a rectangular graphic advertisement. Banner ads are one of the commonest forms of online advertising. Their sizes vary, but most measure 468 pixels wide by 60 pixels high. Clicking on a banner ad will direct you to the advertiser’s website or a designated Landing Page.

Black Hat SEO
Black hat SEO is the term used for unethical or deceptive optimization techniques. This includes Spam, Cloaking, or violating search engine rules in any way. If a search engine discovers a site engaging in black hat SEO it will remove that site from its Index.

Blacklisted
Also known as banned or delisted, a blacklisted site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Blacklisted sites are ignored by search engines.

Broken Link
Also known as a dead link, a broken link is a link that no longer points to an active destination or Landing Page. Search engines dislike broken links. Keeping all of your site’s links active is an important part of ongoing optimization.

Click Fraud
Click Fraud is the illegal practice of manipulating Cost-Per-Click (CPC) or Pay-Per-Click (PPC) revenue sharing agreements. There are numerous types of click fraud, but in a typical scenario the webmaster of a site that earns money from each click of the advertising links it publishes pays individuals a small fee to click those links. Companies thus pay for advertising to clients who had no intention of buying from them. Some companies have filed class action lawsuits alleging that ad publishers such as Google and Yahoo! have failed to aggressively confront click fraud because they benefit from increased CPC revenue.

Click-Through
Click-through refers to a single instance of a user clicking on an advertising link or site listing and moving to a Landing Page. A higher Click-Through Rate (CTR) is one of the primary goals of Search Engine Optimization.

Thursday, June 26, 2008

SEO Basics

1. Understanding Business.

# What is your business all about?

# How is it going to Operate?

# Target audience (Global/Local)

# Opportunity

# Compitetor study operating in same business.

# Core competence

2. Target Keywords.

# Getting a good list of keywords.

# This is something that will directly reflect your efforts(targeting wrong keyword will get all your efforts in vain)

3. WEB Designing.

# Navigation

# Page Size

# Look & Feel

# Inter Linking

# HTML Code Validation

4. Meta Tags & Content.

# Title# Description

# Keywords

# Content

5. Directories.

Search engines need a starting point. That's why human edited directories are important to be listed in. There are 3 major web directories at this moment:

DMOZ - Free listings.

The Yahoo directory - Free listings for non-commercial sites.

Looksmart (Zeal directory) - Free listings for non-commercial sites

6. Link Building.

When setting up the linking architecture of your site keep in mind that links are of two kinds:

1. Internal links 2. External links

Internal links are the constituent elements of the way your web site is organized. They are the bridges by which one page of your site connects to the other & more importantly outlines, which page of your site will connect to which other page/s.

External links on the other hand are the community tools by which the online community votes for you by linking to you. The more popular you get is a function of more incoming links into your website. This in turn is interpreted by search engines as a measure of your online value creation.

To enjoy a good pagerank for the pages it is imperative to have proper linking amongst various pages within the website as well as having good quality external links coming into the page.

A factor to keep in mind while linking is that one should focus on quality links & not always quantity. Consider that, there is a huge difference from a link, to your page from another page with a good pagerank than a page from a low pagerank.

Wednesday, June 25, 2008

Glossary of SEO Terms

These are common knowledge terms and the definitions and/or explanations have been gathered from various places.

GoogleBot - The name of the spider used by Google.

Google Dance - used to describe the index update of Google. A time when fluctuations in ranking appear. Google's index update occurred on average once per month.

PR or Page Rank - the actual, real, page rank for each page as calculated by Google. This can range from 0.15 to billions.

HTML - Hypertext Markup Language - the (main) language used to write web pages.

HTTP - Hypertext Transfer Protocol - the (main) protocol used to communicate between web servers and web browsers (clients).

Yahoo Slurp - The name of the spider used by Yahoo.
Scooter - The name of the AltaVista search engine's spider

Title - The text contained between the start and end HTML head tags of the source code. The title is usually at the top of the window when viewing the source code. The Title text forms the link to the URL page from the search engine results, and search engines pay special attention to the title text when indexing any page in your web site.

CPC - Cost per Click
PPC - Pay per Click
CPM - Cost per thousand

Spider Food - a term used for a site map, or another page of a web site that contains all the links in a web, allowing the search engines spider to index all the pages in a web.

Inbound Link - A hypertext link to a particular page from elsewhere, bringing traffic to that page. Inbound links are counted to produce a measure of the page popularity. Searches for the inbound links to a page can be made on AltaVista, Google and HotBot.

Reciprocating Link - An exchange of links with a web site that is similar in services.

Dynamic Content - Information on web pages which is changed automatically, by database content or user information. Search engines will currently index dynamic content, although they will not usually index URLs which contain the ? character.

robots.txt - A text file stored in the top level directory of a web site to deny access by robots to certain pages or sub-directories of the site. Only robots which comply with the Robots Exclusion Standard will read and obey the commands in this file. Robots will read this file on each visit, so that pages or areas of sites can be made public or private at any time by changing the content of robots.txt before resubmitting to the search engines.