Wednesday, September 24, 2008

Common Mistakes in SEO

Search Engine Optimization is the practice of making website to be search friendly to search engines and for those who search business through search engines webmasters make following mistakes.

Over Optimization affects your website Ranking.

1.Incorrectly Designed Websites :

  • Lack of proper Navigation
  • Using frames to save web designers designing times
  • Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
  • Using high resolution graphics (Try to use low resolution graphics)
2.Poorly Written of content :
content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases.

Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.

More sure to use keywords placed in the meta keyword tag is logical to your content.

People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.

3.Replica of Content :
Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.

4.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.

Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.

5.Absence of Sitemap for Websites :
Sitemaps assist web crawlers in indexing websites more efficiently and efficiently. Sitemap provides the structure of entire website in one page which is very useful for search engine Optimization. Generally site maps drive search engines to go directly to the page instead of searching the links.

Google sitemaps is an easy way to tell Google about all the pages on your site; which pages are most important to you, and when those pages change, for a smarter crawl and fresher search result.

6.Improper use of Meta Tags or site without Meta Tags :

Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.

Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.

7.Page Cloaking :

In this method the web masters deceive search engines by altering the real page content with declared description. Normally spidering robots recognized by their IP addresses or host names are redirected to a page that is specially polished to meet search engines' requirements, but is unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected to see the relevancy of content with description, the page is also revised by search engine owners’ staff and if found any difference the sites are penalized.


8.Spamming:

If the keyword density value becomes excessive it will be regarded as spam which is also called keyword damping. Some webmasters use keywords or key phrase as many times as possible to make a page more relevant for keyword or for a certain key phrase and if it is used more it looks unnatural, These pages will look odd both for search engines and also for human visitors. Search engines will penalize these type of web pages by reducing page ranking and any WWW user will hardly wish to return to this web page after having visited it once.

The other type of spamming method followed by webmasters is using colours to hide multiple keywords

9.Link farm :

Inorder to artificially increase the link popularity many site owners unite in so called link farms. this is nothing but networks where everyone links to everyone else just concentrating on the quantity of links and disregarding their quality.

Having too many links will be worthless, a page that contains links, just links, and nothing else but links will not be authoritative. modern search engines analyse the link quality in terms of web site relevancy, they rank highly if it leads to a site devoted to similar issues. Choose link excange partners whose business is similar to yours. The sites of your partners, or web portals devoted to your business issues are ideal for link exchange.

10. Don’t use Robot to write content for your Web Site:

Never try a machine to write content for your website, there are certain programs that duplicate the same content but make some small changes here and there. if Google catches your website employing this technique you are in trouble, so always write your own content.

11. Hiding keywords:

Font matching: Hiding keywords by making the background color the same as the font color. This is called as font matching. All the search engines are sophisticated at catching these frauds and they will remove any websites using these bad seo techniques.. Placing teeny tiny text at the bottom of a page also don’t help in search engine optimization.

12. Distributing Trojans, Viruses, and Bad ware:

Don’t distribute Trojans, viruses and badware, If Google finds your website distributing them then your website will be removed from the Google index, to protect the public from these bad practices., so always check the software before you agree to distribute them, we must also see our servers are secured so that hackers cant hijack the website and distribute malicious software.

13. Doorway Pages:

Doorway pages or also called as Gateway pages and these are optimized by unethical seo professionals , in this technique they make webpage designed for one key term but are really designed to be gateways to lead you to different content.

For instance by seo technique they make the "blackberry," "blueberry," and "strawberry" gateways all designed to get you to go to "fruit punch."
We should be aware of affiliate programs, because google may consider some of these affiliate programs as doorway pages .


Saturday, September 20, 2008

Seo Methodology

When a website is specifically designed so that it is friendly to the tools that search engine use to analyze websites (Called spiders) is called Search Engine Optimization.

SEO Methodology, methodology, Search Engine Methodology, off page methodology, on page methodology, methodology for seo, methodology for Optimization Tips, search engine optimization, step by step seo methodology, seo process, onpage optimization, off page optimization, what is seo methodology, define seo methodology?, definition of seo methodology, methodology of seo.

  • Offline/off page Optimization
  • Online/on page Optimization
  • Position Monitoring

Offpage Optimization :
  • Hosting of Google sitemap
  • Website submission to all leading search engines having global data base.
  • Submission to country specific search engines having country related data base
  • Submission to general directories
  • Submission to product specific directories
  • Submission to country specific directories
  • Trade lead posting


Onpage Optimization :
  • Pre Optimization Report
  • Key word research
  • Competitors website analysis
  • Rewriting robot friendly text
  • H1 H2 Tags Optimization
  • Title Tag Optimization
  • Meta Tag Optimization
  • Key words Optimization
  • Alt Tag Optimization
  • Website structure Optimization
  • Body text and content Optimization
  • Sitemap for link Optimization

Position Monitoring :
  • Monitoring website ranking with different keywords
  • Renewal of expiry trade leads and posting new trade leads
  • Constant research of updated technology for better positioning
  • Research on current popular directories and site submission
  • Changing methodology with change in search engine algorithm

Architecture of search engines:

Spider - a browser-like program that downloads web pages.

Crawler – a program that automatically follows all of the links on each web page.
Indexer - a program that analyzes web pages downloaded by the spider and the crawler.

Database– storage for downloaded and processed pages.
Results engine – extracts search results from the database.

Web server – a server that is responsible for interaction between the user and other search engine components.

How do Search Engines Work?

All search engines consist of three main parts:

  • The Spider (or worm)
  • The Index
  • The Search Algorithm.

The spider (or worm), continuously ‘crawls’ web space, following links that leads either to different website or with in the limits of the website. The spider ‘reads’ all pages content and passes the data to the index.

The index is the next step of search engine after crawling. Index is a storage area for spidered web pages and is of a huge magnitude. Google index is said to consist of more than three billion pages.

For example Google’s index, is said to consist of more than three billion pages.

Search algorithm is more sophisticated and third step of a search engine system. Search algorithm is very complicated mechanism that sorts an immense database within a few seconds and produces the results list. The most relevant the search engine sees the webpage the nearer the top of the list. So site owners or webmasters should therefore see site’s relevancy to keywords.

Algorithm is unique for each and every search engine, and is a trade secret, kept hidden from the public.

Most of the modern web search combines the two systems to produce their results

Wednesday, August 6, 2008

Hit
Hit is a somewhat misleading measure of traffic to a web site. One hit is recorded for each file request in a web server’s access log. If a user visits a page with four images, one hit will be recorded for each graphic image file plus another for the page’s HTML file. A better measure of traffic volume is the number of pages/HTML files accessed.

HTML
The acronym HTML stands for HyperText Markup Language, the authoring language used to create pages on the World Wide Web. HTML is a set of codes or HTML tags that provide a web browser with directions on how to structure a web page’s information and features.

Hyperlink
Also known as link or HTML link, a hyperlink is an image or portion of text that when clicked on by a user opens another web page or jumps the browser to a different portion of the current page. Inbound Links with keyword-relevant Link Text are an important part of Search Engine Optimization Strategy.

Index
An index is a Search Engine’s database. It contains all of the information that a Crawler has identified, particularly copies of World Wide Web pages. When a user performs a Query, the search engine uses its indexed pages and Algorithm set to provide a ranked list of the most relevant pages. In the case of a Directory, the index consists of titles and summaries of registered sites that have been categorized by the directory’s editors.

Inbound Links
Also known as back link, backward link, or backlinks, inbound links are all of the links on other websites that direct the users who click on them to your site. Inbound links can significantly improve your site’s search rankings, particularly if they contain Anchor Text keywords relevant to your site and are located on sites with high Page Rank.

Monday, August 4, 2008

Google AdSense
Google AdSense is an ad-serving program operated by Google that provides relevant text, image, and video-based advertisements to enrolled site owners. Advertisers register via Google AdWords and pay for ads on a Pay-Per-Click, Cost-Per-Thousand or Cost-Per-Action basis. This revenue is shared with Google AdSense host sites, typically on a PPC basis (which sometimes leads to Click Fraud). Google uses its search Algorithms and Contextual Link Inventory to display the most appropriate ads based on site content, Query relevancy, ad “quality scores,” and other factors.

Google AdWords
Google AdWords is the Keyword Submission program that determines the advertising rates and keywords used in the Google AdSense program. Advertisers bid on the keywords that are relevant to their businesses. Ranked ads then appear as sponsored links on Google Search Engine Results Pages (SERPS) and Google AdSense host sites.

Graphical Search Inventory (GSI)
Graphical Search Inventory is the visual equivalent of Contextual Link Inventory. GSI is non-text-based advertising such as Banner Ads, pop-up ads, browser toolbars, animation, sound, video and other media that is synchronized to relevant Keyword queries.

Gray Hat SEO
Gray hat SEO refers to Search Engine Optimization strategies that fall in between Black Hat SEO and White Hat SEO. Gray hat SEO techniques can be legitimate in some cases and illegitimate in others. Such techniques include Doorway Pages, Gateway Pages, Cloaking and duplicate content.

Hidden Text

Hidden text is a generally obsolete form of Black Hat SEO in which pages are filled with a large amount of text that is the same color as the background, rendering keywords invisible to the human eye but detectable to a search engine Crawler. Multiple Title Tags or HTML comments are alternative hidden text techniques. Hidden text is easily detectable by search engines and will result in Blacklisting or reduced Rank.

Thursday, July 24, 2008

What type of content are you adding to your Web site?

Your Web content needs to be good to high quality content that is
well-written and engaging. We talk a lot about the richness and
value of you providing useful content that hold high-interest for
your audience of visitors.

Tip: Remember the power of using nostalgia where appropriate.

By using nostalgia, you can often connect with your readers to
activate their positive memories and past experiences. After all,
regardless of what your online business is about, the Web is really
all about connecting with others.

Wednesday, July 23, 2008

Search Engine Optimization And Sitemaps Effects

Search Engine Optimization is the process which improves the amount of traffic that a website receives naturally from the search engines. A website gets traffic from search engines when it ranks high for its targeted keywords. A ranking in the search result is not permanent as search engines frequently change their algorithms in order to provide the best search results and so, an individual needs to work consistently on his site so as to maintain the rankings and also, to improve the rankings.

However, it can take some good amount of time to see the desired results as there are already a number of websites on Internet and new ones are being launched at regular intervals. So, you need to work consistently without getting deviated from your target as you're competing against a large number of websites.

On-page optimization and off-page optimization are two forms of Search Engine Optimization and both of them are to be considered while optimizing a website. In on-page optimization, you've the control over the page and you modify the internal aspects of the page so as to optimize it. In off-page optimization, you don't have the control over the page that is linking to your website.

There are a number of factors that are to be considered while optimizing a website so as to improve its ranking. Title, keyword density, unique content, interlinking, anchor text, backlinks, sitemap are some of the key factors that are to be considered while optimizing a website. Each factor has its own importance and it needs to be properly used in order to rank high in the search results.

What's a Sitemap and what are its Benefits: Using a sitemap is one of the tricks that are usually underestimated while optimizing a website. If you're wondering what's sitemap then a sitemap is the map of the site as it's a page which displays the different sections of the website, articles and how the different sections are linked together.

A sitemap is very important as it is used to communicate with search engines. However, XML sitemap is used for search engines whereas HTML sitemap is used for human beings. Sitemaps inform search engines about changes to your site and this helps in faster indexing of the changes when compared to the site without a sitemap. In addition to faster indexing, sitemaps also helps an individual to fix the broken internal links. A website without a sitemap can also achieve high rankings as it's is not a strict requirement for achieving high rankings. Although having a regularly updated sitemap helps in improving the rankings at a better rate when compared to a site without one.

Now, if you're wondering how a sitemap is created and where it is placed then you can use sitemap generator tools to generate a sitemap for your website. Once you've the sitemap ready with you, you need to upload it to the server. Before uploading the sitemap, make sure your sitemap is absolutely perfect as an improper sitemap can cause de-indexing of the website. You can use online tools to check whether the sitemap is properly created or not.

Also, adding a link to the sitemap from the website's page helps in improving the rate at which the sitemap is crawled by the search engine spiders. You should also add the sitemap to your Google Webmaster account as this decreases your reliance on external links for improving the indexing rate. So, a sitemap is a very important aspect that should be considered appropriately while optimizing a website.

Saturday, July 19, 2008

Dynamic Content
Dynamic content is web content such as Search Engine Results Pages (SERPS) that are generated or changed based on database information or user activity. Web pages that remain the same for all visitors in every context contain “static content.” Many e-commerce sites create dynamic content based on purchase history and other factors. Search engines have a difficult time indexing dynamic content if the page includes a session ID number, and will typically ignore URLs that contain the variable “?”.Search engines will punish sites that use deceptive or invasive means to create dynamic content.

Flash Optimization
Flash is a vector graphics-based animation program developed by Macromedia. Most corporate sites feature Flash movies/animation, yet because search engine Crawlers were designed to index HTML text, sites that favor Flash over text are difficult or even impossible for crawlers to read. Flash Optimization is the process of reworking the Flash movie and surrounding HTML code to be more “crawlable” for Search Engines.

Gateway Page
Also known as a doorway page or jump page, a gateway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page. Some search engines frown on gateway pages as a softer form of Cloaking or Spam. However, gateway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.

Geographical Targeting
Geographical targeting is the focusing of Search Engine Marketing on states, counties, cities and neighborhoods that are important to a company’s business. One basic aspect of geographical targeting is adding the names of relevant cities or streets to a site’s keywords, i.e. Hyde Street Chicago

Geographic Segmentation
Geographic segmentation is the use of Analytics to categorize a site’s web traffic by the physical locations from which it originated.