Tuesday, February 26, 2013
Tips to request reconsideration in Google
Friday, February 22, 2013
Reasons Behind Bad Seo Services
- SEO Professional
- SEM Professional
- SMO Professional
- Web Designer
- Programmer
- Quality Professional
- Leader
- Investor
Google Plans for faster search using Ajax
Does your website hacked?
- Website is ranked for spamy words
- Taking more loading time and getting error even host is good
- Suspicious messages from browsers
- Web pages disappeared from the Search engine results
- Homepage is De-indexed from Google
- Present status of your site
- While visiting does google have any problem?
- Does your site distributing malware or not?
- Is your site hosted malware?
- Hidden text and links
- Cloaking urls
- Automated requests to search engines
- Doorway pages
- Duplicate content
- Paid links
- Do follow links for ads
Effective way of Doing Social Bookmarking
- Update entire profile will complete info which looks professional
- Search authorized profiles then vote and comment on their links
- Add Authorized profiles in to your friends
- Build a professional relationship with them
- First to vote and comment on their links
- In result you will get good comments and votes for your links
- Provide different kinds of good links which catches public interest.
- Digg.com
- StumbleUpon.com
- Technorati.com
- Del.icio.us
- Propeller.com
- simpy.com
- Slashdot.org .......... etc...
Monday, February 4, 2013
Are shorten urls SEO friendly? …. Yes
Google Webmaster Tools - SEO Updates 2012
I would like to highlight & include the
summaries of SEO Updates in 2012 which are published on Google webmaster
central official blog.
- Google https search is faster now on modern browser - Mar 19th 2012
- SEO Friendly Pagination algorithm update from Google - Mar 13th 2012
- Better Crawl Errors reporting in Google Webmaster tools - Mar 12th 2012
- Filtering Free hosting spam websites from search results - Mar 06th 2012
- User & Administrator access are now available in GWT - Mar 05th 2012
- Video markup language for better indexation of Videos - Feb 21st 2012
- Optimize your site to handle unexpected traffic growth - Feb 09th 2012
- Webmaster tools sitemaps interface with fresh look - Jan 26th 2012
- Few changes on Top search queries data in Webmaster tools - Jan 25th 2012
- Google Algorithm update to Page layout structure - Jan 19th 2012
- Page Titles Update in Search Results - Jan 12th 2012
1. Faster HTTPs Search:
Google made some changes to its https search. If you are logged in to your gmail account and searching for some thing on Google, When you click on a specific result for a keyword then you will reach that web page little faster compared to earlier https search. Google Team has reduced the latency of https search in modern browser like chrome.
2. SEO Friendly Pagination:
Most of the eCommerce and content portals will usually have pagination issue. Due to pagination most of the duplicate pages will be created with product titles in eCommerce portals and with article titles in content portals. Now Google came up with updated algorithm i.e. following are the two ways to fix pagination issue.
- Insert View all page and use rel="canonical" in sub pagination pages
- Use rel="prev" and rel="next"
3. Upgraded Crawl Errors Reporting:
Crawl Errors is a feature in Webmaster tools which will report all errors in your website. Google webmaster team started detecting new errors and reporting all errors in a better way. Categorized crawl errors in to two types i.e. "Site Errors" and "URL Errors".
4. Filtering Free hosting spam websites:
We know that "few of the hosting providers will offer free hosting services". Due to free hosting most of the spammers will create low quality content websites dynamically. Google started filtering this kind of websites from search results and also requesting free hosting providers to show patterns of spam to Google team, so that Google team can effectively update their algorithm and filter these spam sites from search results effectively.
5. Now Webmaster tools have both admin & user access:
Earlier we do not have an option to give user access to our webmaster tools account to any other user. Only admin access should be shared to other users, which isn't safe. With the latest update we can give webmaster tools user access to other users now onwards.
6. Schema.org Video Markup:
Better page titles in search results
Page titles are an important part of our search results: they’re the
first line of each result and they’re the actual links our searchers
click to reach websites. Our advice to webmasters has always been to
write unique, descriptive page titles (and meta descriptions for the
snippets) to describe to searchers what the page is about.
We use many signals to decide which title to show to users, primarily
the
Other times, alternative titles are displayed for pages that have no title or a non-descriptive title specified by the webmaster in the HTML. For example, a title using simply the word "Home" is not really indicative of what the page is about. Another common issue we see is when a webmaster uses the same title on almost all of a website’s pages, sometimes exactly duplicating it and sometimes using only minor variations. Lastly, we also try to replace unnecessarily long or hard-to-read titles with more concise and descriptive alternatives.
For more information about how you can write better titles and meta descriptions, and to learn more about the signals we use to generate alternative titles, we've recently updated the Help Center article on this topic. Also, we try to notify webmasters when we discover titles that can be improved on their websites through the HTML Suggestions feature in Webmaster Tools; you can find this feature in the Diagnostics section of the menu on the left hand side.
As always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.
Posted by Pierre Far, Webmaster Trends Analyst
Page layout algorithm improvement
In our ongoing effort to help you find more high-quality websites in search results, today we’re launching an algorithmic change that looks at the layout of a webpage and the amount of content you see on the page once you click on a result.
As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.
This algorithmic change noticeably affects less than 1% of searches globally. That means that in less than one in 100 searches, a typical user might notice a reordering of results on the search page. If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly. You can use our Browser Size tool, among many others, to see how your website would look under different screen resolutions.
If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.
Overall, our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus on specific algorithm tweaks. This change is just one of the over 500 improvements we expect to roll out to search this year. As always, please post your feedback and questions in our Webmaster Help forum.
Posted by Matt Cutts, Distinguished Engineer