Tuesday, October 12, 2010

New Search User Interface - Yahoo



Thanks to google for Auto-Dirving Cars.

Google have been working on auto-driving cars for a while now and they hope this technology becomes available to real cars on the road in the future.

http://googleblog.blogspot.com/2010/10/what-were-driving-at.html


Saturday, October 9, 2010

Google Webmasters tool updates: Search queries parameters handling.

http://googlewebmastercentral.blogspot.com/2010/10/webmaster-tools-updates-to-search.html

Wednesday, October 6, 2010

Google Another Innovation Getting Ready in UK.

Google looks to be readying yet another change to its search results.

Referred URL : http://blog.searchenginewatch.com/101006-090600?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sewblog+%28Search+Engine+Watch+Blog%29&utm_content=FaceBook



Monday, September 13, 2010

SEO Tips for E-commerce

1. Maintain a Uniform and Clean Website Structure.
2. Use a Unique Title Tag.
3. Select an Appropriate Keyword.
4. Create a Relevant Description Per Web Page.
5. Include Breadcrumbs for All Inner Pages.
6. Use a Heading Tag.
7. Avoid Usage of Flash.
8. Optimize Images.
9. Optimize Anchor Text.
10. Create an SEO Friendly URL Structure.
11. Generate a XML Sitemap.
12. Limit the Number of Outbound Links.
13. Submít to Search Directories.
14. Submít to Article Directories and Social Bookmarking Sites.
15. Integrate a Secure Payment Gateway.
16. Track User Behavior with a Good Analytics Tool.







Wednesday, September 8, 2010

Google New Innovation: Google Instant Search

Google Instant is a new search enhancement that shows results as you type. We are pushing the limits of our technology and infrastructure to help you get better search results, faster.
http://www.google.co.in/instant/#utm_campaign=launch&utm_medium=van&utm_source=instant



Google Instant features:

  • Instant results
  • Set of predictions
  • Scroll to search

Friday, August 27, 2010

Google Real Time Search


New and free Service from ping.fm

Ping.fm is a free social networking and micro-blogging web service that enables users to post to multiple social networks simultaneously.
You can make update on ping.fm and send the update to different social networking websites at once. Ping.fm groups services into three categories – status updates, blogs, and micro-blogs and updates can be sent to each group separately.

Thursday, August 26, 2010

Google Ranking Algorithm Changed

Hi All,

Two days back Google Ranking Algorithm Changed. (Showing More Results from a Domain ).

http://googlewebmastercentral.blogspot.com/2010/08/showing-more-results-from-domain.html

Thursday, August 19, 2010

10 Tips for Getting Traffic in SMO.

1. Complete your profile in major Social Networking Sites.

2. Interact : Just spend a few minutes each day and let know people in your network.

3. Include a link to your site or blog on your profile page.

4. Ask your followers to “retweet” and repost

5. Spend time each day growing your network

6. Link your social site pages together


7. Use your real name so that you’re easy to find


8. Post your good content


9. Optimize some of your content


10. Get the most benefit for your time

Wednesday, August 18, 2010

7 SEO Tips to Boost Your Website’s Traffic:

1) KEYWORDS
2) METADATA
3) SITE STRUCTURE & TECHNOLOGY
4) INTERNAL LINKS
5) EXTERNAL LINKS
6) CONTENT
7) INTERACTIVE MEDIA

Thursday, May 13, 2010

View a Web Page as 'Googlebot'

Crawler

User Agent

Alexa-1

ia_archiver

Alexa-2

ia_archiver-web.archive.org

AskJeeves-Teoma

Mozilla/2.0 (compatible; Ask Jeeves/Teoma; +http://sp.ask.com/docs/about/tech_crawling.html)

Googlebot-2.1

Googlebot/2.1 (+http://www.google.com/bot.html)

Googlebot-Mozilla-2.1

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Google-AdSense-2.1

Mediapartners-Google/2.1

MSN-1.0

msnbot/1.0 (+http://search.msn.com/msnbot.htm)

Yahoo-Slurp

Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

ZyBorg-1.0

Mozilla/4.0 compatible ZyBorg/1.0 (wn-14.zyborg@looksmart.net; http://www.WISEnutbot.com)

Google Filters

I have been doing SEO for some time now and I have been witness to many a strange occurrence regarding serps. Most of these weird occurrence I would have to say are directly attributed to a Google Filter or Google penalty. So I have been inspired by a post over at webmasterworld and as far as I know there is not a current list out online that list’s all of the potential Google penalties so I have decided to put together an arbitrary list of potential Google Penalties. Please note that there is no proof i.e. press release from Google stating these exist but rather these are ideas, theories and assumptions from SEO’s experiences.

Google Sandbox: The Sandbox Filter is usually applied to brand new websites but has been seen to be applied to domains that have been around for a while. Since most websites do not make it past a year Google implemented a filter that will prevent a new site from getting decent rankings for competitive keyword terms. Usually brand new sites can still rank for non competitive keyword terms though.

How to work around the Sandbox: Google uses a system called trust rank. The idea behind trust rank is if authority sites link to your new site then you must be an authority site as well and since Google trust’s these older more respected sites it will trust your’s as well. Hence getting you out of the sandbox right away. That is not an easy thing to do so if you are not able to get these links then try expanding your content to rank for many more less competitive keywords and keyword phrases (long tail keywords).

Google -30: This Google filter is applied to site’s who use spammy seo tactics. When Google find you using doorway pages, java redirects etc then they will drop your rankings by 30 spots.

How to get around this: If you find yourself a victim of the Google -30 filter then usually just removing the spam elements on your site will get you back in. You can always fill out a request for re-inclusion is worse comes to worse. Here are some resources for the Google -30. Arelis, Threadwatch, SERoundtable.

Google Bombing: Google Bombing is a filter applied to sites who gain a large number of inbound links with the same anchor text. This raises a red flag to Google as it is extremely unnatural for an inbound linking structure to all have the exact same anchor text.

How to work around this: If your site actually has this filter applied then most likely you have been banned from the search engines and a re-inclusion request is probably your best bet. If the filter is not applied but through your monitoring you see this potential then you might want to go back and request people change your anchor text, buy some links with varying anchor text etc. Here are some resources for Google Bombing. Search Engine Watch, NYTimes, Google Blogspot.

Google Bowling: This is not really a filter as much as it is a series of black hat techniques that will get you banned. Usually people use this term in reference to competition or a page/site they want OUT of the serps. Google bowling is usually only effective to site’s that are much new with lower trust rank. Trying to do this to a large site with high trust rank is going to be virtually impossible.

How to get around this: Google says that there is nothing a competitor can do to drop YOUR rankings. Many seo’s do not believe this and if you seoblackhat sells services for something like this. Re-inclusion request is basically your only option. Here are some resources for Google Bowling. Web Pro News, ThreadWatch and SEroundtable.

Google Duplicate Content Filter: A duplicate content filter is applied to sites who take content that has already been created, cached and indexed on other sites. News sites are usually exempt from the duplicate content filter via a hand job. Usually the pages that have this applied are not ranked very well in the serps. Page Rank can be devalued and if a page does not have inbound links you could see your results being put into omitted search results and supplemental results.

How to get through this: If you find yourself in this filter then your first step can be trying to remedy the duplicate content. Contact the person stealing your content and ask them to remove it. You can contact the persons web host to see if they will take down there site and the last resort is “trying” to contact Google and alert them of what is going on. Keep on top of your content by using copyscape to check for duplicate content.

Google Supplemental Results: Google supplemental results take pages on your site that have been indexed and put them into a sub database in Google. Supplemental results do not rank well but rather Google uses its supplemental DB to populate its results when they don’t have enough results to show in a given query. This means pages on your site in Google’s supplemental DB will not help you in the serps.

How to get through this: Its pretty simple actually. Just get some inbound links to your pages. Check this post out to find out more about the Google Poo (supplemental results).

Google Domain name Age Filter: The Google domain name age filter is closely related to trust rank and the sandbox but it is possible to be out of the sandbox and have trust rank and still be in this filter. The idea behind this filter is that older sites and domain names are more likely to rank well for keyword terms then newer sites. If you are in this filter you will most likely not rank well for terms that are competitive until your site grows older.

How to work around this: Quality links from authority sites with high trust rank will help you do much better in the serps.

Google’s Omitted Results Filter: Pages within your website that are in omitted search results will not show up in a Google search unless a user specifically says to show all omitted results. Usually users do not even get to the last page to do this which makes any page of yours that is omitted completely out of a Google search result. The reason this happens is lack on inbound links, duplicate content, duplicate meta title, duplicate meta description and poor internal linking.

How to get out of this: In order to get pages are omitted out of this filter simply alter the meta tags and fix duplicate content and get some quality inbound links.

Google’s Trust Rank Filter: Like the PageRank algorithm the trust rank algorithm has many factors that determine a sites trust rank. Some of the known factors are the age of a site, the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. All sites go through this filter and if your Trust Rank is low so will your rankings in the serps.

How to get work with this: An old site and a new site can both have high trust rank or low trust rank. It is basically determined by the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. Optimize these and you will have quality Trust rank.

links.htm page filter: This filter penalizes a sites ranking determined by the use of a links.html page. Using reciprocal linking is a old technique that is not promoted by Google anymore. This filter effects your ranking in the serps.

How to work with this filter: Instead of using “links” as your page title and name try using something like “mynewbuddies” or “coolsites” as this will help get around this filter. Reciprocal links are old seo techniques and Google devalues reciprocal linking structures.Here is someone discussing this at SEOChat.


Reciprocal Link Filter: Google is very open about reciprocal linking and clearly states that their algorithm can detect reciprocal link campaigns. Usually sites that only participate in reciprocal linking will have a hard time ranking in the search engines but depending on what you are using your site for a reciprocal links campaign might be exactly what you need. For example if you are building an adsense site then you do not want to spend to much time building a site up and a reciprocal linking campaign will help your sites inbound links grow over time.

How to work with this filter: When it comes to building an inbound linking structure try to utilize some or all of the 15 types of links and how to get them post I did a ways back. Here are some resources about this filter. Matt Cutts here and here, Search engine guide and Webmasterworld.


Link Farming Filter: Link farms are sites/pages that have a mass amount of unrelated links grouped together arbitrarily. Link farms can also be related links but most commonly they are unrelated. IP farms and bad link neighborhoods are all part of link farming. Being a part of a link farm can get your rankings dropped in Google and possibly get you banned.

How to get around this: Currently the only way to get around this is to NOT participate in link farming. Here are some resources on link farming:

CO-citation Linking Filter: This popular filter by Google watches your inbound link structure. If your link is on a site who’s outbound links are related to casino’s and porn sites and your automotive site is an outbound link on this site then google will think your site is related to porn and casinos. Poorly constructed co-citation will damage your ranking and make it hard for you to rank well for the terms you are targeting.

How to work with this: When considering a link partner, paid link or monitoring your inbound links be sure to follow this linking quaility guidline page that was derived from Patrick Gavin over at text link ads.

To many links at once Filter: This filter is applied when to many inbound links are acquired by a site to fast. The result can lead to a ban across all search engines. How these links are obtained, how many and over what period of time are factors for this filter.

How to get around this: Simply do not participate in black hat linking schemes and link spaming and you should never have a problem with this. Here is some information concerning this filter over at Aaron Walls at SEObook.com,

To many Pages at once filter: Google is keen on natural site development. Anything that look “unnatural” is going to be flaged by the search engines. Having to many pages to fast will raise this flag/filter. Some people believe that 5000 is the max for pages in a month but this number in my opinion can fluctuate depending on other factors and filters your site might be going through at any given time. The effect of this filter can result in pages being omitted, pages in supplemental results and in the extreme case a Google ban.

How to get through this filter: If you have a system that pulls content in or are using a dynamic content generator be sure to limit it per week and I would stay under 5000 pages per month just to be on the safe side. Depending on how large or well known your site is then the limit will be adjusted.


Broken Link Filter: Broken internal links can cause pages from not being crawled, cached and indexed. If pages like your home page do not have a link back to it on all pages this can count against you in the serps and your overall quality score for things like PR. This is not just bad seo and bad site design but this is bad for your users and can cause poor traffic and poor serp ranking.

How to get through this: Make sure you have a quality footer, a sitemap that covers all of your pages in one central hub and make sure you test your site for broken links. (be sure to use full url’s in your linking via source code).

Page Load Time Filter: The page load filter is very simple. If your website takes to long to load then a spider will time out and move past your site or page. This will result in NEVER being cached and indexed. Ultimately this means your site or page will not be present in Googles SERPS.

How to work with this: Make sure your pages are optimized for load time. Make sure if you are using flash or many images you use java pre-load coding. Make sure you limit the file size of your page’s as much as possible to make sure the spiders can read the entire document and be sure to use web 2.0 and css best practices.

Over Optimization Filter: Over optimization can cause a Google ban or hardship in rankings. Over optimization could be considered keyword stuffing, to much keyword density and keyword proximity optimization, meta tag stuffing etc. Stay away from over optimization.

How to get around this: Don’t over optimize!!!!

There are some filter’s I have not mentioned but I thought I would give a smaller list of other filters that could be attributed to Google:

Keyword Stuffing Filter:
Meta Tag Stuffing Filter:
Automated Google Query Filter:
IP Class Filter:
Google Toolbar Filter:
Click through Filter in serps:
Traffic Filter:
Google -950 Filter: