Tuesday, October 12, 2010

New Search User Interface - Yahoo



Thanks to google for Auto-Dirving Cars.

Google have been working on auto-driving cars for a while now and they hope this technology becomes available to real cars on the road in the future.

http://googleblog.blogspot.com/2010/10/what-were-driving-at.html


Saturday, October 9, 2010

Google Webmasters tool updates: Search queries parameters handling.

http://googlewebmastercentral.blogspot.com/2010/10/webmaster-tools-updates-to-search.html

Wednesday, October 6, 2010

Google Another Innovation Getting Ready in UK.

Google looks to be readying yet another change to its search results.

Referred URL : http://blog.searchenginewatch.com/101006-090600?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sewblog+%28Search+Engine+Watch+Blog%29&utm_content=FaceBook



Monday, September 13, 2010

SEO Tips for E-commerce

1. Maintain a Uniform and Clean Website Structure.
2. Use a Unique Title Tag.
3. Select an Appropriate Keyword.
4. Create a Relevant Description Per Web Page.
5. Include Breadcrumbs for All Inner Pages.
6. Use a Heading Tag.
7. Avoid Usage of Flash.
8. Optimize Images.
9. Optimize Anchor Text.
10. Create an SEO Friendly URL Structure.
11. Generate a XML Sitemap.
12. Limit the Number of Outbound Links.
13. Submít to Search Directories.
14. Submít to Article Directories and Social Bookmarking Sites.
15. Integrate a Secure Payment Gateway.
16. Track User Behavior with a Good Analytics Tool.







Wednesday, September 8, 2010

Google New Innovation: Google Instant Search

Google Instant is a new search enhancement that shows results as you type. We are pushing the limits of our technology and infrastructure to help you get better search results, faster.
http://www.google.co.in/instant/#utm_campaign=launch&utm_medium=van&utm_source=instant



Google Instant features:

  • Instant results
  • Set of predictions
  • Scroll to search

Friday, August 27, 2010

Google Real Time Search


New and free Service from ping.fm

Ping.fm is a free social networking and micro-blogging web service that enables users to post to multiple social networks simultaneously.
You can make update on ping.fm and send the update to different social networking websites at once. Ping.fm groups services into three categories – status updates, blogs, and micro-blogs and updates can be sent to each group separately.

Thursday, August 26, 2010

Google Ranking Algorithm Changed

Hi All,

Two days back Google Ranking Algorithm Changed. (Showing More Results from a Domain ).

http://googlewebmastercentral.blogspot.com/2010/08/showing-more-results-from-domain.html

Thursday, August 19, 2010

10 Tips for Getting Traffic in SMO.

1. Complete your profile in major Social Networking Sites.

2. Interact : Just spend a few minutes each day and let know people in your network.

3. Include a link to your site or blog on your profile page.

4. Ask your followers to “retweet” and repost

5. Spend time each day growing your network

6. Link your social site pages together


7. Use your real name so that you’re easy to find


8. Post your good content


9. Optimize some of your content


10. Get the most benefit for your time

Wednesday, August 18, 2010

7 SEO Tips to Boost Your Website’s Traffic:

1) KEYWORDS
2) METADATA
3) SITE STRUCTURE & TECHNOLOGY
4) INTERNAL LINKS
5) EXTERNAL LINKS
6) CONTENT
7) INTERACTIVE MEDIA

Thursday, May 13, 2010

View a Web Page as 'Googlebot'

Crawler

User Agent

Alexa-1

ia_archiver

Alexa-2

ia_archiver-web.archive.org

AskJeeves-Teoma

Mozilla/2.0 (compatible; Ask Jeeves/Teoma; +http://sp.ask.com/docs/about/tech_crawling.html)

Googlebot-2.1

Googlebot/2.1 (+http://www.google.com/bot.html)

Googlebot-Mozilla-2.1

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Google-AdSense-2.1

Mediapartners-Google/2.1

MSN-1.0

msnbot/1.0 (+http://search.msn.com/msnbot.htm)

Yahoo-Slurp

Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp)

ZyBorg-1.0

Mozilla/4.0 compatible ZyBorg/1.0 (wn-14.zyborg@looksmart.net; http://www.WISEnutbot.com)

Google Filters

I have been doing SEO for some time now and I have been witness to many a strange occurrence regarding serps. Most of these weird occurrence I would have to say are directly attributed to a Google Filter or Google penalty. So I have been inspired by a post over at webmasterworld and as far as I know there is not a current list out online that list’s all of the potential Google penalties so I have decided to put together an arbitrary list of potential Google Penalties. Please note that there is no proof i.e. press release from Google stating these exist but rather these are ideas, theories and assumptions from SEO’s experiences.

Google Sandbox: The Sandbox Filter is usually applied to brand new websites but has been seen to be applied to domains that have been around for a while. Since most websites do not make it past a year Google implemented a filter that will prevent a new site from getting decent rankings for competitive keyword terms. Usually brand new sites can still rank for non competitive keyword terms though.

How to work around the Sandbox: Google uses a system called trust rank. The idea behind trust rank is if authority sites link to your new site then you must be an authority site as well and since Google trust’s these older more respected sites it will trust your’s as well. Hence getting you out of the sandbox right away. That is not an easy thing to do so if you are not able to get these links then try expanding your content to rank for many more less competitive keywords and keyword phrases (long tail keywords).

Google -30: This Google filter is applied to site’s who use spammy seo tactics. When Google find you using doorway pages, java redirects etc then they will drop your rankings by 30 spots.

How to get around this: If you find yourself a victim of the Google -30 filter then usually just removing the spam elements on your site will get you back in. You can always fill out a request for re-inclusion is worse comes to worse. Here are some resources for the Google -30. Arelis, Threadwatch, SERoundtable.

Google Bombing: Google Bombing is a filter applied to sites who gain a large number of inbound links with the same anchor text. This raises a red flag to Google as it is extremely unnatural for an inbound linking structure to all have the exact same anchor text.

How to work around this: If your site actually has this filter applied then most likely you have been banned from the search engines and a re-inclusion request is probably your best bet. If the filter is not applied but through your monitoring you see this potential then you might want to go back and request people change your anchor text, buy some links with varying anchor text etc. Here are some resources for Google Bombing. Search Engine Watch, NYTimes, Google Blogspot.

Google Bowling: This is not really a filter as much as it is a series of black hat techniques that will get you banned. Usually people use this term in reference to competition or a page/site they want OUT of the serps. Google bowling is usually only effective to site’s that are much new with lower trust rank. Trying to do this to a large site with high trust rank is going to be virtually impossible.

How to get around this: Google says that there is nothing a competitor can do to drop YOUR rankings. Many seo’s do not believe this and if you seoblackhat sells services for something like this. Re-inclusion request is basically your only option. Here are some resources for Google Bowling. Web Pro News, ThreadWatch and SEroundtable.

Google Duplicate Content Filter: A duplicate content filter is applied to sites who take content that has already been created, cached and indexed on other sites. News sites are usually exempt from the duplicate content filter via a hand job. Usually the pages that have this applied are not ranked very well in the serps. Page Rank can be devalued and if a page does not have inbound links you could see your results being put into omitted search results and supplemental results.

How to get through this: If you find yourself in this filter then your first step can be trying to remedy the duplicate content. Contact the person stealing your content and ask them to remove it. You can contact the persons web host to see if they will take down there site and the last resort is “trying” to contact Google and alert them of what is going on. Keep on top of your content by using copyscape to check for duplicate content.

Google Supplemental Results: Google supplemental results take pages on your site that have been indexed and put them into a sub database in Google. Supplemental results do not rank well but rather Google uses its supplemental DB to populate its results when they don’t have enough results to show in a given query. This means pages on your site in Google’s supplemental DB will not help you in the serps.

How to get through this: Its pretty simple actually. Just get some inbound links to your pages. Check this post out to find out more about the Google Poo (supplemental results).

Google Domain name Age Filter: The Google domain name age filter is closely related to trust rank and the sandbox but it is possible to be out of the sandbox and have trust rank and still be in this filter. The idea behind this filter is that older sites and domain names are more likely to rank well for keyword terms then newer sites. If you are in this filter you will most likely not rank well for terms that are competitive until your site grows older.

How to work around this: Quality links from authority sites with high trust rank will help you do much better in the serps.

Google’s Omitted Results Filter: Pages within your website that are in omitted search results will not show up in a Google search unless a user specifically says to show all omitted results. Usually users do not even get to the last page to do this which makes any page of yours that is omitted completely out of a Google search result. The reason this happens is lack on inbound links, duplicate content, duplicate meta title, duplicate meta description and poor internal linking.

How to get out of this: In order to get pages are omitted out of this filter simply alter the meta tags and fix duplicate content and get some quality inbound links.

Google’s Trust Rank Filter: Like the PageRank algorithm the trust rank algorithm has many factors that determine a sites trust rank. Some of the known factors are the age of a site, the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. All sites go through this filter and if your Trust Rank is low so will your rankings in the serps.

How to get work with this: An old site and a new site can both have high trust rank or low trust rank. It is basically determined by the amount of quality authority links pointing to it, how many outbound links it has, the quality of its inbound linking structure, internal linking structure and overall SEO best practices on meta and url structure. Optimize these and you will have quality Trust rank.

links.htm page filter: This filter penalizes a sites ranking determined by the use of a links.html page. Using reciprocal linking is a old technique that is not promoted by Google anymore. This filter effects your ranking in the serps.

How to work with this filter: Instead of using “links” as your page title and name try using something like “mynewbuddies” or “coolsites” as this will help get around this filter. Reciprocal links are old seo techniques and Google devalues reciprocal linking structures.Here is someone discussing this at SEOChat.


Reciprocal Link Filter: Google is very open about reciprocal linking and clearly states that their algorithm can detect reciprocal link campaigns. Usually sites that only participate in reciprocal linking will have a hard time ranking in the search engines but depending on what you are using your site for a reciprocal links campaign might be exactly what you need. For example if you are building an adsense site then you do not want to spend to much time building a site up and a reciprocal linking campaign will help your sites inbound links grow over time.

How to work with this filter: When it comes to building an inbound linking structure try to utilize some or all of the 15 types of links and how to get them post I did a ways back. Here are some resources about this filter. Matt Cutts here and here, Search engine guide and Webmasterworld.


Link Farming Filter: Link farms are sites/pages that have a mass amount of unrelated links grouped together arbitrarily. Link farms can also be related links but most commonly they are unrelated. IP farms and bad link neighborhoods are all part of link farming. Being a part of a link farm can get your rankings dropped in Google and possibly get you banned.

How to get around this: Currently the only way to get around this is to NOT participate in link farming. Here are some resources on link farming:

CO-citation Linking Filter: This popular filter by Google watches your inbound link structure. If your link is on a site who’s outbound links are related to casino’s and porn sites and your automotive site is an outbound link on this site then google will think your site is related to porn and casinos. Poorly constructed co-citation will damage your ranking and make it hard for you to rank well for the terms you are targeting.

How to work with this: When considering a link partner, paid link or monitoring your inbound links be sure to follow this linking quaility guidline page that was derived from Patrick Gavin over at text link ads.

To many links at once Filter: This filter is applied when to many inbound links are acquired by a site to fast. The result can lead to a ban across all search engines. How these links are obtained, how many and over what period of time are factors for this filter.

How to get around this: Simply do not participate in black hat linking schemes and link spaming and you should never have a problem with this. Here is some information concerning this filter over at Aaron Walls at SEObook.com,

To many Pages at once filter: Google is keen on natural site development. Anything that look “unnatural” is going to be flaged by the search engines. Having to many pages to fast will raise this flag/filter. Some people believe that 5000 is the max for pages in a month but this number in my opinion can fluctuate depending on other factors and filters your site might be going through at any given time. The effect of this filter can result in pages being omitted, pages in supplemental results and in the extreme case a Google ban.

How to get through this filter: If you have a system that pulls content in or are using a dynamic content generator be sure to limit it per week and I would stay under 5000 pages per month just to be on the safe side. Depending on how large or well known your site is then the limit will be adjusted.


Broken Link Filter: Broken internal links can cause pages from not being crawled, cached and indexed. If pages like your home page do not have a link back to it on all pages this can count against you in the serps and your overall quality score for things like PR. This is not just bad seo and bad site design but this is bad for your users and can cause poor traffic and poor serp ranking.

How to get through this: Make sure you have a quality footer, a sitemap that covers all of your pages in one central hub and make sure you test your site for broken links. (be sure to use full url’s in your linking via source code).

Page Load Time Filter: The page load filter is very simple. If your website takes to long to load then a spider will time out and move past your site or page. This will result in NEVER being cached and indexed. Ultimately this means your site or page will not be present in Googles SERPS.

How to work with this: Make sure your pages are optimized for load time. Make sure if you are using flash or many images you use java pre-load coding. Make sure you limit the file size of your page’s as much as possible to make sure the spiders can read the entire document and be sure to use web 2.0 and css best practices.

Over Optimization Filter: Over optimization can cause a Google ban or hardship in rankings. Over optimization could be considered keyword stuffing, to much keyword density and keyword proximity optimization, meta tag stuffing etc. Stay away from over optimization.

How to get around this: Don’t over optimize!!!!

There are some filter’s I have not mentioned but I thought I would give a smaller list of other filters that could be attributed to Google:

Keyword Stuffing Filter:
Meta Tag Stuffing Filter:
Automated Google Query Filter:
IP Class Filter:
Google Toolbar Filter:
Click through Filter in serps:
Traffic Filter:
Google -950 Filter:

Wednesday, July 30, 2008

Top 10 Things to do Before the Spiders come

1: Check the Title Tags.

In our opinion, creating great title tags can be one of the most valuable ways to help increase your SEO. Make sure you have relevant, keyword-rich title tags for each page you create. This will make it easy for web browsers to search and display your website.

2: If you have a flash introduction, be sure there is a link that allows you to pass it.

Many webpages have a fancy splash page, but no way to navigate around it. Google cannot read into a flash page, so be sure to include a text link to your website’s second index. Overall this type of intro can add to your site's visual appeal, but don't let it ruin your opportunity to get indexed quickly.

3: Don't forget to check the META Tags.

When there is a lot of navigation code that wouldn’t make sense to a human searcher, Google relies on these tags to describe a site. Be sure to set up some valid keywords and a description- just in case.

4: Make sure all your links are in working order.

Check and double check to make sure there are no broken links on your site. Not only does this create errors for a web crawler, but it also creates problems for your site users. Nothing is more disappointing to a web surfer than believing they've found just the right info... only to discover the "page is not found." It reduces a website's credibility, and it leaves users feeling unfulfilled. Make sure you're information is plentiful, and your site is in perfect working order. These are the keys to web popularity and credibility.

5: Check the ALT Tags.

Few people take the time to put these in order, but ALT tags help spiders understand all of your graphics. Don’t spend too much time on this, but every little bit helps!

6: Check your frames.

If you use frames on your website, it may not be fully indexed. Google recommends that you check out this article by Danny Sullivan called Search Engines and Frames.

7: Question- Do you have dynamically generated pages?

Google claims they will be limiting the number of dynamic webpages they will index. It may help if you include some static content in your pages.

8: Keep your content fresh.

Google likes to index pages more frequently when they are updated on a regular basis. It’s also a great way to keep visitors returning when you offer them fresh new content. Blogs, articles, and new products/special offers are all a great way to keep Google - and customers - coming back for more.

9: The robots.txt

This file allows you to filter out the bots that crawl your website, plus you can restrict certain URLs that do not need to be indexed. This is a tremendous resource if used properly.

10: Caches

If you do not want Google to cache a website, simply include this line of code between your head tags:
META NAME="ROBOTS" CONTENT="NOARCHIVE"

Thursday, July 17, 2008

SEO Interview Questions.

1. Could u brief me your SEO career?

2. What is SEO and how it's going to help you?

3. What you do in different off-page optimization?

4. What you do in on page optimization?

5. What type of client websites your landing?

6. How your website get indexed in the search engines within 24 hours?

7. If i give one website, How you plan the strategy for that website?

8. If your client wants rank in France, what is your strategy?

9. Are you using manual directory submission r submitting in tool for directory submission?

10. How you check your website ranking?

11. What tools you using to track your website performance?

12. What you analysis, tracking your website?

13. What is competitor analysis?

14.What actually you analysis in your competitor?

15.What is social media optimization?

16.Social media optimization is that worthy to compare normal optimization?

17.What you do with your site videos??? You optimize the video or submitting the video?

18.How u optimize the video? And where you submit the video??

19.Where you do ppc in google adwords r yahoo?

20.In which forums you participate for seo updates and info??

21.If not, then how you updating the seo updates

22.What is robot.txt?

23.How you using robot.txt

24.Are you aware of noindex and nofollow?

25.For what purpose you using the noindex and nofollow

26.What is difference between noindex and nofollow?

27.What kind of directories you preferred for submission?

28.What is the minimum page rank you will submit in directories?

29.Do you think the page rank that much significant now a days?

30.What type of sites you working? Like finance r business

31.You work only in-house project r client project?

32.Could tell me any one keyword you got 1st rank in google?

Comments are welcome.. if you have some questions to add, please do it in comments part, ill make the changes.




How to remove content from the Google index??





Tuesday, July 15, 2008

Importance of Tags - Matt cutts





Friday, July 11, 2008

Tips to Get Better Visibility on Google - Matt Cutts




Tuesday, July 1, 2008

Google's Matt Cutts discusses how to improve your site's search ranking

More and more businesses are turning to the Web to find customers: $5.8 billion was spent on advertising in the first quarter alone, up 18.2% from the prior year, according to the Interactive Advertising Bureau. Google's share of Internet searches continues to rise as well — to a record 61.8% in May, according to measurement service ComScore Media Metrix.

If you haven't "optimized" your site, here's how:

1. Spotlight your search term on the page.

"Think about what people are going to type in to try and find you," Cutts says. He tells of meeting a chiropractor from San Diego who complained that his site couldn't be found easily using Google search. The words "San Diego chiropractor" were listed nowhere on his site. "You have to make sure the keywords are on the page," Cutts says. If you're a San Diego doctor, Des Moines architect or Portland ad agency, best to let people know so immediately, at the top of your page.

2. Fill in your "tags."
When creating websites, Internet coding language includes two key tags: title and description. Even if you don't know code, which is used to create pages, software programs such as Adobe's Dreamweaver have tools that let you fill them in in plain English (rather than "San Diego Chiropractor"). Tags are crucial, Cutts says, because what's shown in search results most often are the title and description tags.

If Cutts' chiropractor had properly tagged his Web page, a search would have returned something like this: "San Diego chiropractor. Local doctor serves San Diego community."

There's also a third tag, to add keywords, or search terms, but Cutts says Google doesn't put much weight in its rankings on that one.

3. Get other sites to "link" back to you.

Google says it looks at more than 100 pieces of data to determine a site's ranking. But links are where it's at, once your search terms are clearly visible on your site and the title and description tags correctly marked.

In a nutshell: Google ranks sites based on popularity. If authoritative sites link to you, you must be good, and therefore you get to the top of the list. If you can't get top sites such as USATODAY.com or The New York Times to link to you, try your friends. And what if they don't have a site? They probably do. Read on.

4. Create a blog and post often.
Cutts says blogging is a great way to add links and start a conversation with customers and friends. It will cost you only time: Google's Blogger, WordPress and others offer free blogging tools. With a blog, you can link back to your site and offer links to others. It's also a great way to start building content, Cutts says.

5. Register for free tools.
Google's google.com/webmaster offers freebies to help get your site found. You can upload a text-based site map, which shows Google the pages of your site (create it at www.xml-sitemaps.com). Once that's done, you'll be registered with Google, where you can learn vital statistics — including who is linking to your site and how often Google "crawls" your site for updates.

Google's Local Business center (google.com/local/add) is the place for business owners to submit a site so it shows up in local searches, with a map attached. Savvy consumers who use Google for searches know that the first 10 non-advertising results often are from Google Maps, so if you have a business and haven't submitted it, you're losing out on potential customers.

Don't overdo it

When weaving keywords into a main page, Cutts says, some zealous Web publishers will use the term over and over again. That's called "keyword stuffing." It's a big Google no-no that can have your site removed from the index.

"After you've said it two or three times, Google has a pretty good idea — 'OK, this page has something to do with this keyword,' " he says. "Just think about the two or three phrases you want to be known for and weave that in naturally."

For blogger newbies, Cutts knows that writing (for example, posting new material) doesn't always come easy. He suggests finding ideas by visiting social news sites such as Digg and StumbleUpon, to see what people are saying about your particular topic.

Aside from that, Cutts says, new material falls into the common-sense category: It's all about your business. "If I'm a plumber in Iowa, I may want to write about some of the strange things that happen to me on the job, or the five most common ways to fix a toilet," he says. "That kind of content can get really popular, and it's a great way to get links." Folks will post your piece on one of the social media sites. And with links comes higher Google rankings.

Finally, Cutts says, there is one big misconception about getting Google visibility that he wants to clear up: In order to be found at the top of Google's rankings, you do not also have to advertise.

"One thing doesn't have to do with the other," he says.


Monday, June 23, 2008

28 SEO Steps to Win Search Engine Rankings

Search Engines are built to help the searchers. Search Engines strive to put the best fit/standard websites on top rankings so that searchers get their required info within few clicks.

How do Search Engines know a website’s standard and its relevance with searched keywords? What are the steps you should follow for a complete SEO to improve your website’s rankings?

This article covers Search Engine Optimization in two levels - the first is On-page optimization, and the next is Off-Page Optimization. An effective On-page optimization together with good Off-page optimization will improve your Search Engine Rankings.

On-Page Optimization

1. Define your Business & Target Audience
The first step in any business starts with analysis. Determine what kind of services your website is to provide and what kind of audience you want for your website?

2. Don’t Purchase a New Domain

If you already have a good domain name don’t try to purchase a new one, as some search engines look for the age of the website as a ranking factor.

3. Choosing your Domain Name
If you are planning for a new website, try to get a domain name with keywords included. If you target regional customers, you can have your domain based on the region, say for example .uk or .au or .in

4. Make your Website look Clean & Simple
Now you have a domain name and you know whom you want to target. Website design is the key factor which keeps your visitor to stay for a while in your website and navigate your services. Make sure your website design has a good feel & look, clean and simple.

5. Evaluate your Website
If you have an existing website, and now you want to do SEO for it, then evaluate your website:
Navigation structure - think in a visitor’s point of view - Can someone navigate and reach the product/services they are looking for? Make your navigation user friendly, no visitor should leave your website due to confused navigation.
Check for W3 compatibility of your website.
Check for any broken links in your website and fix if any.
Your website should load fast so that it doesn’t check on your visitors’ patience.

6. Observe your competitor
Find your competitor websites. Analyze and gain knowledge on their tactics, the keywords they use, the techniques applied. With the help of this analysis you will learn what is working and what is not working, which will help in your SEO process.

7. Research on Keywords

List the Keywords your target audience would search for and the ones used by your competitors. Make use of keyword research tools like Overture to know more on your related search terms. Now refine your list and make your final list of target keywords.

8. Structure File name
If you have control over your file names - modify file names with your keywords included.

9. Search Engine Friendly Sitemap
A well structured search engine friendly sitemap can help Search Engines to index all your pages. With good anchor text for your navigation links you can improve your rankings

10. Write an Attractive Title
Why is your web page title so important? The web page title is displayed in SERPs which in turn help in attracting your visitors. Web searchers do their search with a term and look for titles in the result pages that best fit their search. Hence make your title attractive, with targeted keywords included. Try to incorporate related keywords too. For example you target for Montessori School, you can write title as ‘Montessori School, the Preschool for your Children’. (in this way you add 2 related terms).

11. Meta Description Tag
Description tag is also displayed in snippets of any SERPs. This description helps to motivate searchers to visit your website. Make an effective and optimized description tag. Snippets are also taken from surrounding text of the searched keyword. You may need to edit your content a little to make your description look good.

12. Meta Keyword Tag
Though there is saying that Search Engines like Google doesn’t look for Meta keyword tag, there are some smaller search engines, which still follow the conventional way and spider the keywords tag. So there is no harm in making of the Meta keyword tags. Your meta keyword should contain keywords that appears in your body text.

13. Have Robot tags
If you don’t want any of your files of your website to be indexed, say an image or a text file, you can write your robot file insisting the search engines not to crawl those pages.

14. Alt Tags for images
As the name infers it acts as an alternate text for any image. This tag is both user friendly and search engine friendly. Search engines cannot read images; instead it indexes the alt tag given for any image and assumes it as the description for the image. This alt tag is not so important by Search Engines, as many spammers try to put irrelevant alt tag for the images to improve their rankings. However for an image link with a proper alt tag (keywords included) will work as great internal anchor link.

15. H1 & H2 heading format
Though no one is sure on whether this helps in rankings, try applying the header options, as this practice is good for any web development

16. Improve Keyword Prominence, Density & Proximity
Keyword prominence will increase if you have your keyword at the beginning of text part of your webpage. Keyword density refers to the ratio (percentage) of keywords contained within the total text content of the page. Keyword proximity refers to the closeness between two or more keywords. And this is yet another factor which has not much proven records but still exist in SEO practice.

17. Make Content look rich
As many SEO experts say content is the king. Optimized content with keywords included, along with other SEO factors working well the website can rank well in Search Engines

18. Keyword Rich Anchor Text
Anchor text for both internal links and external links is powerful element for Search Engine rankings. You can notice websites which doesn’t have the keyword in their web pages rank well which is all because of their anchor text of their inbound links.
Here a good article on Link Anchor Text and Search Engine Optimization to understand the importance of anchor text in Search Engine Optimization.

19. Make Flash files work for you
Its better to avoid including flash files into your website. But under compelling circumstances, you can try adding keyword rich text somewhere in your file, since Search Engines like Google can index the text part of flash files

20. Bring your PDF files on SERPs
PDF file is a good source for keyword rich content. Most Search Engines can read & index PDF files and a few search engines shows PDF files in SERPs too. Make sure you have a good title and file name for your PDF file

Have a look Google guidelines for Webmasters

Off-Page Optimization

Why Search Engines give so much of importance for inbound links?

It is easy for any website owner to optimize their website with good content, navigation, title, etc. But does that mean it’s a good website and people would love to check it out?

Search Engines try to know and evaluate the standard of a website through many factors and the most significant of them is incoming/inbound links. When a website receives inbound links from related & standard websites, it obviously means that it is worth looking at.

Search Engines learn about your website through inbound links and to be precise it’s through anchor text.

So natural linking with different & targeted keyword rich anchor text works great for any website.

Here are a few ways to increase your inbound links/ backlinks.

21. Search Engines
Submit your website manually to search engines including regional search engines.

22. Directories
There are many directories available online for free submission. Submit your website into the category that best fits your services, most importantly with good anchor text. Try submitting in your own regional directories to improve your local business.

23. Forums
Participate in your industry related forums & discussion boards. Post your comments, thoughts and also provide a link to your website (without spam).

24. Blogs
Having your own official blog helps to post your company’s updates, product release, etc., and also inbound links. There are directories for submitting blogs. Submit your blog with proper keyword tags which helps in your blog listing.

25. Articles
Write keyword & content rich articles on your own and post in article junctions available online. There are article junctions which accept articles in HTML format, so you can include your links with good anchor text. Write fresh articles on your topic, post and see your traffic increasing.

26. Press Releases
You can post Press Releases, which also works like Article junctions & directories; however it should look more like a Press Release and not like an article.

27. Bookmarks
Social book marking - it serves two purposes: It gets you incoming links and it popularizes your website in your community. Make sure you put keyword rich tags for your bookmarks.

28. Classifieds
You can see some websites displaying free classifieds. You can put your advertisements there with link to your website.


Wednesday, June 4, 2008

SEO Tips and Strategy

Before you write one line of code:

* Do keyword research to determine what keywords you want to target.

While constructing your website you should do the following:

* Use markup to indicate the content of your site

1) Optimize your title tags on each page to contain 1 - 3 keywords
2) Create unique Meta Tags for each page
3) Use header tags appropriately (H1 > H2 > H3)
4) Use (b) and (i) tags if appropriate

* Optimize your URLs

1) Use Search Engine Friendly URLs
2) Use keywords in your domain (www.keyword1.com)
3) Use keywords in your URL (www.example.com/keyword2/keyword3.html)
4) Use dashes or underscores to separate words in your URLs (keyword2-keyword3.html)

* Optimize your content

1) Use keywords liberally yet appropriately throughout each page
2) Have unique content
3) Have quality content

* Use search engine friendly design

1) Create a human sitemap
2) Do not use inaccessible site navigation (JavaScript or Flash menus)
3) Minimize outbound links
4) Kept your pages under 100K in size

* Design the navigational structure of the site to channel PR to main pages (especially the homepage)

* Create a page that encourages webmasters to link to your site

1) Provide them the relevant HTML to create their link to you (make sure the anchor text contains keywords)
2) Provide them with any images you may want them to use (although text links are better)

* Make sure your website is complete before launching it

Immediately after launching your site you should do the following:

* Create Webmaster Accounts

1) Google Webmaster Tools
2) Yahoo! Site Explorer

* Submit your site to all major search engines

1) Google (Use a Google SiteMap)
2) Yahoo (Use the page list option)
3) MSN
4) Ask (Finds your site via incoming links)

* Create an XML sitemap

* Submit your site to all free directories

1) DMOZ (also powers Google Directory)
2) JoeAnt

* Submit your site to relevant directories

1) Find more at ISEDB

* Begin a link building campaign (attempting to get keywords in the link anchor text)

1) Put a link to your website in your forum signatures (hint hint)
2) Reply to relevant blog posts (Don't spam please)

If you will pay to promote your website:

* Submit your site to pay directories

1) Yahoo
2) GoGuides

Finally, as part of an ongoing strategy:

* Continually update your website with quality, unique content
* Continually seek free links preferably from sites in your genre

Do NOT do the following:

* Make an all Flash website (without an HTML alternative)
* Use JavaScript or Flash for navigation
* Spam other websites for incoming links
* Launch your site before it is done
* Use duplicate content

1) Do not point several domains to one site without using a 301 redirect
2) Do not make a site of duplicated content from other websites

* Use markup inappropriately

1) Style eader tags to look like regular text
2) Hide content using 'display: hidden' (for the sake of hiding text)

* Use other "black hat" techniques (unless you accept the risk - Banning)

1) Doorway/Landing pages
2) Cloaking
3) Hidden text

Additional Tips:

* Usable and accessible sites tend to be search engine friendly by their very nature
* Be patient! High rankings don't happen overnight
* Don't obsess with any one search engine. They are all worth your attention.

Tuesday, June 3, 2008

Google's -60 penalty

During the last weeks, people in online forums observed some strange Google result changes. Rumor has it that there is a new -60 penalty that Google applies to websites in which it has lost trust.

What has happened?

Some webmasters found websites that were listed on position 61 in Google's search results that had Google Sitelinks below their listing.

Normally, Google only displays Google Sitelinks for the first search result.

Many webmasters believe that the website that was listed on position 61 with the Sitelinks was the number 1 result for that keyword but had been penalized by Google.

What does Google say about the -60 penalty?
In a Google Groups discussion about showing Sitelinks for #61 results Google employee John Mu referred to a -60 penalty discussion.

Google hasn't officially confirmed that a -60 penalty exists. However, Google employee John Mu indicated in a discussion about the -60 penalty in the official Google groups that Google penalizes websites if they contain certain spam elements.

Which spam elements trigger the -60 penalty?

It looks that Google applies this penalty to websites that buy links.

Many of the websites that seem to have been penalized had many inbound links from websites that linked to them from every single page of their website (so-called site-wide links). Sitewide links are an indicator of paid links, which Google sees as an unwanted way to artificially inflate search engine rankings.

The head of Google's anti-spam team Matt Cutts has often said that websites that buy paid links will be penalized and it looks as if Google tries to do the job properly.


Friday, May 9, 2008

A safe way to search: YAHOO

Yahoo has teamed up with McAfee to develop SearchScan, a new safe search service. Here's what you need to know:

* Provides always-on alerts to users for "risky" sites with security concerns including spyware, adware and other malicious software
* Identifies sites that have shown bad email practices such as flooding user in-boxes with spammy emails
* Available for Yahoo! Search users in the US, Canada, UK, France, Italy, Germany, Australia, New Zealand and Spain

"The new SearchScan feature from Yahoo! Search makes searching the Web even safer than ever before. No other search engine today offers this level of warning before visiting sites that can damage or infect a user's PC and cost them valuable time and money," said Vish Makhijani, senior vice president and general manager of Yahoo! Search. "Through this partnership with McAfee, we can offer users a safer search experience and drive more users to make Yahoo! Search their starting point on the Web."


Monday, April 28, 2008

White-hat search engine optimisation

Google: White-hat SEO 'a benefit'

White-hat search engine optimisation (SEO) firms can help companies and should not be considered the same as spammers, it has been claimed.

Thursday, April 17, 2008

Top list of google algorithm updates.

Some of the popular google updates.

* Florida Update Nov 16th, 2003
* Austin Update Jan 11th, 2004
* Brandy Update Feb 11th 2004
* Google Bourbon Update Part I May, 2005
* Google Bourbon Update Part II
* Google Bourbon Update Part III June 7th, 2005
* Jagger Update Part I (Oct 16th, 2005 to Nov 7th, 2005)
* Jagger Update Part II (Oct 27th, 2005 to Nov 6th, 2005)
* Jagger Update Part II (Oct 4th, 2005 to Nov 18th, 2005)
* Big Daddy Infrastructure change

Wednesday, April 16, 2008

Google Algorithm Update

Yesterday some members of Digital Point Forums were talking about a Google search results update and checking the results I would say they were correct. The thing that sticks out the most is that a lot of newish .com sites that are hosted in the UK have had a nice little boost in the Google.co.uk rankings.

Three of the .com sites that I have worked on over the last few months up until yesterday were strangely performing much better on Google.com then they were on Google.co.uk (all of these sites were hosted in the UK).

I have two possible theories about why this has happened either Google.co.uk is very careful about determining which .com sites are actually from the UK and it just takes a long time or Google have just fixed there algorithm so that all UK hosted .com sites now get full Google.co.uk ranking benefit.

It is not just my clients who have been affected, I have noticed many .com sites are now performing much better, Kevin Gibbons site SEOptimise.com appears to have shot up in the rankings, congratulations Kev.

I have not said anything to my clients yet, just incase their rankings disappear, I will wait until next week because I have made that mistake before.

Geographical Keywording Your Content to Improve Search Results

What is Geographical Keywording?
Geographical keywording is the process of getting geographic keywords into your web pages and SEO strategy, and is one of your most powerful SEO tools. Many people who are serious about buying something search based on their location because they prefer to do business with a local company for a variety of reasons.

Some people initially don't want to focus on a geographical region because one of the greatest things about the internet is its global reach, and why would you want to limit your web site to your local area? Clearly, nobody wants to limit themselves to a specific geographic region when it comes to showing up in the Google search results. By associating your content and pages with geographical keywords you are not limiting your site but you are just focusing in on your location as a way of capturing the search requests for people in your area. Let's face it, when you launch a site you need to use every tool at your disposal to get an advantage and start getting some traffic on your site.

Once your Google PageRank goes up you may find that you don't need to focus as much on your geographic location anymore, but having geographic keywords in your content is not going to hurt you.

By periodically adding in the name of your city or state to your page titles and content you can sometimes get first page results or even the #1 organic search result on Google. I recently searched Google for "C# Programmer" and got 2,160,000 results. Searching for "C# Programmer Seattle WA" returned 150,000 results. If you are a C# Programmer located in Seattle, WA and you are looking for work in your area, then you can eliminate 93% of all other web pages that matched "C# Programmer" by including Seattle, WA in your content frequently.

There isn't much room for argument there, if you are selling a product or a service and most of your customers are located close by, then you need to get your geographical location into your keywords if you want people in your area to find your site. Even if you don't really want to find local customers, using geographical keywords can help you. If you want to sell products to other areas, just write some web pages that specifically target those geographic locations.

Google PageRank Considerations

This is a perfect time to talk a little bit about Google PageRank and how it affects your organic search performance. When you do a search on Google for the keywords that your website is targetting, don't be too surprised if there are lots of sites showing before your site that aren't really about the topic that you searched on. Linguistics can be tricky and Google does a pretty good job at it but sometimes they just can't figure out exactly what you are searching for. In these cases, any websites that have a better PageRank than your site, and have the keywords that you searched on, might return before your site in the organic search results. What this means is that Google's algorithm doesn't always understand what people are looking for and sometimes PageRank has a greater influence on results than the context of the search terms / keywords. If a person searching on Google encloses their search in quotes, the results are often very different because then Google knows to match on that exact phrase.

The problem is that website owners are at the mercy of Google's index algorithm and how potential customers enter their search terms. That is why you need to try to anticipate how people will search, and then determine how well you did. After analyzing your web statistics and logs for the search terms and keywords that brought people to your site, you will have learned more about how people search, and can make any adjustments to your site that are needed.

Monday, January 28, 2008

How to increase pages indexed

There is a 10 ways to increase pages indexed. They are..

1) PageRank
2) Links
3) Sitemap
4) Speed
5) Google's crawl caching proxy
6) Verify
7) Content
9) Staggered launch
10)Size matters.

PageRank

It depends a lot on PageRank. The higher your PageRank the more pages that will be indexed. PageRank isn't a blanket number for all your pages. Each page has its own PageRank. A high PageRank gives the Googlebot more of a reason to return. Matt Cutts confirms, too, that a higher PageRank means a deeper crawl.

Links

Give the Googlebot something to follow. Links (especially deep links) from a high PageRank site are golden as the trust is already established.

Internal links can help, too. Link to important pages from your homepage. On content pages link to relevant content on other pages.

Sitemap

A lot of buzz around this one. Some report that a clear, well-structured Sitemap helped get all of their pages indexed. Google's Webmaster guidelines recommends submitting a Sitemap file

That page has other advice for improving crawlability, like fixing violations and validating robots.txt.

Some recommend having a Sitemap for every category or section of a site.

Speed

A recent O'Reilly report indicated that page load time and the ease with which the Googlebot can crawl a page may affect how many pages are indexed. The logic is that the faster the Googlebot can crawl, the greater number of pages that can be indexed.

This could involve simplifying the structures and/or navigation of the site. The spiders have difficulty with Flash and Ajax. A text version should be added in those instances.

Google's crawl caching proxy

Matt Cutts provides diagrams of how Google's crawl caching proxy at his blog. This was part of the Big Daddy update to make the engine faster. Any one of three indexes may crawl a site and send the information to a remote server, which is accessed by the remaining indexes (like the blog index or the AdSense index) instead of the bots for those indexes physically visiting your site. They will all use the mirror instead.

Verify

Verify the site with Google using the Webmaster tools.

Content

Make sure content is original. If a verbatim copy of another page, the Googlebot may skip it. Update frequently. This will keep the content fresh. Pages with an older timestamp might be viewed as static, outdated, or already indexed.

Staggered launch

Launching a huge number of pages at once could send off spam signals. In one forum, it is suggested that a webmaster launch a maximum of 5,000 pages per week.

Size matters

If you want tens of millions of pages indexed, your site will probably have to be on an Amazon.com or Microsoft.com level.

Know how your site is found, and tell Google

Find the top queries that lead to your site and remember that anchor text helps in links. Use Google's tools to see which of your pages are indexed, and if there are violations of some kind. Specify your preferred domain so Google knows what to index.

Thursday, January 17, 2008

New Google Filter

Is There an Anchor Text Problem?

Aaron Wall put up a post about a new Google filter that causes people with high ranking terms to be bumped down to position #6. There is also a thread at Webmaster World about this phenomenon. This is still reasonably speculative in nature, but there are a lot of people who have seen this.

Aaron offers some really interesting speculation about why this may be occurring. The most interesting theory was the notion that it was an anchor text problem. Here is what Aaron had to say:

I think this issue is likely tied to a stagnant link profile with a too tightly aligned anchor text profile, with the anchor text being overly-optimized when compared against competing sites.

Whether or not this is occurring now, this makes complete sense. It is well within Google’s (or any other search engine’s) ability to detect an unusually high density of one form of anchor text to a given domain. For example, if your site is called yourdomain.com, and you sell widgets, and the anchor text in 48 or your 65 links says “Widgets on Sale”, this is not natural.

Most of the links to your site should be the name of your domain itself (i.e. in this example, “yourdomain”). Such a distribution of anchor text is a flag that the anchor text of your links are being artificially influenced. How is that done? Why by purchasing links, or by heavy duty link swapping.

This is potentially another step in Google’s stepped up war against the practice of link buying. I have long maintained that the main advantage the link buying has over natural links is the fact that people who buy links get to specify the exact (keyword rich) anchor text. used. Looking for unnatural patterns of anchor text provides a backdoor into detecting people who are purchasing links.

It might be a bit heavy handed for Google to ban a site based on this type of evidence, but reducing the impact of anchor text on rankings when there is an unnatural distribution in play still helps them meet their goal. After all, even if the unnatural acnhor text campaign does not represent the result of a link buying campaign, and all those keyword laden links are in fact completely natural, it might still provide better relevance for Google to filter in this manner.

Thinking about this further, this might be a simple search quality adjustment for skewed anchor text distribution. If it affects paid links, from Google’s perspective, this might just be a bonus.

Wednesday, January 2, 2008

Google Video Sitemaps

Creating and submitting Video Sitemaps files

About Google Video Sitemaps

Google Video Sitemaps is an extension of the Sitemap protocol that enables you to publish and syndicate online video content and its relevant metadata to Google in order to make it searchable in the Google Video index. You can use a Video Sitemap to add descriptive information – such as a video’s title, description, duration, etc. – that makes it easier for users to find a particular piece of content. When a user finds your video through Google, they will be linked to your hosted environments for the full playback.

When you submit a Video Sitemap to Google, we will make the included video URLs searchable on Google Video. Search results will contain a thumbnail image (provided by you or autogenerated by Google) of your video content, as well as information (such as title) contained in your Video Sitemap. In addition, your video may also appear in other Google search products. During this beta period, we can’t predict or guarantee when or if your videos will be added to our index, but as we refine our product, we expect both coverage and indexing speed to improve.

Google can crawl the following video file types: .mpg, .mpeg, .mp4, .mov, .wmv, .asf, .avi, .ra, .ram, .rm, .flv. All files must be accessible via HTTP. Metafiles that require a download of the source via streaming protocols are not supported at this time.

Friday, December 7, 2007

New Experimental Search in google labs

Google is experimenting with new features and improving the search experience.

Monday, December 3, 2007

Link Baiting strategies in SEO

Of all the SEO strategies that have sprung up in the last few years, none have gained as much repute as link baiting. Link baiting is an exciting prospect and several different variations of this method have propped up recently.

Link baiting can be defined as the process of creating content that is interesting enough to catch people’s attention.

It can be informational stuff also known as informational hooks, news hooks which provide fresh news information each time and then garner citations and links.

‘Evil hooks’ which is the process of writing something that is mean about someone or some product is also known to gather a lot of attention.

Improving link quality

While people have commonly created bait articles, you can also provide a fresh information piece and then leave it on the internet as bait. There have been articles that have picked up almost 1000 links in as less as 6 months.

A news hook is the most effective one so far. So are contradictory hooks where you contradict someone else’s opinion.

Overall, it is an extremely effective way to build your brand online and to create a reputation.

Friday, November 30, 2007

Google races Microsoft into the clouds

People won't store very much data on their computers if Google has its way. The company is working on a new data storage service that lets users store virtually all of their information on the web, according to a report in The Wall Street Journal.

Dubbed "cloud" computing, both Microsoft and Google have been racing to rollout applications that allow for heavy amounts of web-based storage and activity.

While Google would not comment specifically on its recent developments, it did confirm that users would soon be able to store a range of files, including video and audio, on Google-owned servers. The move is an extension of free storage already offered with GMail, however, Google said it would likely sell storage space beyond an initial allotment.

Microsoft and Google have been battling for position in the looming cloud war as a way to attract users and use that data for ad serving.

Thursday, June 14, 2007

Search engine optimization

What is Search engine optimization?

Short for search engine optimization, the process of increasing the amount of visitors to a Web site by ranking high in the search results of a search engine. The higher a Web site ranks in the results of a search, the greater the chance that site will be visited by a user. It is common practice for Internet users to not click through pages and pages of search results, so where a site ranks in a search is essential for directing more traffic toward the site.

SEO helps to ensure that a site is accessible to a search engine and improves the chances that the site will be found by the
search engine.

How Does SEO Work?

Most websites do not focus on their topic well, keyword lists containing 50 or more phrases per page (Ebay as a keyword is common), the business that has four separate offerings and each page of the website tries to target all four at once and misses. It might be the site is targeting a search phrase that no one is searching for. By focusing each of the pages of the site in turn on keywords relevant to the business and relevant to people searching for that business SEO ensures each page of your site stays focused and is therefore seen as an authority on the topic.

SEO Tips:

Did you know that, before search engine users visit a specific Web site, they already have made four separate and distinct
choices:

Choice number one: They choose to use the Internet to find a specific product or service, to research an issue of interest to them or solve a problem, as much in their professional careers as in their personal lives.

Choice number two: They choose one of the five to six major search engines on the Internet to launch their search, the most popular by far being Google.

Choice number three: They choose a particular keyword or phrase as the basis of their search, trying to narrow it down as best they can.

Choice number four: They discriminate among the first ten search results on the same, first page. Then they take their final decision as to wich web site (or company) will get their business.

Usefull Interview Question :

1. What are the basic SEO methods?
A. On Page Optimisationg such as correct code semantics with and other tags in correct order with Keywords. Correct directory and file naming structure with indexable site. KEyword Density / Keyword Proximity / Code v Content ratio/ Code Structure Linking (Off Page Optimisation) Various types of links. The main aim is to get high quality content linking to relevant content on your site. A link within relevant content deep linked to some relevant content on your site is best. .edu and.gov domain links are seen as authoiative and give more weight.

2. Which would you do first? On Page SEO, Link Building or Keyword Research
A. Keyword Research and then On and Off Page Optimisation in parallel.

3. What is the Sandbox?
A. The sandbox is a probationary period that all new sites face. It is a filter on competetive / commercial keywords that incubates a site into the index. Sites in the sandbox can rank for less competetive words but will have to wait to get out of the sandbox to rank for competetive keywords. This stops overnight "pump and dump domains" being used by spammers and creates a better quality index. Google was using this technology for sometime , when they were not displaying any new website in their index. This technique was implemented mainly to curbe the efforts of new websites that tried to gain search ranks by buying links. It was a way to test the stability of a business/website before it comes up on the search results. However, the sandbox is nonexistent now and new sites can easily appear in Google's index.

4. How long does the sand box last?
A. There is no specific lenght. On average I would say 6 months but it can be less and it can be more.

5. What effect do Meta Keywords and Descriptions have?
A. Meta keywords have nill effect despite many contradictory opinions. Meta description also does not help in any way your ranking but it is sometimes used in the SERPs them selves so it makes sense to write a well written description that will likely induce a click. Given the above it is still considered good practice to use meta keywords and description tags. Anyone that says optimising meta keywords will give you a higher ranking clearly does not know what they are talking about.

6. What types of links would you look for
A. Article links or links from a site that is seen as a resource on a particular topic. Searching the SERPs and industry related sites for links is a good place to start. High PR sites are desirable but in generl it is relevance that is the most important thing. I look for static keyword targeted links embeded in content. I do not use reciprocal linking as it is ineffective.

7. How important is content?
Content is very important. It is what a searches is looking for. It aids getting links which aids getting higher rankings. The more relevant content the more of a resource a site is, thus the higher it ranks! You must use unique content. Content should be written with users in mind and shouldn't be keyword stuffed.

8. What SEO results can you show?
put your portfolio which should contain Web URL, keywords, ranking for keywords in Google, Yahoo and MSN

9. What do you believe are important things to take into consideration when optimising a site?
lots of factors.. it should start from the phase when a site is planning, competition, nature of product/sevice offered, business objective ( sales/;lead generation etc) - keywords ( will depend on the other factiors mentioned )

10. Who do you respect in the industry?
Matt Cutts ( most common ), Jeremy, Lee odden, Dan thies.\