Wednesday, July 30, 2008

Top 10 Things to do Before the Spiders come

1: Check the Title Tags.

In our opinion, creating great title tags can be one of the most valuable ways to help increase your SEO. Make sure you have relevant, keyword-rich title tags for each page you create. This will make it easy for web browsers to search and display your website.

2: If you have a flash introduction, be sure there is a link that allows you to pass it.

Many webpages have a fancy splash page, but no way to navigate around it. Google cannot read into a flash page, so be sure to include a text link to your website’s second index. Overall this type of intro can add to your site's visual appeal, but don't let it ruin your opportunity to get indexed quickly.

3: Don't forget to check the META Tags.

When there is a lot of navigation code that wouldn’t make sense to a human searcher, Google relies on these tags to describe a site. Be sure to set up some valid keywords and a description- just in case.

4: Make sure all your links are in working order.

Check and double check to make sure there are no broken links on your site. Not only does this create errors for a web crawler, but it also creates problems for your site users. Nothing is more disappointing to a web surfer than believing they've found just the right info... only to discover the "page is not found." It reduces a website's credibility, and it leaves users feeling unfulfilled. Make sure you're information is plentiful, and your site is in perfect working order. These are the keys to web popularity and credibility.

5: Check the ALT Tags.

Few people take the time to put these in order, but ALT tags help spiders understand all of your graphics. Don’t spend too much time on this, but every little bit helps!

6: Check your frames.

If you use frames on your website, it may not be fully indexed. Google recommends that you check out this article by Danny Sullivan called Search Engines and Frames.

7: Question- Do you have dynamically generated pages?

Google claims they will be limiting the number of dynamic webpages they will index. It may help if you include some static content in your pages.

8: Keep your content fresh.

Google likes to index pages more frequently when they are updated on a regular basis. It’s also a great way to keep visitors returning when you offer them fresh new content. Blogs, articles, and new products/special offers are all a great way to keep Google - and customers - coming back for more.

9: The robots.txt

This file allows you to filter out the bots that crawl your website, plus you can restrict certain URLs that do not need to be indexed. This is a tremendous resource if used properly.

10: Caches

If you do not want Google to cache a website, simply include this line of code between your head tags:
META NAME="ROBOTS" CONTENT="NOARCHIVE"

Thursday, July 17, 2008

SEO Interview Questions.

1. Could u brief me your SEO career?

2. What is SEO and how it's going to help you?

3. What you do in different off-page optimization?

4. What you do in on page optimization?

5. What type of client websites your landing?

6. How your website get indexed in the search engines within 24 hours?

7. If i give one website, How you plan the strategy for that website?

8. If your client wants rank in France, what is your strategy?

9. Are you using manual directory submission r submitting in tool for directory submission?

10. How you check your website ranking?

11. What tools you using to track your website performance?

12. What you analysis, tracking your website?

13. What is competitor analysis?

14.What actually you analysis in your competitor?

15.What is social media optimization?

16.Social media optimization is that worthy to compare normal optimization?

17.What you do with your site videos??? You optimize the video or submitting the video?

18.How u optimize the video? And where you submit the video??

19.Where you do ppc in google adwords r yahoo?

20.In which forums you participate for seo updates and info??

21.If not, then how you updating the seo updates

22.What is robot.txt?

23.How you using robot.txt

24.Are you aware of noindex and nofollow?

25.For what purpose you using the noindex and nofollow

26.What is difference between noindex and nofollow?

27.What kind of directories you preferred for submission?

28.What is the minimum page rank you will submit in directories?

29.Do you think the page rank that much significant now a days?

30.What type of sites you working? Like finance r business

31.You work only in-house project r client project?

32.Could tell me any one keyword you got 1st rank in google?

Comments are welcome.. if you have some questions to add, please do it in comments part, ill make the changes.




How to remove content from the Google index??





Tuesday, July 15, 2008

Importance of Tags - Matt cutts





Friday, July 11, 2008

Tips to Get Better Visibility on Google - Matt Cutts




Tuesday, July 1, 2008

Google's Matt Cutts discusses how to improve your site's search ranking

More and more businesses are turning to the Web to find customers: $5.8 billion was spent on advertising in the first quarter alone, up 18.2% from the prior year, according to the Interactive Advertising Bureau. Google's share of Internet searches continues to rise as well — to a record 61.8% in May, according to measurement service ComScore Media Metrix.

If you haven't "optimized" your site, here's how:

1. Spotlight your search term on the page.

"Think about what people are going to type in to try and find you," Cutts says. He tells of meeting a chiropractor from San Diego who complained that his site couldn't be found easily using Google search. The words "San Diego chiropractor" were listed nowhere on his site. "You have to make sure the keywords are on the page," Cutts says. If you're a San Diego doctor, Des Moines architect or Portland ad agency, best to let people know so immediately, at the top of your page.

2. Fill in your "tags."
When creating websites, Internet coding language includes two key tags: title and description. Even if you don't know code, which is used to create pages, software programs such as Adobe's Dreamweaver have tools that let you fill them in in plain English (rather than "San Diego Chiropractor"). Tags are crucial, Cutts says, because what's shown in search results most often are the title and description tags.

If Cutts' chiropractor had properly tagged his Web page, a search would have returned something like this: "San Diego chiropractor. Local doctor serves San Diego community."

There's also a third tag, to add keywords, or search terms, but Cutts says Google doesn't put much weight in its rankings on that one.

3. Get other sites to "link" back to you.

Google says it looks at more than 100 pieces of data to determine a site's ranking. But links are where it's at, once your search terms are clearly visible on your site and the title and description tags correctly marked.

In a nutshell: Google ranks sites based on popularity. If authoritative sites link to you, you must be good, and therefore you get to the top of the list. If you can't get top sites such as USATODAY.com or The New York Times to link to you, try your friends. And what if they don't have a site? They probably do. Read on.

4. Create a blog and post often.
Cutts says blogging is a great way to add links and start a conversation with customers and friends. It will cost you only time: Google's Blogger, WordPress and others offer free blogging tools. With a blog, you can link back to your site and offer links to others. It's also a great way to start building content, Cutts says.

5. Register for free tools.
Google's google.com/webmaster offers freebies to help get your site found. You can upload a text-based site map, which shows Google the pages of your site (create it at www.xml-sitemaps.com). Once that's done, you'll be registered with Google, where you can learn vital statistics — including who is linking to your site and how often Google "crawls" your site for updates.

Google's Local Business center (google.com/local/add) is the place for business owners to submit a site so it shows up in local searches, with a map attached. Savvy consumers who use Google for searches know that the first 10 non-advertising results often are from Google Maps, so if you have a business and haven't submitted it, you're losing out on potential customers.

Don't overdo it

When weaving keywords into a main page, Cutts says, some zealous Web publishers will use the term over and over again. That's called "keyword stuffing." It's a big Google no-no that can have your site removed from the index.

"After you've said it two or three times, Google has a pretty good idea — 'OK, this page has something to do with this keyword,' " he says. "Just think about the two or three phrases you want to be known for and weave that in naturally."

For blogger newbies, Cutts knows that writing (for example, posting new material) doesn't always come easy. He suggests finding ideas by visiting social news sites such as Digg and StumbleUpon, to see what people are saying about your particular topic.

Aside from that, Cutts says, new material falls into the common-sense category: It's all about your business. "If I'm a plumber in Iowa, I may want to write about some of the strange things that happen to me on the job, or the five most common ways to fix a toilet," he says. "That kind of content can get really popular, and it's a great way to get links." Folks will post your piece on one of the social media sites. And with links comes higher Google rankings.

Finally, Cutts says, there is one big misconception about getting Google visibility that he wants to clear up: In order to be found at the top of Google's rankings, you do not also have to advertise.

"One thing doesn't have to do with the other," he says.


Monday, June 23, 2008

28 SEO Steps to Win Search Engine Rankings

Search Engines are built to help the searchers. Search Engines strive to put the best fit/standard websites on top rankings so that searchers get their required info within few clicks.

How do Search Engines know a website’s standard and its relevance with searched keywords? What are the steps you should follow for a complete SEO to improve your website’s rankings?

This article covers Search Engine Optimization in two levels - the first is On-page optimization, and the next is Off-Page Optimization. An effective On-page optimization together with good Off-page optimization will improve your Search Engine Rankings.

On-Page Optimization

1. Define your Business & Target Audience
The first step in any business starts with analysis. Determine what kind of services your website is to provide and what kind of audience you want for your website?

2. Don’t Purchase a New Domain

If you already have a good domain name don’t try to purchase a new one, as some search engines look for the age of the website as a ranking factor.

3. Choosing your Domain Name
If you are planning for a new website, try to get a domain name with keywords included. If you target regional customers, you can have your domain based on the region, say for example .uk or .au or .in

4. Make your Website look Clean & Simple
Now you have a domain name and you know whom you want to target. Website design is the key factor which keeps your visitor to stay for a while in your website and navigate your services. Make sure your website design has a good feel & look, clean and simple.

5. Evaluate your Website
If you have an existing website, and now you want to do SEO for it, then evaluate your website:
Navigation structure - think in a visitor’s point of view - Can someone navigate and reach the product/services they are looking for? Make your navigation user friendly, no visitor should leave your website due to confused navigation.
Check for W3 compatibility of your website.
Check for any broken links in your website and fix if any.
Your website should load fast so that it doesn’t check on your visitors’ patience.

6. Observe your competitor
Find your competitor websites. Analyze and gain knowledge on their tactics, the keywords they use, the techniques applied. With the help of this analysis you will learn what is working and what is not working, which will help in your SEO process.

7. Research on Keywords

List the Keywords your target audience would search for and the ones used by your competitors. Make use of keyword research tools like Overture to know more on your related search terms. Now refine your list and make your final list of target keywords.

8. Structure File name
If you have control over your file names - modify file names with your keywords included.

9. Search Engine Friendly Sitemap
A well structured search engine friendly sitemap can help Search Engines to index all your pages. With good anchor text for your navigation links you can improve your rankings

10. Write an Attractive Title
Why is your web page title so important? The web page title is displayed in SERPs which in turn help in attracting your visitors. Web searchers do their search with a term and look for titles in the result pages that best fit their search. Hence make your title attractive, with targeted keywords included. Try to incorporate related keywords too. For example you target for Montessori School, you can write title as ‘Montessori School, the Preschool for your Children’. (in this way you add 2 related terms).

11. Meta Description Tag
Description tag is also displayed in snippets of any SERPs. This description helps to motivate searchers to visit your website. Make an effective and optimized description tag. Snippets are also taken from surrounding text of the searched keyword. You may need to edit your content a little to make your description look good.

12. Meta Keyword Tag
Though there is saying that Search Engines like Google doesn’t look for Meta keyword tag, there are some smaller search engines, which still follow the conventional way and spider the keywords tag. So there is no harm in making of the Meta keyword tags. Your meta keyword should contain keywords that appears in your body text.

13. Have Robot tags
If you don’t want any of your files of your website to be indexed, say an image or a text file, you can write your robot file insisting the search engines not to crawl those pages.

14. Alt Tags for images
As the name infers it acts as an alternate text for any image. This tag is both user friendly and search engine friendly. Search engines cannot read images; instead it indexes the alt tag given for any image and assumes it as the description for the image. This alt tag is not so important by Search Engines, as many spammers try to put irrelevant alt tag for the images to improve their rankings. However for an image link with a proper alt tag (keywords included) will work as great internal anchor link.

15. H1 & H2 heading format
Though no one is sure on whether this helps in rankings, try applying the header options, as this practice is good for any web development

16. Improve Keyword Prominence, Density & Proximity
Keyword prominence will increase if you have your keyword at the beginning of text part of your webpage. Keyword density refers to the ratio (percentage) of keywords contained within the total text content of the page. Keyword proximity refers to the closeness between two or more keywords. And this is yet another factor which has not much proven records but still exist in SEO practice.

17. Make Content look rich
As many SEO experts say content is the king. Optimized content with keywords included, along with other SEO factors working well the website can rank well in Search Engines

18. Keyword Rich Anchor Text
Anchor text for both internal links and external links is powerful element for Search Engine rankings. You can notice websites which doesn’t have the keyword in their web pages rank well which is all because of their anchor text of their inbound links.
Here a good article on Link Anchor Text and Search Engine Optimization to understand the importance of anchor text in Search Engine Optimization.

19. Make Flash files work for you
Its better to avoid including flash files into your website. But under compelling circumstances, you can try adding keyword rich text somewhere in your file, since Search Engines like Google can index the text part of flash files

20. Bring your PDF files on SERPs
PDF file is a good source for keyword rich content. Most Search Engines can read & index PDF files and a few search engines shows PDF files in SERPs too. Make sure you have a good title and file name for your PDF file

Have a look Google guidelines for Webmasters

Off-Page Optimization

Why Search Engines give so much of importance for inbound links?

It is easy for any website owner to optimize their website with good content, navigation, title, etc. But does that mean it’s a good website and people would love to check it out?

Search Engines try to know and evaluate the standard of a website through many factors and the most significant of them is incoming/inbound links. When a website receives inbound links from related & standard websites, it obviously means that it is worth looking at.

Search Engines learn about your website through inbound links and to be precise it’s through anchor text.

So natural linking with different & targeted keyword rich anchor text works great for any website.

Here are a few ways to increase your inbound links/ backlinks.

21. Search Engines
Submit your website manually to search engines including regional search engines.

22. Directories
There are many directories available online for free submission. Submit your website into the category that best fits your services, most importantly with good anchor text. Try submitting in your own regional directories to improve your local business.

23. Forums
Participate in your industry related forums & discussion boards. Post your comments, thoughts and also provide a link to your website (without spam).

24. Blogs
Having your own official blog helps to post your company’s updates, product release, etc., and also inbound links. There are directories for submitting blogs. Submit your blog with proper keyword tags which helps in your blog listing.

25. Articles
Write keyword & content rich articles on your own and post in article junctions available online. There are article junctions which accept articles in HTML format, so you can include your links with good anchor text. Write fresh articles on your topic, post and see your traffic increasing.

26. Press Releases
You can post Press Releases, which also works like Article junctions & directories; however it should look more like a Press Release and not like an article.

27. Bookmarks
Social book marking - it serves two purposes: It gets you incoming links and it popularizes your website in your community. Make sure you put keyword rich tags for your bookmarks.

28. Classifieds
You can see some websites displaying free classifieds. You can put your advertisements there with link to your website.


Wednesday, June 4, 2008

SEO Tips and Strategy

Before you write one line of code:

* Do keyword research to determine what keywords you want to target.

While constructing your website you should do the following:

* Use markup to indicate the content of your site

1) Optimize your title tags on each page to contain 1 - 3 keywords
2) Create unique Meta Tags for each page
3) Use header tags appropriately (H1 > H2 > H3)
4) Use (b) and (i) tags if appropriate

* Optimize your URLs

1) Use Search Engine Friendly URLs
2) Use keywords in your domain (www.keyword1.com)
3) Use keywords in your URL (www.example.com/keyword2/keyword3.html)
4) Use dashes or underscores to separate words in your URLs (keyword2-keyword3.html)

* Optimize your content

1) Use keywords liberally yet appropriately throughout each page
2) Have unique content
3) Have quality content

* Use search engine friendly design

1) Create a human sitemap
2) Do not use inaccessible site navigation (JavaScript or Flash menus)
3) Minimize outbound links
4) Kept your pages under 100K in size

* Design the navigational structure of the site to channel PR to main pages (especially the homepage)

* Create a page that encourages webmasters to link to your site

1) Provide them the relevant HTML to create their link to you (make sure the anchor text contains keywords)
2) Provide them with any images you may want them to use (although text links are better)

* Make sure your website is complete before launching it

Immediately after launching your site you should do the following:

* Create Webmaster Accounts

1) Google Webmaster Tools
2) Yahoo! Site Explorer

* Submit your site to all major search engines

1) Google (Use a Google SiteMap)
2) Yahoo (Use the page list option)
3) MSN
4) Ask (Finds your site via incoming links)

* Create an XML sitemap

* Submit your site to all free directories

1) DMOZ (also powers Google Directory)
2) JoeAnt

* Submit your site to relevant directories

1) Find more at ISEDB

* Begin a link building campaign (attempting to get keywords in the link anchor text)

1) Put a link to your website in your forum signatures (hint hint)
2) Reply to relevant blog posts (Don't spam please)

If you will pay to promote your website:

* Submit your site to pay directories

1) Yahoo
2) GoGuides

Finally, as part of an ongoing strategy:

* Continually update your website with quality, unique content
* Continually seek free links preferably from sites in your genre

Do NOT do the following:

* Make an all Flash website (without an HTML alternative)
* Use JavaScript or Flash for navigation
* Spam other websites for incoming links
* Launch your site before it is done
* Use duplicate content

1) Do not point several domains to one site without using a 301 redirect
2) Do not make a site of duplicated content from other websites

* Use markup inappropriately

1) Style eader tags to look like regular text
2) Hide content using 'display: hidden' (for the sake of hiding text)

* Use other "black hat" techniques (unless you accept the risk - Banning)

1) Doorway/Landing pages
2) Cloaking
3) Hidden text

Additional Tips:

* Usable and accessible sites tend to be search engine friendly by their very nature
* Be patient! High rankings don't happen overnight
* Don't obsess with any one search engine. They are all worth your attention.

Tuesday, June 3, 2008

Google's -60 penalty

During the last weeks, people in online forums observed some strange Google result changes. Rumor has it that there is a new -60 penalty that Google applies to websites in which it has lost trust.

What has happened?

Some webmasters found websites that were listed on position 61 in Google's search results that had Google Sitelinks below their listing.

Normally, Google only displays Google Sitelinks for the first search result.

Many webmasters believe that the website that was listed on position 61 with the Sitelinks was the number 1 result for that keyword but had been penalized by Google.

What does Google say about the -60 penalty?
In a Google Groups discussion about showing Sitelinks for #61 results Google employee John Mu referred to a -60 penalty discussion.

Google hasn't officially confirmed that a -60 penalty exists. However, Google employee John Mu indicated in a discussion about the -60 penalty in the official Google groups that Google penalizes websites if they contain certain spam elements.

Which spam elements trigger the -60 penalty?

It looks that Google applies this penalty to websites that buy links.

Many of the websites that seem to have been penalized had many inbound links from websites that linked to them from every single page of their website (so-called site-wide links). Sitewide links are an indicator of paid links, which Google sees as an unwanted way to artificially inflate search engine rankings.

The head of Google's anti-spam team Matt Cutts has often said that websites that buy paid links will be penalized and it looks as if Google tries to do the job properly.


Friday, May 9, 2008

A safe way to search: YAHOO

Yahoo has teamed up with McAfee to develop SearchScan, a new safe search service. Here's what you need to know:

* Provides always-on alerts to users for "risky" sites with security concerns including spyware, adware and other malicious software
* Identifies sites that have shown bad email practices such as flooding user in-boxes with spammy emails
* Available for Yahoo! Search users in the US, Canada, UK, France, Italy, Germany, Australia, New Zealand and Spain

"The new SearchScan feature from Yahoo! Search makes searching the Web even safer than ever before. No other search engine today offers this level of warning before visiting sites that can damage or infect a user's PC and cost them valuable time and money," said Vish Makhijani, senior vice president and general manager of Yahoo! Search. "Through this partnership with McAfee, we can offer users a safer search experience and drive more users to make Yahoo! Search their starting point on the Web."


Monday, April 28, 2008

White-hat search engine optimisation

Google: White-hat SEO 'a benefit'

White-hat search engine optimisation (SEO) firms can help companies and should not be considered the same as spammers, it has been claimed.

Thursday, April 17, 2008

Top list of google algorithm updates.

Some of the popular google updates.

* Florida Update Nov 16th, 2003
* Austin Update Jan 11th, 2004
* Brandy Update Feb 11th 2004
* Google Bourbon Update Part I May, 2005
* Google Bourbon Update Part II
* Google Bourbon Update Part III June 7th, 2005
* Jagger Update Part I (Oct 16th, 2005 to Nov 7th, 2005)
* Jagger Update Part II (Oct 27th, 2005 to Nov 6th, 2005)
* Jagger Update Part II (Oct 4th, 2005 to Nov 18th, 2005)
* Big Daddy Infrastructure change

Wednesday, April 16, 2008

Google Algorithm Update

Yesterday some members of Digital Point Forums were talking about a Google search results update and checking the results I would say they were correct. The thing that sticks out the most is that a lot of newish .com sites that are hosted in the UK have had a nice little boost in the Google.co.uk rankings.

Three of the .com sites that I have worked on over the last few months up until yesterday were strangely performing much better on Google.com then they were on Google.co.uk (all of these sites were hosted in the UK).

I have two possible theories about why this has happened either Google.co.uk is very careful about determining which .com sites are actually from the UK and it just takes a long time or Google have just fixed there algorithm so that all UK hosted .com sites now get full Google.co.uk ranking benefit.

It is not just my clients who have been affected, I have noticed many .com sites are now performing much better, Kevin Gibbons site SEOptimise.com appears to have shot up in the rankings, congratulations Kev.

I have not said anything to my clients yet, just incase their rankings disappear, I will wait until next week because I have made that mistake before.

Geographical Keywording Your Content to Improve Search Results

What is Geographical Keywording?
Geographical keywording is the process of getting geographic keywords into your web pages and SEO strategy, and is one of your most powerful SEO tools. Many people who are serious about buying something search based on their location because they prefer to do business with a local company for a variety of reasons.

Some people initially don't want to focus on a geographical region because one of the greatest things about the internet is its global reach, and why would you want to limit your web site to your local area? Clearly, nobody wants to limit themselves to a specific geographic region when it comes to showing up in the Google search results. By associating your content and pages with geographical keywords you are not limiting your site but you are just focusing in on your location as a way of capturing the search requests for people in your area. Let's face it, when you launch a site you need to use every tool at your disposal to get an advantage and start getting some traffic on your site.

Once your Google PageRank goes up you may find that you don't need to focus as much on your geographic location anymore, but having geographic keywords in your content is not going to hurt you.

By periodically adding in the name of your city or state to your page titles and content you can sometimes get first page results or even the #1 organic search result on Google. I recently searched Google for "C# Programmer" and got 2,160,000 results. Searching for "C# Programmer Seattle WA" returned 150,000 results. If you are a C# Programmer located in Seattle, WA and you are looking for work in your area, then you can eliminate 93% of all other web pages that matched "C# Programmer" by including Seattle, WA in your content frequently.

There isn't much room for argument there, if you are selling a product or a service and most of your customers are located close by, then you need to get your geographical location into your keywords if you want people in your area to find your site. Even if you don't really want to find local customers, using geographical keywords can help you. If you want to sell products to other areas, just write some web pages that specifically target those geographic locations.

Google PageRank Considerations

This is a perfect time to talk a little bit about Google PageRank and how it affects your organic search performance. When you do a search on Google for the keywords that your website is targetting, don't be too surprised if there are lots of sites showing before your site that aren't really about the topic that you searched on. Linguistics can be tricky and Google does a pretty good job at it but sometimes they just can't figure out exactly what you are searching for. In these cases, any websites that have a better PageRank than your site, and have the keywords that you searched on, might return before your site in the organic search results. What this means is that Google's algorithm doesn't always understand what people are looking for and sometimes PageRank has a greater influence on results than the context of the search terms / keywords. If a person searching on Google encloses their search in quotes, the results are often very different because then Google knows to match on that exact phrase.

The problem is that website owners are at the mercy of Google's index algorithm and how potential customers enter their search terms. That is why you need to try to anticipate how people will search, and then determine how well you did. After analyzing your web statistics and logs for the search terms and keywords that brought people to your site, you will have learned more about how people search, and can make any adjustments to your site that are needed.

Monday, January 28, 2008

How to increase pages indexed

There is a 10 ways to increase pages indexed. They are..

1) PageRank
2) Links
3) Sitemap
4) Speed
5) Google's crawl caching proxy
6) Verify
7) Content
9) Staggered launch
10)Size matters.

PageRank

It depends a lot on PageRank. The higher your PageRank the more pages that will be indexed. PageRank isn't a blanket number for all your pages. Each page has its own PageRank. A high PageRank gives the Googlebot more of a reason to return. Matt Cutts confirms, too, that a higher PageRank means a deeper crawl.

Links

Give the Googlebot something to follow. Links (especially deep links) from a high PageRank site are golden as the trust is already established.

Internal links can help, too. Link to important pages from your homepage. On content pages link to relevant content on other pages.

Sitemap

A lot of buzz around this one. Some report that a clear, well-structured Sitemap helped get all of their pages indexed. Google's Webmaster guidelines recommends submitting a Sitemap file

That page has other advice for improving crawlability, like fixing violations and validating robots.txt.

Some recommend having a Sitemap for every category or section of a site.

Speed

A recent O'Reilly report indicated that page load time and the ease with which the Googlebot can crawl a page may affect how many pages are indexed. The logic is that the faster the Googlebot can crawl, the greater number of pages that can be indexed.

This could involve simplifying the structures and/or navigation of the site. The spiders have difficulty with Flash and Ajax. A text version should be added in those instances.

Google's crawl caching proxy

Matt Cutts provides diagrams of how Google's crawl caching proxy at his blog. This was part of the Big Daddy update to make the engine faster. Any one of three indexes may crawl a site and send the information to a remote server, which is accessed by the remaining indexes (like the blog index or the AdSense index) instead of the bots for those indexes physically visiting your site. They will all use the mirror instead.

Verify

Verify the site with Google using the Webmaster tools.

Content

Make sure content is original. If a verbatim copy of another page, the Googlebot may skip it. Update frequently. This will keep the content fresh. Pages with an older timestamp might be viewed as static, outdated, or already indexed.

Staggered launch

Launching a huge number of pages at once could send off spam signals. In one forum, it is suggested that a webmaster launch a maximum of 5,000 pages per week.

Size matters

If you want tens of millions of pages indexed, your site will probably have to be on an Amazon.com or Microsoft.com level.

Know how your site is found, and tell Google

Find the top queries that lead to your site and remember that anchor text helps in links. Use Google's tools to see which of your pages are indexed, and if there are violations of some kind. Specify your preferred domain so Google knows what to index.

Thursday, January 17, 2008

New Google Filter

Is There an Anchor Text Problem?

Aaron Wall put up a post about a new Google filter that causes people with high ranking terms to be bumped down to position #6. There is also a thread at Webmaster World about this phenomenon. This is still reasonably speculative in nature, but there are a lot of people who have seen this.

Aaron offers some really interesting speculation about why this may be occurring. The most interesting theory was the notion that it was an anchor text problem. Here is what Aaron had to say:

I think this issue is likely tied to a stagnant link profile with a too tightly aligned anchor text profile, with the anchor text being overly-optimized when compared against competing sites.

Whether or not this is occurring now, this makes complete sense. It is well within Google’s (or any other search engine’s) ability to detect an unusually high density of one form of anchor text to a given domain. For example, if your site is called yourdomain.com, and you sell widgets, and the anchor text in 48 or your 65 links says “Widgets on Sale”, this is not natural.

Most of the links to your site should be the name of your domain itself (i.e. in this example, “yourdomain”). Such a distribution of anchor text is a flag that the anchor text of your links are being artificially influenced. How is that done? Why by purchasing links, or by heavy duty link swapping.

This is potentially another step in Google’s stepped up war against the practice of link buying. I have long maintained that the main advantage the link buying has over natural links is the fact that people who buy links get to specify the exact (keyword rich) anchor text. used. Looking for unnatural patterns of anchor text provides a backdoor into detecting people who are purchasing links.

It might be a bit heavy handed for Google to ban a site based on this type of evidence, but reducing the impact of anchor text on rankings when there is an unnatural distribution in play still helps them meet their goal. After all, even if the unnatural acnhor text campaign does not represent the result of a link buying campaign, and all those keyword laden links are in fact completely natural, it might still provide better relevance for Google to filter in this manner.

Thinking about this further, this might be a simple search quality adjustment for skewed anchor text distribution. If it affects paid links, from Google’s perspective, this might just be a bonus.

Wednesday, January 2, 2008

Google Video Sitemaps

Creating and submitting Video Sitemaps files

About Google Video Sitemaps

Google Video Sitemaps is an extension of the Sitemap protocol that enables you to publish and syndicate online video content and its relevant metadata to Google in order to make it searchable in the Google Video index. You can use a Video Sitemap to add descriptive information – such as a video’s title, description, duration, etc. – that makes it easier for users to find a particular piece of content. When a user finds your video through Google, they will be linked to your hosted environments for the full playback.

When you submit a Video Sitemap to Google, we will make the included video URLs searchable on Google Video. Search results will contain a thumbnail image (provided by you or autogenerated by Google) of your video content, as well as information (such as title) contained in your Video Sitemap. In addition, your video may also appear in other Google search products. During this beta period, we can’t predict or guarantee when or if your videos will be added to our index, but as we refine our product, we expect both coverage and indexing speed to improve.

Google can crawl the following video file types: .mpg, .mpeg, .mp4, .mov, .wmv, .asf, .avi, .ra, .ram, .rm, .flv. All files must be accessible via HTTP. Metafiles that require a download of the source via streaming protocols are not supported at this time.