Skip to main content

How to increase pages indexed

There is a 10 ways to increase pages indexed. They are..

1) PageRank
2) Links
3) Sitemap
4) Speed
5) Google's crawl caching proxy
6) Verify
7) Content
9) Staggered launch
10)Size matters.

PageRank

It depends a lot on PageRank. The higher your PageRank the more pages that will be indexed. PageRank isn't a blanket number for all your pages. Each page has its own PageRank. A high PageRank gives the Googlebot more of a reason to return. Matt Cutts confirms, too, that a higher PageRank means a deeper crawl.

Links

Give the Googlebot something to follow. Links (especially deep links) from a high PageRank site are golden as the trust is already established.

Internal links can help, too. Link to important pages from your homepage. On content pages link to relevant content on other pages.

Sitemap

A lot of buzz around this one. Some report that a clear, well-structured Sitemap helped get all of their pages indexed. Google's Webmaster guidelines recommends submitting a Sitemap file

That page has other advice for improving crawlability, like fixing violations and validating robots.txt.

Some recommend having a Sitemap for every category or section of a site.

Speed

A recent O'Reilly report indicated that page load time and the ease with which the Googlebot can crawl a page may affect how many pages are indexed. The logic is that the faster the Googlebot can crawl, the greater number of pages that can be indexed.

This could involve simplifying the structures and/or navigation of the site. The spiders have difficulty with Flash and Ajax. A text version should be added in those instances.

Google's crawl caching proxy

Matt Cutts provides diagrams of how Google's crawl caching proxy at his blog. This was part of the Big Daddy update to make the engine faster. Any one of three indexes may crawl a site and send the information to a remote server, which is accessed by the remaining indexes (like the blog index or the AdSense index) instead of the bots for those indexes physically visiting your site. They will all use the mirror instead.

Verify

Verify the site with Google using the Webmaster tools.

Content

Make sure content is original. If a verbatim copy of another page, the Googlebot may skip it. Update frequently. This will keep the content fresh. Pages with an older timestamp might be viewed as static, outdated, or already indexed.

Staggered launch

Launching a huge number of pages at once could send off spam signals. In one forum, it is suggested that a webmaster launch a maximum of 5,000 pages per week.

Size matters

If you want tens of millions of pages indexed, your site will probably have to be on an Amazon.com or Microsoft.com level.

Know how your site is found, and tell Google

Find the top queries that lead to your site and remember that anchor text helps in links. Use Google's tools to see which of your pages are indexed, and if there are violations of some kind. Specify your preferred domain so Google knows what to index.

Comments

interactive said…
You itemize it well. specially the ways to increase index.....
Lekhni said…
Thanks :) I find your tips quite helpful..

Popular posts from this blog

Geographical Keywording Your Content to Improve Search Results

What is Geographical Keywording? Geographical keywording is the process of getting geographic keywords into your web pages and SEO strategy, and is one of your most powerful SEO tools. Many people who are serious about buying something search based on their location because they prefer to do business with a local company for a variety of reasons. Some people initially don't want to focus on a geographical region because one of the greatest things about the internet is its global reach, and why would you want to limit your web site to your local area? Clearly, nobody wants to limit themselves to a specific geographic region when it comes to showing up in the Google search results. By associating your content and pages with geographical keywords you are not limiting your site but you are just focusing in on your location as a way of capturing the search requests for people in your area. Let's face it, when you launch a site you need to use every tool at your disposal to get an ad...

Google Algorithm Update

Yesterday some members of Digital Point Forums were talking about a Google search results update and checking the results I would say they were correct. The thing that sticks out the most is that a lot of newish .com sites that are hosted in the UK have had a nice little boost in the Google.co.uk rankings. Three of the .com sites that I have worked on over the last few months up until yesterday were strangely performing much better on Google.com then they were on Google.co.uk (all of these sites were hosted in the UK). I have two possible theories about why this has happened either Google.co.uk is very careful about determining which .com sites are actually from the UK and it just takes a long time or Google have just fixed there algorithm so that all UK hosted .com sites now get full Google.co.uk ranking benefit. It is not just my clients who have been affected, I have noticed many .com sites are now performing much better, Kevin Gibbons site SEOptimise.com appears to have shot up in ...