Posted on | November 12, 2009 | No CommentsA search engine â€œindexesâ€ websites to add the latest information on it to its existing database. To achieve this, Search Engines use something called â€œSpidersâ€. These arenâ€™t the 8-legged crawling spiders which spit webs. These are Search Engine spiders which crawl all over websites reading the source and indexing the pages and updating their existing database of content. When you submit your website to a Search Engine, the Search Engine spider(which is some sort of automated program) comes over, crawls over your website and indexes each and every page your website has that it has access to. The spider visits your website, reads the content on the actual web page, the websiteâ€™s META tags and also follows the links the site connects to. All this information collected by the Search Engine Spider is then returned to the central depository or the Search Engineâ€™s database where the pages are then indexed. The spider will visit each and every link displayed on your website and index those sites as well. Sometimes spiders donâ€™t index all pages of a website, hence if your website has 500 pages, donâ€™t expect all of them to get indexed in one spider visit!