The Sitemaps informs search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists your websites URLs in, and may include additional information such as when the URL was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.
Site maps are particularly essential where
- Some areas of the website are not available through the browser interface
- The pages include rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
- The site is very large and there is a chance for the web crawlers to overlook some of the new or recently updated content
- When websites have a huge amount of pages that are isolated or not well linked together
- When a website has few external links