Google calls Sitemaps an experiment in web crawling. In general, you place a sitemap on to your web server. This helps the crawlers identify the pages you have and add it to the search index. And when you update your site map (after changing a page or adding new content), it also marks the pages youve changed and makes a note of what order to review it.
Its a lot more systematic and efficient than random crawling, where there are no guarantees that the web crawlers will pick up on the most important pages. So even if it requires an additional step for web developers, its greater assurance that all the hard work you put into your content actually shows up in the searches.
Google Sitemaps was developed partly to resolve the problem encountered by big websites, wherein web crawlers would skip over some pages and fail to index some of the content. Considering the effect this can have on your search engine optimization efforts, that glitch can have a big impact on your ranking. And for websites that regularly change content (like product sites), youd want your new stock to actually registerespecially when youre promoting a hot, trendy item.
So in short, to get...