Once a website is built and sent out into the great World Wide Web, it can take a long time before a search engine robot reaches it and starts indexing it – and in that time any number of errors can pop up. Third party software such as SEOfrog or Rabbit SEO are designed to crawl your website in the same way as a search engine robot, and in doing so search for errors and bugs.
The importance of doing this is to pre-empt search engines finding errors on your website and poorly ranking your website. The 3rd party tracking software will identify errors and allow developers to correct them immediately before search engines notice them. It emulates the indexing process of a search engine robot, and predicts which errors will trip up crawlers and cause negative rankings.
It is important to run this on your website as soon as it goes live. The sooner errors are identified and fixed, the sooner a website will be properly indexed and ranked when the crawlers come knocking.