When a website is submitted to search engines using a site map, the search engine robot or spider will crawl the website, indexing the pages. However sometimes these robots can also pick up on errors that might prevent the page from being indexed by search engines.
Using your webmaster tools that have been installed, the diagnostics option will illuminate any errors that were noticed during the crawl. Some of the most common search engine errors are:
The crawlers might pick up on malware that has snuck into your website. It is imperative that this be identified and removed as soon as possible, as anything that can be harmful to your users will have a detrimental impact on your search engine ranking.
If the crawler comes across errors while crawling the website, a report of these errors will be sent to the diagnostics tool in the webmaster tools. Using this information errors such as inaccessible URLs may be fixed before users have the same problem with them. Broken links are not good for search result ranking.
The search engine will make the webmaster aware of HTML elements that are not search engine friendly. This can include your meta tags, descriptions or any other content that prevents the page from being indexed.
Identifying and fixing these errors as they are reported by search engines will have a significant impact on your search engine rank and should always be paid attention to.