A robots.txt file is used to tell search robots which pages on your website you want indexed and which pages to disregard when crawling the website. Thus pages with sensitive content can be excluded from search results at the behest of the client.
When implementing an SEO campaign for a website it is important to review and revise the robot.txt file once the site has gone live and the SEO campaign has been in action for a while. A robot.txt file is always located in the main directory of a website and when a search engine robots comes across the website, they look to this file for instructions. New instructions will make them aware of the fact that the website content has been updated and that there are new files to be indexed.
Secondly the robot.txt file may be excluding content that needs to be indexed, and a revision of this file can correct the problem. A review will also highlight any other potential problems that can arise as a result of the robot.txt file.
The chief aim of a robot.txt file review is to ensure that nothing on the main website is being excluded for any reason other than the fact that the client asked for it.