One of the primary goals of SEO is to have your pages indexed by search engine robots known as spiders. These spiders crawl the internet looking for pages that are new and by following links they explore pages and create a cache of them. It is only once a page has been indexed that it will appear in search results, and this is why it is a crucial aspect of SEO. Indexing a page is how search engines make sense of and list all the information available on the World Wide Web.
Therefore it is always important to check which pages have been indexed using your sitemap that you submitted. Submitting the sitemap alone is a good way of making search engines aware of your website and that there are pages to be indexed.
When making use robot.txt files to exclude certain content from indexed, it is important to be incredibly careful when including instructions to the search robots. This will also be covered when checking which pages have been indexed and why others are not.
Using webmaster tools to see which pages have been indexed is a fast an effective way to get an overview of the progress search engine robots are making on your pages. It is perfectly ok for some pages not be crawled immediately, as long as the most important pages have been indexed already. If pages that are not indexed have been identified, the SEO team will set out to increase the search engine’s awareness of them until they are indexed.