What is the Robots.txt file
The robots.txt file is a simple text file that lives in the root directory of your web site. It's purpose is to provide instructions to search engine spiders about how to crawl your web site.
For example, this robots file tells all search engine crawlers that they can visit any page.
User-agent: * Allow: /
Conversely, this robots file tells all crawlers to stay out of all pages.
User-agent: * Disallow: /
You can of course be much more specific. This example shows how to tell two specific crawlers not to index a particular directory.
User-agent: BadBot User-agent: Googlebot Disallow: /private/
You must remember that the instructions in your robots.txt file are entirely optional for search engines - there is no guarantee that the rules will be respected.
Do not use robots.txt to hide sensitive pages or information.