What is a robots.txt file?

A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages. Though all respectable robots will respect the directives in a robots.txt file, some may interpret them differently.

You need a robots.txt file if your site includes content that you don't want search engines to index.

See Wikipedia robots.txt sample file for reference.