A robots.txt file restricts access to your site by search engine robots that crawl the web.
These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.