Robots.txt file is basically a text file that stops the web crawler software. For instance, with a robots.txt file, you can stop Googlebot from crawling some pages of your website. Robots.txt file consists of a list of commands like allow and disallow, as a result of which the web crawlers understand which URLs they can retrieve and which URLs they cannot retrieve.