Can anyone please tell me what exactly is a robots.txt file?
Robots.txt file is basically a text file that stops the web crawler software. For instance, with a robots.txt file, you can stop Googlebot from crawling some pages of your website. Robots.txt file consists of a list of commands like allow and disallow, as a result of which the web crawlers understand which URLs they can retrieve and which URLs they cannot retrieve.
You need to use a robots.txt file only if your website comprises of content that you don't want Google bots or any other search engine bots to index. If you want Google to index your entire website, you don't have to create a robots.txt file, not even an empty one.