Search Engines uses robots (User-Agents) to crawl web pages. The robots.txt file is a text file that defines which parts of a domain can be crawled by a robot. In addition, the robots.txt file can include a link to the XML-sitemap.
Whenever search engines crawl a website, the first thing it looks for is a robots.txt file in the domain root. If found, the search engine then reads the file’s list of directives to see which directories and files, and also check for the ones that are blocked from crawling. This file can be created with a robots.txt file generator.
This Free Tool allows you to easily create robots.txt file for your site.
you can create and copy the text or create and save the file.
Enjoy!
Suggested read: Robots txt Generator And How to Use it for Website Indexing