A robots.txt is a file with a purpose of telling search engines or other robots which fields of your website can index and visit. The Robots.txt generator creates a robots.txt file that is important for the security of your site, prevent spam and provide you a basic level of protection. The information you consider irrelevant won’t be easily accessible. However, you can have robots.txt only in the root directory.
Robot.txt generator works simply: all fields you need to fill are clear because everything is explained. You would just need to follow the steps and you will quickly have the robot.txt file created in your root directory.