Robot.txt Control Code Generation Tool
Our Robot.txt control code generation tool is FREE to use.
What is a Robots.txt
A robots.txt file is a very simple text file that is placed on your root directory for search engine spiders to pickup as the land on the website.
This robots file tells the search engine robots which areas of the website they can access and which areas they should not access and index.
You should ONLY have one robots.txt on your webiste. It must always be in the root directory.
What is in a Robots.txt file?
Simply the following code is in most robots files:
This one tells all search engine robots known as user agents to go anywhere they
want and we disallow nothing.
If you want to block all robots then you would use:
The only difference between the two examples above is the “/”.
To create robots.txt tool click here.