Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

This tool will allow you to generate robots.txt file which is used if you would like to exclude some of your webpages from the bots. Once you place the file to your server, it will prevent robots to scan through the webpages in the exclude list providing you with higher ranking.

What is Robot.txt?

Robot.txt or Robot Text is the first thing search engines looks for when they crawl your website. Once they find the robots.txt, it will check check the list of files and directories. The file Robot.txt can be generated using our tool. You can specify which files or directories should be excluded from search engine results or the ones that shouldn't be crawled by the search engines.

Using the tool is very easy. Simple click the Create button and get the results. You can create directives using this tool as well. You can choose either allow or disallow. Please note that the default is "allow" and you would have to change it if you want to disallow anything. You have an option to add or remove the directives.


For it is in giving that we receive.


Think of giving not as a duty but as a privilege.