Robots.txt Generator

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt has the task to create a file that is completely the opposite of the site map to indicate that the pages are included. So having Robots.txt for each site is necessary. When a web crawler scans a site, it first locates the robots.txt file at the root level and after it has been crawled, it reads the crawler and finally finds files and directories that may be blocked.


Under Constriction

Under Constriction

Report the problem of this tool:

Permalink:

Copy to clipboard

QR Code:

Robots.txt Generator qr img

 


Share this