Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt document permits bots to slither the particular spaces of your site. Before a web search tool slithers your webpage, it goes to your site's robots.txt document to get the guidance of creeping and ordering your site in web crawler results. 

Robots.txt documents are significant and valuable in the event that you don't file the copy and broken pages of your site, explicit spaces of your site, and login pages, XML sitemaps. With the utilization of the robots.txt document, you can eliminate those pages which enhance your site as web indexes center around the main pages to slither. 

Search engines can slither a restricted measure of pages in a day so it would be advantageous for web search tools on the off chance that you block some irrelevant URLs, so they can creep your pages rapidly.