Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

About Robot.Txt Generator:

Welcome to the SEO CTRL’s Robot.Txt Generator. This tool is very useful to webmasters in creating their sites Googlebot friendly.  This free tool is used to produce the robot.txt file.

This tool is very easy and effective and you have the option on the page which things you want to include or not. If the robot. Txt file is not found in the search engine bots, then there is a chance that crawlers won’t index all the web pages. Google basically runs on a crawl budget which have some crawl limit and basically crawl limit is the number of crawlers that spend time on your website.  If Google finds out that crawling your site is shaking the experience of user’s, then it will crawl the site slower. To remove these type of restrictions from the website you must need side map and robot.txt file. These files basically speed up the crawling.

How to use SEO CTRL’s Robot.Txt Generator Tool:

  1. Firstly choose whether you want all robots or not, if you want to choose all robots then click on the allowed on the checkbox otherwise click on the refused.
  2. Choose the Crawl delays.
  3. Then paste site map on the sitemap box.
  4. Select the bots that which bots can crawl your web.
  5. At the end, you restrict the directories.

SEO CTRL’s Robot.txt Generator Tool is an awesome tool for generating robot.txt file in effective and efficient manner.