Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

What is a Robots.txt Generator?

A Robots.txt Generator allows you to create a Robots.txt file without requiring knowledge of the Robot Exclusion Standard. It is a simple text form that allows you to select which directories & files you want to be hidden from search engine robots and which directories & files you want them to index.

This can be a great time saver for webmasters who have a large site and don't want to spend hours writing out the Robots.txt file from scratch. Instead, you can use one of these free tools to create your Robots.txt file in a matter of minutes.

Remember that your robots.txt file tells search engines what pages on your website to crawl and what pages not to crawl. The more pages you tell search engines not to crawl, the less effective the exposure of your website will be.

How to use the Robots.txt Generator tool online?

  1. Enter the URL of the page you want to crawl.
  2. Select the Robots.txt file.
  3. Click on the Generate button.
  4. Download or copy and paste the generated text in a file with a ".txt" extension and upload it to your server.