Robots.txt Generator

Search Engine Optimization

Robots.txt Generator

Default - All Robots are:  
Sitemap: (leave blank if you don't have) 
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo MM
  Yahoo Blogs
  DMOZ Checker
  MSN PicSearch
Restricted Directories: The path is relative to root and must contain a trailing slash "/"

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robot dot txt generator

Generate Robots.txt Using a Robot Text Generator

Today the webmasters are clever. They create their web robots which tell the search engines and robots how to crawl their website. It helps with the search engine optimization. The webmasters use the “robot text generator” or the “robot txt generator” to generate the robots.txt file.

What is robots.txt?

It is a text file which is an instruction for the web crawlers. Using the custom robots txt generator, the webmasters can create the file which will work for their specific website. The REP or the robot exclusion program is the group of web standards which regulates how to robots crawls the internet. These standards also regulate how the robots index the content and access it and then bring up to the users. These protocols include meta robots, site-wide instructions, subdirectories, pages, links and other instructions for the search engines.

In simple words when you create a robots.txt file, it will tell the web crawlers or software (also known as user agents) to crawl or not to crawl a part of the website.

The format of the Robots.txt

User-agent: [user-agent name]

Disallow: [URL string not to be crawled]

The robots.txt generator or the robot.txt generator is a tool which will automatically create this file for a particular website. Although this file is small and only a few instructions, but you can use multiple lines to allow, disallow or put crawl delays using multiple instruction sets. The robots text generator is the tool that will create a robots.txt file which has all the parameters that you want for your website. See this image to get an idea about it. 

Each multiple line instruction is known as multiple user-agent directives. A line break separates each user agent. Each “allows” or “disallow” rule will only work until that line break. To get a better idea see the image below.

In this example, the DISCOBOT, MSNBOT, and Slurp have a separate set of instructions.

Here are a few more instructions to make you understand.

  1. To block all of the web crawlers from the entire web content, you need the file with the following code

User-agent: *

   Disallow: /

  1. To allow all of the web crawlers to access the entire web content, you need the file with the following code

User-agent: *


  1. To block a specific web crawler, i.e., Googlebot, from a specific folder on the website you need the file with the following code

User-agent: Googlebot

Disallow: /example-subfolder/


  1. To block a specific web crawler, i.e., Bingbot, from a specific web page on the website you need the file with the following code


User-agent: Bingbot

Disallow: /example-subfolder/blocked-page.html

To make sure that the robots.txt file works and helps improve your SEO ranks you should also use other tools such as Meta Tag Generator, XML SiteMap Generator, Keyword Density Checker, Backlink checker, Google Index Checker, etc. these tools will help your website get the content that it needs.  The XML SiteMap Generator is a great sitemap online tool that you can use to generate the sitemap for your website. The sitemap in robots.txt is a great addition which helps the web crawlers to access various parts of your website easily.  Generally, the sitemap is in the last section of the robots.txt file. Make sure that you add a sitemap to robots txt. See the image below. Adding sitemap to robots.txt is easy using an online robots.txt generator.

Syntax of robots.txt

The syntax of the Robots.txt file is the terms that you use. These are five terms:-

  • User-agent: It is the “web crawler” such as Googlebot or Bingbot, etc.
  • Disallow: It is the command. It tells the crawler not to crawl the URL or a particular part of the website.
  • Allow : It is the command which tells only the Googlebot to access a page or a subfolder.
  • Crawl-delay: It is the delay of time in seconds. It tells how many seconds the crawler needs to wait before crawling the content.
  • Sitemap: With the robot.txt sitemap you can call out a specific location of any robots txt sitemap xml. It makes it easy for the crawler to crawl the website.

Some Important Points

Here are a few more instructions about the robots.txt file.

  • Make sure that your robots file is at the top-most directory of the website.
  • You should always name the file in small case letters. Name it as “robots.txt” and not Robots.txt.
  • You should not use this file to hide private user information
  • You should have two separate robots.txt files for the root domain and its subdomain. For example, you should have two files for and
  • You should store these in separate directors such as and
  • It is always good to add a sitemap to the robots file

Why do you need a robots.txt file?

The robots.txt file is very handy. Of course, you should not disallow Googlebot to crawl your entire website.

  • These files will help prevent the duplicate content to appear on the search engines
  • It allows you to keep an entire section of a website private
  • It allows you to specify the location of sitemap or sitemaps
  • You can prevent the search engines to index specific files such as PDFs or images.

Creating a robots.txt file

Creating the robots text file is very easy. All you need is an online robot txt generator. You can also use specific generators for Wordpress or Joomla websites. You can use WordPress and Joomla robots.txt generator. Make sure that you use Meta Tag Generator, XML SiteMap Generator, Keyword Density Checker, Backlink checker, Google Index Checker, etc. All these together with the robots.txt file will allow you to get a good rank on the SERP.

You may additionally use our other tools :