Robots.txt Maker

Now it's easy to notify Search Engine Robots to Disallow and Allow the files and folders you want indexed and more...

Create Attention Grabbing, Search Engine Friendly robots.txt file Fast and Easy!

Disallow and Allow Robot Access
Major search engines (for example, Google, Bing and Yahoo!) allow robots to index files and folders.  The Allow directive works just the opposite of Disallow.
You can specify which content will be disallowed and allowed to be crawled.

Visit Time
You can define between which hours of the day or night you want your pages to be crawled

Crawl Delay
If a robot is crawling your site too frequently and chewing up your bandwidth, use Crawl Delay.  This will reduce the load on your server and help your webpages open faster.

Request Rate
You can define how many pages for robots to crawl per second.

Allow Sitemaps
You can show robots where your sitemap is located for quick and easy indexing.

User Agents List
A user agent is any device that interprets web documents.  User agents include visual browsers (text-only and graphical), non-visual browsers (audio, braille), and search engine robots.

A list ofUser Agents shown to index your files and folders are included in the program easy access.  Alternatively, you can enter your own favorite robot.

Now it's easy to instruct robots which of your files and folders to allow or exclude from indexing!

"FTP Access" is included !
"FTP Access" is included for easy drag & drop or copy/paste robots.txt file Upload.  Once uploaded to the root directory of your website files, it starts to work immediately!
Whenever the next robot comes to your site, it will first look at your robots.txt file to see which files it is allowed or not allowed to index with a crawl delay if any, and allow any sitemap(s), and then will go further to index all files and folders that were not listed in the robots.txt file.

If you don't have a valid robots.txt file, robots will read and index every file and folder they can get their hands on !

Valid Robots.txt Files
Robots.txt Maker produces a valid coded robots.txt file that is stored with your website files.  Its purpose is to give instructions to user agents (also known as "robots", "crawlers" or "spiders") detailing what they should or should not index on your website.

As long as you have some basic knowledge of how to edit and upload your webpage, you'll be glad you have Robots.txt Maker.

Be sure search engine spiders index the files you want with Robots.txt Maker !

  Download the FREE demo version NOW!

$ 19

Download Demo

 Product Updates

 Customer Support

 E-Mail This

Screenshot
What's New

Version 4.4.1

Recommended update:

New Features:

Performance improvements

Version 4.4

Recommended update:

New Features:

File import now restricted to
  robots.txt files
Updated User Agent list.
Pattern matching.
Visit Time.
Request Rate.
Updated online resources.
Interface improvements

Version 4.3.3.1

New Feature:

Crawl Delay

Features of Robots.txt Maker

  Make Search Engine Friendly Robots.txt files.
  Save your robots.txt file to your hard drive.
  Import an existing robots.txt file from disk
   for review and editing.
  Print your robots.txt file for easy reference.
  Select popular robots from a User Agent list
   or enter your own favorite robot.
  Realtime Date.
  Easily add notes.
  Visit Time.
  Crawl Delay option to reduce server Host burden.
  Request rate.
  Pattern matching.
  Disallow files and folders.
  Allow files and folders.
  Allow XML Sitemap(s).
  Online resources.
  "FTP Access" is included for easy drag & drop Upload
   to your website files.  Alternatively, you can use your
   own favorite FTP program.
  Unlimited Personal and Commercial use.

Disallow and Allow Robot Access
Major search engines (for example, Google, Bing and Yahoo!) allow robots to index files and folders.  The Allow directive works just the opposite of Disallow.
You can specify which content will be allowed and disallowed to be crawled.

Visit Time
You can define between which hours of the day or night you want your pages to be crawled

Request Rate
You can define how many pages for robots to crawl per second.

Crawl Delay
If a robot is crawling your site too frequently and chewing up your bandwidth, use Crawl Delay.  This will reduce the load on your server and help your webpages open faster.

Allow Sitemaps
You can show robots where your sitemap is located for quick and easy indexing.

As long as you have some basic knowledge of how to upload your web files, you can add a robots.txt file to your website in just a few minutes.

Runs On: Windows 8, Windows 7, Vista, XP