The Robots.txt Generator helps you create a properly formatted robots.txt file for your website. The robots.txt file tells search engine crawlers which pages and directories they are allowed or not allowed to access on your site.

Every website should have a robots.txt file in its root directory. It is one of the first files search engine bots look for when visiting your site. A well-configured robots.txt can prevent search engines from crawling private areas, duplicate content, or resource-heavy pages that waste your crawl budget.

This tool lets you specify which user agents the rules apply to, which paths to allow, which paths to disallow, and your sitemap URL. The generated output follows the standard robots.txt format recognized by all major search engines including Google, Bing, and Yahoo.

Everything runs in your browser. Your configuration data is never sent to any server.

Generator

Results

How to Use

  1. Enter the user agent (use * for all bots)
  2. Add paths to allow (one per line)
  3. Add paths to disallow (one per line)
  4. Optionally enter your sitemap URL
  5. Click Calculate to generate your robots.txt

FAQ

Where do I put robots.txt?

Upload the robots.txt file to the root directory of your website so it is accessible at https://yourdomain.com/robots.txt.

Is my data uploaded?

No. Everything runs in your browser. Your configuration never leaves your device.

Will robots.txt hide my pages from Google?

Robots.txt prevents crawling but does not prevent indexing. If other sites link to a disallowed page, Google may still index it. Use a noindex meta tag to prevent indexing entirely.

Learn More

Guides that feature this tool

Part of These Collections

Curated tool sets for specific workflows