Robots.txt Generator

Generate properly formatted robots.txt files for your website. Control how search engine crawlers access your content by setting allow and disallow rules, crawl delays, and sitemap locations.

User Agent Rules

Rule 1

No disallow paths (allows all)

Sitemap URLs

Host Directive (Optional)

Specifies the preferred domain for search engines (Yandex directive)

Generated robots.txt

User-agent: *
Disallow:

FAQ

What is a robots.txt file?

A robots.txt file is a text file placed in your website root directory that tells search engine crawlers which pages or sections of your site they can or cannot access. It follows the Robots Exclusion Protocol.

Can robots.txt block pages from search results?

Robots.txt can prevent crawlers from accessing pages, but it does not guarantee pages will not appear in search results. For complete removal, use noindex meta tags or HTTP headers in combination with robots.txt.

Where should I place the robots.txt file?

The robots.txt file must be placed in your website root directory and be accessible at yourdomain.com/robots.txt. Search engines look for it at this exact location.

Related Tools