Pick a preset (Standard, WordPress, Laravel) or build from scratch.
2
Add Rules
Add Allow/Disallow rules, set crawl delay, add sitemap URL.
3
Download File
Click Generate, then download and upload to your site root.
What Is robots.txt?
The robots.txt file tells search engine crawlers which pages they can and cannot access on your site. It lives at your domain root (e.g., https://example.com/robots.txt) and is checked by every crawler before indexing your content.
Common uses: blocking admin pages, preventing duplicate content indexing, controlling crawl rate, blocking bots from private directories, and pointing crawlers to your sitemap.
Is This Tool Free and Private?
Yes. Runs 100% in your browser. No data sent to any server. Works offline.
Frequently Asked Questions
Upload to your website's root directory so it's accessible at https://yourdomain.com/robots.txt. In Laravel, place it in the /public folder.
It prevents crawling but not indexing. If other sites link to a blocked page, Google may still index the URL (without content). For full removal, use a noindex meta tag combined with robots.txt.
No. Everything runs in your browser. No data is ever transmitted.