SEO3 min

How robots.txt works and what it should not be used for

A practical guide to robots.txt for technical SEO, with clear limits on what crawler directives can and cannot do.

Robots.txt controls crawling, not secrecy

A robots.txt file gives instructions to crawlers about which paths they should or should not request. It is useful for avoiding wasteful crawling on low value areas.

That does not make a blocked URL private or guaranteed absent from search results.

Use it for guidance, not as a security layer

Robots.txt helps shape crawl behavior, but it should not be treated as protection for sensitive content.

If a page must stay private, use proper access controls instead of relying on crawler directives.

Related

Similar tools

SEOFeatured

Canonical Tag Generator

Generate a clean rel canonical tag from a page URL for SEO reviews, migrations and duplicate URL cleanup.

Open tool
SEOFeatured

Meta Description Length Checker

Check if your meta description fits a practical length range.

Open tool
SEOFeatured

Meta Title Length Checker

Check if your SEO title fits a practical character range.

Open tool
SEOFeatured

Open Graph Tag Generator

Generate Open Graph meta tags for cleaner link previews across social platforms and messaging apps.

Open tool
SEOFeatured

SERP Preview Tool

Preview how your title, URL and description could appear in search results.

Open tool
SEOFeatured

UTM Builder

Build campaign URLs with UTM parameters for cleaner attribution across email, ads and social traffic.

Open tool

Insights

Articles connected to this tool

SEO4 min

Robots.txt vs noindex: which one solves the problem you actually have

Compare robots.txt and noindex so you can choose the right control for crawling, indexing and sensitive pages.

Read article
SEO4 min

When it actually makes sense to block crawlers in robots.txt

Learn which areas are worth blocking in robots.txt and which ones are usually better left crawlable.

Read article

Linked tools

Move from guide to action

All tools
SEOFeatured

Canonical Tag Generator

Generate a clean rel canonical tag from a page URL for SEO reviews, migrations and duplicate URL cleanup.

Open tool
SEOFeatured

Robots.txt Generator

Generate a practical robots.txt file for crawlers, staging sites and basic technical SEO setups.

Open tool
SEOFeatured

XML Sitemap Generator

Generate a clean XML sitemap from page URLs for audits, small sites and technical SEO work.

Open tool