Skip to main content
🤖

Robots.txt Tester

Test if a URL is allowed or blocked by robots.txt directives. Supports wildcards, multiple user-agents, and provides detailed matching information.

Instructions

Paste your robots.txt content, enter a URL to test, and select a user-agent. The tool will analyze the directives and tell you if the URL would be crawled.

Tips

  • More specific rules take precedence over general ones
  • Allow rules override Disallow rules when they have equal length
  • Wildcards: * matches any sequence, $ matches end of URL
  • Test with different user-agents to see how different crawlers are affected
Category: testing
Difficulty: beginner

Related Articles