New Robots.txt Syntax Checker: a validator for robots.txt files

. Friday, March 09, 2007
  • Agregar a Technorati
  • Agregar a Del.icio.us
  • Agregar a DiggIt!
  • Agregar a Yahoo!
  • Agregar a Google
  • Agregar a Meneame
  • Agregar a Furl
  • Agregar a Reddit
  • Agregar a Magnolia
  • Agregar a Blinklist
  • Agregar a Blogmarks

Robots.txt Checker

Robots.txt files (often erroneously called robot.txt, in singular) are created by webmasters to mark (disallow) files and directories of a web site that search engine spiders (and other types of robots) should not access.

This robots.txt checker is a "validator" that analyzes the syntax of a robots.txt file to see if its format is valid as established by Robot Exclusion Standard (please read the documentation and the tutorial to learn the basics) or if it contains errors.

1
Simple usage: How to check your robots.txt file format? Just insert the full URL (Example: http://www.yourdomain.com/robots.txt) of the robots.txt file you want to analyze and hit Enter

2
Powerful: The checker finds syntax errors, "logic" errors, mistyped words and it gives you useful optimization tips

3
Accurate: The validation process takes in account both Robots Exclusion Standard rules and spider-specific (Google, Inktomi, etc.) extensions.

Source: New Robots.txt Syntax Checker: a validator for robots.txt files

0 comments: