I have been doing private personal research on www robots since some September 1995, wrote the robot, database and search engine for www.fi during October and November 1995, and have since that been more or less actively following the discussions.
For more information on robots, see Web Robots home page.
A perl script to collect robots.txt files around the WWW document tree and user directories, and collate them to a single /robots.txt.