It would also make it hard to block bots that were sucking up bandwidth. If you can’t tell what the bot is about, you’ll have to assume it’s human.
I think I’ll write a crawler, too. Mine will be sixpackbot. It will stumble through websites, sometimes three and four times, in circles. It’ll stop for a nap occasionally and then start up again, stopping to hit the john a few times. I’ll ignore robots.txt files, though, cause when sixpackbot is drunk, he goes wherever he damn well pleases.