Upstanding crawlers already follow this and more; rogue crawlers won't be influenced by such a code-of-conduct.<p>Furthermore, unless a crawler is trying really hard to disguise itself, site owners have the ultimate 'self-help' available to them: just block bad crawlers entirely.<p>Any formal code-of-conduct beyond what are already customary practices (like following robots.txt rules) thus strikes me as superfluous.