We do. Much in the same way private property is protected, we need regulation enabling the technical means to keep bad actors off private machines.-<p>This, back in the quaint, good, ol' days, was sufficiently implemented through the voluntary, good will, communal, neighborly agreement that robot.txt embodies.-<p>Unfortunately, sadly, that is no longer enough.-
I agree. Robots.txt is a suitable means of preventing crawlers from accidentally DOSing your site, but it doesn't really give you any protections as to how your content is used by automated services. The current anything-goes approach is just too exploitable.