"When there is error, we blame humans. But it is not humans who are stupid, it's the machines. We should design machines to minimize error. Humans are bad at precision so why demand it?" -- Don Norman, Design of Everyday Things
Immortal Linux would probably get more installs - a package that prevents common malicious commands from working without first warning you of the consequences.
Lately I've been advising people to use find -delete rather them rm.<p>Ever tried to clear the current directory with rm? It's actually amazingly tricky and dangerous to wipe out dot files without moving up directories. Whereas find has the benefit of saying exactly what it'll do before it does it (and being a bit faster IMO).<p>The thing I'm really missing in find is a native -chown and -chmod flag now.
Reversely, I wonder why `rm` hasn't been patched to abort when the `/` path is passed as an argument (with maybe a --no-really-i-mean-it argument for the really rare cases where that's what you want)<p>EDIT: not sure why this is being down-voted. It's just an idea, albeit possibly a bad one.
It is interesting the number of things that sharpen when there is risk on the line. We've already discussed how narrow streets without signs are safer for pedestrians, but its true for a lot of things. The trick is to make it really painful but not fatal.
A common issue, is stuff like:<p>cd $TempDIR<p>wget <a href="http://example.com" rel="nofollow">http://example.com</a> #Do stuf in temp dir<p>rm -R . #Clean Tempdit<p>if $TempDIR is not set, it will delete your home directory.<p>If $TempDIR does not exist or the cd somehow fails it will delete your current folder.