Wait. What?<p>1) Is he implying that launching a perl interpreter is somehow a more lightweight operation than launching ls and having it stat() half a dozen files? Having ls check if a file exists 99% of the time won't even touch the disk. If you're working in that directory, those inodes will probably be cached. Running stat() on a cached inode is hardly more expensive than a system call. Read: cheap. Now perl launching will probably load a few libs and check a few files before even getting to his one-liner.<p>2) Am I the only one, who in my ~10 years of running unix-like operating systems have <i>never</i> created a filename with a newline char in it? Like even when opening obscure filesystems, never.<p>3) Does he think unlink() skips the check for the file's existence?<p>4) If he wanted to overcomplicate things and <i>make sure</i> everything's kosher, why not:<p><pre><code> find . -maxdepth 1 -type f -name '*.c' -delete
</code></pre>
Whenever I see anything piped to a perl one liner, 4 times out of 5 it's some sort of hack that can be done with standard utils.<p>It also grinds my gears when people take longer to think of making something more efficient than the amount of CPU time they'll save in their lives of using that 'more efficient' solution. So he spent a minute coming up with this and ten minutes posting about it. Now, will he save roughly 11 minutes of execution time in his life while running this "superior" command (also count that it takes longer to type)? If not, he wasted his time.<p>From his sig: "Smalltalk/Perl/Unix consulting". When you have a large-enough hammer....