I work in HPC and mostly on the shell/console. Without a doubt, the combination of Bash (or another shell) and AWK is truly amazing. Being able to quickly generate statistics, filter out unnecessary information, generate pipelines, etc., is unmatched, and 99% of the time only requires a pipe redirect. Not to sound trendy, but using AWK really is the proverbial "if you know, you know".<p>One of my favorite use cases is based upon grep with extended regular expressions; there's always a need to search for strings while needing to exclude others, think of a basic example as "grep -E 'this|that' file |grep -Ev 'not(this|that)'". With AWK it's simple, "awk $0 ~ /(this|that)/ && $0 !~ /not(this|that)/' file". Or, if you're monitoring server load averages via a tool like sar, you can pick and choose which loads you want to monitor based upon a threshold, "uptime |awk '$10 ~ /[0-9]{3,}\.[0-9]{1,}/ || $12 ~ /[0-9]{3,}\.[0-9]{1,}/'". This will print matches if the 1 minute or 15 minute load averages are over 100.<p>Just because it's an "old" language doesn't mean it's obsolete or useless!