>>we find stepping through a program less productive than thinking harder and adding output statements and self-checking code at critical places.<p>This ignores the typical scenario where you are working in a business application that you've never seen most of the code, only the relevant parts of whatever tasks you have done in that program, and all of the sudden you are asked to solve a bug or make a change in some place that you didn't know even existed, the original developer is long gone, is not documented, and chances are that there are many great coding horrors. A debugger can be really helpful to uncover how that code works.<p>Also it assumes that the only possible use of a debugger is set a breakpoint and then follow every next step until the end, but this is actually not the case. A debugger allows you set conditional breakpoints, skip whole sections of code, evaluate code using the actual context that the application had when it was stopped, make changes to variables while the program is being executed, it is a great tool to explore code and behavior.<p>I respect that some people may not like them and don't want to use them, but I find pretty dumb the idea that not using them is superior and you are a worse programmer if you do.