This is a great example of systems thinking. You may see some starving birds outside so you go feed them. But as people do that, the population of birds merely increases up to the malthusian point so that larger population is again on the verge of starvation. By feeding some birds you merely replaced one hungry bird with two hungry birds.<p>Systems thinking is about trying to understand the new stable equilibrium that will result in response to an intervention, rather than thinking about the local, immediate response to a given action (you feel good for feeding the bird. The bird feels good. What can be wrong with that?).<p>Once you realize that, the whole world changes. All of a sudden very obvious and simple prescriptions come into doubt.<p>Am I really helping make an OS better by reporting bugs? Maybe not. And not just an OS, but business models in general, economic development, poverty, education, family formation, trade, legal systems -- many questions that before had very simple, moral answers that were "obviously" right are now filled with questions and doubt. Are the long term results really what I expect?<p>And with this new doubt comes a realization that it's hard to tell what the effects of an intervention will be until you have a pretty deep understanding of the system you are describing <i>as well as its future evolution</i> in a wide variety of scenarios. This itself requires a deep understanding of human nature. As these are all in doubt, opposing views stop being people on the "wrong side" but people who have different predictions about core aspects of human nature which themselves are up to debate.<p>This is why I suspect the post was written by an old timer. Young, enthusiastic engineers just don't think in terms of systems, they tend to think in simple terms of stimulus/response, or right/wrong, help/hurt. This isn't due to a lack of intelligence, or poor intellectual curiosity. But simply because unlike the help/hurt localized stuff, a deep understanding of the long term behavior of a system can only come from living in that system, observing it respond to stimulus, and carefully observing how past interventions worked out. Over and over, over many years. Gaining that perspective is measured in decades, not years, of careful observation.<p>And after those decades, you will be painfully aware of how incomplete and faulty your understanding actually is. You become much more cautious, much more skeptical, and question a lot of obvious assumptions. Hopefully this additional perspective doesn't make you more gloomy or cranky, but it certainly seems that way to the enthusiastic new entrant, for whom it's obviously the case that when the public reports more bugs, this can only improve code quality and at worst will leave code quality unchanged.