The longer I work on performance teams, the more I agree with the "performance always matters" point of view[1]. In large binaries (or at least, in the large binaries I've worked on), we don't see a few tight loops taking the bulk of the time. Instead, there's a death by a thousand cuts in which many small inefficiencies add up, and have to be clawed back slowly and painfully, often by people with less understanding of the semantics of the code in question than the original authors. Most performance work I see isn't stuff like "write the tight loop in assembly; save 30% on execution time", it's stuff like "reuse the locale object to avoid construction penalties; save 0.4%".<p>For reasons like this I'm skeptical of e.g. python advocates who say the speed difference doesn't matter since you can always rewrite "the hot code" in C. That works when you're truly using python as a scripting language; the glue that ties you matrix multiplication routines or whatever together. But when you're going to have a large, flat, performance profile, you're better off just writing your program in C++ (or its friends) to begin with.<p>[1] So I suppose, you can take this entire comment as "person in specialty thinks everyone else should change to make his life easier". Maybe I just have a warped view of priorities.
<i>” We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.”</i><p>As I've gotten older, I've developed the following interpretation of this statement:<p>Getting system architecture right is hard, but crucial. When it is right, there is very little code to it, as little as necessary for the job. Rather than sprinting ahead and micro-optimizing your initial architecture, you should go slow, focus on minimal-viable units within the system and then, once contact with the real world has occurred, step back and ask if your architecture is going with the problem, akido style, or fighting it, J2EE style. Then, and only then, consider the perf fixes appropriate to whichever situation you find yourself in.
The author is very welcome to start a company that builds software according to these principles. In some domains (databases, network stacks, middleware), he could probably make a killing. In other domains (say, social web applications), he'd get clobbered.<p>I think this is the problem with appeals to professional ethics. Software is not one profession, and not one industry. It's labeled as such because the software industry changes much more quickly than the job categorization industry does. Nowadays, there's a huge difference between people who write software that helps planes stay in the air vs. those who let you search the web vs. those who write critical storage infrastructure for other companies vs. those who write software that lets you throw sheep at your friends vs. those who let you order from the grocery store with your phone. The best practices and rules of thumb for one industry don't transfer over to another industry. And it's not fair to consumers to make them deal with blindingly fast software that doesn't actually do what they want when they're quite willing to put up with slow, bloated software that does.
I couldn't agree more. It's rather alarming really, the mindset that most of the professional developers have on this matter. Making the life of the developer easier is the highest objective, low-level is derided, ease of use for the dev is readily bought with inefficiency, massive layers of abstraction and inefficiency and bloat are <i>everywhere</i>, and above all performance in the end product is sacrificed for the convenience of the developer.<p>Now, I'm not saying to spend substantial amounts of time and money chasing small micro-optimizations. I'm saying it makes me think for a while that it seems like everybody just simply <i>stopped giving a shit about performance</i>, and then we end up with orders of magnitude of crud on our software.
I thought it was a bit silly until I saw that he's talking about C++.<p>The smart-everything, RAII-everything approach that most C++ uses these days tends to make everything a little bit slow. In other languages, there's usually a small number of places that are slow, and everything else is inconsequential. In C++ you end up spending a lot of time spread among a million different bits of code twiddling smart pointers and RAIIing things that don't really need it.<p>I completely lost it at this paragraph, though:<p>"If you tell yourself, <i>it’s only a malloc, it’s nothing</i>, and you do this often enough, you will end up with 25000 temporary allocs for a single keystroke. They may only take 50ns each, but I type 529 characters per minute."<p>That works out to 1.1% CPU utilization when typing at full bore. (50ns * 25000 * 529/minute = 0.011.) In the abstract, that's high. But in a practical sense, it's completely irrelevant.<p>Shaving CPU cycles off the keystroke handler of an app that only uses 1% CPU in the hands of a fast typist is a massive waste of the programmer's time. This paragraph is followed by:<p>"When these sorts of things are pointed out, people aught to respond by fixing the problem, so that they can deliver a better product. Instead, they typically start arguing with you about why it’s not that big a deal."<p>Well yes, because it's <i>not</i> that big of a deal. There are better places to spend your time. A product that uses 0.01% CPU while typing at full bore is not noticeably better than one that uses 1% CPU. You won't "deliver a better product" by addressing this, you'll waste a bunch of time you could have spent building a product that's actually better.
I'm glad this article exists, mainly because calling any and all optimization work a "micro-optimization" has become highly fashionable among the lazy. For every 1 developer who actually understands what Knuth was saying, there are 5 more who think that concern for performance during implementation is "premature", and will take any opportunity to criticize others, loudly, so that they appear intelligent.<p>I recently had someone chastise me for micro-optimising, before even seeing the code, understanding the use-case, or knowing that my load-tests already established this as a bottleneck. I'd barely said more than "I'm looking for a more efficient implementation of this" before being shamed as a micro-optimizer. It's out of control.
They keep telling you:<p>> "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.<p>But they always omit the following part of the quote:<p>> Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code, but only after that code has been identified
> There is a whole class of developer out there who rejects any optimization beyond occasional attention to asymptotic complexity. They believe that their responsibility ends once they’ve picked the theoretical best algorithm, and that at this point things are as good as they’re likely to get.<p>And there's also a whole class of developers that don't seem to care about asymptotic complexity. Take for example the React crowd: in React, in its rawest form, every little update to the DOM will require O(N) time to update the display, even though the update is usually fast because of the diffing algorithm. However, the O(N) complexity puts a strong limit on the maximum complexity of the DOM tree, and it is imprudent to just ignore that.
Couple of quotes comes to mind related to abstraction.<p><pre><code> Premature Optimization is like fart: Premature Abstraction is like a taking dump in another developer desk
</code></pre>
More less blunt version of that<p><pre><code> Premature optimization, that's like a sneeze. Premature abstraction is like ebola; it makes my eyes bleed.</code></pre>
One example is Skype on Windows. It now takes several minutes to load when it used to be a few seconds. Also I find mobile hard to use for everything because there are 10 second delays between every trivial task you want to do and it adds up.
Knuth:<p>> The conventional wisdom shared by many of today’s software engineers calls for ignoring efficiency in the small; but I believe this is simply an overreaction to the abuses they see being practiced by penny-wise- and-pound-foolish programmers, who can’t debug or maintain their “optimized” programs<p>I think ignoring efficiency is bad, but ignoring efficiency is very different to not prematurely optimising. The author seems to be misquoting Knuth to a certain extent by equating them.<p>Give me the fast enough, no performance issues, no repetition maintainable code with those intermediary variables please.
"premature optimization" is statement setting itself up for negativity. Of course "premature" optimization is bad - so is basically anything else that happens prematurely.<p>That's not to deny thoughtful optimization when logical, even before the program runs. I've often found Knuth's quote to be interpreted as "Performance doesn't matter - just get it to run".<p>Which as Knuth points out, for "one shot" programs is probably fine, but for anything in the longer term is damaging.