Physician, heal thyself.<p>It always seemed evident to me that Knuth's exhortation was about making sure that the thing you were optimized was in fact a major contributor of code time usage -- and that your change was an improvement.<p>Telling people to avoid, for example, bounds checking because it might turn out to be a cycle soak later sounds like a good way to make your software worse, not better, in the hopes that a few instructions saved will make the difference. I once worked on a code base once with three different half-right hand-hacked versions of date formatting code. I replaced them with strftime(). Certainly the call was slower, but I was provably better off optimizing the timer routine that ran 50 times a second than worrying about hand-formatting dates into strings.
I had a professor that used to say, "Make it work and then make it work fast." The point being that you need to both understand and solve the problem before you can figure out how to make the solution faster. It is the reason that the concept of a prototype exists in every engineering discipline.<p>As an additional perspective, compare the solution of an engineer with 2 years of experience to that of an engineer with 5 years of experience. If the solutions are drastically different then interpretation of a rule such as "avoiding premature optimization" will be drastically different as well.<p>Like any overly simplified statement, it is actually highly subjective. The author of the article even calls out the specific context in which their interpretation of Donald Knuth's rules are being applied, "I’ve listed lots of relatively low-level things up there, but that’s just because it’s the level I work at." and as such their interpretation doesn't necessarily apply in other contexts.<p>Premature optimization is a problem if you're approaching it from a place of ignorance. If you're doing it mindfully based on experience and domain knowledge then it starts to make sense. But even under these conditions your best intentions can be wrong. I've been in plenty of situations where I thought I'd identified a code bottleneck only to have a far easier, cheaper, and better solution completely unrelated to code come to light.
Today I had to debug code with some database calls within 5 levels (at least) of for-each loops.<p>I stopped measuring at ~40,000 database round-trips.<p>"Engineering time costs more than CPU time" was the attitude, and for the original problem in its original specification, the solution was clearly OK.<p>But here we are, now needing to work out the original specifications, work out the current implementation (in case they differ anywhere), and work out whether it's worth re-writing it top-down or just fixing the worst of the loops.<p>And I'm not blaming whoever wrote this originally, it must have done its job to make it into the code base, but it really sucks to have to unpick it because an assumption of "database calls are free" is an assumption that unravels in a really messy way.
"The website is temporarily unable to service your request as it exceeded resource limit. Please try again later."<p>Cache:
<a href="https://webcache.googleusercontent.com/search?q=cache:zjuUa-xmVy0J:www.joshbarczak.com/blog/%3Fp%3D580+&cd=1&hl=en&ct=clnk" rel="nofollow">https://webcache.googleusercontent.com/search?q=cache:zjuUa-...</a>
All too often premature optimization is brought up as the antidote to carefully think about what you're implementing prior to actually opening up the IDE and start coding tests madly.<p>Thinking is hard, and takes time, and we want to get the feature out now, immediately, and worry about performance later. If at all.<p>And, sad to say (for an engineer), it's not clear that from a "business" perspective it's wrong. Hard to argue when accumulating features seems to matter more than crappy software. We have a lot more software these days, to run all of these bright new pieces of hardware, and perhaps because I'm an old-timer, the general quality seems to have degraded significantly. But the novelty of the stuff certainly has exploded, and I'm continually delighted by the twists and features that folks are coming up with, while being saddened by crashes, slowdowns, need for restarting, etc.
For some reason, the vast majority of developers take 'premature' to be a synonym of any.<p>If CS majors spent a fraction of the time learning how to optimize the way EE/CEs do, we'd need a lot less magic from the latter.
Yes, yes it is. Simply asserting the opposite doesn't suddenly invalidate an entire industry's decades worth of experience.<p>One good point from the essay though is Knuth's example of 12% speed increase for low effort is definitely worth doing. I agree.<p>A better way of putting things is:<p>Considering low effort, small performance improvements that don't affect other factors such as code readability or system maintainability is not "premature optimization".<p>If you are considering performance "in the small" and it affects maintainability you are indeed prematurely optmizing.
The problem I have found is not early optimization, but knowing what to optimize early and what can wait. Everything in a system can be optimized, but there obviously isn't the time to do this.<p>I work in Android, so optimizing I look at first is bitmap loading, backgrounding tasks, and sqlite queries. Usually, this is where 90% of the performance benefit comes from.
He is right.<p>He is right about premature optimizations and also right about craftsmanship.<p>Many performance problems stem from bad overall design decisions, bad abstractions and bad data structures (summarized: craftsmanship). I always wonder, how fast today's processors have become and how slow they get with today's software.<p>One of my first computers had 64k of memory and a processor that was so incredible slow compared to today's processors, that it is unbelievable. And still, they made so many good software with it. In today's programs the resources of this computer would not suffice for the idle loop.<p>Also the first Unix computer that I worked on, had only 8MB of memory and had X11, LaTeX and many other stuff!<p>It was a long way down the road of abstractions, that today's computer could hardly live with 1GB of memory and 1GHz processor (2 cores, please, please!) for basic usage.<p>The real art in computer science is, to know where to optimize -- to use your time most effectively.
Hmm, he complains about the awesome bar being coded inefficiently, but I'm on a several year old laptop and the awesome bar is pretty much instant at showing results for me when I type a key. So his evidence is not compelling in my experience. Reading his links, it sounds like his memory error checking tool is what causes slow down, not the usual Chrome code. Kind of bizarre he is pointing fingers at others when his stuff is the problem that needs to be worked on.<p>His whole argument boils down to:
> Considering performance in the small is not “premature optimization”, it is simply good engineering, and good craftsmanship<p>But that's the same reason German industry tends to compete so badly right now. They do a lot of hand trimmed and finished components that really could have been better designed for automation and require less custom craftsmanship. His argument seems to boil down to aesthetics.
I'm very happy to see this article and more people taking this mindset.<p>Many people don't even know the context or the original quote. And many times I've seen discussions about improving a piece of code shot down by a single incantation of this Knuth. It's sad.
Considering performance is a basic tenant of software architecture and engineering. Obsessing about it needlessly is the "premature" part. Considering the business case is the most important thing to remember.
Brilliant blog. Thank you for writing this. Optimization is important! You don't have to tune everything to the fastest possible speed. But remember, in web programming, you may be writing a function that is called 10 or 100 or 500 times per request. That little optimization will add up quickly.<p>Be smart when developing, and know where the easy optimizations and common pitfalls are in your language and toolset. Optimize as you can without sacrificing maintainability.
"As a developer, I am tired of my IDE slowing to a crawl when I try to compile multiple projects at a time. I am tired of being unable to trust the default behavior of the standard containers. I am tired of my debug builds being unusably slow by default."<p>Don't forget web browsers. Web browsers are horrible.
Nobody's arguing that speed isn't important. What's more important is to get things right.<p>And no, incorporating speed in the specification doesn't make any difference. Suppose the spec says, "page must load in 200ms." Fine, if you don't care what correct page loading means, you're perfectly served with a blank one.<p>What's at the root of such intellectual capitulation? Complacency? Absence of skills? "Correctness is hard, let's just randomly perturb settings instead while fiddling with a stopwatch. Correctness is hard, let's just conflate motion with progress."<p>Whence the shabby treatment of correctness like porn: I'll know it when I see it.
Your small is another man's big. So, "don't prematurely optimize" could be said another way, "a different point of view is like losing 80 IQ points" or you could say it positively like Kay does <a href="http://en.wikiquote.org/wiki/Alan_Kay" rel="nofollow">http://en.wikiquote.org/wiki/Alan_Kay</a>, but I think he's only positive because he drinks once in a while.