This is a list of five cognitive biases that one could go research a lot deeper. On its own, the post doesn't offer much actionable advice on how to use or avoid these day to day. They even conflict in nonobvious ways. For example, I use the arrow keys in VIM rather than learning more efficient, more efficient navigation. You could call that hyperbolic discounting, but I consider it avoiding premature optimization (I don't use VIM often).<p>Getting caught up in cognitive biases is itself an example of a problem that interferes with our productivity. It doesn't strike me as a useful view of the big picture, other than the realization that perhaps software engineering is all about balancing these sorts of conflicting factors. E.g., know when to experiment, quick and dirty, and now when to go deep in refactoring.
Hyperbolic Discounting Sounds really weak. So what if I use the arrow keys in my editor rather than learn some obscure magic to get where I want a bit quicker? Likewise, testing is always in that area that is nice to have in code that is going to last more than a few days, but adding them in up front when you are still exploring the design and solution space is just dumb (up there with premature abstraction).
Hmm,<p>I would like to add a couple more - I'd be interested if either of these has a name:<p>* Assuming that complexity is additive.<p>* Assuming that complexity of an extension to a piece of software is proportionate to the difficulty of imagining such an extension.
All of these biases certainly exist. What I want to see is some evidence of how often these actually occur, how much damage each one does, and what ways exist to mitigate them.<p>I would agree it helps to become aware, but I honestly don't know how to mitigate any of them. It seems like all of them involve predicting the future. I don't know what payoff I'm going to get if I learn vim's wacky and hidden keyboard commands, and the payoff isn't likely to be large if I avoid using vim whenever possible.<p>Hyperbolic discounting seems like a generally good bias to have - unless I have a very specific bet in mind or the big payoff is an obvious benefit to me or a requirement of some kind, then choosing quick wins is probably the right choice more often than not.<p>Premature optimization seems like almost the exact opposite of hyperbolic discounting, and avoiding premature optimization feels like the right decision the majority of the time.<p>But I never know for sure which one was actually the right choice until long after the decision has been made. I've certainly optimized things that didn't need to be, and I've certainly gone for quick wins and regretted not planning for a bigger payoff.
IKEA Effect:
This is a good one, however I think it's also a bit oversimplified...<p>>If you’ve ever worked for a company that used a dumb internal tool rather than a better out-of-the-box solution, you know what I’m talking about.<p>This is probably not what the author is talking about, but there is a valid case for "inferior" in-house tools or libraries this overlooks. In some cases a suitable off the shelf component exists, but is an over-generalised, over-complicated black box with the associated disadvantages. If this component is an important piece of your product or business, and you only need a fraction of the functionality provided by the off the shelf one, then it can be preferable to create an in-house version to gain simplicity, focus, insight and control.<p>As i'm sure many people have experienced the opposite of "dumb internal tools" in the form of frustratingly buggy and unpredictable off the shelf components, when these problems are too broad to reconcile with upstream contributions it's sometimes worth creating a less-capable "inferior" internal version more suited to your use-case.
> <i>Have you ever found yourself using the arrow keys in Vim?</i><p>Real problem is that you are forced to use Vim.<p>For a long time it was the only decent code editor for terminal. Vi is still default editor on many distros.
The brain can make real-time decisions because it does not evaluates reality from scratch at every moment, but rather runs everything through a learned model of the world. That model is based off incomplete information, and will always be biased.
"IKEA effect" seems to be a different name for the "NIH syndrome", which I prefer to call "ownership preference" anyway, mostly because it doesn't use a brand name or an acronym.
I'd like to add one which I might call "Excessive machine sympathy." This is where the programmer shies away from using a certain technique because it's too difficult, not quite grasping that it's difficult for the <i>computer</i> but not for the programmer. I often see newer programmers acting like:<p><pre><code> f1()
for element in array {
f2()
}
</code></pre>
Is easier to code than:<p><pre><code> for element in array {
f1()
f2()
}</code></pre>