I feel that I made the transition from jr to mid level developer when I was able to come into some new code and judge it, instantly pointing out what was wrong and how it could be improved, and how it should've been built. I feel that I went from mid to senior when I would enter that same situation and try to understand the why behind the code (even though I probably still thought it was poorly architected) before passing judgement or trying to "fix" it with a refactor.
I work in a lot of old systems and this is something I have to remind myself of constantly--the people that were working on these systems before me were not necessarily dumber than me. If they made these choices, they may have had good reasons for it that are not immediately clear to me, but if I change it may have far flung consequences that I am just not aware of.
How do you know how much you don't know?<p>I'd suggest there's a difference between "I don't understand how this is supposed to work" to "I do understand how this is supposed to work, and I don't understand why it isn't working."<p>Seat belt story seems to be Type 1. But what if a lot of "stupid" design decisions are actually Type 2?<p>And the reasons may or may not be good - somewhere on a scale from real budget and/or time constraints, lack of insight, indifference, penny pinching economics, to passive aggressive user hostility.<p>When Apple removed Magsafe no doubt there were perfectly good internal justifications for that decision. But ultimately the people outside the company who said it was a poor move from a user POV turned out to be right.
> An overwhelming feeling of stupidity hit me almost as fast. Nine-year-old me was not in fact smarter than Toyota engineers.<p>It’s unfortunate how many people never come to this realization.
The seatbelt is an excellent example since there are multiple mechanisms with varying observability contributing to its effectiveness. Even after discovering the dynamic locking behaviour, a naive yet inquisitive passenger remains blind to the pretensioning feature that is observable only during a collision.
Kinda related. When I was like 11-12ish, I remember waiting in my Dad's car with my younger brother when he went into a store for like 20 mins. We started playing with the steering wheel and then suddenly I couldn't turn it anymore. It was locked. We were shit scared that we broke something and how will Dad drive the car now ? I remember being quiet when my Dad got back to the car (I had warned my younger brother to deny that we did anything) and the moment he put the key in ignition, boom. I face palmed in my mind and learned how steering wheel locking works :)
One particularly prominent special case of this is when one engineer makes a snap judgment about another engineer's work that it's "overcomplicated". Oftentimes this means that the person making the judgment hasn't thought deeply enough about the problem to be able to see all the complexities in the problem that justify all the complexities in the solution.<p>I feel like, at this point in my career, and in the specific context where I work, this has now become the bottleneck to my career progress.<p>The problem is: It works as a self-fulfilling prophecy based on the fallacies of social proof and fundamental attribution error. People reach for the snap judgment "this is overcomplicated" because they expect me to just be the kind of person who just overcomplicates things. And they feel easily justified in that judgment given that other people also tend to make that judgment about me. This means they budget even less time for trying to understand the complexities in the problems I solve before reaching for the easy judgment of "this is overcomplicated".<p>So: The guy who always overcomplicates things, thus draining the company's cognitive resource needlessly and just generally being a nuisance, ends up looking much the same from the outside as the guy who <i>could</i> just be a company's strongest engineering asset. He <i>could</i> be the guy to put the company ahead of the competition, always understanding the engineering problems more deeply and solving them more thoroughly than others, including the competition, in a way that the competition can't easily replicate. -- ...if only the people tasked with passing judgment could overcome biased and lazy decision-making.
The reverse is also quite true.
You can get a thing which just seems too slow/fiddly/buggy/etc, but it's in an area where you are unfamiliar with, so you pass it off as just your misunderstanding of the thing. Whenever you investigate more on it, you find that there are ways ppl use it that avoid the problems that you are having with it, or that pieces of the thing make a certain amount of sense the way they are.
Then as you learn more you begin to understand the underlying reason that the thing is the way it is, and you realise that it's not you. Poor decisions were made in the design of the thing initially (or at least decisions which once valued the right things, but now do not), and have become baked into the DNA of the thing so it could not be fixed without major changes that nobody wants to do.
It's at that point, you realize that the emperor has no clothes, and certainly didn't deserve the benefit of the doubt.<p>I find this is particularly the case when the design flaw is particularly bad, the developers end up adding complexity to work around the flaw, which has the effect of hiding the flaw, making it alot less obvious to the novice.
This reminds me of a quote from John Carmack that gnaws on the back of my brain during day to day development - "A large fraction of the flaws in software development are due to programmers not fully understanding all the possible states their code may execute in." Seems it would also be relevant to most of modern inventions.
The other side of this coin is that some ignorance can be beneficial to break out of local minima.<p>> How do you overcome schlep blindness? Frankly, the most valuable antidote to schlep blindness is probably ignorance. Most successful founders would probably say that if they'd known when they were starting their company about the obstacles they'd have to overcome, they might never have started it. Maybe that's one reason the most successful startups of all so often have young founders.<p><a href="http://paulgraham.com/schlep.html" rel="nofollow">http://paulgraham.com/schlep.html</a>
I think the harder part is determining how much you actually know about something, to know how much you can judge.<p>My default nowadays is I never know enough, even in the fields I'm specialized in. Part of it is I think just a limit of how much information I can accurately refer to/pull up in my brain at any given time too.
Sometimes the engineers are "too" clever. In my car, seatbelts have an additional mechanism which causes them to stop if the car is braking (with a certain acceleration, perhaps, or actual detection of the brake - I haven't determined). This seems very sensible until you live near an intersection which is not at right angles, which requires the driver to lean forward upon arrival at the intersection in order to look around the A pillar for oncoming traffic.<p>What's interesting is that I now know something that the engineers designing the system should have known but apparently didn't, or knew and didn't care about. So sometimes (or much of the time?) users <i>do</i> know something the engineers don't.
It's always terrified me how one experience or one anecdote can completely change a persons view on a topic and make them blind to the overwhelming evidence they've previously experienced and accepted (and ignore future evidence)<p>A single rude salesperson and they won't step foot in the entire chain, a bad cop or doctor and the whole police force or medical establishment is corrupt. Listen to a politician giving a speech and people become lifelong supporters even when their policies change radically.<p>There is something deeply embedded in the human mind that makes us very susceptible to some stories - against all evidence, if it fits our preconceived perception or coincides with something in our memory.
> My willingness to judge something should be proportional to how much I know about it.<p>I'd also add:
...should be proportional to my need to or benefit from passing judgement.<p>Postpone judgement and you observe more. It's counterproductive to stamp everything in life "good/bad" "dumb/smart" "friend/foe" as soon as possible.
In the spirit of humility that this article promotes, can anyone explain to me why so many doors have handles that stick out when they’re push doors and a sign that says push instead of having no handle and say a protruding flat push surface that requires no sign and intuitively indicates to push even for the sight impaired?
My experience has bit more expansion to this 'mental rule'.<p>I judge something 'proportional to how much I know about it' <i>and</i> in context when it was developed.<p>Designers take things outside of the design into consideration. The design may remain alive and in use (but not necessarily the original intended use) way past the original, outside environment.
When I was a kid I decided that test pilots must be idiots.<p>My logic:<p>Being a test pilot is dangerous. Only idiots do dangerous things. Therefore test pilots must be idiots.<p>Later (embarrassingly later) I learned that many test pilots were also engineers. This made me reconsider my opinion. I learned to be very careful when judging intelligence, and also the limits of inference.
Then don't judge anything as 'stupidly designed'. What does that mean? Just say "I think the design flaw is XYZ".<p>If I find a stupidly designed product, it is often because it was the cheap option. Spending more saves money in the long run. Water bottles spring to mind - most of them leak or get damaged by dish-washing sooner or later. Spending $20 on a water bottle is cheaper than spending $5 ten times. Although a high price isn't a guarantee of quality either!
For those who just want to see the rule:<p>> My willingness to judge something should be proportional to how much I know about it.<p>I think it’s a sound rule. Going to try to remind myself of this more.
If you are a 9-yr old developer, this makes a lot of sense. Most people in a professional situation are closer in skill level and this kind of disparity indicates a lack discipline when hiring. I see it all the time so it is not unusual.
I drew a similar conclusion about people's observations, in general: the faster the conclusion the less knowledge about the topic - which is similar to Kahneman's system 1(intuition).
I first tried out the WWW when it had fewer than 2000 pages, and didn't see the point. My constant reminder that if something initially looks useless to me, it's probably me.
> My willingness to judge something should be proportional to how much I know about it.<p>When everyday folk start doubting the safety and/or effectiveness of mRNA vaccines this is what comes to my mind.
Obligatory: be careful you’re not standing atop Mount Stupid.<p><a href="https://www.smbc-comics.com/?id=2475" rel="nofollow">https://www.smbc-comics.com/?id=2475</a>
I see your Chesterton's fence and raise a Gell-Mann amnesia. Yes, the phenomenon in the article is real, but the opposite is also true. Many times I have seen something, thought it is silly, wrong, or could be done better, but I reserved judgement because I didn't know much about the subject. And then in the end, I was right.<p>Be it some technical choice at work that had obvious flaws. Or when we were renovating our house and as a layperson I noticed a serious problem the experts didn't see. Or when I was reading about poststructuralism, or critical theory at university. I had a feeling it was just a lot of word games around a couple important ideas - then I put in the work and read books and went to courses, and yupp, that was basically true.<p>Looking at it from the other side, as an expert on some topics, I know there are a lot of things we do that are not justified by the "subject matter" but we just do them because we have always been doing them, or because a pointy haired boss decided so. Or we have operational blindness and can't notice the flaws anymore.
Like how most things is for a reason. Especially within rigorous domains such as engineering, math, physics and programming. It is not to be expected in BS vendors safe space in subjects like politics, economics and social science.