The prerequisite for "mental liquidity" is articulated by Aristotle: "It is the mark of an educated mind to be able to entertain a thought without accepting it." If you entertain the thought, this gives you the chance to try out a new belief network. If you find your belief network would be strengthened by its inclusion, then you adopt it. Otherwise, you reject it. In this way, ones interconnected set of beliefs grows monotonically stronger. And this is right and good.<p>EDIT: got downvoted! I would love love love to know why! Not offended, just curious.
One of the biggest beliefs I keep struggling with is the need to be perfect. I've been jamming away for many many weekends on a side project that literally was done. I just kept adding tiny tweaks left and right, until I literally just now launched it (<a href="https://amee.la" rel="nofollow noreferrer">https://amee.la</a>).<p>Nothing ground breaking, and in the end nothing that needed to have so much perfectionism around.<p>The belief of having to need something perfect is one of the strongest I see among founders here on HN and elsewhere. It's almost always bad. I have zero examples where that ended up being good. Yet, even though the facts are clear, it's extremely hard to overcome.
I just finished the Einstein biography by Walter I. and found Einsteins stubbornness quite entertaining. He knew about this trait, accepted it as an effect of ageing and even was making jokes about it. He simply disliked some facts about quantum m. and allowed himself to pursue a rather fruitless endeavour for many years. He knew that this kind of stubbornness would kill the career of a younger scientist but he could afford to do so. In that sense he contributed to science.
It’s easy to forget how difficult learning is, for us as individuals and as flocks in formation. Pick any topic and it’s likely it took you years to learn well. So simply switching out beliefs embedded in that topic requires overwriting years of patterns and synapses in sync.<p>Where Kuhn is so helpful in understanding that even scientists have immense difficulty, if not vigorous myopia, stuck with wrong beliefs. Paradigm shifts with funerals is easier over decades than getting scientists to evolve their models.
> It might sound crazy, but I think a good rule of thumb is that your strongest convictions have the highest chance of being wrong or incomplete, if only because they are the hardest beliefs to challenge, update, and abandon when necessary.<p>I strongly disagree with this, unless we are only talking about beliefs that are about facts of the universe.<p>For example, my strongest belief is that all people have an equal right to exist and pursue their own purpose... this is not a belief about the facts of the universe, but about my own morality. I don't think it has a chance to be 'wrong'
The best way to test your "mental liquidity" is to think about some hypotheses that are outside the "Overton window" or even outright taboo.<p>"What if ***** were true? Surely it can't be true. If it were, that would be terrible."<p>That's motivated reasoning. Remember that the truth of any hypothesis is not influenced by how much you want it to be true, or false. Some hypotheses are deeply uncomfortable, but you should nonetheless strive to believe the truth. Or rather, what is best supported by the evidence. Even if it hurts.
Take a die with six to twenty sides and assign a belief system/worldview to each number.
Roll the dice twice, first for the belief system/worldview, the other for the number of months you live by it.
Of course, you can vary the parameter according to your taste and courage.
But it is important to persevere, so you better start small.
I call it Rhinehartian chaotic paradigm shift.
Dice Man goes chaos magic.
I think this article is a little too overzealous with trying to simplify a topic like beliefs and ideas.<p>A lot of it also sounds like common sense to me, the people capable of grasping this:<p>> Be careful what beliefs you let become part of your identity.<p>Are quite capable of adjusting themselves.<p>Everything else falls into either Ego, or people being self-(un)aware, and for the latter - you can only change "their" belief system if they themselves are willing to change.
> <i>Be careful what beliefs you let become part of your identity.</i><p>“I have a tight enough knowledge and grasp of my beliefs to intentionally control my sense of identity” is a fascinating belief to turn into an identity.
One approach to preserving mental fluidity is to not get emotionally attached to ideas. This was expressed by Richard Feynman in his 1979 lectures on quantum electrodynamics, available here:<p><a href="http://www.vega.org.uk/video/subseries/8" rel="nofollow noreferrer">http://www.vega.org.uk/video/subseries/8</a><p>> Q: "Do you like the idea that our picture of the world has to be based on a calculation which involves probability?"<p>> A: "...if I get right down to it, I don't say I like it and I don't say I don't like it. I got very highly trained over the years to be a scientist and there's a certain way you have to look at things. When I give a talk I simplify a little bit, I cheat a little bit to make it sound like I don't like it. What I mean is it's peculiar. But I never think, this is what I like and this is what I don't like, I think this is what it is and this is what it isn't. And whether I like it or I don't like it is really irrelevant and believe it or not I have extracted it out of my mind. I do not even ask myself whether I like it or I don't like it because it's a complete irrelevance."<p>I think that's critical, because if you become emotionally involved with promoting an abstract idea, it becomes part of your personal identity or self-image, and then changing your mind about it in the face of new evidence becomes very difficult if not impossible.<p>In another lecture, Feynman also said something about not telling Nature how it should behave, as that would be an act of hubris or words to that effect, you just have to accept what the evidence points to, like it or not.<p>(Changing your mind about what's morally acceptable, socially taboo, aesthetically pleasing etc. is an entirely different subject, science can't really help much with such questions.)
> A question I love to ask people is, “What have you changed your mind about in the last decade?” I use “decade” because it pushes you into thinking about big things, not who you think will win the Super Bowl.<p>This is a great question. And "decade" is a good time frame not only because of size but because it's a long enough time frame there's a better chance people will have good answers.<p>The Dee Hock quotes (“A belief is not dangerous until it turns absolute” and “We are built with an almost infinite capacity to believe things because the beliefs are advantageous for us to hold, rather than because they are even remotely related to the truth”) are great too.
> Changing your mind is hard because it’s easier to fool yourself into believing a falsehood than admit a mistake.<p>Different angle: it's not simply "fooling" oneself, but it's because ideas are one way or another built on top of an ideological foundation.<p>Einstein rejecting quantum theory on the basis the universe shouldn't have a random component to it is also rejecting the idea of having to re-examine all philosophy past Descartes and Newton, which aligned so well with society's viewpoint at the time - a deterministic, cause-consequence universe, where things have logical explanations and where hard work is rewarded.
This article matches my own life experience: Rather than what have you changed your mind about in the past decade, I use 'in your whole life'. Speaking personally, there are only two big things I've changed my mind about. I'm working on a third... I wish the article had included something in the vein expressed by Charlie Munger, which is a 'how-to' for intellectual integrity.<p>"I never allow myself to have an opinion on anything that I don't
know the other side's argument better than they do.”
Mental Liquidity is another way of thinking about "Psychological Flexibility," which is the subject of a huge amount of clinical research. There's an entire therapeutic framework called Acceptance and Commitment Therapy (ACT) which came out of this research.<p>Check out this article [0] for a description of ACT from a founder's perspective.<p>[0] <a href="https://every.to/no-small-plans/how-to-do-hard-things" rel="nofollow noreferrer">https://every.to/no-small-plans/how-to-do-hard-things</a>
> Most fields have lots of rules, theories, ideas, and hunches. But laws – things that are unimpeachable and cannot ever change – are extremely rare.<p>This sounds like a rehash of Popperian epistemology. We should look forward to disproving existing theories (finding new problems), because it leads to new, better theories.
Brings to mind Robert Pirsig's 'value rigidity' concept: 'an inability to revalue what one sees because of commitment to previous values.' I don't remember if there was a term for the opposite, but 'flexibility' seems to be right.
I like this nice little text. Einstein is a perfect example for mental liquidity. I think we should be very forgiving about this for two reasons: first, Einstein was one of the people establishing quantum mechanics. He also got the Nobel Price for his work on the photoelectric effect. Second, even the brightest minds have only a narrow time frame until mental ability starts to decline. So we cannot expect a brain to dig deep into general relativity and at the same time something completely different like QM.
Surprisingly, Einstein even contributed to QM in old age by trying to poke holes into the theory that later proved to be true (e.g., spooky effects at a distance).
Of course we must reference Berlins fable the Fox and the Hedgehog here.<p>A great essay in this area is Venk’s Cactus and the Weasel. <a href="https://www.ribbonfarm.com/2014/02/20/the-cactus-and-the-weasel/" rel="nofollow noreferrer">https://www.ribbonfarm.com/2014/02/20/the-cactus-and-the-wea...</a>
It seems like this is a term for the ability to avoid sunk cost fallacy ( <a href="https://www.scribbr.com/fallacies/sunk-cost-fallacy/" rel="nofollow noreferrer">https://www.scribbr.com/fallacies/sunk-cost-fallacy/</a> )<p>The link contains a number of reasons why people get trapped in sunk cost fallacy.
Perhaps unexpectedly, I find that thoughtful engagement with religion (Judaism in my case) has helped me become much more liquid on other topics.<p>When you accept on faith a handful of principles that deal with an unknowable domain, it becomes much easier to be less attached to the other stuff.
> Albert Einstein hated the idea of quantum physics.<p>Einstein came up with most of what physicists now recognize as the essential features of quantum physics. He was not anti quantum, he just believed randomness could not be a fundamental feature of nature.
In my experience, This attribute is an absolutely critical part of successfully building culture at an early stage startup, and you have to be ruthless about culling those who are not willing to give it a try nevermind master it.
I have had to have an open mind.<p>Long story. Lots of tears. Get your hanky.<p>It's served me well, in my technical work.<p>I now do a lot of stuff that I used to scoff at.
I think about these questions very often, but I don't feel like going on a long rant about it from a philosophical perspective. I will instead give an anecdote:<p>I think from my teens to my early 20s my political stance changed dramatically, and at any one point in time I would think that whatever I held to be true I would continue to in the future. But what always changed my belief system was not encountering some new piece of information that changed my idea or made me "update my priors" (in the crude Bayesian system, a most despicable philosophy of our era). It was always something that radically changed <i>how</i> it was that I understood the world around me, something that made my way of thinking about things shift so dramatically that I had to abandon my old ideas. I think everyone should read Marx, Nietzsche, and Freud for that reason, even if you think they are heinous and evil, because they radically question the logic and order of society and knowledge, and their writings are deeply disturbing to many for that reason.<p>What changes people's perspectives is generally what people want to avoid (to the author's point). And the more you want to avoid something or "prove it wrong," oftentimes the more it changes the way you think about the world.