Pardon the cult-like musings ahead of time, but it’s par for the course.<p>My initial thought was rationalism is obviously egoic and selfobsessed, loving and trusting ones own thoughts. Set theory should tell you that you cant make a mental image of yourself to act by that will be more encompassing than the totality of what you are. You can’t build a mental model inside of your ego that will work better than your natural instinct for all interaction with reality. Trust sheer emotion to say that rationalising any loss of life means an upside down philosophy, a castle in the sky. This cult with its ”functional decision theory” […that the normative principle for action is to treat one’s decision as the output of a fixed mathematical function”]
makes actions a sort of cold choice without emotion. Like people using religion in war to remain cool when killing, a misuse of a neutral idea such as a mathematical function.<p>But it can’t be that easy to handwave it away. When Aum Shinrikyo was mentioned down below, I changed my mind, there’s no easy answers. A sick leader can justify anything and you can judge any tree from its fruits. From doctrine section on that wikipedia: ”Their teachings claimed a nuclear apocalypse was predicted to occur soon” (Parallell to AI now in rationalism), ”Furthermore, Lifton believes, Asahara "interpreted the Tibetan Buddhist concept of phowa in order to claim that by killing someone contrary to the group's aims, they were preventing them from accumulating bad karma and thus saving them" ” (Parallell to rational behavior guidance gone wrong, these datascientists just lost touch, Norm Macdonald would say theyre real jerks pardon the humor).<p>I just the other day listened to Eckhart Tolles podcast where he talked about doomsday fear, on the bottom of the transcript it says: [“There's also an unconscious desire in many human beings for what we could call the collapse of the human made world.<p>Because most humans experience the world as a burden. They have to live in this world, but the world is problematic. The world has a heaviness to it.<p>You have your job, you have the taxes, you have money, and the world is complex and heavy. And there's an unconscious longing for people in what we might bluntly call the end of the world. But ultimately, what they are longing for is, yes, it is a kind of liberation, but the ultimate liberation that they are really longing for is the liberation from themselves, the liberation from what they experience as their problematic, heavy sense of self that's inseparable from the so-called outer world.<p>And so there's a longing in every human for that. But that's not going to happen yet.”]
Eckhart Tolle: Essential Teachings: Essential Teachings Special: Challenging Times Can Awaken Us 30 jan. 2025<p>Obvious parallell to AI doomsaying can be drawn.<p>When we were children we experienced unfiltered reality without formulas to replace our decisions. But we could even then be wrong, stupid, or convinced to do stupid shit by a charismatic playground bully. But when we were wrong it resulted in falling and scraping our knee or whatever. Theres no reality checks in internet culture bubbles.<p>This is sick people huddling together under a sick charismatic warlord-ish leader whose castle in the sky is so selfcoherent that it makes others want to systemize it aided by the brainwashing methods.[”Zizians believe normal ideas about morality are mostly obfuscated nonsense. They think real morality is simple and has already been put to paper by Jeremy Bentham with his utilitarian ideas. Bentham famously predicted an expanding scope of moral concern. He says if humanity is honest with itself it will eventually adopt uncompromising veganism. Zizians think arguments which don't acknowledge this are not about morality at all, they're about local struggles for power delaying the removal of an unjust status quo.”] <i>Insert Watchmen pic of grandiose narc Adrian Veidt asking Dr Manhattan if utilitarian masskilling was the right choice</i><p>And then the sleep deprivation indoctrination method dulls even their rationality even further. So they can all become ego clones of the cultleader.<p>And that other link in this thread mentioned other groups of rationalists debugging from demons sent by adversary groups and other psychotic stuff, yeah is it the chicken or the egg where those people gather in a place where people loop with their mind or is it the mindlooping that sends them in a downwards spiral. Maybe we should calculate the bayesian.