<p><pre><code> While I could go into a long story here about the relative merits of the
two designs, suffice it to say that among the people who actually design
operating systems, the debate is essentially over. Microkernels have won.
</code></pre>
The developers of BSD UNIX, SunOS, and many others would disagree. Also, the then upcoming Windows NT was a hybrid kernel design. While it has an executive "micro-kernel", all of the traditional kernel stuff outside the "microkernel" runs in kernel mode too, so it is really a monolithic kernel with module loading.<p>While the original post was written well before NeXTSTEP, the Mach 3.0 kernel was converted into a monolithic kernel in NeXTSTEP, which later became MacOS. The reality is that Mach 3.0 was just still slow performance wise, much like how NT would have been had they had made it into an actual micro-kernel.<p>In the present day, the only place where microkernels are common are embedded applications, but embedded systems often don't even have operating systems and more traditional operating systems are present there too (e.g. NuttX).
This is the thread that I read in high school that made me fall in love with software architecture. This was primarily because Tanenbaum’s position was so obviously correct, yet it was also clear to all that Linux was going to roll everyone, even at that early stage.<p>I still hand this out to younger software engineers to understand the true principle of architecture. I have a print off of it next to my book on how this great new operating system and SDK from Taligent was meant to be coded.
I've heard of this debate but haven't heard an argument of adoption from a FOSS perspective. From Wikipedia on Minix [0]:<p>> MINIX was initially proprietary source-available, but was relicensed under the BSD 3-Clause to become free and open-source in 2000.<p>That is a full eight years after this post.<p>Also from Wikipedia on Linux becoming FOSS [1]:<p>> He [Linus Torvalds] first announced this decision in the release notes of version 0.12. In the middle of December 1992 he published version 0.99 using the GNU GPL.<p>So this post was essentially right at the cross roads of Linux going from some custom license to FOSS while MINIX would remain proprietary for another eight years, presumably long after it had lost to Linux.<p>I do wonder how much of an effect, subtle or otherwise, the licensing helped or hindered adoption of either.<p>[0] <a href="https://en.wikipedia.org/wiki/Minix" rel="nofollow">https://en.wikipedia.org/wiki/Minix</a><p>[1] <a href="https://en.wikipedia.org/wiki/History_of_Linux" rel="nofollow">https://en.wikipedia.org/wiki/History_of_Linux</a>
Academically, Linux is obsolete. You couldn't publish a paper on most of it; it wouldn't be original. Economically, commercially and socially, it isn't.<p>Toasters are also obsolete, academically. You couldn't publish a paper about toasters, yet millions of people put bread into toasters every morning. Toasters are not obsolete commercially, economically or socially. The average kid born today will know what a toaster is by the time they are two, even if they don't have one at home.
> Writing a new OS only for the
386 in 1991 gets you your second 'F' for this term. But if you do real well
on the final exam, you can still pass the course.<p>what a way to argue...
There's an element of "Worse is Better" in this debate, as in many real-world systems debates. The original worse-is-better essay even predates the Linux vs Minix debate:<p><a href="https://dreamsongs.com/RiseOfWorseIsBetter.html" rel="nofollow">https://dreamsongs.com/RiseOfWorseIsBetter.html</a><p>Gabriel was right in 1989, and he's right today, though sometimes the deciding factor is performance (e.g. vs security) rather than implementation simplicity.
Ironically it actually is, from 2025 perspective.<p>Not only does microservices and Kubernetes all over the place kind of diminishes whatever gains Linux could offer as monolithic kernels, the current trend of cloud based programing language runtimes being OS agnostic in serverless (hate the naming) deployment, also makes irrelevant what is between the type-2 hypervisor and language runtimes.<p>So while Linux based distributions might have taken over the server room as UNIX replacements, it only matters for those still doing full VM deployments in the style of AWS EC2 instances.<p>Also one of the few times I agree with Rob Pike,<p>> We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy.<p>> At the risk of contradicting my last answer a little, let me ask you back: Does the kernel matter any more? I don't think it does. They're all the same at some level. I don't care nearly as much as I used to about the what the kernel does; it's so easy to emulate your way back to a familiar state.<p>-- 2004 interview on Slashdot, <a href="https://m.slashdot.org/story/50858" rel="nofollow">https://m.slashdot.org/story/50858</a>
It’s always heralded as a great CS debate, but Tanenbaum’s position seems so obviously silly to me.<p>Tanenbaum: Microkernels are superior to monolithic kernels.<p>Torvalds: I agree— so go ahead and write a Production microkernel…
Here's the debate in a single compressed text file.<p><a href="https://www.ibiblio.org/pub/historic-linux/ftp-archives/sunsite.unc.edu/Sep-29-1996/docs/misc/linux_is_obsolete.txt.z" rel="nofollow">https://www.ibiblio.org/pub/historic-linux/ftp-archives/suns...</a>
The realization that in 2058 some people will be reading comments from 2025 Hacker News threads and will feel amused at all the things we were so confidently wrong about.<p>;)
The Linus response is pretty great -> <a href="https://groups.google.com/g/comp.os.minix/c/wlhw16QWltI/m/P8isWhZ8PJ8J" rel="nofollow">https://groups.google.com/g/comp.os.minix/c/wlhw16QWltI/m/P8...</a>
Linux <i>is</i> obsolete. The main thing it has going for it is that it isn't actively hostile to it's users like the alternatives. It's also somewhat hackable and open, for those technically enough inclined. Also unlike it's alternatives it's (slowly but surely) on a positive trajectory... And that's not something anyone says about Windows or Mac.<p>> How I hated UNIX back in the seventies - that devilish accumulator of data trash, obscurer of function, enemy of the user! If anyone had told me back then that getting back to embarrassingly primitive UNIX would be the great hope and investment obsession of the year 2000, merely because it's name was changed to LINUX and its source code was opened up again, I never would have had the stomach or the heart to continue in computer science.<p>> Why can’t anyone younger dump our old ideas for something original? I long to be shocked and made obsolete by new generations of digital culture, but instead I am being tortured by repetition and boredom. For example: the pinnacle of achievement of the open software movement has been the creation of Linux, a derivative of UNIX, an old operating system from the 1970s. It’s still strange that generations of young, energetic, idealistic people would perceive such intense value in creating them. Let’s suppose that back in the 1980s I had said, “In a quarter century, when the digital revolution has made great progress and computer chips are millions of times faster than they are now, humanity will finally win the prize of being able to write a new version of UNIX!” It would have sounded utterly pathetic.<p>- Jaron Lanier
> Be thankful you are not my student. You would not get a high grade for such a design :-)<p>Further proof that computer "science" is a nonsense discipline. ;-)<p>The World Wide Web was invented at CERN, a particle physics laboratory, by someone with BA in physics. Who later got the Turing award, which computer scientists claim is somehow equivalent to a nobel prize.<p>Prof. Tanenbaum (whose degrees are also in physics) wasn't entirely off base though - Linux repeated Unix's mistakes and compromises (many of which were no longer necessary in 1992, let alone 2001 when macOS recycled NeXT's version of Unix) and we are still suffering from them some decades later.
I don’t think Tanenbaum’s distinction between micro-kernel and monolith is useful or important. He has monolith as a single binary running as a single process, while micro-kernel is multiple binaries/processes.<p>But either way these both boil down to bytes loaded in memory, being executed by the cpu. The significant thing about a microkernel is that the operating system is organized into functional parts that are separate and only talk to each other via specific, well defined channels/interfaces.<p>Microkernel uses processes and messages for this, but that’s hardly the only way to do it, and can certainly be done in a bunch of units that happen to be packaged into the same file and process. C header files to define interface, C ABI to structure the channels, .c files for the separate pieces.<p>Of course you could do that wrong, but you could also do it right (and, of course, the same is true of processes and messages).<p>A process, btw, is an abstraction implemented by the os, so microkernel or not, the os is setting the rules it plays by (subject to what the CPU provides/allows).
I have no idea how they think IPC is as quick as in-process. I do it pretty quickly with memory mapping (shared memory between data providers and consumers), but it has at least an order of magnitude overhead compared to a concurrent queue even after 30 years.<p>Tannenbaum must be threatened by the growing linux community to start throwing flamebaits like this.
> As a result of my occupation, I think I know a bit about where
operating
> >are going in the next decade or so<p>I’m not sure one necessarily qualifies you to know the other… there always seems to be a lot of arrogance in these circles.
I don’t think enough credit is given to the role the GPL played in making Linux successful. The more liberal BSD-style licenses resulted in every hardware maker selling their own slightly incompatible fork of UNIX where las the GPL forced everyone to unite behind a single code base, which is what you want for an operating system.
Like most nerds, your blindspot is an ability to be pragmatic. In the real world, "technically better solution" does not trump being the thing that is widely adopted on merit, is mature enough to have beeen so stable and reliable for decades that it is a no-brainer standard that comes with almost no risks.
A comment in the group caught my attention:<p>> There are really no other alternatives other than Linux for people like
me who want a "free" OS.<p>What a minute. What about FreeBSD?<p>[Update: Never mind. I realized later this thread was written about a year before FreeBSD was first released.]
That hasn’t aged well because the microkernels of the day like Mach failed to keep their promises. There are newer ones like the L4 family that were designed specifically for performance, but they have not been deployed as a base for a full-featured OS like Mach was for macOS or OSF/1, where IPC was too slow and the OS server was glommed to the microkernel, making it an even ungainlier monolith. Just another illustration of academic theory vs industrial practices.
The first gnulinux I installed was blag on a 770 thinkpad replacing windows98SE. For a week I tried to get sound working trying all the recipes I found and downloading forum threads on the topic in bulk to research the issue. But not even crickets. One night I woke up for some reason and looked through saved forum posts and in one of them someone posted a long cryptic command that worked for them. I typed it in hit Enter, no message, prompt just returned and the sound worked. <i>that kind of cured me</i>
Great reminder that there's more to adoption than just theory on paper; the practicalities, communities and a little bit of inexplicable magic are how new tech really takes off.
I still think what I wrote about this last year in a talk I gave on linux is pretty good. I did about 5 months of research full-time on it. You can read the whole thing here: <a href="https://siliconfolklore.com/scale/" rel="nofollow">https://siliconfolklore.com/scale/</a><p>"
Things like paradigmatic ways of doing open source software development took 20 years to dominate because the longevity and applicability of the more abstract solutions is on the same time frame as their implementations. But within that exists lower-level Maslovian motivations.<p>And keeping things there makes them more actionable. Let’s say your network card isn’t sending out packets. We can say this bug is known, agreed upon, and demonstrable. So although it may not be easy, the labor path is traversable.<p>A new network card comes out, you need it to work on Linux. That’s a need. You can demonstrate and come to an agreement on what that would look like.<p>Pretend you want that network card to do something it wasn’t designed to do. That’s harder to demonstrate and agree upon.<p>To get that actionable you need to pull the desire into the lower curve so that a development cycle can encompass it.<p>VVV here's where it comes in VVV<p>It’s worth noting the Tannenbaum-Torvalds debate from 1992 to illustrate this. Tannenbaum chastised Torvalds approach because it wasn’t a microkernel and Linux was exclusive to the 386. Really Linus was in these lower curves and Tannenbaum was trying to pull it up to the higher curves where things move far slower. That’s where the hot research always is - people trying to make these higher level concepts more real.<p>GNU/Hurd is a microkernel approach. Stallman claimed in the early 2000s that’s why it was taking so long and wasn’t very stable.<p>The higher level curves are unlikely to succeed except as superstructures of the lower level functions in the same way that our asymmetric approach to platonic ideals happens on the back of incrementally more appropriate implementations which is why you can snake a line from 1950s IBM SHARE to GitHub.<p>Through that process of clarifying the aspirations, they get moved to the concrete as they become material needs and bug problems.<p>The clarity of the present stands on both the triumph and wreckage of the past. For example, the Mach 3 micro-kernel led to Pink, NextStep, Workplace OS, Taligent, and eventually XNU which is part monolithic and is now the basis for macOS. To get there that curve burned over a decade through multiple companies and billions of dollars. Also the OSF group I mentioned before had a Mach-BSD hybrid named OSF/1. Apple was going to use it in an alliance with IBM but that got canceled. It went on to become Tru64 whose last major release was in 2000, 24 years ago, to add IPv6 support.<p>How’s that transition going?"
For those interested in where Tanenbaum ended up, he co authors the electoral-vote.com website these days. I used to be a pretty regular reader until Trump won.
> My real
job is a professor and researcher in the area of operating systems.<p>> As a result of my occupation, I think I know a bit about where operating
are going in the next decade or so.<p>The gap between industry and academia must have been less well recognized at this stage. I think of PL researchers today, most of whom would not confidently assert they know the way programming languages will go—they'd instead confine themselves to asserting that they know where PLs <i>ought</i> to go, while acknowledging that the industry doesn't tend to care at all what PL researchers think a PL should look like.<p>One thing I'm curious about is <i>why</i> the industry-academia gap is so large? Is this true in other disciplines? I'd expect some baseline level of ivory-tower effect in any industry, but I'd also expect there to be a significant number of people who actually do cross the gap and make an effort to study the way things actually work rather than just the way they theoretically <i>ought</i> to work.<p>Where are the OS researchers who research why Linux won? Where are the PL researchers who study what makes a great industry language?