This is the money quote:<p>> The second dimension to the problem is that no two Linux distributions agreed on which core components the system should use.<p>Linux on the desktop suffered from a lack of coherent, strategic vision, consistency and <i>philosophy</i>. Every engineer I know likes to do things a particular way. They also have a distorted view on the level of customization that people want and need.<p>I like OSX. Out of the box it's fine. That's what I want. I don't want to dick around with Windows managers or the like. Some do and that's fine but almost no one really does.<p>Whereas Windows and OSX can (and do) dictate a topdown vision for the desktop experience, Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows? Whatever the case, this really matters for a desktop experience.<p>I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why. It could be an update gone awry. It could be corp-specific modifications. It could be anything. But the point is: <i>I don't care what the problem is, I just want it to work</i>. In this regard, both Windows and OSX just work. In many others too.<p>I can't describe to you how much torture it always seems to be to get anything desktop-related to work on Linux. I loathe it with a passion. I've long since given up any idea that Linux will ever get anywhere on the desktop. It won't. That takes a topdown approach, the kind that anarchies can't solve.
I think some of this is perceptive. It's true that the attempt by both Canonical (Unity) and Red Hat (Gnome 3) to sort-of-incompatibly break away from the <i>so close to standard that it hurts to type this</i> Gnome 2 environment did a lot more harm than good, at least as far as platform adoption goes.<p>And clearly OS X is an extremely polished Unix and is going to appeal to the more UI-focused of the hacker set. And Miquel is definitely among the most UI-focused of the hacker set. He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.<p>But at the same time the Linux desktop was never really in the game. I use it (Gnome 3 currently) and prefer it. Lots of others do. For many, it really does just work better. But in a world where super-polished products are the norm, a hacker-focused suite of software isn't ever going to amount to more than a curiosity. (And again, I say this as someone who will likely <i>never</i> work in a Windows or OS X desktop.)<p>So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before. It's more fractured in a sense, as the "Gnome" side of the previous desktop war has split into 3+ camps (Unity, Gnome 3 and Gnome2/Xfce, though there are other spliter camps like Mint/Cinnamon too). But it's here and it works, and it's not going anywhere. Try it!
I've used Linux for years and have never had these problems. I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. If you just follow advice on forums, you will make things worse because you're doing things you don't understand to a system that you don't understand. That's not going to lead to success. You need to be able to think critically about what's wrong and what needs to change, and then execute those changes. No, that's probably not worth doing if you already like Windows or OS X. If you don't, though...<p>(And, there are of course Linux-based systems that were built by someone controlling the whole experience, and those work really well. Android and ChromeOS come to mind, though those aren't really <i>desktops</i> per se.)<p>The other day, someone here was complaining about udev. It has ruined Linux forever, or something. I have a different experience: udev has made my life very easy. I have a rule for each device I care about, and that device is automatically made available at a fixed location when it is plugged in. For example, I have a rule that detects a microcontroller that is waiting to be programmed with avrdude in avr109 mode that symlinks the raw device (/dev/ttyUSB<whatever>) to /dev/avr109. I then have a script that waits for inotify to detect the symlink, and then call avrdude to program the microcontroller. A few lines of shell scripting (actually, it's in my Makefile), and I can just plug in a microcontroller, press the programming button on it, and everything just works. No screwing around with figuring out which device address it's assigned to. How do you do <i>that</i> in Windows?
JWZ identified the issue Miguel discusses in this post ten years ago, he even gave it a name: CADT<p><a href="http://www.jwz.org/doc/cadt.html" rel="nofollow">http://www.jwz.org/doc/cadt.html</a><p>Also, part of what killed the Linux desktop was Miguel and his total lack of understanding of the unix philosophy which drove him to create abominations like BONOBO. D-Bus is not much better either.<p>That he fell in love with an iPhone goes to show he didn't fully appreciate the value of open source either.<p>We were just yesterday commenting with some friends in #cat-v how Evolution is one of the worst pieces of software ever created, and Evolution is supposedly considered by Miguel and co to be the epitome of the Linux desktop.
Speaking as someone running a Linux desktop (and am writing this on one) there's not much to say other than I agree. I run linux because work gave me a PC and there's no way I can write software on Windows. Of course we all have servers managed off in the corporate cloud somewhere that run ssh/vnc etc, but there's no way I wanted to install putty again or miss out on the unix commands that make (work) life more enjoyable, so I installed Linux, because I write server software, and client sometimes, but browsers make the operating system moot pretty much. There's more variation between browsers than between operating systems - mobile aside. And when I need to try something on Wintel I spin up a cloud instance and use vnc.<p>When i'm not on Linux I run OSX everywhere else (and IOS) because its unix-like (is) and because it works so well. I am sure Windows 7 and 8 are great, but I doubt they have gotten rid of c: or \ as path delimiter or any of the other nonsense that DOS introduced (copied from PIP) back in the dark ages. why should they, MSFT still runs DOS apps so they aren't going to change and choosing between OSX and Linux on a non-work desktop is a no-brainer, Netflix, Photoshop etc etc etc...
I think the article does a great job of explaining the problem, but doesn't explore the ramifications far enough.<p>Let me give an example: a few months ago, a new version of Skype was announced for Linux. I was excited, since I used Skype 2 for Linux but then it stopped working for me and I couldn't be bothered to fix it. But if you go to the Skype for Linux download page, you will find a few downloads for specific distros, then some tar files which are, statistically speaking, <i>guaranteed</i> not to work.<p>Long story short(er), I still don't have Skype working on my desktop, because my distro isn't in the list, I can't get one of the other distro packages to work on my system, and of course none of the statically-linked binaries work.<p>(I could almost certainly get it to work if I was willing to install 32-bit binary support. But it's 2012. If your app requires me to install 32-bit binary support, I don't need your app <i>that</i> badly.)<p>Steam for Linux, recently announced by Valve, will run into the same problem. I suspect it will actually be Steam for Ubuntu and Debian, possibly with a version for Fedora, assuming you have the proper libraries installed and are using the right sound daemon and graphical environment.<p>But if big-name software comes out for Linux, hopefully distros will get in line. Do you want to be that distro which can't run Steam? Doesn't really matter if you think that OSSv4 is superior to ALSA and PulseAudio...if Steam requires the latter, you will toe the freaking line, or disappear into obsolescence.
I'd agree with a whole lot of what's said here, but also add:<p>One of the big thrusts of the Linux desktop wasn't simply dominance itself, but for it to simply <i>not matter</i> what you were using on the desktop. The Linux desktop fought to produce the first cracks in Windows hegemony a decade ago, but the final push came from the rebirth of Apple and the rise of the smartphone.<p>Today people happily do their normal productive or recreational tasks from a variety of computing environments: Windows, GNOME, Unity, KDE, OS X, iOS, Android, et al. Probably the majority of (Western) web users use at least one non-Windows internet device.<p>During the golden age of the Linux desktop everything seemed predicated on reaching exactly this point -- that you wouldn't <i>need</i> Windows, and then, by virtue of competing on a leveler playing field, the Linux desktop would ascend.<p>But the Linux desktops didn't "scate where the puck is going" -- or their attempts at such missed the mark. By the time we reached the era post-Windows dominance, the Linux desktops weren't positioned to take advantage of the new playing field dynamics. The rest of the industry isn't even all that concerned with the desktop wars anymore. It stopped mattering very much -- and ironically, that came around to bite the projects in the ass that first got the ball rolling.
I never understood this. Why would market share of Linux on the desktop matter? I've always viewed Linux on the desktop as something for power users and developers, and thousands of said power users and developers are continually developing and maintaining multiple distros and thousands of applications. It's not like it's a stale and abandoned paradigm that's left to die.
> (b) incompatibility across Linux distributions.<p>This is completely missing the point - a statically compiled end-user binary should be compatible across <i>all</i> distributions of Linux, using the same version of the kernel or <i>any</i> newer version.<p>The only caveats to that are (a) hardware and (b) poorly-packaged software.<p>(A) is the fault of hardware manufacturers and is increasingly not an issue these days anyway; driver issues are becoming increasingly rare.<p>(B) is easy to solve for any open-source software, as it is the responsibility of the community for that distribution to provide the appropriate packaging. They <i>prefer to do it themselves</i>. And they're good at it - it gets done!<p>If you want to ship a closed-source binary on Linux, just make sure you don't dynamically link it against any libraries that you don't also ship with the binary. Problem solved.<p>Honestly, I can't remember <i>one single instance ever</i> where I have run into end-user software that will run on one distribution of Linux and not another, as long as that principle was followed.
Software compatibility in OS X?<p>A lot of applications break on newer versions of Mac OS X. That's why there are websites like <a href="http://roaringapps.com/apps:table" rel="nofollow">http://roaringapps.com/apps:table</a><p>Also, there are a lot of "transitions" that Apple loves doing: PowerPC -> Intel. Java -> Objective-C. Carbon -> Cocoa. 32-bit > 64-bit. Access everything -> Sandbox.<p>See also Cocoa docs: "method X introduced in 10.5. Deprecated in 10.6".<p>I have a few devices that don't work in 10.8.<p>Basically, what I'm saying is that OS X is a <i>bad</i> example for backward compatibility. Windows is much better at this. Open source software is much better at this.
What killed it is that it didn't have a huge and multi-billion dollar company betting on it (on the desktop) like Microsoft and Apple had, even Apple with its billions is still around 5% market share worldwide so having 1% is still a great accomplishment when you think that it had no support from huge corporations.<p>Now take the mobile world for example, Linux on mobile had been around for a decade but it never really took off until a huge company like Google decided to throw its billions of dollars and its great ingenuity at the task. Getting an OS to be popular is just incredibly difficult and it needs way more than just good driver support and/or good software. It needs marketing, talking to manufacturers, dedicated and well payed devs, designers, UI and UX professionals, sales, R&D and so on and so forth.<p>Focusing on the technicality of drivers and API is typical of us devs, but it has nothing to do with why Linux didn't take off on the desktop, sure Linux did fail because it couldn't do any or some of that well, but why couldn't it do any or some of that? Because it didn't have a huge and focused company pushing for it. How many popular desktop OS are there? Only 2, I think that's enough to show that it's incredibly hard to get into that market and that only a huge company can make it. Also, let's not forget that Windows was good enough and there was not much Linux could do to attract users, in fact this is still true and probably why even OS X is still at 5%: Windows is good enough and it's the de facto standard used by +90%. Having the best UI and UX in the world like OS X doesn't help that much either.
I'm typing this on my work laptop running a linux desktop (Ubuntu FWIIW). Our engineering servers at work run linux and, as a convenience, have the desktop installed. As many of my co-workers run linux desktops as OS-X desktops (and the engineers running OS-X or Windows have VMs running linux... desktops).<p>When I go home, I'll be using my personal laptop running linux. My wife and kids run a netbook with a linux desktop.<p>The linux desktop may be dead to Miguel, but it works just fine for me, a lot of other people in my life, and a lot of people in the world.<p><shrug>
Yeah, right, because OSX cares so much about backwards compatibility. They care so much that they actively go out and intentionally break APIs, like say when CGDisplayBaseAddress() stopped working in Lion, breaking fullscreen in every single SDL-based game (and by "breaking", I mean the game will actually crash when attempting to enter fullscreen.)
Arguing about the niceties of the UI is all well and good but the actually problem is far more fundamental.<p>What killed the linux desktop? Drivers. Mostly graphics drivers but some others as well. Who cares if the UI isn't ideal if the damn thing can't sleep and wake up properly, or if it spazs out every time I plug in an external monitor.
I used to be really into the whole free software thing, but have mellowed with age.<p>However, no way in hell anyone will get me to switch to Mac OS. I am simply too enamored with having an environment that I can hack on if it strikes my fancy, as well as an environment that I can customize how I want it. Despite all its flaws, it still does focus follows mouse pretty well, and not having that would drive me batty.<p>Also, Apple is an 800 pound gorilla that has always been about Being In Control. The Samsung lawsuit wasn't anything new:<p><a href="http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corporation" rel="nofollow">http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microso...</a><p>I just don't want to be part of that kind of walled garden.
I'm writing this from my laptop which is running Ubuntu as its desktop.<p>I don't really see how the Linux desktop is dead. I've been running the same OS on this same laptop since 2007. The only upgrade I've added is an SSD and an extra gig of memory. It's still pretty speedy and I've never had any problems.<p>I use a Macbook Pro with OS X at work because that's just what I was issued by default. I hate it. I hate the over-reliance on the mouse, on gestures, the abundant and tedious animations; I hate the crappy ecosystem of package repositories and how most of the packages are broken or completely mess with the system; I hate never being able to find where any of the configuration files are or where something is installed; I hate the plethora of ways you can start and stop services; the confusing GUI; the masochistic meta-key layout; the awful full-screen support; and the complete lack of customization options.<p>I've had much better experiences with the Linux desktop for 95% of the things I do.<p>Now before some OS X fan-person decides to point out how woefully misguided and ignorant I am, my point is that there are different folks out there who want different things from their desktop experience. Apple gets to decide top-down what that experience is all the way down to the hardware. I prefer a little more flexibility. I like being able to swap out my own battery or adding a new memory module when I need one. I like being able to switch from a GUI desktop to a tiled window manager. Some folks don't -- there are Linux distros that hide as much of that as possible. Either way there are plenty of options and I think that's a good thing. Competition breeds innovation and even though I don't particularly like Unity I am glad to see people trying new things.<p>The Linux desktop isn't dead. It may just smell funny. You may switch to OS X and wonder why anyone could possibly want anything else. I just gave you a bunch of answers.
There's room for many approaches, of course. While the perfectionism (or is it lack of pragmatism?) of Linux and its developers may well have held back its wider adoption on the desktop, there's a lot to be said for the its development community's single-minded pursuit of quality and correctness.<p>As well as Linux's presence in the data centre, witness the success of 'embedded' Linux: many TVs, routers, set top boxes and other bits of sealed-box electronics all run on it. It's broad in its scope because of the large team of divergent interests working on it, and it's able to support those systems because it's been well made as a direct result of that team's philosophy. Is it really so bad that the average Facebooker does't want to use it?<p>It really is very, very hard indeed to be all things to all men and no single system around today can make that claim. Linux has its place in the world of computing, just like Android, Windows, OSX and everything else.
There never was a "Linux desktop". Linux is a kernel. GNU is a set of utilities. And X11 is a mess.<p>Did you know that X11 is why we have shared libs (the UNIX version of "dll hell")? If not for having to run X11, shared libs really would not have been needed.<p>There are many window managers. Maybe too many. Too much choice for a noob. That selection or the pre-selections Linux distribution people make does not equate to "the" Linux Desktop. It equates someone else's configurations and choice of applications. It equates to having to fiddle with X11, whether you are just configuring it or developing programs to run in it. And that has always been extremely frustrating for too many people- constant tweaking; it never ends. This is like a brick wall to people who might want to try Linux, coming from Windows. You are inheriting a system that's been configured to someone else's preferences. (Same is true with Apple, but they have a knack for making things easy.)<p>I skipped Linux altogther and went from using Windows to using BSD. I've also been a Mac user. And BSD is way better than OSX, or any of the previous MacOS's for doing most everyday things: email, internet and secure web (ramdisk). Moreover it's flexible - you can shape into what you want - without this being an overwhelming task of undoing someone else's settings.<p>If you want a citation for the shared libs thing I will track it down, but honestly anyone can do it on their own. The historical research will do you good. Educate yourself.
An interesting observation is that tablets are becoming the new desktop and in that space linux, through android, is becoming a dominant player. In a way, the linux desktop is finally here and it's winning against both Microsoft and Apple put together.<p>All of the article's criticism of mainstream workstation distributions is accurate, of course. But it's important to note that those represent nowhere near the sum total of the linux user experience these days.
> In my opinion, the problem with Linux on the Desktop is rooted in the developer culture that was created around it.<p>This developer culture DEFINES Linux. A fruit is either an apple or an orange. I couldn't have an OS with wonderful package management, developer tools, endless configurability AND a desktop Miguel de Icaza dreams of.
This flame ignites periodically, and I'm always left wondering when exactly the Linux desktop died? Some have noted similar aspects already, but here's my 2 cents:<p>I'm on Linux now (GNU/Linux, maybe lump BSD in there too, I'm using "Linux"). I know plenty of users on Linux. I know plenty of users of Windows and OS X who run virtual Linux Desktop distributions for testing/development/security. I'm sure some of HN are running Linux.<p>Does Linux have the potential to enter the market as a third core option for desktop usage - not really. But why does it matter?<p>The problem with Linux is that there are too many choices. People who like technical choices and options trend toward Linux (needs citation).<p>John Q. ComputerUser isn't going to use Linux unless his geeky son or nephew installs it for him AND provides support. He can't get support anywhere else - because there are too many possibilities for it to be fiscally effective.<p>If/When something gets confusing or broken on Windows/OS X, you call JoeBob's SuperDuperPuter, and say it's broken. JoeBob asks, "What Windows version?" While he might need to poke and pry a bit to get the user to tell him he's running Millenium edition, once he gets that data, it's a pretty straightforward troubleshooting effort and fix.<p>If you call some mythical Computer Service group that actually supports Linux, and say your machine is broken, they would need to know a LOT more about your system just to figure what they need to do to start.<p>Distribution? Parent Distribution? Shell? Window Manager? Hardware? ...<p>I find generic computer service companies to be extremely expensive. To be able to provide even basic service for Linux in general, your techs need to be very familiar with more operating systems (emerge, apt, yum, zypper, pacman), and more core applications. Each service effort inherently takes longer. These factors pile up and everything becomes necessarily more expensive. It's downright impractical to support Linux generically. The support costs for one or two issues on Linux would far outweigh the cost of an upfront OS license and cheaper support for the end user.<p>Linux has (and will likely continue to have) a comfortable hold on the technically-capable DIY market. It may not be on track to step beyond that market in the desktop arena - but that certainly doesn't indicate it's time for a toe tag.
>And you can still run your old OSX apps on Mountain Lion.<p>Having been a small-scale Mac developer for many years, that really made me chuckle. Not since OS X 10.2 did Apple release a major upgrade that didn't break my apps and make me struggle to push an update out as quickly as possible to fix all the things that Apple broke. Apple has heard of deprecation, but they don't seem to really grok the concept.<p>If I had been developing for Linux, I could have simply tested on pre-release versions of the distros I wanted to support and would have been ready when the new versions were released. On OS X I would have had to have paid a prohibitive fee for that privilege.<p>In any case, this article made me happy. You see, for so many years, I used a Mac, and everybody said "Apple is on its last legs; the Mac will be dead in a few years". Apple had to scramble to compete, and that drove them to provide such a good product. But I knew that situation might not last forever, and I was right. After seeing the turn that Apple had taken over the last few years, I switched to an Ubuntu laptop six months ago.<p>It's refreshing, once again, to be using an OS that people are calling "dead".
Linux is too hard to configure; if the distro gets it right out of the box it's fine, but not otherwise. I started with Windows 3.1 in 1995, mostly used Slackware, and some Windows 95, from 1996 to 2000. Slackware and Windows 98 from 2000 to 2004. But from the time I got on the Internet in 2004 to the present I have mostly used Windows (98, XP, and Vista) because I have not managed to get any version of Linux that I have tried to connect through a dial-up modem. I have to admit I have only tried sporadically, since Windows just works, and my efforts to get some Linux distro to work have been so frustrating. (Note that though a frequent user, I am not a programmer or professional sys-admin.)<p>ADDED: jrockaway's comment, added while I was writing this, hits it just right: "I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole." Which is what makes it so frustrating, even to very intelligent people who have other interests than computers in and of themselves.
Please ignore Icaza.
As for the famous death of Linux on the desktop let me tell you something: IT NEVER HAPPENED.
What are those people smoking?<p>I've been using Linux for the last decade and every year it gets better, more polished, more integrated, featuring a better design; I hear more & more people talking about it and using it. Linux is more alive than ever on the desktop!<p>Depending on your needs, Linux can make an exceptional desktop. Yes, true, it is not for _everyone_, but then again neither are Windows or MacosX.
Nothing "killed the Linux desktop"; it still thrives for those that want it, and it's steadily improving. It never came to dominate the market, and one can argue about the reasons it never displaced Macintosh (still less Windows). It <i>probably</i> has a lot to do with lack of a single, unified vision, and the market fragmentation caused by the different distros, and the lack of market pressure to ship, as it relies on volunteer labor, but I'm not going to presume.<p>Personally, I've been primarily a Mac user since the Mississippian superperiod, but I used an X-11 Windows(™) environment (on top of FreeBSD) for years at work. I don't miss it, even one iota, but I know plenty of smart people who prefer that sort of thing. <i>De gustibus non disputandum est</i> and all that.
When the iPhone 3GS came out, battery life tanked on my 3G. They fixed that, after some period, only to break the reporting in the firmware. Now the device thinks it's dead after a few hours. Replace the battery, same life time. At this point, I'll never expect and apple device to last longer than a year and therefore will not buy one.<p>Additionally, OSX is no linux replacement. Bash is completely different except for cd, rm, and ls.
Not this post again. Those who thought Linux can compete with heavily subsidized windows on Laptops or OSX with Apple's flashy interfaces are dreamers.<p>Linux has been for those that like to get dirty and it is doing that job quite well. Canonical came a bit late to the party and wasn't large enough to matter. RHEL just went after the servers. To make a fair comparison, Linux should have had a big player backing it strongly on the Desktops / laptops 10-15 years ago (like Google is doing now with Android). HP and IBM did their half assed attempts, but they were never really behind it completely.
I love Linux, and as a developer, use it as my main os (ubuntu). It is so easy to develop on, and it's package management is superb. I don't use the desktop per se, that much, and am usually command-line driven.<p>I have a Mac, and use it for some things, at times. It's nice, for sure, but I love the openness of Linux, even though, of course, there can be many very painful hardware issues (video, sound, etc), all of which I have experienced at one time or another.<p>I am wondering - I hear Google is working on a "Android desktop". Would that perhaps maybe change things regarding the "Linux desktop" a bit?
Back in the day, before setting up Linux was a breeze, I got tired of mucking around with configuration and such just to get a usable Unixy desktop and environment. So the day OS X Jaguar was released I purchased a Mac.<p>Now if I need to fire up Linux for a project, (usually for a microcontoller or such hardware that needs C), a virtual machine or appliance that I can launch from Windows 7 does the job. This is also how I keep Windows 8 contained, safely in a virtualized box that I don't have to deal with it, unless I need too... ;)
The Linux desktop never got killed! It never was really living. As long as Linux is not sold on computers, it will never spread. Maybe things change in the future, Canonical is doing an insanely great Job bringing Linux to the masses. But personally I think Linux will take of in new markets (China, Brazil, etc), not in allready established ones.<p>By the way, in my oppinion only a small fraction is buying Macs because of OS X, it's the Hardware. Design and Usability of Ubuntu is a lot better than OS X at the moment.
First of all, the Linux desktop is not dead.<p>As I wrote on my blog recently:<p>"In the [past three years], Linux has grown — albeit slowly — in desktop usage. After nearly 2 years of no growth (2008-2010, lingering around 1% of market), in 2011 Linux saw a significant uptick in desktop adoption (+64% from May 2011 to January 2012). However, Linux’s desktop share still about 1/5 of the share of Apple OS X and 1/50 the share of Microsoft Windows. This despite the fact that Linux continues to dominate Microsoft in the server market."<p>It may be in third place in a desktop market with primarily three OSes, but usage has never been higher.<p>As I discussed in this article, most of the original reasons that stopped Windows / Mac users from using Linux years ago are no longer valid. However, the irony is that it's easier than ever to get by with a Free Software desktop, but harder than ever to avoid proprietary software and lock-in, thanks to the rise of SaaS and the personal data cloud.<p>I agree with de Icaza that the "Open Web" is more important these days than a Free Desktop. But the linked Wired article's conception of Open Web refers to things like HTML5, JavaScript and CSS. These aren't the problem. They are an open delivery mechanism, yes, but usually for proprietary software.<p>Modern SaaS applications accessible through the web browsers using open web standards are the modern equivalent of an open source Perl script wrapping calls to a closed-source, statically-compiled binary.<p>You can read more about my thoughts on this in "Cloud GNU: where are you?" <a href="http://www.pixelmonkey.org/2012/08/18/cloud-gnu" rel="nofollow">http://www.pixelmonkey.org/2012/08/18/cloud-gnu</a>
in a q&a round at aalto university in finland linus adressed the question why linux never took off on the desktop: the lack of being a pre-installed os.
he mentions that without preinstalled operating systems there's now way to gain a significant market share in the desktop segment.<p>the whole talk by itself is very recommendable: <a href="http://youtu.be/MShbP3OpASA?t=23m45s" rel="nofollow">http://youtu.be/MShbP3OpASA?t=23m45s</a>
Thats just bullshit. The Linux desktop maybe isn't broadly accepted or mainstream, but I dont see the Problem in that- after all Linux remains a system for power users, even if some Distros want to change that. And there really is no better desktop environment for those people than the Linux desktop. Windows is shit incarnated, so lets not even begin to talk about it. What remains? Mac OS. Sure, it has a more accessible GUI, but not a more efficient one.
I cant think of something more elegant than a tiling wm, be it awesome, wmii or xmonad. Everything based on moving a cursor just feels awkward in comparison to the simplicity of ~5-10 keyboard shortcuts. And tiling also means that I always have everything in front of me. Fumbling around to find some window is HORROR.<p>I think the Linux desktop simply has more options for experienced users. I simply see no way how I could be more productive with a GUI designed to cater to lusers.
It's still very much alive if you don't give a shit about normal UX conventions or popularity, and there are hundreds of thousands of excellent 3rd party applications that run perfectly.<p>It's getting really irritating when someone who's jumped ship to OSX declares it "dead" because they have a shiny iDevice and an expensive laptop.
Largely the same issues that killed UNIX as a viable desktop alternative are the same issues that are killing Linux as a viable desktop alternative: Fragmentation and lack of consistency across different distributions.<p>This is compounded by most distributions having a lack of centralized vision on how everything fits together. They are merely a collections of individual parts rather than a collection of parts that are designed to work well together and they lack the polish as a result. While the lack of centralized vision was fine for SunOS circa 1992, it simply doesn't cut the mustard in 2012.<p>Ubuntu seems to be trying to push such a centralized vision with Unity, but I fear they lack the clinical editorial willpower to make the hard decisions required to see it through to its ultimate conclusion.
Compare Linux to OSX makes no sense to me. Linux has always been missing features when compared to alternatives; the GNU system was actually written to emulate the alternative.<p>But the GNU/Linux project had a very different objective. Fighting for freedom. If it is still freedom the driving force, then we should encourage the enthusiasts and get back to work on improve Linux, as it has been done for the past years. By doing so Linux already reached the excellence in some fields.<p>If you're just competing on features, you'll be missing some great benefits and enjoyment. And to be honest, in terms of features OSX isn't that good either as Windows is still used by the majority for one reason or another.
Miguels affection towards his iPhone is a bit unconsidered and superficial.<p>But anyway, a more interesting question could be: What does it take to bring an ex-linux user and now happy OSX user back to linux?<p>I used Windows for 3 years, then linux for 2 years. During that time I did a lot of installations (mostly ubuntu and debian) on a lot of different devices. During this time, while fighting with drivers, minor display problems, and spoiled windows users I lost my faith in linux as a desktop os and switched to OSX.<p>I can just speak for myself, but this few points would bring me back to linux in no time.<p>Presenting Distribution "Utopia"<p>1. No X11 based display stack, it is replaced with something conceptually simpler (like cocoa).<p>2. (Multiple) monitor recognition 100% accurate. (Probably connected to Pt. 1)<p>3. The audio setup is not much worse then the one of OSX.<p>4. Throwing Gnome and everything that is based on Glib out. It's 2012 there alternatives to faking oo with C. Qt isn't allowed either.<p>5. Throwing APT out. No more dependency management for a desktop OS please. Then kill Perl as requirement for running an os.<p>Ahhhhh, I feel better now :-).
This is the opposite of what Miguel demanded, he cares for backward compatibility.<p>When I think about it. "Utopia" would be similar to Android. No fear to throw old stuff out.<p>Android as a foundation for a new desktop linux?
1. If this is true, and it seems right to me, maybe some of the massive effort put into designing new GUIs for Gnome/KDE/etc should be put into hacking the look and feel of the OS X desktop?<p>Unsanity ShapeShifter hasn't worked since OS 10.4<p>and I know about<p><a href="http://magnifique.en.softonic.com/mac" rel="nofollow">http://magnifique.en.softonic.com/mac</a> - 10.5 only<p><a href="http://www.marsthemes.com/crystalclear/" rel="nofollow">http://www.marsthemes.com/crystalclear/</a> 10.7 support claimed, but it's not very radical. I'd love xfce's window look controls or a Stardock windowblinds.<p>I know Apple don't want anybody to do this. I know they will deliberately introduce changes that break hacks. But as I said, how can it be more effort than Linux?<p>-----<p>2. To try to prevent OSx86 hacks, DSMOS.kext uses crypto to prevent the system running essential UI elements like Finder, SystemUIServer, etc. Can't we build our own versions of those parts?
<a href="http://en.wikipedia.org/wiki/Apple%E2%80%93Intel_architecture#Dont_Steal_Mac_OS_X.kext" rel="nofollow">http://en.wikipedia.org/wiki/Apple%E2%80%93Intel_architectur...</a><p>-----<p>3. Is this true?:<p>Linux desktop - dying, dead<p>Windows 8 - trying so hard to copy OSX/iPad/Springboard/Launchpad that everybody is gonna hate its TWO UI's! (dying?)<p>Mac - winning, won (by default?)
This article hits so many sore spots right on a pustulent scar tissue.<p>I had run a Linux desktop (a Debian build mostly w/ KDE) for a while and kept getting hammered with random stuff breaking for random, and often poorly considered, reasons. I gave up and went back to running a Windows deskop with a X-server to pull up windows on my Linux box.<p>Then I went to work for Google and they did a really good job of running Ubuntu as an engineering desktop (calling their distro Gubuntu of course) and I thought "Wow, this has come quite a ways, perhaps Linux has matured to the point where there is at least one way to run it reliably. And so I installed Ubuntu on my desktop and tried that for a while.<p>For "using" it, it was for the most part ok if once every few days I did an apt-get update/upgrade cycle. For <i>developing</i> it was a real challenge. Pull in the latest gstreamer? Blam blam blam things fall over dead, update their packages (sometimes pulling git repos and rebuilding from source) to get back working, and now apt-get update/upgrade falls over the next time because you've got a package conflict. It is enough to drive you insane.
Proud Ubuntu user here. Ubuntu 12.04 is not bad at all. Supports the fancy font he used on his blog. Flash is working. WebGL is working. LibreOffice opens Word docs when I need to. Audio is working.<p>I have Windows 7 on the other partition mainly to play games.<p>There was a minor issue with Ubuntu trying to melt the CPU in my laptop the other day, but its not so bad since I upgraded, and I found this powertop thing that also helps.
i guess if you don't mind that osx used 5% active cpu just for flashing bubbly buttons that's alright.<p>i like osx, I think it does have a good ecosystem for GUI APPS. but at pretty much everything it fails. It's a performance nightmare and the filesystem makes me want to punch a kid in the face(yes sorry, I also don't think you should be doing opengl in javascript, but hey) everytime it kills the cpu.<p>Now, with all the mentioned above I do wish there was a better ecosystem for app development. I mean something like xcode 3 not 4. Yes we have QT, yes we have glade, but build an app with the interface designer and bindings, mvc concepts and it just helps a lot.<p>You can do most of it with Vala, granted, it's just shittier documented and not as "round", there are no standard concepts to follow, etc. And yes, I do like my linux customizability, but we have stuff like CERT best practices for secure C coding. Why can we not get something like that for linux gui programming.<p>ps. gnome3 can go right where it came from
In like 2006 I switched from Windows XP to Linux. This was before Ubuntu was what it is today. I learned using Slackware and eventually switched to Gentoo. It was cool and gave me nerd cred when I went to college.<p>I switched to OSX for exactly the reasons the author mentioned. The fact that I have an awesome UI + ability to use the shell all day is a huge win for me.
For what it's worth, GNU/Linux never really was about some desktop conquest, so this whole discussion "What killed the Linux desktop" is quite absurd.<p>That aside, what we have here is a thread apparently devoted to shitting on the work of people who built something for fun and gave it away for free.<p>Good job folks!
I would have to say Apple killed Linix. As many others have noted here, OSX has improved to the point where many Unix admins run OSX and it runs the tools they have for their work. Also Mac hardware is better than PC hardware so you buy a macbook with OSX and you are happy.
OSX didn't kill the Linux desktop, Office and Photoshop did. Just as it killed the *BSD desktops. Lack of high-end applications that were compatible with what the business world was using doomed anything that didn't have at least a tacit blessing from Adobe and Microsoft.
"As for myself, I had fallen in love with the iPhone, so using a Mac on a day-to-day basis was a must."<p>What? How? I've got an iPhone and have never felt like having a Mac was a must. Am I missing some major parts of the system that don't work if you don't have a Mac?
Does anyone feels that linux and desktop are at odds with each other? Don't we like small components to bind together using pipes ? Desktop apps are the reverse, big black boxes that barely communicate with anything (I'll admit I don't know dbus)
><i>"is not a sexy problem."</i><p>This pretty much describes the root cause of nearly all the impediments to the adoption of FOSS in general and GNU/Linux in particular by the general public. It touches everything from backwards compatibility to documentation.
Its strange for me to read something like this since I have recently switched back to Linux and I've never been happier with it. I've been a sometime user of it since around 1997 but it could never survive long as my primary OS. I bought my Frankenputer parts from Newegg without checking on hardware compatibility for any of it and it all worked great. I had only one problem which was wake from USB keyboard and I googled it down pretty quick (was a new issue with Ubuntu Precise it seems).<p>I like OS X too and had a Powerbook for years but all other things being equal I'd prefer to develop and deploy on same OS and Linux is just fine for development so far.
Way too true. I ended up going back to Windows, because the audio would frequently (3-4 times an hour) stop working on my laptop until I restarted pulseaudio. And that's on Ubuntu certified hardware...<p>Not to mention the problems we had with our streaming servers and ffmpeg. It turns out that there was a big flame war on libav vs ffmpeg, and <i>someone from the libav camp managed to get the ffmpeg package marked as deprecated (it's not) and redirected to the libav package</i> on Ubuntu's apt repo. So we're stuck either compiling from source or running our own repo. Seriously? (fwiw, the rationale is that libav pushes new versions more frequently)
Ingo Molnar has some interesting thoughts on this subject: <a href="https://plus.google.com/109922199462633401279/posts/HgdeFDfRzNe" rel="nofollow">https://plus.google.com/109922199462633401279/posts/HgdeFDfR...</a>
I more or less disagree. My main frustration with Linux For Personal Use is that I can't buy a piece of hardware that I know won't regress with new versions of a distribution for three plus years or get any service if it does. My reference for the importance of this is a perfectly usable 2008 refurbished Macbook. I upgraded the RAM once recently for a bit more snap, but otherwise have no complaints over the three or so Macintosh releases since then.<p>Could the UIs and third party application situation be better? Of course. But considering all the garden variety crash bugs, power management bugs, lockup bugs, video driver misbehavior, hit and miss peripheral support, and in general just analysis paralysis about what hardware I should buy, and even then there is a less-certain future with regard to regressions.<p>Even given Windows's monopoly power in the commodity desktop and laptop markets, its reputation for dealing with sleep and drivers is only so-so compared to Apple Hardware and Software. If Window's monopoly power -- which buys you full attention from hardware manufacturers and their driver divisions -- only gives you mediocre results, what are the odds that a bunch of kernel hackers who receive almost no continual consideration from hardware vendors have a chance? To me, it looks like absolutely not a chance of becoming stable over time. I have completely given up on Linux laptops for this reason: by using desktops with Linux only I have cut out a lot of the problems, but not all of them. It's a kind of medicore that I can bear.<p>I want someone to sell me Linux distribution on a laptop that simply will not break over in its kernel-oriented features in five years of upgrades. I want that distribution to stop-ship if it a new version introduces a power management bug to an old laptop, and do whatever it takes to work around some lousy hardware bug or whatever. I want them to do whatever to work with Skype (such as statically linking whatever libraries, etc) and test Google Hangouts to make sure the webcam and microphone works. And it they don't work, they absolutely cannot ship. Until that day, I use Linux -- and I do mean the kernel in most of these cases -- as my personal operating system most of the time in spite of these problems because of my professional and philosophical needs, and not out of preference in any other dimension.
The first comment explodes this piece:<p>"I mean, look at OS X itself. Sure it's doing fine, but powered by iPhone and iPad, not by people wanting a new desktop. And it still has minority marketshare despite being from one of the most profitable companies on earth and despite Microsoft's repeated weird Windows-rethinks."<p>Basically, path-dependant lock-in means we're lucky not to be using x86-based wPhones that don't even have web browsers. The linux and open web communities have achieved amazing things, enabling Apple's comeback along the way.
Could it be that the main issue is the lack of leadership? We don't have many linux kernels yet we have dozens of incompatible desktop configurations and the list keeps growing. I think if there was a clear winner in the desktop wars, desktop apps would be of much higher quality.<p>And also the horrible aping of other environments and stupid UI eyecandy. Given that the majority of linux users and developers are technical, that's surprising.
You really have to wonder if the advent of Windows 8 and disgruntlement with it from Valve, Blizzard, etc. might have repercussions on this whole situation.
Linux isn't dead in the desktop because it never was a product in the first place.<p>The first attempts were Mandrake and Conectiva. Canonical has been doing a good job lately, the problem is that the platform is now beyond hope on the desktop, it simply doesn't gather traction from 3rd party developers - the most important thing for a desktop OS. You're pretty much limited to the FOSS utilities that exist on the repositories.
Seriously... Is this discussion still relevant?<p>Anyway, my bet on what "killed" the Linux desktop would be the Windows OEM licensing terms. Nothing really killed it because it was always a very specialized product.<p>Do we always have to see a problem when someone doesn't make the same choices we do?
"Miguel de Icaza — once a central figure in the development of the Linux desktop environment GNOME — says the open web is now a greater concern than free software."<p>I was kind of hoping those two things would each help drive the other forward.
OSX got nice touchpad, Windows has awesome game libs, and Linux comes with shit loads of developer goodness. But yea, now OSX has home-brew so it almost like a better linux, but still forces you to buy overpriced hardwares.
In a comment Miguel says,<p><i>Because the developers have moved on to greener pastures.</i><p>Of course, it all boils down to <i>green</i> at the end of the day.
I think he's right, but I think he's missing a key point.<p>Design. Design is what killed the linux desktop. It never had it. OS X has it. Even windows, crappy as it may be, has it.<p>Before I go on, let me say that Design is NOT "making it look pretty". In fact, thinking that this is what design is, is what leads many linux advocates to reject the needs of design.<p>Apple's work looks pretty-- <i>because</i> it is designed to function well.<p>Design is about usability and understanding the user and making an interface for the user that works well according to the users understanding, perspective and needs.<p>Design is an engineering discipline.<p>Seriously.<p>The Linux community hasn't had that, and I've seen many of them reject it. In fact, you can see it in the rejection of apple's patents. This is why they think that apple patents are not original is because they reject that any engineering went into them. But that's just one example. You see it all the time in lots of contexts. Look at the UIs of Linux... they didn't design one, they just copied windows.<p>Literal copying is about as far from design as you can get.<p>Sure, over the years, designers have taken cracks at bringing design to linux, including the work of Ubuntu, but it is rejected by the community.<p>Rejection of design is a cultural trait of the linux community. They reject it as a discipline, doesn't even see that it exists. (broadly speaking, of course.)<p>But as users, they have been influenced by it and many of them have switched to OS X because it is the best designed operating system.<p>And then they write long blog posts about how its wrong that OS X does things a certain way ... based on their lack of design perspective that would let them see why things should work that way.<p>Its ironic.<p>But its fine- if you want to run a linux desktop and don't value or care about design, more power to you. Won't ever fault someone for making that decision. We should all use the systems that we prefer.<p>But the culture that doesn't value design, and can't even see it as an engineering discipline, is going to have a great deal of trouble making something usable by the mainstream.
Apple builds fancy gadgets and gathers a fan-boy population, and eventually starts selling more. This really doesn't say anything about Linux desktop.<p>This whole thing about backward compatibility and the discussion that surrounds it is just vague. Here's a practical "true story" for you: I'm using GNU/Linux for more than 10 years now, and it is still alive.<p>Never had any vague binary compatibility problems either, because I'm not strangely expecting to use an ancient binary version of Gimp on my current system. That's because FOSS is source oriented, not binary.
I'm not suddenly trying to use a 15 years old graphics card whose driver is longer in the kernel either, because I don't use a 15 years old graphics card.
I'd say it was the lack of a standardized install convention for "guest" (non-nix) software.<p>What to do about it? Couple golang's preference for large statically-linked binaries with a one-folder, one-executable install convention and Linux may become more inviting for non-nix apps.<p>For example, imagine "/outside/myapp/myapp" is a large, unix-unfriendly, statically-compiled binary placed in it's folder by a OS-provided install utility. "Myapp" was probably developed for Mac or Windows and by design does not give a damn about /etc, /lib, /var, etc. These app should just be allowed to crap their configuration files into the home directory into which it has been placed ("/outside/myapp"). If one no longer needs the app, the folder is deleted along with everything else the app created while it was being used. Tidy. Behind the scenes such an app would be compiled to call the standard Linux APIs, yet it would probably avoid any dynamic dependencies. Disk space is cheap. Just bundle it all together and throw it somewhere where it can run in peace.<p>Amiga's icon files are another approach. Rather than a large, monolithic registry tracking everything in the system, executables exist in tandem with an "icon" (.info) file. This file is generated by the OS and tracks the executable's location and other settings in the workbench (desktop). A modern reincarnation could potentially track anything. Instead of accumulating registry filth with every uninstall one can simply remove an executable and its associated .info file. Instead of adhering to the heir convention, the app plays nicely in its own folder with it's own registry. By using an ".info" file, portable non-nix installs could reside anywhere, and not in a prefabbed "/outside" folder.<p>The smartphone penchant for portable installation should come to nix, particularly with non-unix software. It should be encouraged, and that's coming from an OpenBSD user. Unix needs a playground for non-unix apps.