I am using Debian as my main desktop (laptop in this case) for the last 8 years... sure I am not the only one.<p>I have suspend/hibernate, great Gnome experience, no issues to run presentations or use public printers, really stable and very fast experience, and I did not configure any system aspect to make it work, just installed the distribution and that works out of box.<p>Well, I am very happy with Linux desktop. Just presenting a contra-example :)
> Computing-wise that three week vacation turned out to be very relaxing. Machine would suspend and resume without problem, Wi-Fi just worked, audio did not stop working, I spend three weeks without having to recompile the kernel to adjust this or that, nor fighting the video drivers, or deal with the bizarre and random speed degradation that my ThinkPad suffered<p>For three weeks every system works. After six months every system breaks. Even Windows 10 and Mac OSX developed various strange bugs (or annoying idiosyncrasies) after few months of my use.
Most of points there make sense for power users only. If you need just basic internet station (no gaming, professional video/audio editing etc) then linux works almost perfectly.
"Android contains the only Linux component - the kernel"<p>Well, only the kernel IS actually Linux. A Linux-kernel based *nix OS in itself, which is usually wrongly assumed as Linux, is not in fact Linux (or not quite). In this regard Android is just like a lot of other Linux-kernel based distributions, no mater that it runs on an older/modified kernel.
ChromeOS uses the Linux kernel and works rather well. There is a paragraph dismissing Android as a desktop OS, but ChromeOS (including the Chromebox) is very close to a classic desktop experience, especially if you consider typical consumer use cases.
Based on the number of issues I've had with either OS, Windows 10 is much less ready for desktop than Ubuntu.<p>In fact, using Windows 8.1 or 10 I can't get 1080p video over HDMI without screen tearing.<p>With Ubuntu, I can.
I'm not sure it's even "ready for the server".<p>I have a server without a monitor I'd like to VNC to. Unfortunately X doesn't start without a physical monitor plugged in because it uses it for autodetection. It's such a silly problem that you can even buy fake "monitor" dongles to plug in to trick X into thinking that there is one.<p>I don't feel like spending money on a software problem so I followed the standard workaround - create a static xorg.conf file. This is very suboptimal because what happens if I later plug in a real monitor?<p>Ok so now X starts. Sort of. Actually lightdm starts and I still can't get X11vnc to connect to the `:0` display. Apparently there is a workaround involving the MIT magic cookie but at this point I've given up.<p>In fact does X even support monitor hot-plugging?<p>Oh and also, if the wifi connection goes down and X isn't running it doesn't automatically reconnect! Wtf? I assume this is because the thing that does the reconnecting is a GUI tool of some sort.<p>Madness.
dang, the title is editorialized (and in bad taste).<p>It was discussed a few days ago with a lots of comments: <a href="https://news.ycombinator.com/item?id=10812214" rel="nofollow">https://news.ycombinator.com/item?id=10812214</a>
I've been running Linux Mint on my private desktop exclusively for 3 years, and Ubuntu for one year on my laptop (HP EliteBook 8530p). I use Windows 7 at work. The only thing I miss from Linux is MS Excel.