I've used Linux as my main desktop OS for about 3 years, from around 2010 to around 2013 (so maybe some things changed in the meantime).<p>The article says that you pay a "VM tax" by running in a Linux VM in Windows or by using WSL, but by using Linux directly you end up "paying" in other areas, for example in terms of less-than-perfect drivers (which leads to problems such as worse battery life, worse graphical performance etc). In an ideal world, laptop and GPU makers would put as much effort developing Linux drivers as they do when they create Windows drivers, but unfortunately this is not the case.<p>Another point is that, even if you are running Linux on your dev machine, it is often challenging to have a local environment that is as close as possible to your production one; for example where I work we have Ubuntu 16.04, Ubuntu 18.04 and CentOS (and with various combinations of installed packages), even if I was using a Linux laptop, I would still need containers to get all the different environments right. The only case where you could use no containers at all is if you had all of your servers running the exactly the same environment.<p>Finally there is the issue of some pieces of software not being available for Linux, 95% of the times you can find a Linux equivalent that works for you but, in my experience, there were still rare cases where I had to resort to Wine or some Win VM just to run some specific tool that was needed to do my job.<p>PS My job, at the time I was using Linux as my main desktop OS, and during the couple of years after I switched back to Windows, was about PHP, Python and Java web apps deployed on Linux servers, so my comments above should be taken in this context. Maybe in other fields of software development there are different factors to take into account.<p>PPS I have also tried MacOS, on the one hand it's great that Mac is UNIX, but on the other hand it's not Linux so, if you want to closely replicate your production Linux environment, you will still need VMs or containers.