I've always considered the Debian model of support (freeze the world and try to support everything) to be wrong and swimming against the tide. I believe that the only ones capable of supporting a package of a given complexity are the authors themselves, and the distros should just handle packaging (and minimal patching on top if necessary). If the author drops support, then you should either be <i>extremely</i> confident you have an adequately strong team to fork and maintain the package...or just don't.<p>In my humble opinion, the FreeBSD ports model is better in that regard. That's also why I try to use pkgsrc for various packages when maintaining systems running LTS Linux distros.
This is a real problem. It is a total joke to suggest that people move important systems to Archlinux though. Basically as an administrator you need to be aware of security advisories against the OS you run. This is already true because you need to know when to upgrade packages. If you are running Debian you should sign up for debian-security-announce:<p><a href="https://lists.debian.org/debian-security-announce/" rel="nofollow">https://lists.debian.org/debian-security-announce/</a><p>There are equivalent mailing lists for CentOS. It would probably be helpful to have better tooling to warn about/mark packages with known vulnerabilities associated.<p>I'm not an expert on Debian's internal process but sometimes it seems that Debian add packages to their distribution without a clear plan for how to fix security issues in them. Sometimes upstream maintainers seem hellbent on making it impossible to offer long term support for software. Elasticsearch is a case in point:<p><a href="https://www.debian.org/security/2015/dsa-3389" rel="nofollow">https://www.debian.org/security/2015/dsa-3389</a>
> But can you really be sure that people that do this stuff as an hobby can deliver this in the quality that you expect and require? Let’s be honest here, probably not.<p>This must be the most short sighted description of Debian developers I have to see, cause 1) most do packaging as part of their paid job 2) apparently millions of people in the world believe that, yes, these people do a fine job at maintaining those packages. Not to mention those who develop the software they package or those who do academic research about software packaging and dependency resolution. But insulting Debian has become commonplace nowadays.
Well, you have to chose something. You can't have security, automated updates to latest version and stability in the same place. You're either in debian/RHEL world with old kernels, old libraries and old userspace tools which miss a lot of fresh features or in npm/pip/curl|bash world, where you have latest version of everything all the time.<p>In former case you can do `apt-get upgrade`/`yum update` and be almost sure that everything will continue working, but - no - you can't have PHP 7.<p>In latter case you either use npm shrinkwrap-like tools to install the exact same version of everything every single time, or play Russian roulette with the new dependency versions. And - just in case if you didn't notice - when you pin some package to a specific version you no longer receive security upgrades for it. And let's be honest - you have a lots of those "^1.0.1", "~0.10.29", "^0.3.1" things in you package.json/Berksfile/... And for almost any package "^0.3.1" is the same as "0.3.1", cause the next version will obviously be "1.0.0" and 0.3.X won't be receiving any more updates.<p>It's obvious that no single distribution will be able to package the insanely large amount of packages from all the different sources, let alone backporting patches. So you either limit yourself to only the stuff available in your distribution, or you're on your own with updates (including security ones).<p>As for the packages updating themselves, sometimes it's a good thing, sometimes it isn't. I bet a wordpress installation which can't overwrite itself (because is owned by root), and doesn't allow executing user-uploaded .php files will be much more secure than one which has full access to itself.<p>P.S. no amount of tooling can solve this problem. If you're using version X of package A, then you find out that there is a security vulnerability in version X which is fixed in version Y and version Y is not fully compatible with version X (changed an API, config file, anything else in a backwards-incompatible way), you're semi-screwed. You will have to handle that situation manually.
This seems to less a complaint about packages and more a debate over Rolling vs Non-Rolling Distro's<p>The Author seems to take the opinion that Rolling is always better for Security because he selected a few "web" packages and found vulnerabilities in the shipping versions.<p>Ofcourse for the people running mission critical enterprise applications they probally are not running phpmyadmin on that server so they could care less if the repo for their stable version of debian contains that older software.<p>I agree it is problem, but the solution to that problem can not simply be rolling all the time.<p>I use Arch on my Desktop system, but I would never run my employers mission critical database on Arch... I need stability and security, not simply security.<p>Ultimately I think we need a better way to define the Core OS, vs "User Applications"<p>Things like glibc, the kernel, etc are clearly coreOS, things like phpmyadmin are User Applications. Where it get grey is databases, webserver etc. Do you lable them a Core product or a User App, if I was running a mission critical applications I would not want my mariaDB system just upgraded to the lastest version with new features and possible breaking (removing) older features that my app may still need.<p>Enterprises move slow, I still have enterprise applications that require Java 6, I have some things that still have software that only run on Windows 2000. This idea that Rolling to the latest version of software all the time is a workable plan highlights the authors ignorance of how enterprises actually work.
> Did you know that for example Wordpress comes with automatic updates<p>I grant that Debian is not great at webapps. OTOH giving an application write access to itself is inherently risky. It's too bad shared webhosting and its limitations has so warped the PHP application community.<p>An alternative model: web app has r-x on the app files, and an app specific admin user has rwx to run the check and update script on a regular basis.
I think that labeling point release distribution package managers as insecure because they are sub-optimal for a certain use case (updating web applications) seems a tad too excessive.<p>Having software go through a decent distribution's packaging process not only provides stability guarantees, but also off-loads many tasks that you would otherwise have to preform on a per-project/per-developer basis.<p>I perfectly satisfied with the trade offs.
The author makes some important points, but there is a cruel irony: He's a main developer of owncloud, which in terms of security updates is a huge problem.<p>Owncloud has its own update mechanism, which unfortunately usually doesn't work in the real world (it breaks if you have any reasonable timeout for your PHP applications, which every normal webhoster has). There are likely countless owncloud installations with known security issues that their users tried to update, but couldn't. (The alternative is a manual update on the command line, but given the target audience of owncloud it's safe to assume that many of its users aren't capable of doing that.)
It is true that in the stable branch there are dated version numbers, probably most of the time for a good reason (e.g. long term support).<p>On the other side I adhere to Slackware's vanilla philosophy and with slackbuilds you can have always fresh and up to date software.<p>Well, if you do this you should have dev, test, prod chain for you servers, otherwise you update at your own risk.<p>Last upgrade I went for Ubuntu12/Node0.10 to Ubuntu14/Node4 but also nginx changed and even logrotate from 3.7.8 to 3.8.7 introduced few modifications that broke my configuration files.<p>Upgrade is not that easy.<p>I would like to share a project of mine that brings vanilla philosophy on every distri. You have a script to build your software and installing it locally, so for example you have version z.y.x installed oj your system and you want to install z.y.(x+1) released yesterday.<p>Normally you download the tarball, bla bla bla and launch make install, most of the time you follow really similar steps, you can put in a script and launch<p>.software_install Foo<p>if the version is the same as in the README.md or even<p>.software_install Foo 1.2.3<p>to install a specific version. It is really was to add new software to the list. You can also package your software to avoid compile time on other hosts (test and prod). Give it a try, I think it can be useful to many system administrators and developers:<p><a href="http://g14n.info/dotsoftware" rel="nofollow">http://g14n.info/dotsoftware</a>
Yup, the guy still doesn't have a clue what he's talking about.<p><a href="https://news.ycombinator.com/item?id=11095783" rel="nofollow">https://news.ycombinator.com/item?id=11095783</a>
I think Debian's long release cycles don't make much sense in this day and age. To me, a rolling release model makes much more sense, especially in this world where security updates are being done constantly, and are generally focused on the more modern branches.<p>For enterprises where software is part of the core business, keeping up with updates on a regular basis is probably better than doing large upgrades every once in a while (for one thing, tracking down regressions is a lot easier when you don't have to search the entire haystack).
The conclusion I take from this is that distros need to be a lot more selective in what they package. If packagers can't reliably backport security fixes for the several years that a distro release is supported, they shouldn't create that expectation by putting the package in.
A bit meta but articles that use the passive voice such as "Blah considered insecure" annoy me.
Considered insecure by whom? The author? The why not just say "Distribution packages <i>are</i> insecure". Sure it makes it sound like there is some consensus here but that does not actually seem to be the case.
Well, I can agree with the concerns. But what are the alternatives? What GNU/Linux distributions do (and Debian in particular) looks to me the least of all possible evils.
I thought about this issue some time ago:<p><a href="http://security.stackexchange.com/questions/109026/security-updates-foss-upstream-policies-how-are-they-chosen" rel="nofollow">http://security.stackexchange.com/questions/109026/security-...</a><p>I agree that there's a distinction to be made between core/base and user-applications/ports (as mentioned elsewhere in this thread)...<p>Ultimately it's all softare, and the only distinction is fuzzy: e.g. the kernel won't easily break backwards compatibility, while databases, interpreters, etc. will... but it's not something you can easily measure without being vigilant for every change.<p>I think that an important distinction (at least for Ubuntu) are the main and universe repositories: I'd expect these problems to happen in universe and to be mostly absent in main.<p>From this point of view, a good choice would be to completely rely on main, and to weights pro and cons when deciding if using the repositories to manage your user applications/libraries/dependencies for your actual service. (I'd probably define all of them with Nix, but that's not a panacea)<p>The problem is that even main is guaranteed to keep up with all the security updates: In some cases updates aren't prepared and shipped because the default configuration is not vulnerable (but obviously a sysadmin could change that) and it's not worth the effort.. or just like in the Python2.7.9 case: A security update most often can be applied as a standalone patch, but if the changes are overarching and not easily distilled into a patch, the update will be too expensive/risky and won't be done.
Oh yes, this is a big problem. I still run into things like installing ownCloud client on Ubuntu and then wondering why it doesn't work (it's years old). Or recently I found out that Python3-Pandas on the Raspberry pi is version 14.something which has annoyances that have long since been solved. If you install Drush (Drupal Shell) from Ubuntu 14.04, you get version 5.10! (They are at 8.1).
Arch Linux is already much better but one breaks things occasionally and they were kicked off of Digital Ocean, sadly.<p>I really hope Ubuntu Snappy packages, which is essentially the same as the Arch User Repository but more secure If I understand correctly, will solve this messy misery.
The fact that we haven't solved this yet is puzzling. It seems that we have actually gone backwards in some ways over the last decade. With languages implementing several broken distribution packaging systems trying to compete against each other. Not even to mention the horrible practice of piping random <i>sh1t</i> from the web with curl/wget into a shell.<p>It would be more or less trivial to implement a wget wrapper that downloads the .sig and validates it. Hardly anyone these days understands or cares which seems to be the bigger issue (not a technical problem but a people's problem as <i>Gerald M. Weinberg</i> would say)
Regarding the issue of verifiable builds, everyone should read this:<p><a href="http://0pointer.net/blog/projects/stateless.html" rel="nofollow">http://0pointer.net/blog/projects/stateless.html</a>
The job of the distributions would be much easier if more software projects would a) consequently use semantic versioning (MAJOR.MINOR.PATCH, see <a href="http://semver.org" rel="nofollow">http://semver.org</a> for details) and b) explicitly and officially designate that they no longer support the version MAJOR.MINOR branch.<p>The latter should be a signal for the distribution to upgrade to a newer and supported upstream version instead of (halfheartedly) trying to support the software themselves.
Nah, that's why packaging <i>web apps</i> with a stable Linux distribution is a notoriously bad idea and should be discouraged.<p>Debian (and other distros) do a perfectly fine job updating the actual operating system.
This. I agree with this.<p>Freezing the work does not work particularly with certain packages. For example, I still see lots of usage of old versions of openssl (0.9.8) and openssh.<p>If we even just make sure to cover these two packages and the related dependencies/affected applications alone, that would go a long way in covering attack vectors (obviously not comprehensive)