Uv's latest release was discussed yesterday: <a href="https://news.ycombinator.com/item?id=41302475">https://news.ycombinator.com/item?id=41302475</a><p>The linked post is the author of Rye's take on that.
For those interested in uv (instead of pip), uv massively sped up the release process for Home Assistant. The time needed to make a release went down from ~2.5 hours to ~20 minutes. See <a href="https://developers.home-assistant.io/blog/2024/04/03/build-images-with-uv/" rel="nofollow">https://developers.home-assistant.io/blog/2024/04/03/build-i...</a> for details. I'm just a HA user btw.
I know python packaging has it's issues, but so far I personally have gotten pretty far with plain pip. The biggest shift to me was switching from original virtualenv to built-in venv module. On the other hand, if I wanted to be really serious about dependency management, I'd steal a page from FAANG, and build a monorepo and avoid all this hassle with package managers.
At first I was excited to see that a new tool would solve the Python "packaging" problem. But upon further reading, I realized that this was about _package management_, not so much about packaging a Python application that I've built.<p>Personally I haven't had many problems with package management in Python. While the ecosystem has some shortcomings (no namespaces!), pip generally works just fine for me.<p>What really annoys me about Python, is the fact that I cannot easily wrap my application in an executable and ship it somewhere. More often than not, I see git clones and virtualenv creation being done in production, often requiring more connectivity than needed on the target server, and dev dependencies being present on the OS. All in all, that's a horrible idea from a security viewpoint. Until that problem is fixed, I'll prefer different languages for anything that requires some sort of end user/production deployment.
After the whole npm VC rugpull + Microsoft acquisition, and OpenAI showing legal non-profit status is toothless marketing to VC-path-entangled leaders, I'm reluctant to cede critical path language infra to these kinds of organizations. Individual contributors to these are individually great (and often exceptional!), but financial alignment at the organizational level is corrupted out of the gate. Fast forward 1-4 years, and the organization is what matters. "Die a hero or live long enough to become the villain."<p>So fast lint, type checking, code scans, PR assistants, yes, we can swap these whenever. But install flow & package repo, no.<p>That is unfortunate given the state of pip and conda... But here we are.
There’s one problem left with those tools: authority. They’re not pypa endorsed, that’s what makes those different from cargo.<p>At the same time pypa wasn’t able to provide a comprehensive solution over the years, and python packaging and development tools multiplied - just 3-4 years ago poetry and pipenv seemed to solve python packaging problems in a way that pip+virtualenv couldn’t.<p>We need pypa to now jump on the astral.sh ship - but will they do that without a certain amount of control?
Armin advocates for 'uv' to dominate the space, but acknowledges it could be rug-pulled due to its VC backing. His solution to this potential issue is that it's "very forkable." But doesn't forking inherently lead to further fragmentation, the very problem he wants to solve?<p>Any tool hoping to dominate the Python packaging landscape must be community-driven and community-controlled, IMO.
I was looking this morning at migrating our software from poetry to uv at my company, due to poetry's slowness. And so far i've been reading a lot of doc and not getting a lot things done. I did the previous migration to poetry as well which was vastly simpler. So far it seems that poetry tried to make a simple package manager that works like any other, while uv is keeping quite a bit of the python package insanity around.
I wouldn't blame people if they sat this round out and waited for 2026's iteration of "Python Package Managers: we've really solved it this time!"<p>(still a happy Nix user)
I really like this framing - lots of incremental work by lots of people over time got us to the point where ~a few people at one company can radically improve the situation with a medium amount of work.
The churn is interesting. In 2019, I made a python version-manager and dependency manager written in rust. I gave up after it seemed like no one wanted to use it. Everyone not satisfied with Pip was on Poetry or Pipenv; I made the one I did because they both had user-interface problems; of the sort I would run into immediately. (I believe Poetry would default to Python 2, and not give you a choice by default, or something to that effect). Now there is a new batch.<p>The biggest challenge was dealing with older pacakges that used non-standard packaging and ran arbitrary code; generally ones that didn't have wheels.<p>From the article:<p>> As of the most recent release, uv also gained a lot of functionality that previously required Rye such as manipulating pyproject.toml files, workspace support, local package references and script installation. It now also can manage Python installations for you so it's getting much closer.<p>These are all things that dead project I wrote could do.
TY to the astral team for making my quality of life so much better and Armin for being brave enough to pass the torch.<p>Strong +1 on a one tool wins approach - I am so tired of burning time on local dev setup, everything from managing a monorepo (many packages that import into each other) to virtual environments and PYTHONPATH (I’ve been at it for like 8 years now and I still can’t grok how to “just avoid” issues like those across all pkg managers, woof!)<p>I am really excited to see what’s next. Especially looking for a mypy replacement and perhaps something that gives compiling python a “native” feeling thing we can do
Those people seem to have a passion for developing package managers (instead of just seeing it as a tool that needs to do the job), and as long as it is the case, I don't see how we wouldn't end up with one new package manager every year.
With declarative tooling - unlike setup.py - we've lost the ability to install to locations not managed by Python. Most of my projects used to have a python-code config file that I could include both in my code and in setup.py. Last I checked none of that is now possible. Want to install a system binary - ship an RPM or a flatpak.<p>I don't understand how we could lose so much flexibility and yet gain so little in return.<p>P.S I've only ever encountered minor dependency issues in my admittedly small projects using just pip and venv.
I'm kind of interested in this space -- can anyone point me at an article that goes over why this is harder for python than it seems to be for, e.g., ruby? Is there something inherent about the way code is imported in python that makes it less tractable? Or is it just that the python world has never quite all come together on something that works well enough?<p>(Note that I can certainly complain about how `bundler` works in ruby, but these discussions in python-land seem to go way beyond my quibbles with the ruby ecosystem)
Just a reminder of the article that should be mandatory reading for anyone interested in Python packaging tools: <a href="https://news.ycombinator.com/item?id=40045318">https://news.ycombinator.com/item?id=40045318</a>
Is there any chance that computer scientists will analyse the software distribution situation for several language eco systems and finally find a general solution so that we can stop wasting so much time with these things?<p>It feels like we were driving cars since 50 years and still haven’t figured out a way to distribute gas.<p>Is there any research going on? The situation is totally crazy, especially for python.<p>I would like to see this done by top scientists. I would love to never have to spend any time again on the newest packaging tool.<p>What is the core of this problem and why is it not solved?