While `uv` works amazingly well I think a lot of people don't realize that installing packages through conda (or let's say the conda-forge ecosystem) has technical advantages compared to wheels/pypi.<p>When you install the numpy wheel through `uv` you are likely installing a pre-compiled binary that bundles openblas inside of it. When you install numpy through conda-forge, it dynamically links against a dummy blas package that can be substituted for mkl, openblas, accelerate, whatever you prefer on your system. It's a much better solution to be able to rely on a separate package rather than having to bundle every dependency.<p>Then lets say you install scipy. Scipy also has to bundle openblas in their wheel, and now you have two copies of openblas sitting around. They don't conflict, but this quickly becomes an odd thing to have to do.
The problem conda solved that nothing had solved before was installing binary dependencies on MS Windows.<p>Before conda, getting a usable scipy install up and running on MS Windows was a harrowing experience. And having two independent installations was basically impossible.
The real hard work that went into conda was reverse engineering all the nooks and crannies of the DLL loading heuristics, to allow it to ensure that you loaded what you intended.<p>If you are working on macOS and deploying to some *nix in the cloud, you are unlikely to find any value in this. But in ten years as lead on a large tool that was deployed to personal (Windows) laptops in a corporate environment, I did not find anything that beat conda.
As someone with admittedly no formal CS education, I've been using conda for all of my grad school and never managed to break it.<p>I create a virtual environment for every project. I install almost all packages with pip, except for any binaries or CUDA related things from conda. I always exported the conda yaml file and managed to reproduce the code/environment including the Python version. I've seen a lot of posts over time praising poetry and other tools and complaining about conda but I could never relate to any of them.<p>Am i doing something wrong? Or something right?
Can somebody please eli5 why it is so unanimously accepted that Python's package management is terrible? For personal projects venv + requirements.txt has never caused problems for me. For work projects we use poetry because of an assumption that we would need something better but I remain unconvinced (nothing was causing a problem for that decision to be made).
Really the issue is python itself, it shouldn't be treating it's installs and packages as something that's linked and intertwined to the base operating system.<p>People like to complain about node packages but never seen people have the trouble with them that they have with python.
I hope you read it while it was available, because the domain has expired<p><pre><code> Domain Name: pyherald.com
Registry Domain ID: 2663190918_DOMAIN_COM-VRSN
Registrar WHOIS Server: whois.namesilo.com
Registrar URL: https://www.namesilo.com/
Updated Date: 2024-12-21T07:00:00Z
Creation Date: 2021-12-21T07:00:00Z
Registrar Registration Expiration Date: 2024-12-21T07:00:00Z
</code></pre>
<a href="https://web.archive.org/web/20241220211119/https://pyherald.com/articles/16_12_2024/" rel="nofollow">https://web.archive.org/web/20241220211119/https://pyherald....</a> is the most recent snap
Conda used to be a life saver when years and years ago, compiled extensions were hard to install because you had to compile them yourself.<p>Nowadays, thanks to wheels being numerous and robust, the appeal of anaconda is disappearing for most users except for some exotic mixes.<p>conda itself now causes more trouble than it solves as it's slow, and lives in its own incompatible world.<p>But anaconda solves a different problem now that nobody else solves, and that's managing Python for big corporation. This is worth a lot of money to big structures that need to control packages origin, permissions, updates, and so on, at scale.<p>So it thrives there.
Nothing in the "article" seems to support the title. A lot of it is just about Python packaging in general, or about problems when mixing conda- and pip-installed packages.<p>In my experience conda is enormously superior to the standard Python packaging tools.
I think Pixi mostly solves the main issues of conda by forcing users to have project-specific environments. It also solves environments incredibly fast, so it’s really quick to create new projects/environments. <a href="https://pixi.sh/" rel="nofollow">https://pixi.sh/</a>
Conda is the only package manager I've used on Ubuntu that intermittently and inexplicably gets stuck when installing or uninstalling. It will sometimes resolve itself if left alone for hours, but often won't.<p>I avoid it as much as possible.
I feel like a major selling point of Nix is "solving the Python dependency-hell problem" (as well as that of pretty much every other stack)<p>I've seen so many issues with different Python venvs from different Python project directories stepping on each others' dependencies somehow (probably because there are some global ones) that the fact that I can now just stick a basic and barely-modified-per-project Python flake.nix file in each one and be always guaranteed to have the <i>entirely</i> of the same dependencies available when I run it 6 months later is a win.
This seems to be an aggregation of some posts on python-list. Basically, extra-random opinions.<p>I'll offer mine: I won't say that Python packaging is generally excellent, but it's gotten much better over the years. The pyproject.toml is a godsend, there's the venv module built-in to Python, pip will by default no longer install package outside of a venv. Dependency groups are being added, meaning that the requirements.txt files can also be specified in the project.toml. Documentation is pretty good, especially if you avoid blog posts from 5+ years ago.
I tried Conda a number of time over the years, regretted it every time.<p>These days, when I absolutely <i>have</i> to use it because some obscure piece of software can't run unless Conda, I install it in a VM so that:<p><pre><code> - I protect my working system from the damage of installing Conda on it
- I can throw the whole garbage fire away without long term brain damage to my system once I'm done</code></pre>
All thoughts and opinions about conda aside, it’s the only sane way (on several platforms) to install gdalbins + gdal-python-bindings.<p>I don’t mind conda. It has a lot of caveats and weird quirks
People here focus on Python, but to me, a bioinformatician, conda is much more, it provides 99.99% of the tools I need. Like bwa, samtools, rsem, salmon, fastqc, R. And many, many obscure tools.
Besides the horrendous formatting, some stuff in this article seem incorrect or irrelevant. Like, is this even possible?<p>> A single Anaconda distribution may have multiple NumPy versions installed at the same time, although only one will be available to the Python process (note that this means that sub-processes created in this Python process won’t necessarily have the same version of NumPy!).<p>I’m pretty sure there’s not, but maybe there is some insane way to cause subprocesses to do this. Besides that, under the authors definition, different Python virtualenvs also install multiple copies of libraries in the same way conda does.<p>The comments about Jupyter also seem very confused. It’s hard to make heads or tails of exactly what the author is saying. There might be some misunderstandings of how Jupyter kernels select environments.<p>> Final warning: no matter how ridiculous this is: the current directory in Python is added to the module lookup path, and it precedes every other lookup location. If, accidentally, you placed a numpy.py in the current directory of your Python process – that is going to be the numpy module you import.<p>This has nothing to do with conda.
I think Python had a pretty good idea in standardizing a packaging protocol and then allowing competing implementations, but I would have preferred a single "blessed" solution. More than one package management option in an ecosystem always adds some kind of "can't get there from here" friction and an additional maintenance burden on package maintainers.<p>poetry has been working well enough for me as of late, but it'd be nice if I didn't have to pick.
I honestly have no idea why anyone still uses Conda, it's a right pain in the ass. Python package management in general is a nightmare, but whenever I run up a project that uses Conda I immediately disregard it and use uv / pyenv.
Conda: a package manager disaster that became paid license required for companies over 200 employees. It worked 5 years ago, we can no longer legally use it
The Five Demons of Python Packaging That Fuel Our Persistent Nightmare:
<a href="https://youtu.be/qA7NVwmx3gw?si=QbchrYvCEp8aazvL" rel="nofollow">https://youtu.be/qA7NVwmx3gw?si=QbchrYvCEp8aazvL</a>
I strongly suspect that there is about to be a spike in Python packaging discussion over and above the high ambient baseline.<p>uv is here to kick ass and chew bubblegum. And it’s all out of gum.
conda was for scientific python, but had to solve for everything below python to make that work. There was no generic binary solution before python for multiple architectures and operating systems.
> The traditional setup.py install command may install multiple versions of the same package into the same directory<p>Wait, what? In what situation would that ever happen? Especially given the directories for packages are not versioned, so setuptools should never do two different versions in any way.
It's rare to see something as systematically broken as Python package/dependencies ecosystem.<p>What I don't understand - what makes this so difficult to solve in Python? It seems that many other platforms solved this a long time ago - maven 2.0 was released almost 20 years ago. While it wasn't / isn't by no means perfect, its fundamentals were decent already back then.<p>One thing which I think messed this up from the beginning was applying the Unix philosophy with several/many individual tools as opposed to one cohesive system - requirements.txt, setuptools, pip, pipx, pipenv, venv... were always woefully inadequate, but produced a myriad of possible combinations to support. It seems like simplicity was the main motivation for such design, but these certainly seems like examples of being too simplistic for the job.<p>I recently tried to run a Python app (after having a couple of years break from Python) which used conda and I got lost there quickly. Project README described using conda, mamba, anaconda, conda-forge, mini-forge, mini-conda ... In the end, nothing I tried worked.