I'm surprised at the number of people here complaining about venvs in Python. There are lots of warts when it comes to package management in Python, but the built-in venv support has been rock solid in Python 3 for a long time now.<p>Most of the complaints here ironically are from people using a bunch of tooling in lieu of, or as a replacement for vanilla python venvs and then hitting issues associated with those tools.<p>We've been using vanilla python venvs across our company for many years now, and in all our CI/CD pipelines and have had zero issues on the venv side of things. And this is while using libraries like numpy, scipy, torch/torchvision, etc.
I personally hate Conda with a firey passion - it does so much weird magic and ends up breaking things in non obvious ways. Python works best when you keep it really simple. Just a python -m venv per project, a requirements.txt, and you will basically never have issues.
I would highly recommend Poetry for python package management. It basically wraps around pip and venvs offering a lot of convenience features (managing packages, do dist builds, etc.). It also works pretty nicely with Tox.<p>I would recommend using virualenvs.in-project setting so Poetry generates venv in the project folder and not in some temporary user folder.
Answer: they don’t<p>(Seriously, I’ve gotten so fed up with Python package management that I just use CondaPkg.jl, which uses Julia’s package manager to take care of Python packages. It is just so much cleaner and easier to use than anything in Python.)
My personal approach is:<p>- use miniconda ONLY to create a folder structure to store packages and to specify a version of python (3.10 for example)<p>- use jazzband/pip-tools' "pip-compile" to create a frozen/pinned manifest for all my dependencies<p>- use pip install to actually install libraries (keeping things stock standard here)<p>- wrap all the above in a Makefile so I am spared remembering all the esoteric commands I need to pull this all together<p>in practice, this means once I have a project together I am:<p>- activating a conda environment<p>- occasionally using 'make update' from to invoke pip-compile (adding new libraries or upgrading), and<p>- otherwise using 'make install' to install a known working dependency list.
All other languages: use whatever packages you like. You’ll be fine.<p>Python: we’re going to force all packages from all projects and repos to be installed in a shared global environment, but since nobody <i>actually</i> wants that we will allow you to circumvent that by creating “virtual” environments you can maintain and have to deal with instead. Also remember to activate it before starting your editor or else lulz. And don’t use the same editor instance for multiple projects. Are you crazy???<p>Also: Python “just works”, unlike all those other silly languages.<p>Somebody IMO needs to get off their high horse. I can’t believe Python users are defending this nonsense for real. This must be a severe case of Stockholm-syndrome.
It feels like it is one of the reasons experienced devs are ditching Python for production systems. Besides horrendous performance and lousy semantics. The cost of setting up, maintaining the environment and onboarding people is just not worth it.
These days I'm just throwing each project into a fresh LXC on a server.<p>All these different languages have their own approach and each then also user/global/multiple versions...it's just not worth figuring out
Virtual environments are easy to create and manage. Create one with the built-in <i>venv</i> module:<p><pre><code> python3.10 -m venv ./venv # or your favorite version
. ./venv/bin/activate
pip install pip-tools
</code></pre>
Manage dependencies using <i>pip-compile</i> from <i>pip-tools</i>. Store direct dependencies in "requirements.in", and "freeze" all dependencies in "requirements.txt" for deployment:<p><pre><code> . ./venv/bin/activate
pip-compile -U -o ./requirements.txt ./requirements.in
pip install -r ./requirements.txt</code></pre>
> One point I would like to make is how virtual environments are designed to be disposable and not relocatable.<p>Is the author saying that relocating them will actually break things, or that it's just as easy to recreate them in a different location? Because I've moved my venv directories and everything still seemed to work OK. Did I just get lucky?
How much of this is caused by a join over "odd" decisions of what is installed by Python3 developers, "odd" decisions of what a "package" is by package makers and what I think I want to call "fanaticism" by Debian apt around things?<p>FreeBSD ports are significantly closer to "what the repo has, localized" where it feels like linux apt/yum/flat is "what we think is the most convenient thing to bodge up from the base repo, but with our special sauce because <reasons>"
That's insightful.<p>It seems that a virtual environment created by Poetry looks very similar, except that it doesn't contain an `include` directory.
It contains:<p>* `bin` directory<p>* `lib/<python-version>/site-packages/` directory<p>* `pyvenv.cfg`
I'm beginning to feel like every single comment in every thread related to python package management is just this:<p>"Package management in python is so easy, just use [insert tool or workflow that's different to literally every other comment in the thread]."
Been really enjoying trying out pdm in PEP 582 mode. I've just found it behaves when used across multiple devs, not necessarily that used to working with python.
The "global" vs. "directory" dichotomy seems... off. Haven't PYTHONHOME and PYTHONPATH been supported since approximately forever?
This writeup needs work.<p>> So while you could install everything into the same directory as your own code (which you did, and thus didn't use src directory layouts for simplicity), there wasn't a way to install different wheels for each Python interpreter you had on your machine so you could have multiple environments per project (I'm glossing over the fact that back in my the day you also didn't have wheels or editable installs).<p>This is a single run-on sentence. Someone reading this, probably doesn't know what "wheels" means. If you are going to discount it anyway, why bring it up?<p>> Enter virtual environments. Suddenly you had a way to install projects as a group that was tied to a specific Python interpreter<p>I thought we were talking about dependencies? So is it just the interpreter or both or is there a typo?<p>> conda environments<p>I have no idea what those are. Do I care? Since the author is making a subtle distinction, reading about them might get me confused, so I've encountered another thing to skip over.<p>> As a running example, I'm going to assume you ran the command py -m venv --without-pip .venv in some directory on a Unix-based OS (you can substitute py with whatever Python interpreter you want<p>Wat? I don't know what venvs are. Can you maybe expand without throwing multi-arg commands at me? Maybe add this as a reference note, rather than inlining it into the information. Another thing to skip over.<p>> For simplicity I'm going to focus on the Unix case and not cover Windows in depth.<p>Don't cover Windows at all. Make a promise to maintain a separate doc in the future and get this one right first.<p>> (i.e. within .venv):<p>This is where you start. A virtual environment is a directory, with a purpose, which is baked into the ecosystem. Layout the purpose. Map the structure to those purposes. Dive into exceptional cases. Talk about how to create it and use it in a project. Talk about integrations and how these help speed up development.<p>I also skipped the plug for the mircoenv project, at the end with a reference to VSCode.