I don't use venv and other tools (I use docker for this)
But here some points I found interesting comparing vanilla pip to npm (and tools listed in the article fixes it):<p>1. You have to manually freeze packages (instead of automatic package-lock.json)<p>2. Each time your install/remove package, dependant packages are not removed from freeze. You have to do it manually. (interesting link: <a href="https://github.com/jazzband/pip-tools" rel="nofollow">https://github.com/jazzband/pip-tools</a>)<p>3. Freeze is flat list (npm can restore tree structure)
Throwing this out there for criticism... I use `python3 -m pip install -t .pip -r requirements.txt` and add .pip to my PYTHONPATH. That works for me without having to use any of the python virtual env tooling, basically trying to get something more npm like.<p>I don't work on any significant python code bases, so I expect it has limitations when compared to the virtual env options like developing with specific (or multiple) python versions.
I love this kind of articles that take a tool I use often but never had enough motivation to figure out what it exactly does.<p>I learn best how something works if I try to re-create it. But there is not enough time to do it for all the things around. Explanations like this kind of simulate that process for me.
Regarding the big list at the beginning of the article, which may seem daunting, IMO you just need venv, and I'd also add poetry. pyenv and tox are useful if you need to support multiple Python versions.<p>- pyenv is used to manage which Python versions you have installed<p>- venv comes with Python and is used to actually create virtualenvs<p>- `poetry install` will create a virtualenv for you, install your packages into it, and create a lock file (assuming your project is configured to use poetry)<p>- tox is used to run your test suite against multiple Python versions (but you don't use it to directly manage virtualenvs)
I found it quite interesting that your "what's the point" section only has one point in it: avoid conflicting dependencies.<p>I found it interesting because I am generally in the distro-package camp vs venvs, and I do not see any other point myself. And for conflicting dependencies, I strive to solve that with barebone VMs to run individual "services" (or containers if security is not a concern).
I really find it a bit sad the lengths all the Python devs will go to just to compensate for the entrenched core deficiencies in their platform, without actually uprooting said deficiencies once and for all.<p>What's stopping the wider community to finally adopt any sort of namespacing / switches / gemsets / per-project environments? And I mean automatic; you `cd` into the directory and it's handled for you, similar to the the functionality of the `direnv` and `asdf` generic tools, and Elixir mix's / Rust cargo's / Ruby RVM's ways of isolating apps and their dependencies.<p>Why is Python lagging behind so many other ecosystems? Why is it so persistent in not fixing this? It's obvious it's not going anywhere and a lot of people are using it. Why not invest in making it as ergonomic as possible?<p>And don't give me the "backwards compatibility" thing now. Everyone I know that uses Python also uses several such tools on top of the vanilla experience -- so I'd argue the vanilla experience is mostly a theoretical construct for years now and can be almost safely assumed to not exist.<p><i>(And I get sadder engaging with HN these days. If you don't think I am right, engage in an argument. This downvote-and-navigate-away practice that's been creeping up from Reddit to HN isn't making this place any favours and it starts to remove the community aspect for me with time.)</i>