> things get tricky when it comes to nested venvs<p>Never had such a requirement.<p>Environments, like the interpreter itself, seem a singleton concept.<p>I have used a Makefile that sources different Bash environment variables kept in files in etc/ within the venv to switch between, say, a Flask and a Gunicorn startup.
Could someone ELI7 why python is struggling so hard with dependencies and dependency management when in the Java world this is a solved problem and just works with maven, which is working so damn well it last got updated in November 2019, and the biggest fight going is whether you should use gradle as client instead of maven to access the same dependency ecosystem?<p>I had to set up Python projects for some machine learning classes in college and it was a complete mess.
This was a good article and looks like an interesting project.<p>Regardless of whether or not you choose to _deploy_ with Docker, _developing_ in Docker containers using the VS Code Remote extensions really solved all of Python's (and Javascript's) annoying packaging problems for me, with the bonus that you also get to specify any additional (non-Python) dependencies right there in the repo Dockerfile and have the development environment reproducible across machines and between developers. YMMV, of course, but I find this setup an order of magnitude less finicky than the alternatives and can't imagine going back.
Thanks for giving a look and leaving so many reactions. This project was originally built to experiment on some new PEPs on Python packaging and it turned out working well.<p>1. What does PEP 582 differ from virtualenv, it is just another "virutualenv"
The biggest difference, as you can see, is __pypackages__ doesn't carry an interpreter with it. This way the packages remain available even the base interpreter gets removed.
2. I prefer Poetry/virtualenvwrapper/Pipenv
It is fine. If you feel comfortable with the current solution you are using, stick with it. I am not persuading that PDM is a better tool than those. Any of the said tools does a great job solving its own problems. PDM, IMO, does good in following aspects:<p>- Plugin system, people can enhance the functionalities easily with plugins
- Windows support, PDM has completion support for PowerShell out of the box. Other CLI features also work as good as *nix on Windows.
> One day a minor release of Python is out<p>It is decidedly <i>not</i> true that I want to update my venv for every minor version bump!<p>I deploy to a cloud service w/ a specific version; my package manager is slow to update; I develop collaboratively using a shared container w/ a fixed version.<p>Updating shared resources with every new release is not always realistic, and that makes it so that I <i>do</i> need (or at least <i>want</i>) to use a virtualenv.
My first attempt, and I get "AttributeError: module 'toml.decoder' has no attribute 'TomlPreserveCommentDecoder'"<p>So careful, far from stable.<p>I reported it (<a href="https://github.com/frostming/pdm/issues/247" rel="nofollow">https://github.com/frostming/pdm/issues/247</a>) because the project is great and I want it to succeed, but I won't put it in prod any time soon. I prefer my package manager to be stable than fancy.<p>It's also why I don't use poetry and so on in training. There is always some bug that students will encounter later because there are so many config combinations they can encounter, but recent tools have not been as battle tested as we think: the enterprise world is a wild beast.
I recommend virtualenv and virtualenvwrapper. I usually set up a new venv like this:<p>$ mkvirtualenv -a $(pwd) new_venv<p>$ pip install -r requirements.txt<p>When you want to activate this env and cd to the directory where you created it, you can simply do:<p>$ workon new_venv<p>That's all you need to know. It just works.
Happy to see PEP 582 gaining traction, the last attempt I saw in this space was <a href="https://github.com/cs01/pythonloc" rel="nofollow">https://github.com/cs01/pythonloc</a>.
Thanks for writing this. I've always felt that virtualenv was bit of a placebo. I still use it from time to time though because others expect it. In fact I have used it this week on MacOS, and was annoyed to find that I can't install ipdb to my user packages and still see it in a virtualenv. Seems it's either global or local. (I think the --system-site-packages defaulting to off has it backwards, but it is probably reassuring if you don't know how the module finder works).<p>Regarding nested virtualenvs, last time I checked there was no such thing. When nested, they always linked back to the system level python, and the cloned python had loads of dependencies on the global system. I'm not aware of a valid use case for nesting virtualenvs.<p>There is one handy feature of virtualenv that I'm not sure __packages__ solves though: console_scripts and entry_points. setuptools will automatically create a console script under /usr/local/bin or similar which hooks in your module under site-packages. virtualenv includes a ./bin directory and adds it to your path so that these console scripts still work locally. Any thoughts on this?
Man i love the ease of this lib. I've been running between docker, poetry, venvs, and pipenv and they all just SUCK.<p>Is there a best workflow for using this inside a container?
Curious, what about environment variables? I like the ability to put my environment variables in my activate script. It seems like if you did things the way the author does, you would need all of your env variables to be global, which could be hazardous
props to the author of this tool. based on what I read from the blog, it's already a step above poetry, which a lot of people rave about recently, but still not good enough not to not use pip. question is, are we going to be nuking the __module__ folder like we nuke node_modules every now and then ? other than I will become an early adopter of the tools. currently using miniconda for my virtualenvs
This looks interesting, but personally I don't have any problems with virtualenvs. I use pipx to install "global" tools and virtualenvwrapper to make project envs. It all works fine really.
Regarding the installation of self-contained tools, this is a problem already solved by pipx, which is very simple to use. I wished such a system was included as part of official python installers.
In fact this guy propose to have python be able to work like npm...<p>Looks like a ridiculous idea to me as node and npm dependencies/node_modules is completely messy and inefficient.