With the move to pyproject.toml in 2022[0], Poetry has become our goto method.<p>With the lock file being default, we don't worry about different installs in different envs.<p>Having come from the Rails world, Bundler system was solved for... a decade? So I was surprised it was such a mess in Python until so recently.<p>At the core, the thing that makes Poetry and Bundler system so predictable is 1, lock file. 2, ability to install different versions in diff locations and referencing the version you need to load. Each alone isn't enough.<p>npm had the same problem pip suffered from, you may have a version installed different what the req.txt, proj.js or even lockfile says but because it exists inside the location, it gets loaded. It wasn't until yarn2 did node_modules finally get moved out such that side-by-side versions wasn't awkward.<p>[EDIT]<p>If you're not using Poetry + Docker for deployment yet, I 100% recommend it as the "boring" method.<p><pre><code> RUN curl -sSL https://install.python-poetry.org | python -
RUN /root/.local/bin/poetry config virtualenvs.create false
COPY poetry.lock pyproject.toml ./
RUN --mount=type=secret,id=gh_priv_key,target=/root/.ssh/id_rsa \
/root/.local/bin/poetry install --without dev --no-root
</code></pre>
[0] <a href="https://packaging.python.org/en/latest/tutorials/packaging-projects/" rel="nofollow">https://packaging.python.org/en/latest/tutorials/packaging-p...</a>
The python core team endorses pip, but pip solves something like 80% of the problem - we need them to endorse the other 20% of the solution.<p>I've been using pip-tools for some time, I think it solves the other 20% of the problem for me in a simple way that I like. Poetry et al seem to be trying to do too much - ymmv.<p>The iterations on packaging that don't really seem to ever get it right are, I think, frustrating to the community where the core likes to advertise a "Zen of Python" "one way to do things" mantra, but can never really get 100% of the packaging problem sorted out in a clean way in spite of several communities seeming to figure it out.
For me it feels like people have always been over-engineering dependency management where the practical issues from not doing it are pretty much non-existent.<p>My approach is to just use Docker, no virtualenvs. I get that you might run into the multiple interpreters issue in theory but across multiple projects in the past 5 years I haven't seen that. Also, this might no longer be true but avoid using Alpine. If you're deploying Django there is no reason to optimize image size and Alpine has a lot of things which are missing (i.e. at least a couple of years ago, wheels where not supported leading very slow build times).<p>I only do a single requirements.txt. Anything which makes your prod and local environment differ is a bad thing in my opinion. Fine black might make my image a couple of mbs larger but why would it matter? On the other hand attempting to figure out why something which works on my machine does not work on prod is always a nightmare.<p>Setting requirements as a range in requirements.txt allows me to automatically get bugfixes without spending time on it (e.g. django>=4.2.3,<4.2.99
django-ninja>=1.0.1,<1.0.99) Again, I might have run into 1-2 issues over the past couple of years from this and I've saved a lot of time.<p>Getting a project running locally should not take more than 1 minute (a couple of .env vars + docker-compose up -d should be enough).<p>The biggest practical issue in dependency management in python is dependencies not pinning their dependencies correctly.
Here's my "boring" workflow:<p>1. Start project (mkdir, git init),<p>2. Make virtualenv using virtualenvwrapper,<p>3. Write project.toml file for setuptools,<p>4. pip install -e .<p>5. To add deps, add them to pyproject.toml and repeat step 4. <i>Do not pip install deps directly</i>. Do not pin deps to any particular version, but if you have to you can add constraints like >=5 (I need a feature introduced in v5).<p>6. If you are writing a package to be pip installed by others then you're done. Read setuptools docs for how to build etc.<p>7. If you also want to build an <i>environment</i> to run your code (e.g. docker image for deployment or serverless deployment etc) use pip-tools to pin your dependencies. (This is the only reason you need requirements.txt).<p>8. For test dependencies (e.g. pytest) or dev dependencies (e.g. test server) leverage optional dependencies in the pyproject.toml file. This plays very nicely with tools like tox, which you should use. Use pre-commit for linting etc.
I've landed on Poetry after having tried many different options over the years. Being able to specify my relatively open dependencies (Eg "django==4.0.*") but having the exact version of every subdependency versioned has proved to be very reliable and reproducable. Docker multistage builds allow me to ship a production container without Poetry installed.
I’m happy others are writing on this subject! I appreciate your enthusiasm for trying to do the most “basic” things in Python. While I personally enjoy a bit more management with tools like Poetry, I believe all Python programmers should know how pip and setuptools work before trying their supersets.<p>To add to this discussion, I recently wrote this less wordy guide on macOS Python setup <a href="https://steins.studio/technical/01-python-setup" rel="nofollow">https://steins.studio/technical/01-python-setup</a>
I'm personally afraid that the python community has bifurcated so far that a single 'correct' solution is doomed to fail. The needs of people writing and deploying HTTP/REST servers is just so different from the people writing PyTorch models and numerical simulations that no tool will ever satisfy the very different needs of both camps. The worst part is that many people developing these tools don't seem to realise this and blindly claim that their tool is The Tool! without having any deep insights into the needs of the other camps.
The biggest problem about Pythons dependency managment in 2024 is that it stil feels like an afterthought. It is just not straightforward and it is not <i>pythonic</i>. I would go as far as saying that dependency managment with Python is likely more complicated than anything you would normally encounter within the language.<p>As of 2024 poetry is the best solution we have, but even it can come to its limits at times. I work in a position where I develope with poetry and have to deploy without it (using venv), and I do not wish the journey of learning how to do that on anybody.
Has anyone figured out to how to “cross-compile” Python?<p>By this I mean creating an app bundle that contains the dependencies but for another platform than the one we are bundling on.
The biggest problem in Python dep management is Pip. When upgrading pip breaks tools wrapping/integrating with it, it's a bad time.<p>I've had to pin pip itself a few times due to resolution that used to work failing, and sometimes there's breaking API changes at the module level.<p>Oh and also because setup.pys in packages are somehow tied to pip apis.<p>It's a weak foundation to build from.
Coming from Java/Maven I was amazed to see the obscure mess in Python world regarding dependency management. After trying all available tools I finally settled on pdm. I also found it to be more intuitive than poetry.
The author mentions using pip freeze or pip-tools pip-compile as a solution to the indirect dependencies which are reliant on the Python environment, i.e. the platform and Python version.<p>But from what I understand pip cross-environment usage needs the requirements.txt file to be generated on the environment it is going to be run on. The solution of copying in the same requirements text for installing the packages locally might not work in the container.