I recently watched a talk by the author of uv that was surprisingly fascinating [1]. He goes into a few of the more notable hacks that they had to come up with to make it as fast as it is. The most interesting thing for me was that package resolution in python given constraints defined (eg. in requirements.txt) maps to a boolean satisfiability problem which is NP-complete. So uv uses a custom SAT solver to do this. I totally under-appreciated how much goes into this software and I'm bummed I have to use Poetry at work after having watched this talk.<p>[1] <a href="https://www.youtube.com/watch?v=gSKTfG1GXYQ" rel="nofollow">https://www.youtube.com/watch?v=gSKTfG1GXYQ</a><p>edit: NP-complete not NP-hard
Kind of an aside as this doc is about the complexities of installing particular PyTorch versions, but will say that uv is <i>way</i> faster at installing PyTorch than pip.<p>We run internal benchmarks of our custom container image builder and in the 'install torch' benchmark the p50 time saved when using `uv` is 25 seconds! (71.4s vs. 43.74s)<p>---<p>Aside 2: Seems there's a missing "involves" in this sentence: "As such, installing PyTorch typically often configuring a project to use the PyTorch index."
uv significantly speeds up my pytorch in docker builds<p><pre><code> # Setup virtual env
ENV VIRTUAL_ENV=/app/.venv
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
RUN python3 -m venv $VIRTUAL_ENV
RUN . $VIRTUAL_ENV/bin/activate
# install using uv
RUN pip install uv
RUN uv pip install torch==${TORCH_VERSION} --index-url https://download.pytorch.org/whl/cpu
</code></pre>
The index-url makes it really convenient.
So uv caused a bit of an issue with me installing PyTorch over the weekend.<p>When installed with brew on my MacBook, uv currently has PyTorch 3.13 as a dependency, which is fine. But PyTorch does not currently have a stable wheel that's compatable with Python 3.13! This resulted in very confusing errors. (Solution was to point to the Nightly index)<p>That's technically PyTorch's fault, but it's indicitave why a specific page on installing PyTorch is necessary, and it's good to know the documentation specifically calls it out.
I was trying to figure out how to set up a pyproject with uv that could support cuda, rocm and other device types this morning, and next thing I knew, there was a new release adding pretty much exactly what I needed.<p>The pace of development on uv is really impressive.
I've read that Torch was dropping their Conda support, but won't everybody just move to Mamba which is a drop-in replacement of Conda?<p>Conda (and Mamba) allows to avoid duplicating packages on the disk between environments (not just the downloaded archives, but the resulting expanded files too).<p>How does uv compare in this regard?
In a nutshell, what do I gain from switching to uv from my current workflow, which is:
1) create a venv (`python3.xx -m venv venv`)
2) install packages from a requirements.txt into that venv?<p>One limitation I know of are the inability to detect stale packages.<p>Apart from „blazing fast“, which I‘m not convinced it really matters to me as I rarely touch the dependencies, what are the main reasons why uv is gaining traction?