Today I spent at least an hour fighting with Python packaging. The more I think about it, the more I feel that self-contained static binaries are the way to go. Trying to load source files from all over the filesystem at runtime is hell. Or at least it's hell to debug when it goes wrong.<p>I would love to see a move towards "static" binaries that package everything together into a single, self-contained unit.
One thing I found when converting our python application packaging from RPM to wheels is that wheels don't properly handle the data_files parameter in the setup call. That is, it places files under the python library directory instead of in the absolute path as specified. This means that sample configuration files and init scripts end up in the wrong place on the file system. In order to get around this, we had to upload the source distribution to our devpi instance and run pip install with the --no-binary option which would then place those files in the correct directories.<p>The other issue is that there's no equivalent of the %(config) RPM spec directive to prevent the config file from being overwritten if it already exists on the file system.<p>So, for libraries, wheels are a good cross-platform packaging solution, but not so much for applications that require configuration files and init scripts.
I don’t know a lot about Python tooling, but in general my experiences with pip have been pleasant, so I appreciate all the work done by the maintainers to make it pleasant.
At one point in time I created a Python package to highlight this benefit of wheels: "Avoids arbitrary code execution for installation. (Avoids setup.py)" - <a href="https://github.com/mschwager/0wned" rel="nofollow">https://github.com/mschwager/0wned</a><p>Of course Python imports can have side-effects, so you can achieve the same results with 'import malicious_package', but the installation avenue was surprising to me at the time so I created a simple demo. Also consider that 'import malicious_package' is typically not run as root whereas 'pip install' is often run with 'sudo'.
So far it's just a bit of smoke on the horizon, but I'm noticing some packages abandoning 'pip' installs entirely in favor of 'conda'. It's a bit early to tell if this trend will take off, but it does seem plausible.
For anyone working on python wheels requiring to compile c/cpp/fortran, this may be useful. See <a href="https://scikit-build.org" rel="nofollow">https://scikit-build.org</a><p>(Disclaimer: I am one of the maintainer)
That doesn't seem to be a particular fast adoption. I remember seeing the first wheels when working at a job I quit early 2010. So it must be over 10 years.<p>Edit: A web search points to 2012, so maybe it's "only" 8 years?<p>Edit 2: Pip came in 2008, so something changed somewhat before 2010 as I remembered. But what did it install if not wheels?
This seems like a really effective way to set up a page for shaming "laggards". It would be interesting to track over time how many github issues are just links to this page.
Since pip is used to install Wheels it would probably be best to have a new separate 3rd party meta tool to install package managers themselves to avoid the confusion. Preferably this should have it's own additional PEP and integrate PyPI and PyPy along with other packages that could make life simpler for the (hopefully now happier) end user.
have always wondered why pypi doesn't generate whl files for pure-python sdists<p>and why companies like travis / github aren't more active in language-level packaging work<p>github gives away so much free docker time -- faster installation would save them money directly
Could someone give a beginner's tl;dr of wheels vs. eggs?<p>I use Python extensively but the "Advantages of wheels" section on this site is way over my head.<p>edit: Thanks everyone :)