Are there any efforts akin to deno for python? A “burn all the packaging down and start over” path?<p>It’s so thoroughly broken, every day a dev on some team gets their poetry env entangled with some system installed python, or numpy suddenly decides all the CI builds will now compile it from scratch on every build, or.. Today it was segfaults on poetry version X on the M1 Mac’s, that went away in version Y but of course version Y broke pandas for the windows devs..
Having encountered poetry recently for the first time, it was "simply" hell. I just wanted to use a single file python project, <a href="https://github.com/rumpelsepp/oscclip">https://github.com/rumpelsepp/oscclip</a><p>I spent about three hours trying to figure out how to setup python keyrings to work, to let me just get started using poetry. On a system I was ssh'ed I to. Gnome-keyring-daemom was up. I spent a while adding random pam rules suggested by archwiki in to inject more gnome-daemon stuff in my envs. Random gnome-keyring-unlock scripts, which quickly start talking about Clevis and tpm and fido 2-factor. Wading through hundreds of responses about Seahorse, a gui tool unsuitable for ssh. Many long miserable sad stories.<p>In the end I stumbled upon someone who suggested just nulling out & turning off keyring with some config to make it have a null provider. After this the poetry project just worked.<p>The tiny handful of deps this project has were already installed on my system, but poetry was also a task runner, instrumental for the usage of this single-file script.<p>There's been so many years of churn in the python world of tools. A fractal nesting doll of virtual-env, mkvirtualenv, & various offshoots. I hope some day there is a mature reasonable option folks generally find agreeable. Poetry eventually worked for me, but what a miserable gauntlet I had to walk, and the cries of so many who'd walked the path & utterly failed echoed out at me at every step.
In general, don't tell people to "simply" or "just" use anything unless you're willing to provide the precise config that they need or otherwise hand-hold them through the starting phase.<p>Nothing in computing is "simply".
Ive heard that 'venv' are very problematic, but honestly, Ive never had a problem. And I used them daily. I understand that it can not be enough on some cases... that don't concern me.<p>I would recommend to 'python -m venv' and thats all.
Python as a language I find pretty nice. What I don't find is their environment and packaging system compared to something like Rust.<p>"There should be one-- and preferably only one --obvious way to do it.", unless it is how to setup your environment.<p>I only use Python every few months, and it is always a struggle.<p>In comparison, "cargo build" works 98% of the time just after "git checkout"
100% agree. I use a programming language to get stuff done. and if the day ever comes that someone wants me to show them how I do what I do, I dont want to start that conversation with a <i>sigh</i> "well...", I want to start it with a "OK cool..." and all these Python "tools on top of tools" make me sad.<p>Personally I like Go. If someone wants to build my stuff, then I just say go here <a href="http://go.dev/dl" rel="nofollow">http://go.dev/dl</a> and download Go, then set location to where the code is, and enter "go build". thats it. All languages should be that easy.
If you're just using python as a local scripting language, and not pushing production code, the other option is to simply not bother with any of this.<p>When there's a new python version I'm interested in, I install it via Homebrew and update my zshrc to clobber everything else via $PATH. All my scripts and tools are broken? Just reinstall the packages globally. Whatever.<p>Since the big 3.x transition, it's pretty rare for forwards-compatibility to break (IME), and if something does, I can just try running prior python3x binaries until I find the last version that worked.<p>It's hideous, but honestly the least stressful way I've found to date.
Python packaging is sooo fun it has given me permanent brain damage and PTSD. Now I have a docker devbox with all my language toolchains and fun unix tools installed. I could install 3 different versions of cuda, 8 different pyenv pythons all sharing parts (but not all) of each others modules with torch compiled for a 4th different version of cuda that is NOT installed, then replace the core system python with a duck. pipx has somehow installed a version of borg backup that depends on a secret hidden 9th python. Then I will simply `docker rm` and `docker compose up -d` and I'm back. Yesterday I ran a random academic paper ML model in python in 5 minutes on my docker machine. HAHAHAHAHAÀÄÄÄÄÄ i am invincible!!!!!!
I often find the reason for all this hell is, ironically, an effort to help people who dont know "how computers work", by offering "useful automations". And then these automations clash and fail because of the complexity involved.<p>If you "simply" (yes yes I know) download the python version you want directly and compile/install in a local folder, and use that with venv as a way to manage individual project dependencies, all problems go away.
Python runtime deployment is a major pain point for us (CS department at a university).<p>On the most tightly managed lab machines, which are all in lockstep on a fixed configuration (latest Ubuntu LTS with a updated image pushed annually), we can provide a consistent Python setup (e.g. Python 3.10 and a fixed set of C-based modules like psycopg2). However, our staff and PhD desktops and laptops are more diverse - with the OS often only being upgraded when the distro is going out of support, they could be running n, n-1 or n-2. That, most likely, means three different Python versions.<p>We could use pyenv to let people install their own preferred version. Installing with pyenv requires building from source (slow on some of our oldest machines). This also means installing the Python build deps, which is fine for our departmental machines but not possible on the HPC cluster (operated by a different business unit) or the Physics shared servers. It's also less than ideal for our servers where students deploy their projects (where we want to minimise the packages installed, like the build-essentials meta package).<p>It's also a massive stumbling block for less experienced students with their own laptops which could be running any distro, of any age. Many CS101 or engineering/business/humanities students taking a programming class, who have never programmed before, would really struggle.<p>So, classes might tend towards teaching lowest common denominator Python (i.e. the oldest conceivable version a student might have installed on their machine).<p>Sure, we have in-person and remote lab machines students can use - but it's not always convenient (especially for the data science / ML students running Jupyter notebooks with their own GPU).<p>There are workarounds, but they all have serious downsides.<p>Compared with Node.js and Go, where users can just download the appropriate package and unzip/untar the runtime or compiler version of their choice, deploying the Python runtime has enormous friction (especially for less experienced users). This has the bonus of simplifying deployments elsewhere in our infrastructure (CI/CD, containers, etc).<p>And while we all complain about node_modules, Python venvs not being trivially relocatable is another huge frustration.<p>We've used Anaconda, but that comes with its own issues (most recently, finding that it ships with its own gio/gvfs binaries and libraries which fail to mount our CIFS DFS shares - causing confusion for users running `gio mount` from within a conda environment).
Unfortunately or not, in some fields conda is the only sane choice because it can manage non-Python binary dependencies that Python packages may depend on. Some of those dependencies may be huge C libraries that are a pain to build, like HDF5, so if you're not using conda you'll be relying on your OS's package manager to serve your particular venv's needs - we all know what usually happens next.
I think I mostly wrapped my head around pyenv and used Anaconda the other day. It was quite the pain, to setup and then it seemingly mangled my fish and bash configs causing a noticeable delay every start up. Not something I was hoping for just for hacking around on some AI project.<p>Disclaimer: I was using Fedora which has Python 3.11, using Fish which is clearly non-standard and I’m a sysadmin not a Python dev.
The problem is, dependency management isn't a solved problem. I was nodding my head in agreement at the part where the author mentions that pyenv compiles Python from source when it installs. If you try to figure out the reason behind this, it becomes obvious that the only one that makes any sense at all is that distributing binaries, in a secure way, is hard for open-source projects to manage. After all, bandwidth costs money, someone has to pay for the build server, someone has to pay for the data transfer bandwidth. And, do you trust that person to not be a rogue actor? It's easier to pull down the source, on the end user's machine, cross your fingers for good luck, and compile it.<p>We also haven't figured out how to incentivize maintaining backward-compatibility. 99.9% of the time when some library updates, and stops working with language version X, it's using some hot new feature of the language. Usually just because the library author thought it would be cool or more elegant. The entire software world needs to update now, because someone left-padded us for elegance.
These comments in this thread were a bit surprising to me. People really like to make things hard on themselves not doing a little do diligence. Yes, I still say to simply use conda. It spells out exactly what is getting installed in the environment, and uses a separate python installation for each env than the system. If you don’t trust it just type which python. I never get these headaches people seem to have, since conda is easy and well documented and supported.
IMO to fix these issues the first thing to do is to write an alternative to Python's `import`, more like a function call that works more similar to NodeJS' `require()`. That should start life in userland and only later become part of the language.<p>How to transition a bazillion packages though I do not know.
The variety of solutions sure isn't helping.<p>I'm just sticking all dev projects into a separate LXC and calling it a day. Don't want to deal with all the various separation models the various languages and package managers cooked up
Aside: I usually use direnv to activate the venv (or poetry) when entering a dictionary<p><a href="https://gist.github.com/tom-pollak/8326cb9b9989e0326f0d2e19fba6aeb0" rel="nofollow">https://gist.github.com/tom-pollak/8326cb9b9989e0326f0d2e19f...</a>
> You should really use docker<p>>> I think you missed the point.<p>Maybe I did, but I've been using Docker as version management for pretty much every technology I employ for five or six years. Prior to that I sparsely used things like rbenv and virtualenv and I actually thought it was super dangerous and unreliable. Maybe it's gotten better in recent years, and certainly people who write python and ruby every day are going to know more about this than I do.<p>I don't install <i>anything</i> on my computer if I can just use Docker for it. OK, I do have go:latest, but I use docker images for various projects that might be on any version of go from 1.8 to 1.20. Your website still runs on PHP5.3? I can help you (I won't, but I could totally run it locally!).<p>Reasons I like docker better:<p>1. Any scripts or configs can explicitly refer to the version number. No guessing or assuming.<p>2. Our whole team uses the same version.<p>3. Only one dependency: docker.<p>Granted I'm more of a sysadmin than a developer and I'm sure that biases apply.
Perhaps people should accept that these are more developer tools (like git) and not end-user distribution tools and just use regular distribution packages (rpm / deb) for that.
I dealt with this years ago by only using Nix to do python Dev. Worked great for the entire ML stack. Had to do a bunch of packaging for nixpkgs early on though.
In my personal experience, I'll take Python's packaging hell over nuget or npm any day.<p>And it's often less about the package manager and more about the ecosystem: Can you find what you need? Is what you need stable enough? Does it break every few months with new versions of the runtime, either because of actual incompatibility (npm/node) or versioning shenanigans (nuget)?