Does anyone else think this reflects badly on Python? The fact that the author has to use a bunch of different tools to manage Python versions/projects is intimidating.<p>I don't say this out of negativity for the sake of negativity. Earlier today, I was trying to resurrect an old Python project that was using pipenv.<p>"pipenv install" gave me an error about accepting 1 argument, but 3 were provided.<p>Then I switched to Poetry. Poetry kept on detecting my Python2 installation, but not my Python3 installation. It seemed like I had to use pyenv, which I didn't want to use, since that's another tool to use and setup on different machines.<p>I gave up and started rewriting the project (web scraper) in Node.js with Puppeteer.
I have never understood the need for all the different tools surrounding Python packaging, development environments, or things like Pipenv. For years, I have used Virtualenv and a script to create a virtual environment in my project folder. It's as simple as a node_modules folder, the confusion around it is puzzling to me.<p>Nowadays, using setuptools to create packages is really easy too, there's a great tutorial on the main Python site. It's not as easy as node.js, sure, but there's tools like Cookiecutter to remove the boilerplate from new packages.<p>requirements.txt files aren't very elegant, but they work well enough.<p>And with docker, all this is is even easier. The python + docker story is really nice.<p>Honestly I just love these basic tools and how they let me do my job without worrying about are they the latest and greatest. My python setup has been stable for years and I am so productive with it.
Eeeehhh I think I will be downvoted to hell and back for this but after I read the article I had the feeling of "why are you making this feel more complex than it needs to be?"<p>I mean compared to Java and C# I have a MUCH MORE EASIER time to set up my development environment. Installing Python, if I am on a Windows box I mean, is enough to satisfy a lot of the requirements. I then clone the repo of the project and<p>source venv/bin/activate<p>pip install -r requirements.txt<p>is enough to get me to start coding.
If you're going to be using pyenv + poetry you should be aware of #571 that causes issues with activating the virtualenv<p><a href="https://github.com/sdispater/poetry/issues/571" rel="nofollow">https://github.com/sdispater/poetry/issues/571</a><p>the OP himself has a fix for this in his own dotfiles repo:<p><a href="https://github.com/jacobian/dotfiles/commit/e7889c5954daacfe0988fc05ff9e8e87eb1241b7" rel="nofollow">https://github.com/jacobian/dotfiles/commit/e7889c5954daacfe...</a>
> Although Docker meets all these requirements, I don't really like using it. I find it slow, frustrating, and overkill for my purposes.<p>How so? I've been using Docker for development for years now and haven't experienced this
EXCEPT with some slowness I experienced with Docker Compose upon upgrading to MacOS Catalina (which
turned out to be bug with PyInstaller, not Docker or Docker Compose). This is on a Mac, btw;
I hear that Docker on Linux is blazing fast.<p>I personally would absolutely leverage Docker for the setup being described here: multiple versions
with lots of environmental differences between each other. That's what Docker was made for!
The Dockerfile that's provided looks like it would be very slow to build. I always try to make Dockerfiles that install deps and then install my python package (usually just copy in the code and set PYTHONPATH) to fully take advantage of the docker build cache. When you have lots of services it really reduces the time it takes to iterate with `docker-compose up -d --build`-like setups.
In addition to the popular conda, it's worth checking out WinPython for scientific use. Each WinPython installation is an isolated environment that resides in a folder. To move an installation to another computer, just copy the folder. To completely remove it from your system, delete the folder.<p>I find it useful to keep a WinPython installation on a flash drive in my pocket. I can plug it into somebody's computer and run my own stuff, without worrying that I'm going to bollix up their system.
> On Linux, the system Python is used by the OS itself, so if you hose your Python you can hose your system.<p>I never manged to hose the OS Python on Linux, by sticking to a very simple rule: DON'T BE ROOT. Don't work as root, don't run `sudo`.<p>On Linux, I use the system python + virtualenv. Good enough.<p>When I need a different python version, I use docker (or podman, which is an awesome docker replacement in context of development environments) + virtualenv in the container. (Why virtualenv in the container? Because I use them outside the container as well, and IMHO it can't hurt to be consistent).
I love Python syntax, but I still haven't found a sufficiently popular way that can deploy my code in the same set of setting s as my dev box (other than literally shipping a VM).
So setting up a dev env is one problem, but deploying it so that the prod env is the same and works the same is another.
<p><pre><code> python -m venv venv
source venv/bin/activate
pip install -U pip
pip install whatever
# <do you stuff here>
deactivate
</code></pre>
no need any third-party tools, venv is built-in the above steps always worked for me out of the box.
This article is great, those are viable solutions for sure. One of the alternatives is conda: it's common among data scientists, but many of its features (isolation between environments, you can keep private repository off the internet) meet enterprise needs.
I would generally reach for conda instead of this, but they seem quite comparable in aggregate.<p>And, given that I've been trying NixOS lately and had loads of trouble and failing to get Conda to work, I will definitely give this setup a try.<p>(I haven't quite embraced the nix-shell everything solution. It still has trouble with some things. My current workaround is a Dockerfile and a requirements.txt file, which does work...)
I like Python has a language, but when I see how clean are the tools of other similar languages, for example Ruby, compared to the clusterf<i></i>k of the Python ecosystem, it just make me want to close the terminal. I'm always wondering how it became the language #1 on StackOverflow.
There are two things that I find a bit elusive with Python:<p>1. Highlight to run
2. Remoting into a kernel<p>Both features are somewhat related. I want to be able to fire up a Python Kernel on a remote server. I want to be able to connect to it easily (not having to ssh tunnel over 6 different ports). I want connect my IDE to it and easily send commands and view data objects remotely. Spyder does all this but its not great. You have to run a custom kernel to be able to view variables locally.<p>Finally, I want to be able to connect to a Nameko or Flask instance as I would any remote kernel and hot-swap code out as needed.
So far using docker and setup.py files is working for me, I've never felt they were particularly slow, so I'll keep using them.<p>I gotta give poetry a try, though.
My sole use of Python is writing plugins (mostly single-user: me) for Sublime Text.<p>It feels pretty comfy to effectively be on an island and far away from the hustle and bustle of the industrial Python tooling.
Related from 2018: <a href="https://news.ycombinator.com/item?id=16439270" rel="nofollow">https://news.ycombinator.com/item?id=16439270</a>
I've moved to ASDF and haven't really looked back. It's working well with low fuss, and supporting far more than just python on my machine.
> Governance: the lead of Pipenv was someone with a history of not treating his collaborators well. That gave me some serious concerns about the future of the project, and of my ability to get bugs fixed.<p>Doesn't seem fair. You're not abandoning requests, are you?