What's wrong with pipenv? I am genuinely curious.<p>On local :<p><pre><code> mkdir my_project_directory
cd my_project_directory
export PIPENV_VENV_IN_PROJECT=1 (To make the virtual environment folder determininstic(.venv/) otherwise you will get a hash based directory(my_project_directory-some-hash-value) which might not be suitable for automatic deployments in applications like docker. I don't know why this is not default.)
pipenv --python 3.6 (or any particular version number)
pipenv install numpy scipy pandas matplotlib requests
pipenv graph (Gives me a dependency graph)
git add .
git commit -a -S -m "init"
git push
</code></pre>
On remote :<p><pre><code> git clone url/my_project_directory
cd my_project_directory
export PIPENV_VENV_IN_PROJECT=1
pipenv install
pipenv shell
pipenv graph
</code></pre>
Is this workflow not enough? I have recently started using pipenv after a lot of struggle. The only issue I have is, Pycharm doesn't allow native pipenv initialisation. I always end up creating an environment manually and then importing the project. Pycharm does detect the environment though.
I feel that all of these language specific solutions still only solve halve the problem. Your code depends on a lot more than _just_ the python libraries. And often this is exactly what makes projects break on different systems.<p>Let me make another suggestion: nixpkgs [0] it helps to define exactly that fixed set of dependencies. Not just on published version number, but on the actual source code _and_ all it's dependencies.<p>[0] - <a href="https://nixos.org/nixpkgs/" rel="nofollow">https://nixos.org/nixpkgs/</a>
Here we go again. The source of the problems in in toy package managers (and I include all language package managers here) is not just the package managers themselves, it's the "version soup" philosophy they present to the user. Not daring to risk displeasing the user, they will take orders akin to "I'd like version 1.2.3 of package a, version 31.4.1q of package b, version 0.271 of package c, version 141 of package d...", barely giving a thought to inter-version dependencies of the result.<p>Unfortunately, software <i>does not work this way</i>. You cannot just ask for an arbitrary combination of versions and rely on it to work. Conflicts and diamond dependencies lurk everywhere.<p><i>Sensible</i> package systems (see specifically Nix & nixpkgs) have realized this and follow a "distribution" model where they periodically settle upon a collection of versions of packages which <i>generally</i> are known to work pretty well together (nixpkgs in particular tries to ensure packages' test suites pass in any environment they're going to be installed in). A responsible package distribution will also take it upon themselves to maintain these versions with (often backported) security fixes so that it's no worry sticking with a selection of versions for ~6 months.<p>However, I can't say I'm particularly surprised that these systems tend to lose out in popularity to the seductively "easy" systems that try to promise the user the moon.
Using a local virtual environment and then building a Docker image removes most of the headaches. I also bundle a Makefile with simple targets. See this as an example: <a href="https://github.com/zedr/cffi_test/blob/master/Makefile" rel="nofollow">https://github.com/zedr/cffi_test/blob/master/Makefile</a>
New projects are created from a template using Cookiecutter.<p>It isn't really so bad in 2018, but I do have a lot of scars from the old days, most of them caused by zc.buildout.<p>The secret is using, as the article mentions, a custom virtual env for each instance of the project. I never found the need for stateful tooling like Virtualenvwrapper.
"Pipfile looks promising for managing package dependencies, but is under active development. We may adopt this as an alternative if/when it reaches maturity, but for the time being we use requirements.txt."<p>If I where given the choice between community supported/in development Pipfile/pipenv or the 3rd party supported yet-another-package-manager lore to get those best practices my money would be on Pipfile/pipenv. I've been using it for many project now and besides some minor annoyances (eg: the maintainer's love for color output that is not form follow function) it has been a great tool.
Never had a problem with dependencies in Python. Just keep it simple.<p>When starting a new project:<p><pre><code> virtualenv venv -p *path-to-python-version-you-want*
./venv/bin/pip install *name-of-package*
</code></pre>
When running that project:<p><pre><code> ./venv/bin/python *name-of-python-file*
</code></pre>
Many people don't realize that the venv/bin/ contains all the relevant binaries with the right library path's out of the box.
I'm not sure why the scientists don't use VMs and simply save the virtual disk files? That would at the very least allow them to verify the settings at a later date. Fresh install reproducibility doesn't seem necessary to verify experimental findings as long as the original vm is available to boot up.
1. Build Docker image out of requirements.txt<p>2. Develop application<p>3. Repeat 1-2 until ready to deploy<p>4. Run Docker image in production with same dependencies as development<p>5. ??<p>6. Profit!<p>As long as you don't rebuild in between steps 3-4, you'll have the same set of dependencies down to the exact patch level.
genuine question - is nobody using anaconda/conda in production ? I have found the binary install experience in conda far more pleasant than in anything else.<p>Going forward, the trend is going to be pipenv+manylinux (<a href="https://github.com/pypa/manylinux" rel="nofollow">https://github.com/pypa/manylinux</a>), but conda is super pleasant today
Interesting reading, I share some of the points in the post, however, one more dependency manager?<p>Mostly I've used plain `python -m venv venv` and it always worked well. A downside - you need to add a few bash scripts to automate typical workflow for your teammates.<p>Pipenv sounds great but there are some pitfalls as well.
I've been going through this post recently and got a bit upset about Pipenv:
<a href="https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-lot-delivers-very-little/" rel="nofollow">https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-l...</a><p>Another point is that it does not work well with PyCharm and does not allow to put all dependencies into the project folder as I used to do with venv. (just like to keep everything in one folder to clean up it easily)<p>Are there any better practices to make life easier?
I bitch a lot about npm, but then I remember that time when python's package distribution drove me to learn a new language. I can't help but notice that TFA and all the comments here are only talking about one end of this: managing your dev environment. Is there a similar work explaining how to distribute python packages in a straightforward manner? Is that article compatible with this one?
The author's justifications for using this home-grown tool over miniconda are weak at best, if not plain incorrect.<p>Conda really is the tool he wants; he just seems not to understand that.
Version pinning is technical debt and a fool's errand. New versions will always come out and your new development is confined to what once worked. You need to keep testing with current versions to see what will break when you upgrade and fix it as soon as possible so as to minimize the odds of a big breaking change.<p>It may keep your environment stable for some time, but that stability is an illusion because the whole world moves on. You may be able to still keep your Python 2.2 applications running on Centos 3 forever, but you shouldn't want to do it.
One things that comes to my mind is: when I was starting using Python, I was eager to mock Java people and their absurd approach (write everything in Java, specify a full classpath for all dependency, etc). I pointed out as it was easy and quick to program in Python rather than in Java.<p>I did not appreciate what the pros of a linear and well-defined (by the language) approach to the dependencies, and a clear API between the system libraries (java, javax) vs the user libraries, actually gives A LOT of value. Even though it's more cumbersome to use.
Why would you do this?
Redirect chain:<p><pre><code> https://tech.instacart.com/freezing-pythons-dependency-hell-in-2018-f1076d625241
https://medium.com/m/global-identity?redirectUrl=https%3A%2F%2Ftech.instacart.com%2Ffreezing-pythons-dependency-hell-in-2018-f1076d625241
https://tech.instacart.com/freezing-pythons-dependency-hell-in-2018-f1076d625241?gi=85c0588ca374</code></pre>
I ran into a migraine last week: cleaning up requirements.txt<p>How do you determine which requirements are no longer needed when you remove one from your code? In node, your package.json lists only packages YOU installed. So removing them cleans up their dependencies. But in Python, adding one package with pip install might add a dozen entries, none indicating they're dependencies of other packages.
Naive question: Why does this url 302 redirect to medium.com and then medium.com forwards back to the same original url?<p>Is there some commercial advantage?<p>Why not just post the medium url<p><a href="https://medium.com/p/f1076d625241" rel="nofollow">https://medium.com/p/f1076d625241</a><p>This 302 redirects to tech.instacart.com
Anybody played with the brand new XAR from Facebook?<p><a href="https://code.fb.com/data-infrastructure/xars-a-more-efficient-open-source-system-for-self-contained-executables/" rel="nofollow">https://code.fb.com/data-infrastructure/xars-a-more-efficien...</a>
Since we're sharing XKCD cartoons, here's one that comes to mind: <a href="https://xkcd.com/927/" rel="nofollow">https://xkcd.com/927/</a><p>So not to disappoint, here's another contestant: Poetry [0]<p>That said, in my experience it works best if don't force any particular workflow on your developers, but maintain a solid and repeatable process for testing and deployment. People have different mental models of their development environments -- I personally use virtualfish (or virtualenvwrapper if I'm on Bash), while a colleague works with `python -m venv`; and we have played with pipenv, pyenv, anaconda and poetry in various cases.<p>As long as your requirements are clearly defined -- requirements.txt works perfectly well for applications, and setup.py for libraries [1] -- any method should be good enough to build a development environment. On the other hand, your integration, testing and deployment process should be universal, and fully automated if possible, and of course independent of any developer's environment.<p>[0] <a href="https://github.com/sdispater/poetry" rel="nofollow">https://github.com/sdispater/poetry</a><p>[1] <a href="https://caremad.io/posts/2013/07/setup-vs-requirement/" rel="nofollow">https://caremad.io/posts/2013/07/setup-vs-requirement/</a>
<i>Use a fresh virtualenv for each project</i><p>As a form of version pinning, this locks in old versions and creates technical debt. A few years downstream, you're locked into library modules no longer supported and years behind in bug fixes.
We’ve recently went through this process at our company & chose to use pipenv as the dependency management tool. As mentioned in the article, pipenv is under active development but takes care of many things that we had custom scripts before such as requirements hashs, in-built graph of dependencies, automatic retries of failed dependencies, automatic re-ordering of dependency installations etc. it also has a few quirks - we had to pick a version that had most commands working & also pipenv install is painfully slow & didn’t seem to have a caching strategy for already built virtualenvs.
Doesn't using requirments.txt not account for (I forget the official name) Double Dependencies, you dependencies in requirements.txt might have a dependency whose version number may change over time.<p>This seems like something pip freeze could handle but doesn't.
I started using pipenv and it seems everything just works fine, except that I can't really install wxPython with pipenv, but I can live with that.
ruby practices based around bundler aren't perfect, but they did solve _this_ level of problem ~7 years ago.<p>It remains a mystery to me why python seems to have won the popularity battle against ruby. They are very similar languages, but in all ways they differ ruby seems superior to me.
Dependency hell in Python ?
The only annoying part would be missing some library to build certain packages, like lxml, etc.<p>That's all.<p>We Python developers are fortunate to have amazing tools such as pip, virtualenv, etc.
So... current tools miss some functionality. Let's invent a new one. Reminds me of another xkcd: <a href="https://xkcd.com/927/" rel="nofollow">https://xkcd.com/927/</a>