This is great, but sometimes I think that python needs a new package manager from scratch instead of more tools trying to mix and mash a bunch of flawed tools together in a way that's palatable by most of us. Python packaging sucks, the whole lot of it. Maybe I'm just spoiled by rust and elixir, but setuptools, distutils, pip, ez_install, all of it is really subpar. But of course everything uses pypi and pip now, so it's not like any of it can actually be replaced. The state of package management in python makes me sad. I wish there was a good solution, but I just don't see it.<p>Edit: I don't mean to disparage projects like this and pipfile. Both are great efforts to bring the packaging interface in line with what's available in other languages, and might be the only way up and out of the current state of affairs.
> I wrote a new tool this weekend, called pipenv.<p>> It harnesses Pipfile, pip, and virtualenv into one single toolchain. It features very pretty terminal colors.<p>For a weekend project, this has some very nice things.<p>Which removes the need for me to run my own project that basically does these things... In more or less, a worse way.<p>Everything I've come to expect from Reitz, and hopefully it'll gain some decent ground like other projects of the same author.
For people who want to do it right without using an additional tool read this: setup.py vs. requirements.txt by Donald Stufft <a href="https://caremad.io/posts/2013/07/setup-vs-requirement/" rel="nofollow">https://caremad.io/posts/2013/07/setup-vs-requirement/</a>
Hey everyone. I'm Kale, currently the lead developer on the conda project. It's been mentioned a few times in this thread, and I just want to make sure that any questions about it are answered accurately. Feel free to ask me anything about product conda. Thanks!
Neat. Now for questions and comments.<p>Often people have a requirements.live.txt, or other packages depending on the environment. Is that handled somehow? Can we use different files or sections? [ED: yes, different sections]<p>Still wondering to myself if this is worth the fragmentation for most people using requirements.txt ? Perhaps the different sections could have a "-r requirements.txt" in there, like how requirements.dev.txt can have "-r requirements.txt". [ED: the pipfile idea seems to have quite some people behind it, and pip will support it eventually. Seems it will be worth it to standardise these things. requirements.txt is a less jargony name compared to Pipfile though, and has a windows/gui friendly extension.]<p>Other tools can set up an environment, download stuff, and run the script. Will pipenv --shell somescript.py do what I want? (run the script with the requirements it needs). ((I guess I could just try it.)) [ED: doesn't seem so]<p>Why Pipfile with Caps? Seems sort of odd for a modern python Thing. It looks like a .ini file? [ED: standard still in development it seems. TOML syntax.]<p>With a setup.py set up, all you need to do is `pip install -e .` to download all the required packages. Or `pip install somepackage`. Lots of people make the setup.py file read the requirements.txt. Do you have some command for handling this integration? Or is this needed to be done manually? [ED: seems no considering about this/out of scope.]<p>Is there a pep? [ED: too early it seems.]
I'm surprised that no one has mentioned pip-tools: <a href="https://github.com/nvie/pip-tools" rel="nofollow">https://github.com/nvie/pip-tools</a><p>It's a very similar set of tools. I use pip-compile which allows me to put all of my dependencies into a `requirements.in` file, and then "compile" them to a `requirements.txt` file as a lockfile (so that it is compatible with pip as currently exists).<p>This looks great, though, I'm excited to check it out!
I think an app should not expose end users to its dependencies. That leaves the end user with a lot of pain figuring out versions of dependencies and god forbid you need to compile some dep then you need a build environment and its dependencies any of which can fail in this chain leaving a very unpleasant and even hostile end user experience.<p>Ruby and Node apps are particularly guilty of this pulling in sometimes hundreds of packages some of which need compilation. Compare that to a Go binary which is download and use. These things can get very complicated very fast even for developers or system folks let alone end users who may not be intimately familiar with that specific ecosystem.
Hey other Reitz fans. Make sure to check out his newish podcast series: <a href="https://www.kennethreitz.org/import-this/" rel="nofollow">https://www.kennethreitz.org/import-this/</a>
Seems to be a python specific nix-shell like tool?<p>With nix[OS] you just run `nix-shell -p python[2,3] python[2,3]Pacakges.numpy ...` to get an environment with the required packages.<p>Of course this requires that the python library is packaged in nix, but in my experience the coverage is quite good, and it's not very hard to write packages once you get the hang of it.<p>It also possible (but currently a bit clumsy in some ways) to set up named and persistent environments.
See also: <a href="https://github.com/pypa/pipfile" rel="nofollow">https://github.com/pypa/pipfile</a><p>I'm glad to see Python getting the same attention as other modern package managers. This is all great work!
I will definitely be trying this out. Python version and package management is a dumpster fire that wastes gobs of my time on the regular. I'll try anything that promises to end the pain.
Finally someones does it!
I was using:
pip -t .pip
in my code, avoiding virtual-env completely, but that was not enough and incomplete.<p>As this is not cross platform and it would be nice to switch between Linux/Windows while coding to maintain platform compatibility, can the virtualenv envs be created with a os platform & subsystem prefix ?
for example, having multiple envs at once:<p><pre><code> - env/posix/bin/activate
- env/nt/Scripts/activate.bat</code></pre>
I always wonder if this could be done once and for all languages, instead of Ruby making bundler, Haskell Cabal sandboxes or stack, Perl Brew, etc. Is this where Nix is going?
normally I use virtualenvwrapper and that makes a virtualenv directory for all virtualenvs you create with it. before that, I always create my projects' venvs inside my project hierarchy.<p>I had a dilemma about it. But after all, you can not move your venv directory unless you use `--relocatable` option. So, anyone have a strong argument about creating venvs inside your project directory?
I was really not a fan of the last "made in Reitz" project Maya. But this, I really can get along.<p>The whole things make it way easier to get started for a beginner. Now more activate. No more wondering about virtualenv. Automatic lock files are great since no project I know of use them since they are not well understood.<p>It's like node_packages (easy and obvious), but cleaner (no implicit magic).<p>Like.
LinkedIn has a similar open source project that is much more mature. It builds on Gradle features to manage complex dependencies and build Python artifacts. If you include this LinkedIn Gradle Plugin [1] you can automatically run tests in a virtual env and source a file to enter the project's virtual env.<p>PyGradle [2]: "The PyGradle build system is a set of Gradle plugins that can be used to build Python artifacts"<p>[1] <a href="https://github.com/linkedin/pygradle/blob/01d079e2b53bf9933aa786af1ec7dabcf964c669/docs/python.md" rel="nofollow">https://github.com/linkedin/pygradle/blob/01d079e2b53bf9933a...</a><p>[2] <a href="https://github.com/linkedin/pygradle" rel="nofollow">https://github.com/linkedin/pygradle</a>
<i>> --three / --two Use Python 3/2 when creating virtualenv.</i><p>I use Python 2.7, 3.4, and 3.5 on various projects. Is there a way to choose between 3.4 and 3.5 using Pipenv? I'm using something like this with virtualenv:<p><pre><code> $ virtualenv -p `which python3.5` .venv</code></pre>
I haven't really used requirements.txt because I found that I could install 'extra' and 'test' specific content based on args to setup() in my setup.py. It seems more like the Right Thing than requirements.txt, from what I can tell.<p>At first glance, this doesn't seem to offer anything beyond what I already see from setup(). What am I missing?<p>It's unfortunate that CPython gave us distutils and took a very long time to converge on a built-in successor (setuptools?) that gives the right composability.
This is very interesting! I had exact same question how to do it in python just a while ago! <a href="http://stackoverflow.com/questions/41427500/creating-a-virtualenv-with-preinstalled-packages-as-in-requirements-txt" rel="nofollow">http://stackoverflow.com/questions/41427500/creating-a-virtu...</a><p>Glad that someone thought about similar thing and made a tool to solve it!
> Otherwise, whatever $ which python will be the default.<p>This is a bit strange because the python binary is always supposed to be Python 2. The Python 3 binary is supposed to be named python3. Some distributons don't follow this, but they're the weird non-conformant ones; it's not a behaviour that should really be relied on.