Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipenv update --dry-run not detecting some out of date packages #1073

Closed
tony opened this issue Nov 16, 2017 · 21 comments
Closed

pipenv update --dry-run not detecting some out of date packages #1073

tony opened this issue Nov 16, 2017 · 21 comments

Comments

@tony
Copy link

tony commented Nov 16, 2017

First, thank you for the great software @kennethreitz 😄

I have an out of date pytest (was at 3.2.3, and --dry-run wouldn't report 3.2.5)

pipenv, version 8.3.2
Python 3.6.3

diff --git a/Pipfile b/Pipfile
index a7761214..f9f35a30 100644
--- a/Pipfile
+++ b/Pipfile
@@ -15,9 +15,10 @@ collectfast = "==0.5.2"
 python-memcached = "==1.58"
 pytest-django = "==3.1.2"
 pytest-factoryboy = "==1.3.1"
-pytest = "==3.2.3"
+pytest = "==3.2.5"

I had to update manually via pipenv install --dev pytest==3.2.5

I had to do that by hand because I saw another project I had notified me of it. I've been running pipenv update --dry-run --dev on a near daily basis, and it seems non of my main or dev packages need an update.

For instance: I have sqlalchemy 1.1.14 in Pipfile, but I've never see any mention of 1.1.15 being available

Prior to my conversion to pipenv, I used https://github.com/sesh/piprot to detect package updates. Unfortunately, I haven't yet been able to replace it pipenv's way of detecting outdated packages. (Unless I'm getting something wrong, which also happens)

related to request for outdated command #469

I don't have enough information to substantiate or isolate at this point. Is anyone else experiencing issues with the correctness of pipenv update --dry-run and pipenv update --dry-run --dev?

@techalchemy
Copy link
Member

@tony you're going to have to give us the pipfile where it was broken... it's possible there was a dependency issue

@tony
Copy link
Author

tony commented Nov 20, 2017

@techalchemy eh it's for a production project obviously linked to me that isn't open source. so can't do that, at this time. Probably the only time ever I'll have a legitimate reason not to give more debug info 😆

if someone has a pipfile they're willing to give and can recreate this, feel free to re-open

@tony tony closed this as completed Nov 20, 2017
@techalchemy
Copy link
Member

@tony if you can provide it privately or something, that may help also, but it might be related to how pipenv handles version pinning. pipenv graph may be informative

@tony
Copy link
Author

tony commented Nov 21, 2017

@techalchemy Sent

@techalchemy
Copy link
Member

@tony at a glance the issue is just because these things are pinned in your pipfile. It won't ever suggest updates to you for packages that are pinned (i.e. you have a specific version specified with the == operator). Your lockfile is supposed to takeover the task of 'pinning' subdependencies, and only top level dependencies should appear in your pipfile.

You also have a few things that appear more than once -- I'm pretty sure this will break things, but it hasn't been tested much as far as I know. Anyway, for things like pytest (which is already depended on by some of the other things in your pipfile), you can either unpin it:

pytest = "*"

or leave it out entirely, since it will be included regardless due to being a requirement of your other dependencies. I'd recommend the second approach, personally.

For example, Unpinned your sqlalchemy dependency in your pipfile and then tried pipenv update --dry-run:

 %  /t/test  pipenv update --dry-run
Checking dependencies…
sqlalchemy==1.1.15 is available (1.1.14 installed)!

Does that make sense?

@tony
Copy link
Author

tony commented Nov 22, 2017

@techalchemy

First, forgive me if I don't fully understand a concept when I write below.

Your lockfile is supposed to takeover the task of 'pinning' subdependencies, and only top level dependencies should appear in your pipfile.

That's fine. It helps to know that. I understand that best practice would be to not pinning a dependency would be the advised thing. That makes sense.

I'd recommend the second approach, personally. For example, Unpinned your sqlalchemy dependency in your pipfile and then tried pipenv update --dry-run...

Here is where I'm confused. Does this mean that Pipenv won't prompt me to update sqlalchemy to 1.1.15 if my Pipfile is pinned to 1.1.14? That's what I mean. I (and I believe others) definitely want to be able to see available updates for stuff in my Pipfile.

I feel that's a completely legitimate request from a user perspective. Maybe it's based off intuition I've gained from other package managers, like npm, yarn, and maybe even pip + piprot.

In package.json in npm, it's always advisable (to my knowledge) to pin, even though there is a lock-file, that's for subdependencies. Think about package.lock and yarn.lock. Giving suggestions to update stuff in a lock file (which I think is what you're saying, correct me if I'm wrong) is neat, but not nearly as valuable as broadly checking for the latest PyPI version of packages in Pipfile, especially if pinned.

So if it's true Pipenv won't

  • recommend / output pypi updates for pinned packages in the Pipfile
  • as a second, but not as important thing, perhaps suggest pinning a pypi version for top-level packages that are unpinned (e.g. pytest = "*")

Then, I think my issue differs from #469. Because I'm more interested in updates to top-level packages in Pipfile.

edit: made some changes to this

@tony
Copy link
Author

tony commented Nov 22, 2017

I think I different in #469 because that was about lockfile updates, which is cool. I'm more interested in seeing PyPI version comparisons to pinned (and possibly unpinned+whats currently installed in the virtualenv) versions inside of Pipfile.

@techalchemy
Copy link
Member

@tony your packages are pinned in your Pipfile.lock, you should only pin them in your Pipfile if you never want them to be updated except by manually changing the version. By default the Pipfile.lock is the one that is used to install packages, unless it is not in sync with your Pipfile. If you pass --deploy pipenv will not allow installation using anything besides the Pipfile.lock. If it is out of sync with your Pipfile and you pass --deploy, pipenv will exit and issue an error informing you about this.

Here is where I'm confused. Does this mean that Pipenv won't prompt me to update sqlalchemy to 1.1.15 if my Pipfile is pinned to 1.1.14? That's what I mean. I (and I believe others) definitely want to be able to see available updates for stuff in my Pipfile.

So this is a bit nuanced. Your Pipfile doesn't have to be pinned, because your Pipfile.lock is already pinned and is the first place pipenv will try to install from. Even with an unpinned Pipfile everything is still pinned as long as you ship your Pipfile.lock, until you update whatever you like. If you don't ship your lockfile, however, installation will be unconstrained.

Does this mean that Pipenv won't prompt me to update sqlalchemy to 1.1.15 if my Pipfile is pinned to 1.1.14?

Essentially, if your Pipfile has version 1.1.14 pinned, that's the only version pipenv will ever attempt to ask for from PyPI. It won't ever tell you there is an update available for it, since you've instructed it to only ever install version 1.1.14 which is now the version it will always be using for dependency resolution (which is how possible upgrades are determined). What you actually want is to install your working environment and then unpin things in your Pipfile and let the pipenv resolver tell you what can be upgraded without causing dependency problems. Otherwise pipenv has to assume you really mean that you only ever want it to use version 1.1.14, because it does a lot more than just checking for possible upgrades -- it has to manage your whole environment, so it has to believe what you tell it.

@tony
Copy link
Author

tony commented Nov 23, 2017

It could be Pipfile's way of doing it, but am I incorrect/correct in saying this differs from the behavior of yarn and npm?

How well-decided upon is the current behavior compared to the behavior I suggested where Pipfile packages (pinned or unpinned) could be checked against Pypi? Since that's where the manual pinning happens, it's pretty relevant when using it.

@techalchemy
Copy link
Member

techalchemy commented Nov 23, 2017

@tony if you manually pin a package to a specific version, it really doesn't matter what versions of it are available because you have instructed pipenv that you only ever want version 1.1.14, for instance. If you really mean that you want version 1.1.14 or greater, you should use >=1.1.14 as your pin. Then pipenv will let you know if it can be updated.

These are checked against PyPI. The difference is that pipenv is trusting you to provide accurate pins. If you strictly pin a version, pipenv expects you to be doing that intentionally, and to only ever want that specific version. If it's not allowed to install anything but that version, it will never tell you it can be updated, because you've indicated that it cant.

It could be Pipfile's way of doing it, but am I incorrect/correct in saying this differs from the behavior of yarn and npm?

Kenneth has mentioned this in a few other issues, but the design decisions behind pipenv were not constrained by the practices other package managers follow. While things that work may be incorporated, each decision is made based on what makes sense, not what other tools are doing. I'm sure if new practices make sense, pipenv would adapt, but I am only providing context (which is to say that you are correct)

@tony
Copy link
Author

tony commented Nov 23, 2017

Some underlying context from my perspective:

  • First, I'm not trying to sway design decisions. I'm simply stating my POV as someone who has like 30+ projects, has to pin/sometimes doesn't, and doesn't want to manually check packages by hand.
  • When I moved to Pipfile, I lost access to piprot and pyup.io.
  • I'm also happy with Pipenv and, despite this thread so far, (and what I write below), I don't believe I'm challenging Pipenv's philosophy.

if you manually pin a package to a specific version, it really doesn't matter what versions of it are available because you have instructed pipenv that you only ever want version 1.1.14, for instance. If you really mean that you want version 1.1.14 or greater, you should use >=1.1.14 as your pin. Then pipenv will let you know if it can be updated.

Here's a scenario: 1.1.14 has a security bug, Pipenv won't let them know PyPI has an update? I want to "opt-in" to updates from packages, that's why I do == and not >=. Even knowing there is a Pipfile.lock. I still use Pipfile as a high-level lock of sorts, too.

Pinning a version, not pinning, etc. shouldn't prevent you some knowing a package is outdated. What if it was a security release? What other mechanism should people rely on?

These are checked against PyPI.

How much work would it be to check the Pipfile packages against PyPI and list the latest versions on PyPI?

The difference is that pipenv is trusting you to provide accurate pins.

Here is why I have an issue with that, I'll speak more on it below, but of course I would have a valid pin. But I maintain... like what, 30+ projects? Each with an average of 10 packages.

Kenneth has mentioned this in a few other issues, but the design decisions behind pipenv were not constrained by the practices other package managers follow. While things that work may be incorporated, each decision is made based on what makes sense, not what other tools are doing

It's just an analogy, they're not necessarily outsider's trying to impose arbitrary things or cargo cult stuff.

each decision is made based on what makes sense,

I feel this is common sense. Not as a Pipenv maintainer, but as someone who has quite a few open source and private projects running on Pipenv + Pipfile. And I'm a pretty proud user. I'm not trying to sway philosophical, decided underpinnings.

The rationale I heard so far, for what I see is a very practical need seems to be internal design philosophies I'm not even challenging. Heh.

To Pipenv, Pipfile.lock may be a source of truth. To human's, Pipfile is the source of truth. And that's what I believe most people are going to be pinning/not-pinning, and wanted to see package updates for.

In the current situations, I feel users just have to go check PyPi for each package across all their packages, if they pinned a package in Pipfile. They sort of have to use "telepathy" to figure it out?

I think Pipfile.lock is fine and has it's use cases for checking outdated stuff, but that's only one side of the coin: Pinned versions in Pipfile need a mechanism to see the package versions against PyPI.

Look, here's the big picture: I have like 30? plus packages in my Pipfile. Am I supposed to just go to PyPI every day and check when each is out of date? Write my own script?

@tony
Copy link
Author

tony commented Nov 23, 2017

In case you're using email, FYI I edited the above issue a few times.

@techalchemy
Copy link
Member

@tony I think your points are super valid, and the input is really interesting. Its really helpful to have feedback like this and I understand what you're saying about the Pipfile being the user's source of truth. Whether you are trying to explicitly influence design decisions or not isn't that important since feedback like this usually does tend to have that impact, especially if it happens frequently. I understand basically what you're asking for and the technical issue isn't really the question (obviously it's not that hard), but more on that in a second.

Here's a scenario: 1.1.14 has a security bug, Pipenv won't let them know PyPI has an update? I want to "opt-in" to updates from packages, that's why I do == and not >=. Even knowing there is a Pipfile.lock. I still use Pipfile as a high-level lock of sorts, too.

I can see doing this, but Pipenv is really not designed to work that way. Pipfiles are the last source of truth and are used only if there is no lockfile to base off of, which is why once you've converted a requirements file and created a lockfile it's recommended that you unpin things that you don't actually need explicitly pinned (and in fact that you remove them from your Pipfile). But if you're open to checking for non-api-breaking security updates for example, you might want to say ">=1.1.14,<2.0" or ">=1.1.14,<1.2". Since this is only your Pipfile, it's not actually an installation instruction, unless you dont have a lockfile. It's only a pin instruction for updates.

Pinning a version, not pinning, etc. shouldn't prevent you some knowing a package is outdated. What if it was a security release? What other mechanism should people rely on?

This is essentially why lockfiles exist. They create strict pins to specific versions, while allowing you to flexibly unpin your Pipfile & remove non-top level dependencies which allows pipenv (via pip-tools) to resolve the dependency graph if you wanted to update something. This way you can see which updates are available, and can be installed without messing up your environment, but still maintain your current pins and lockfile until you decide you want to do that.

I have no idea if this makes any sense for what you're doing, since I'm unsure about whether I understand the use case for maintaining strict pins in your Pipfile, so all of this is said at the risk of talking past or around the problem you're experiencing... But essentially if you just remove the pins in your Pipfiles, then pipenv update --dry-run should do exactly what you want, right? And as long as you pass around your Pipfile.lock, you will always install your pinned dependencies. That's how it seems to me, so feel free to let me know what I'm missing!

I think Pipfile.lock is fine and has it's use cases for checking outdated stuff, but that's only one side of the coin: Pinned versions in Pipfile need a mechanism to see the package versions against PyPI.

Look, here's the big picture: I have like 30? plus packages in my Pipfile. Am I supposed to just go to PyPI every day and check when each is out of date? Write my own script?

So correct me if I'm wrong, but it seems like what you're asking for is an equivalent to pipenv update --dry-run which acts as though packages in the Pipfile are unpinned? That kind of API change would definitely need @kennethreitz to sign off on it, but I think he'll just want you to unpin your Pipfile

@tony
Copy link
Author

tony commented Nov 24, 2017

before @kennethreitz signs off, I may want to siphon down the request to specifically what I'm asking for, perhaps into separate issues, because at this point, this issue has changed scope.

Pipfiles are the last source of truth and are used only if there is no lockfile to base off of, which is why once you've converted a requirements file and created a lockfile it's recommended that you unpin things that you don't actually need explicitly pinned (and in fact that you remove them from your Pipfile).

So let's take an open source project like https://github.com/tony/tmuxp. I benefit from using Pipenv in multiple ways:

  • it helps with virtualenv, pipenv replaced an older bootstrap script
  • it also handles my packages (which also made the bootstrap script I had redundant, yay)
  • it works only through Pipfile, which uses --skip-lock, and official Pipenv feature, and that speeds up installs significantly
  • integrate it in an eating your own dogfood way, since the project's own .tmuxp.yaml config uses Pipenv. So far it works across systems.

I actually use tmuxp + pipenv as a combo across all my open source projects, and on private projects like https://devel.tech. Also devel.tech + hskflashcards.com both are using pipenv in a production capacity. So my investment in it in my day-to-day workflow is pretty significant.

However, in order for me to completely remove requirements.txt files, I need 3 things:

  1. ability to see Pipfile package updates against Pypi regardless of pin-state/lockfile state (pyup.io/piprot style)
  2. setup.py integration (I think this could do it https://github.com/kennethreitz/pipenv/issues/209#issuecomment-278185133)
  3. performance when loading environment through pipenv + tmuxp: I got this by adding --skip-lock

So correct me if I'm wrong, but it seems like what you're asking for is an equivalent to pipenv update --dry-run which acts as though packages in the Pipfile are unpinned?

If that would have outdated pinned/unpinned packages in the Pipfile output the latest pypi version of packages, yes.

Course of action: I think it'd be a good idea to make a fresh issue with the request restated, think that is OK?

@techalchemy
Copy link
Member

it works only through Pipfile, which uses --skip-lock, and official Pipenv feature, and that speeds up installs significantly

This is a pipenv feature but it really defeats the whole point of using pipenv. It seems like the real core of the issue is that installing from the lockfile is too slow for what you want to do with it. If it was as fast or nearly as fast as --skip-lock, would you use the lockfile?

Course of action: I think it'd be a good idea to make a fresh issue with the request restated, think that is OK?

Sure, I can't promise any action but this issue seems to have gone far enough astray

@tony
Copy link
Author

tony commented Nov 24, 2017

This is a pipenv feature but it really defeats the whole point of using pipenv.

Heh, I don't even use Pipenv for dependency resolution. Pipfile is great. The lockfile only a nice-to-have for me.

Same with many others who use Pipenv, I would assume? I'm surprised I'm the first to mention it.

The primary thing it does for me is project management. It handles virtual environments and installing packages in a streamlined fashion.

This is a pain to do right across systems, as I don't know if a teammate/contributor is using virtualenv, pyenv-virtualenv, virtualenvwrapper, etc. If I just tell them to install Pipenv as a development requirement, I save loads of time documenting mundane processes and trying to make scripts work across different virtualenv methods.

It helped me replace a bootstrap script I was using across all my open and closed source projects.

Pipfile + Pipenv also handles the python version.

Sure, I can't promise any action but this issue seems to have gone far enough astray

Tomorrow morning I'll write the issue.

@tony
Copy link
Author

tony commented Nov 24, 2017

Happy Thanksgiving @techalchemy

@techalchemy
Copy link
Member

@tony fair enough :) happy thanksgiving to you too! Kinda neat to see how its being used to be honest, but it does make sense. I wonder if a solution that could work for you but also work for pipenv would be if there was a way to install from the lockfile (aka the pinned versions) but without checking hashes or anything...

@tony
Copy link
Author

tony commented Nov 26, 2017

Sorry for the delay, I said "tomorrow morning" a few days ago. I'm a bit busy now to give it clear focus.

But I also have a workaround / idea I will try: add back to requirements.txt files to get "real" (in my eyes) package update suggestions working via piprot/pyup.io, then create a Makefile task to generate a fresh Pipfile via the requirements files (a new Pipfile, incase packages are removed from one of the requirements files).

@tony
Copy link
Author

tony commented Nov 26, 2017

(In case you view by email, I did some edits to the above post)

@jayfk
Copy link

jayfk commented Jan 22, 2018

It would be super helpful if one of the devs could write some kind of authoritative user guide on how pipenv is supposed to be used, maybe even for different use cases.

I did a quick code search on GitHub and every project I've found is pretty much using pipenv in a different way. There are all kinds of different combinations of pinned/unpinned Pipfiles and committed/ignored Pipfile.locks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants