-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pipenv update --dry-run not detecting some out of date packages #1073
Comments
@tony you're going to have to give us the pipfile where it was broken... it's possible there was a dependency issue |
@techalchemy eh it's for a production project obviously linked to me that isn't open source. so can't do that, at this time. Probably the only time ever I'll have a legitimate reason not to give more debug info 😆 if someone has a pipfile they're willing to give and can recreate this, feel free to re-open |
@tony if you can provide it privately or something, that may help also, but it might be related to how pipenv handles version pinning. |
@techalchemy Sent |
@tony at a glance the issue is just because these things are pinned in your pipfile. It won't ever suggest updates to you for packages that are pinned (i.e. you have a specific version specified with the You also have a few things that appear more than once -- I'm pretty sure this will break things, but it hasn't been tested much as far as I know. Anyway, for things like pytest (which is already depended on by some of the other things in your pipfile), you can either unpin it:
or leave it out entirely, since it will be included regardless due to being a requirement of your other dependencies. I'd recommend the second approach, personally. For example, Unpinned your sqlalchemy dependency in your pipfile and then tried % /t/test pipenv update --dry-run
Checking dependencies…
sqlalchemy==1.1.15 is available (1.1.14 installed)! Does that make sense? |
First, forgive me if I don't fully understand a concept when I write below.
That's fine. It helps to know that. I understand that best practice would be to not pinning a dependency would be the advised thing. That makes sense.
Here is where I'm confused. Does this mean that Pipenv won't prompt me to update sqlalchemy to 1.1.15 if my Pipfile is pinned to 1.1.14? That's what I mean. I (and I believe others) definitely want to be able to see available updates for stuff in my Pipfile. I feel that's a completely legitimate request from a user perspective. Maybe it's based off intuition I've gained from other package managers, like npm, yarn, and maybe even pip + piprot. In package.json in npm, it's always advisable (to my knowledge) to pin, even though there is a lock-file, that's for subdependencies. Think about package.lock and yarn.lock. Giving suggestions to update stuff in a lock file (which I think is what you're saying, correct me if I'm wrong) is neat, but not nearly as valuable as broadly checking for the latest PyPI version of packages in Pipfile, especially if pinned. So if it's true Pipenv won't
Then, I think my issue differs from #469. Because I'm more interested in updates to top-level packages in Pipfile. edit: made some changes to this |
I think I different in #469 because that was about lockfile updates, which is cool. I'm more interested in seeing PyPI version comparisons to pinned (and possibly unpinned+whats currently installed in the virtualenv) versions inside of Pipfile. |
@tony your packages are pinned in your
So this is a bit nuanced. Your
Essentially, if your |
It could be Pipfile's way of doing it, but am I incorrect/correct in saying this differs from the behavior of yarn and npm? How well-decided upon is the current behavior compared to the behavior I suggested where Pipfile packages (pinned or unpinned) could be checked against Pypi? Since that's where the manual pinning happens, it's pretty relevant when using it. |
@tony if you manually pin a package to a specific version, it really doesn't matter what versions of it are available because you have instructed pipenv that you only ever want version 1.1.14, for instance. If you really mean that you want version 1.1.14 or greater, you should use These are checked against PyPI. The difference is that pipenv is trusting you to provide accurate pins. If you strictly pin a version, pipenv expects you to be doing that intentionally, and to only ever want that specific version. If it's not allowed to install anything but that version, it will never tell you it can be updated, because you've indicated that it cant.
Kenneth has mentioned this in a few other issues, but the design decisions behind pipenv were not constrained by the practices other package managers follow. While things that work may be incorporated, each decision is made based on what makes sense, not what other tools are doing. I'm sure if new practices make sense, pipenv would adapt, but I am only providing context (which is to say that you are correct) |
Some underlying context from my perspective:
Here's a scenario: 1.1.14 has a security bug, Pipenv won't let them know PyPI has an update? I want to "opt-in" to updates from packages, that's why I do Pinning a version, not pinning, etc. shouldn't prevent you some knowing a package is outdated. What if it was a security release? What other mechanism should people rely on?
How much work would it be to check the Pipfile packages against PyPI and list the latest versions on PyPI?
Here is why I have an issue with that, I'll speak more on it below, but of course I would have a valid pin. But I maintain... like what, 30+ projects? Each with an average of 10 packages.
It's just an analogy, they're not necessarily outsider's trying to impose arbitrary things or cargo cult stuff.
I feel this is common sense. Not as a Pipenv maintainer, but as someone who has quite a few open source and private projects running on Pipenv + Pipfile. And I'm a pretty proud user. I'm not trying to sway philosophical, decided underpinnings. The rationale I heard so far, for what I see is a very practical need seems to be internal design philosophies I'm not even challenging. Heh. To Pipenv, Pipfile.lock may be a source of truth. To human's, Pipfile is the source of truth. And that's what I believe most people are going to be pinning/not-pinning, and wanted to see package updates for. In the current situations, I feel users just have to go check PyPi for each package across all their packages, if they pinned a package in Pipfile. They sort of have to use "telepathy" to figure it out? I think Pipfile.lock is fine and has it's use cases for checking outdated stuff, but that's only one side of the coin: Pinned versions in Pipfile need a mechanism to see the package versions against PyPI. Look, here's the big picture: I have like 30? plus packages in my Pipfile. Am I supposed to just go to PyPI every day and check when each is out of date? Write my own script? |
In case you're using email, FYI I edited the above issue a few times. |
@tony I think your points are super valid, and the input is really interesting. Its really helpful to have feedback like this and I understand what you're saying about the Pipfile being the user's source of truth. Whether you are trying to explicitly influence design decisions or not isn't that important since feedback like this usually does tend to have that impact, especially if it happens frequently. I understand basically what you're asking for and the technical issue isn't really the question (obviously it's not that hard), but more on that in a second.
I can see doing this, but Pipenv is really not designed to work that way. Pipfiles are the last source of truth and are used only if there is no lockfile to base off of, which is why once you've converted a requirements file and created a lockfile it's recommended that you unpin things that you don't actually need explicitly pinned (and in fact that you remove them from your Pipfile). But if you're open to checking for non-api-breaking security updates for example, you might want to say
This is essentially why lockfiles exist. They create strict pins to specific versions, while allowing you to flexibly unpin your Pipfile & remove non-top level dependencies which allows pipenv (via pip-tools) to resolve the dependency graph if you wanted to update something. This way you can see which updates are available, and can be installed without messing up your environment, but still maintain your current pins and lockfile until you decide you want to do that. I have no idea if this makes any sense for what you're doing, since I'm unsure about whether I understand the use case for maintaining strict pins in your Pipfile, so all of this is said at the risk of talking past or around the problem you're experiencing... But essentially if you just remove the pins in your Pipfiles, then
So correct me if I'm wrong, but it seems like what you're asking for is an equivalent to |
before @kennethreitz signs off, I may want to siphon down the request to specifically what I'm asking for, perhaps into separate issues, because at this point, this issue has changed scope.
So let's take an open source project like https://github.com/tony/tmuxp. I benefit from using Pipenv in multiple ways:
I actually use tmuxp + pipenv as a combo across all my open source projects, and on private projects like https://devel.tech. Also devel.tech + hskflashcards.com both are using pipenv in a production capacity. So my investment in it in my day-to-day workflow is pretty significant. However, in order for me to completely remove requirements.txt files, I need 3 things:
If that would have outdated pinned/unpinned packages in the Pipfile output the latest pypi version of packages, yes. Course of action: I think it'd be a good idea to make a fresh issue with the request restated, think that is OK? |
This is a pipenv feature but it really defeats the whole point of using pipenv. It seems like the real core of the issue is that installing from the lockfile is too slow for what you want to do with it. If it was as fast or nearly as fast as
Sure, I can't promise any action but this issue seems to have gone far enough astray |
Heh, I don't even use Pipenv for dependency resolution. Pipfile is great. The lockfile only a nice-to-have for me. Same with many others who use Pipenv, I would assume? I'm surprised I'm the first to mention it. The primary thing it does for me is project management. It handles virtual environments and installing packages in a streamlined fashion. This is a pain to do right across systems, as I don't know if a teammate/contributor is using virtualenv, pyenv-virtualenv, virtualenvwrapper, etc. If I just tell them to install Pipenv as a development requirement, I save loads of time documenting mundane processes and trying to make scripts work across different virtualenv methods. It helped me replace a bootstrap script I was using across all my open and closed source projects. Pipfile + Pipenv also handles the python version.
Tomorrow morning I'll write the issue. |
Happy Thanksgiving @techalchemy |
@tony fair enough :) happy thanksgiving to you too! Kinda neat to see how its being used to be honest, but it does make sense. I wonder if a solution that could work for you but also work for pipenv would be if there was a way to install from the lockfile (aka the pinned versions) but without checking hashes or anything... |
Sorry for the delay, I said "tomorrow morning" a few days ago. I'm a bit busy now to give it clear focus. But I also have a workaround / idea I will try: add back to requirements.txt files to get "real" (in my eyes) package update suggestions working via piprot/pyup.io, then create a Makefile task to generate a fresh Pipfile via the requirements files (a new Pipfile, incase packages are removed from one of the requirements files). |
(In case you view by email, I did some edits to the above post) |
It would be super helpful if one of the devs could write some kind of authoritative user guide on how pipenv is supposed to be used, maybe even for different use cases. I did a quick code search on GitHub and every project I've found is pretty much using pipenv in a different way. There are all kinds of different combinations of pinned/unpinned |
First, thank you for the great software @kennethreitz 😄
I have an out of date pytest (was at 3.2.3, and --dry-run wouldn't report 3.2.5)
pipenv, version 8.3.2
Python 3.6.3
I had to update manually via
pipenv install --dev pytest==3.2.5
I had to do that by hand because I saw another project I had notified me of it. I've been running
pipenv update --dry-run --dev
on a near daily basis, and it seems non of my main or dev packages need an update.For instance: I have sqlalchemy 1.1.14 in Pipfile, but I've never see any mention of 1.1.15 being available
Prior to my conversion to pipenv, I used https://github.com/sesh/piprot to detect package updates. Unfortunately, I haven't yet been able to replace it pipenv's way of detecting outdated packages. (Unless I'm getting something wrong, which also happens)
related to request for outdated command #469
I don't have enough information to substantiate or isolate at this point. Is anyone else experiencing issues with the correctness of
pipenv update --dry-run
andpipenv update --dry-run --dev
?The text was updated successfully, but these errors were encountered: