-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Start pinning doc and test requirements? #1566
Comments
My (often unpopular) opinion on pinning is that one should only pin if there's a known or expected reason to pin. You've given one
That use case doesn't seem a very likely or strong one (as one could manually install older, compatible verisons)... or simply go back to Travis or RTD and find out what versions were used at that time. In my experience, pinning versions is a much higher maintenance burden than addressing emergent concerns when they arise. But if you (or anyone) is willing to absorb that burden (or better, mitigate it), then I'm +0. |
In general I agree with this, and I strongly agree with it for libraries, but there are other reasons to pin the test dependencies as well. For example, the first person making a PR after a test dependency breaks something in our build will start getting broken CI, whereas if we have all our test dependencies pinned, the CI breaks on the PR that upgrades the dependency. No need to write a separate PR to fix the build, then rebase all the old PRs against it just to get CI working again.
I have recently revisited an old project that didn't pin its versions, and the tests were failing. It was very hard to figure out the versions that were used in the passing versions of the tests (though admittedly I did not have CI set up for this project). I still don't like the idea of relying on RTD and Travis to store this information.
I can see why this would be the case, which is why I only advocate this if there's an out-of-the-box solution that provides a bot that will do the upgrades for us, but I think there's a bot that will do that. At that point, we'll probably end up merging a lot more PRs, but we can just merge all the "upgrade dependencies" PRs where the CI passes. Ideally we'd be doing the same thing with our vendored dependencies, actually, so that we keep the vendored dependencies relatively up-to-date. |
Just a note: you don't need to rebase, just close and re-open the PR. |
My $0.02: Test env/CI/docs dependencies most definitely should be pinned. However, it's a good practice to keep them up-to-date. Yet, it's no job for a human. We've got robots to do that dirty work... These two I saw other people using and used myself:
They just send PRs against master with bumped dependencies. Which triggers CI. Once CI is green, the human can just hit merge. That's it! Well, if it's too boring, we can also automate merging on green CI... |
Assuming that you're not sourcing the
The If you don't mind, I'd try to create a PR with pinned versions. |
This is not quite what we're talking about here, and in general we want two "requirements.txt" files, one that is abstract and follows the "setup.py" rules, and one that is a concrete "lock file" that locks in the test and documentation requirements and their transitive dependencies. In any case, pinning the requirements is not the difficult part of this ticket, the more difficult part of this ticket is setting up some sort of automated mechanism for updating the requirements files, since we want both reproducible test and doc environments and to use recent versions of our dependencies. |
I thought it was more about the bug that we stumbled upon yesterday when trying to fix #1761 -- where running tests in pristine I still believe that pinning everything in |
@pganssle may I suggest having |
@webknjaz Yes, I always forget about
The way you have implemented it actually destroys information, because some of these things have minimum or maximum dependency requirements for a reason, and others are pinned to a specific value for a reason. Pinning all of them doesn't allow us to know which ones are safe to update and which ones aren't. I think the way forward is to go with Sviatoslav's suggestion of a |
I don't think I agree, especially not with the But in the end it's your decision of course, I'd just like to point out that you have some responsibility here. Packaging (despite all the awesome effort of you guys) in Python is still very messy . People look at the specs and then maybe at some example code from the Packaging Authority only to find out that even they are not following the specs. |
@pganssle FTR I usually include constraints reference in requirements files so that it's automatically enforced:
As for updating things, If that doesn't feel fitting the needs, this project may need to have its own robot and I may be willing to collaborate on that. But we'll need to come up with clear requirements first. |
If we do not freeze all dependencies (direct and transitive), we have the same problems as if we froze nothing. I'm not sure what problems a true lock file solves that
What you linked to is not a spec, it just explains the differences between the kind of things people use In The |
FTR dependabot is now owned by GitHub so it's most likely a way to go but we'd need to convince them to improve Python support... Oh, it looks like they are integrating it directly: https://github.blog/2019-05-23-introducing-new-ways-to-keep-your-code-secure/. |
So I was thinking and discussing this a bit at the EuroPython sprint, I think we'll probably want to go with something custom, at least for a little while. One problem we might have is that our constraints will be different for each version of Python we test against (e.g. we may be using Ideally we would also do some simple collapsing of requirements just to prevent these things from ballooning (e.g. merge together all the constraints that are the same for all versions, or for all versions |
@pganssle it looks like dependabot now uses pip-tools. Maybe it's worth trying it own. The maintainers seem to be open to improvements too... |
@webknjaz From what I've seen, |
A few updates here. I've been experimenting with an approach that seems to be close to what you were suggesting (as I understand now). Basically, it's a tox integration with a pip wrapper that automatically adds TBC.
Yeah but that's automatable now + GH is now alpha-testing a Zuul-like merge queue + we could specify monthly or quarterly updates if needed. The main showstopper right now is managing multiple constraints files which, when I get it right, will open new doors for us. |
When making the issue #1565, I realized that if we don't do exact-version pins of the documentation requirements, some future person who wants to build old versions of the docs may start running into errors. Same for people trying to run old versions of the tests.
I'm thinking maybe we should start pinning all development requirements, and set up a bot to monitor them for changes and automatically bump them. That will make it much easier for people in the future to reproduce old behavior of the repo.
The text was updated successfully, but these errors were encountered: