-
-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Workflow for layered requirements (e.g. prod<-test<-dev requirements)? #398
Comments
Yes I totally think that's a good strategy.
I've just published my |
Hi @jamescooke, I just saw your post, it looks great! Once suggestion I could make here is that you can include the shared .in file (so not the .txt file!) from within your .in files. That way, pip-compile has just a tiny little bit more information to compile the per-env output files. |
To answer the original question, you can use this for your
And then use this to compile it:
And then it all just works™. |
Hi @nvie - thanks for the kind words about the blog post 😊. The reason that I recommended including the As an example, let's say that a project wants any Django that's version 1.8, so in
When we compile that file in October it picks
Now, in November a new version of Django is released
Using
As you'll see, Django has now been bumped in the dev requirements only to It's for this reason that I have been including the So with
When this is compiled we now get
This maintains the exact Django I'd be really happy to see the just works:tm: version of the kind of pinning that I'm talking about - I'm sure I'm getting the wrong end of the stick somewhere. One alternative I could see is to pin the Django version in an Sorry for the long essay 😞 |
Thanks for the extensive explanation, @jamescooke! You're absolutely correct. This is why I normally also advise to always recompile both (all?) .in files at once, never one and not the other, so the pinned versions remain in sync. But yeah, I agree that this takes discipline and if you forget about it, the tooling won't protect you against these diverging pins indeed, your example illustrates that perfectly. Not sure how we can build support for this workflow into pip-compile natively. Thanks for shining a little light on this subject. |
Sorry if I missed something, but what's the issue of |
I'll just chime in with 100% voting for Only real downside is that sometimes the Maybe a better end-result would be to be able to compile all |
@Groxx Does that happen after you upgrade (recompile) For (imaginary) example, |
Yep, exactly. Though maybe more accurately (with fake versions):
It's all made worse when the cause is in |
Even if it doesn't take long, it does waste time; and I believe it's one of the reason why we have this project. I came from Ruby background, where Bundler did the right thing: even if you ask it to only upgrade a single package, it re-constructs the entire dependency graph, including development ones. Similarly, when upgrading, I believe
I'm not sure whether current Pip works with this workflow, that is, whether this workflow is a feasible solution for |
@Groxx thanks for the @FranklinYu - The Bundler strategy might also work. Thanks for illustrating the unions of package requirements 👍 |
@jamescooke yeah, it does happen. The alternative though, if you include
The mismatch between dev and prod and the lack of any error or warning are largely what pip-tools helps eliminate, so to me this is an entirely unacceptable result. I much prefer to have the second step fail, which reveals that there is a problem, rather than rely on my eyes to catch the disparity between the two. |
This bit me today and also coming from a Ruby/Bundler background I like that all dependencies is in the same lock file, but I don't want to install dev dependencies in production. However this seems incompatible with how pip currently operates. That is, one requirements.txt to rule them all, but separate common/dev dependencies. I had hoped that having dev.in -> dev.txt would solve this but as others have noted you get conflicts. And while I could have a -r somewhere it still would produce two lock files, which sooner or later will diverge. So my question is if it would be possible to teach pip-compile to just write the dependencies for one input file, while accepting the pinned ones in another. Perhaps an example would clarify this: # requirements.in
django # requirements.txt
django==1.9 # dev.in
-r requirements.in
django-debug-toolbar # dev.txt
-r requirements.txt
django-debug-toolbar==1.6
# note, no direct django dependency here, but still respect the 1.9 bound. Here I've overloaded -r to point to the other file. Thoughts? |
How about considering eliminating the need for multiple files by supporting sections in # Install in all environments
gem 'rails'
gem 'mysql'
# Install only in test
group 'test' do
gem 'rspec'
end
# Install only in development
group 'development' do
gem 'web-console'
end |
@maxnordlund This blog post answers your question, I believe: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html |
That would make it incompatible with vanilla
I've read that, it's linked in the first comment. In the end I wrote a small python script to generate the two files, with dev having a The sad part here is that you need another layer to get it right, either a script or make, instead of it being included in the box. The only thing I think might be good enough without changing the format too much, is the suggestion above. That a |
I think this would bring value, and keep existing processing. The only change would be in the file collection phase before invoking the resolver. I believe this is not too hard to implement, I am open to a PR for this functionality (with relevant tests). |
@jamescooke thanks for posting that article (though it was a while ago). I made one slight modification to it:
i.e. this corrects the annoyance I know this isn't the really the place to discuss your Makefile, but I've grown tired of editing |
Hi @dfee , thanks for sharing this suggestion 👍 I've not been able to get this working on my machine, so I won't update my article just yet. The post is on GitHub here https://github.com/jamescooke/blog/blob/master/content/1611-pip-tools-workflow.rst - feel free to open an issue / PR to discuss. |
Hi guys, Wanted to jump in with this observation, that for me is really important: I consider that having For instance
Then in dev.in
Then in the resulted dev.txt
Anyway, that's all from me. Whoever fixes this, please take this into consideration if you can. |
Why can't |
I followed @jamescooke's flow and recently ended up in a state where I had to add a constraint to my
Try compiling I've hacked around this for now by explicitly stating |
Is there a possibility of having for example, one might do -
Which would generate the following
( It could be sensible to even have this such that |
Does anyone have any existing work related to this? |
I presume does not. Is this solution from #398 (comment) not suitable for you? |
This describes a workflow: #532
|
@atugushev The workflow is fine. My issue is that on requiring another requirements.txt (eg. -r base.txt) I lose the comments on where they originated from. base.txt: dev.in: dev.txt: |
The canonical way is to use By including IMO this workflow with |
That works for me. Is there any downside to requiring .in files other than the extra time to compile all required .in files? |
What I would love to have, is a simplified command that does this stuff in one go. #!/usr/bin/env bash
pip install pip-tools
for f in requirements/*.in
do
pip-compile $f -v
done
pip-compile requirements/*.in -v -o requirements/requirements.txt
pip-sync requirements/requirements.txt |
How about using It could be used in addition to But I agree that some standard tooling to support multiple non-contradicting sets of requirements would be ideal here (multiple requirements sections in setup.cfg is one example of similar structures). Resolver could mimic the logic with constraint files above when generating multiple sets of pinned requirements... |
Also, when using I think it's the most correct and flexible way available right now. Not sure if it can cause any issues but so far I haven't stumbled upon any and I suspect |
I've tried this approach and must say that It works like a charm! Would be nice to have it documented it somewhere. |
@IvanAnishchuk Thanks for the tip on
Answer = "Use @atugushev When you say "Would be nice to have it documented it somewhere." does that mean cramming this into the README? If so, I can give it a go and open a PR. It would be nice to close off this Issue before it gets to be 3 years old. |
Yes, it does! Please go for it, I'd be happy to merge it. |
Hey folks, Finally, this issue's closed! Huge thanks to @IvanAnishchuk for the idea and @jamescooke for the docs! 🎉 |
FYI, |
Say I have
requirements.in
:And also
requirements-dev.in
:How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?
For now I have an ad-hoc script that compiles requirements.in first, then requirements-dev.in has
-r requirements.txt
as its first line. Is this an okay workflow? I'm worried that in the future if I add a dependency it will try and update a bunch of stuff I don't want it to update, but I haven't actually used this tool long enough to determine whether that's truly a problem. Wondering if anyone else has used pip-tools in this fashion and has any advice?The text was updated successfully, but these errors were encountered: