-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
installing on alpine is slooow, if we can't have pre-compiled, could we have parallel builds instead? #261
Comments
perhaps phase one would be a new package which wrapped around pip and called packages in parallel? It would "simply" need to:
Apart from being ugly, is there anything that would block this from working? |
You should just prepare a CI pipeline building a docker image and then use that image in other builds. |
arguments surely should not be passed via a command line. |
For parallel downloading aria2c can be used. |
that's a very poor solution, it involve updating the image on every release of every package, some of that can be done automatically with pyup etc/ but it's still another repo, CI setup, pyup, image release - a lot more faff.
well, it depends how good pips python api is, but this was just to make a case, hence why I caveated it with "or something marginally less ugly".
As I explained download is not the bottleneck, it takes <10 seconds while build takes minutes |
Yes. The good thing it doesn't need to happen too often. I usually do the following: |
You can have both! It's just that someone has to do the work of figuring out how to implement them and then actually implementing them - like any other functionality in the volunteer-run PyPA projects. Both of those tasks are non-trivial tasks, which is part of why they haven't been "solved" yet. Due to how pip / manylinux are positioned in the ecosystem, the solution to these problems has to be general and not affect existing workflows. If you want to champion this effort, you are welcome to! :D |
@pradyunsg, great. My questions were:
@KOLANICH, thanks for your response - I think the length of your description of a work around demonstrates how much simpler |
Definitely.
I don't know. I've not looked into this lately but I'm pretty sure you can find some thoughts on the pip issue. Since this issue is scoped to just pip for now, I'm moving this conversation to pypa/pip#825. |
Ubuntu and alpine
any, 3.7
any, latest, 19.x
ref:
I spend a lot of my life waiting for packages to build when building images based on alpine, it's not download that's the problem but build time.
If you have an image that includes uvloop, asyncpg, cryptography, pycares, aiohttp etc. you can easily be waiting 10 minutes for that build stage alone.
Could pip do those builds concurrently across multiple processes to save time?
On a modern machine this could speed up installs by 10x.
For me this would be an elegant workaround until then time when (if) musl binaries are available.
Is there anything fundamentally stopping this from happening (eg. race conditions on install directory etc.)?
The text was updated successfully, but these errors were encountered: