-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip build command #6041
Comments
Related Discussion: pypa/packaging-problems#219 |
Some things that need to be considered, transferred from that discussion:
Overall, I think pip needs a review of all of its download/build functionality, with the intention of rationalising it into a better organised set of subcommands. |
I've never been convinced that building the sdist was analogous to building the binary distribution or building to install "setup.py install". It would make sense for one command to be able to create more than one kind of binary. |
Do you reckon the following is acceptable as an alternative to the current $ pip download -r requirements.txt
$ pip build *.tar.gz *.zip |
I hadn't really thought through how it would get done, but yes, that seems simple enough to be an acceptable approach. Worth documenting as part of the transition, though - it's not immediately obvious that the average user would think of this. (More accurately I didn't think of it and I'd prefer to believe against all the odds that this means that it's not obvious, rather than just that I'm dumber than average 😉) |
I think the more interesting bit is that we now have 2 kinds of dependencies - build time and run time. This means even a I have a draft structure in my head -- I'll post it when I find a bit of time to flesh it out. |
FTR - I've forgotten what I had in mind but I'll be working on pip's build logic in the coming months so I might pick this up. |
I would like to suggest that this is probably a bad idea at least if the motivation is to create a recommended build tool to replace One reason for this is that I'll note that while it seems like Another reason to avoid adding this to The one time where I think it would make sense for |
It would made sense to me if twine grew or used a PEP 517 builder to produce wheels. In some simple projects where I use flit, the same tool builds and uploads dists. |
These arguments make sense to me. However, it's worth noting that (at least in my mind) the intention very definitely is to move to a "source → sdist → wheel → install" process. I'd rather we did that by delegating all of the build processes to a separate library that could be shared among different tools. The |
I'm on board for doing a Your comments about the |
Yeah, I think we should probably have something that is basically a thin-ish wrapper around
Yeah, the biggest complication of the builder tool I think will be that we'll either have to expose the implementation detail that it uses I would like to see a future where the tool for doing package installations is also configurable, so maybe it's best to expose it as a configurable implementation detail with either a plugin system or some guarantees about how the environment setup utility will be called.
@pradyunsg and @merwok's comments illustrate exactly why I think a third tool makes the most sense. If we're worried about tool proliferation, we may want to create a wrapper tool ( |
This is where I've reached multiple times, when thinking about this stuff and it makes a lot of sense to me. I know @ncoghlan probably also has thoughts here along similar lines.
Already a thing. https://pypi.org/project/ppm/ I have a name in mind, if I end up developing this tool but I'd rather not state it publicly; it may get squatted by someone else on PyPI, before me. I've had 2/3 false-starts of writing such a tool. |
I think worrying about another tool is a tiny bit premature TBH. Quite frankly, I think every time we introduce yet another project/tool people end up more confused. Libraries are great, splitting up complex topics into multiple libraries are perfectly fine to me, because I don't think most end users are going to interact with the libraries directly, only advanced users who are going to understand the differences better. This is probably a larger question better suited for discourse, but honestly I think we need to take a holistic view of what our ideal tool looks like. I would try to put the current set of tools out of mind for now, and try to design an experience up front. It can be useful to experiment with a brand new project (or projects) for this, but ultimately the goal shouldn't necessarily be to produce yet another tool, but to come up with an outline of how what we think the destination should be. Once we have a destination in mind, we'll be far better off making the decision about if we should adapt the current tools to provide that OR if we should look at creating new tools. We can also decide then where it makes sense to compromise on our ideal, in order to fit into the real world of what the sheer weight of existing code demands. Personally I think there are a few major questions we need to ask ourselves:
All that being said, I think trying to follow the "unix philosophy" is a mistake and is actually a pretty poor UX. Yea a lot of nerds grok it because we've caused enough collective brain damage by being forced to use it over time and it works better for the typical unix tools because they generally just come preinstalled. I think it would just add additional complexity to an already confusing landscape of tools for our end users. Ideally we figure out a rough idea for our ideal "goal", and figure out how to turn our current tools into that, but failing that we can at least better articulate why we needed yet another tool. |
I'll start a discourse topic. :) |
I think we should still do this, using pypa/build or our existing code. We already host the logic for this, and I still feel that exposing this to users more directly would be a genuine usability improvement for them. All frontends that are attempting to encapsulate the user workflow are providing a build command on their CLI as well, for usability reasons. |
As a user, I totally agree. Especially since the direct usage of setup.py got discouraged, it'd be an advantage, if pip could create a source archive. |
PRs are welcome, of course. Otherwise, it'll be a matter of when the pip maintainers get around to this, which might be some time (we're all pretty busy). |
Ended up here after investigating warnings from setup.py, followed several threads to other discussions, eventually found that this seems to have been addressed in a new(ish) |
Phase 2 from #5407
Some notes in #5407 (comment)
pip now has PEP 517 support and uses build isolation by default on packages with pyproject.toml files -- which means the way pip builds is difficult to reproduce externally in a trivial manner.
The
pip build
command would use the same code paths as PEP 517 with the sole difference being that it does not install the packages. It's essentially thepip wheel
in a PEP 517 world.The text was updated successfully, but these errors were encountered: