-
-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix run exports to restrict the minor version #1409
Conversation
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
@conda-forge-admin, please rerender |
…nda-forge-pinning 2024.05.15.19.07.12
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Vyas! 🙏
Have a few questions below. Admittedly these may be things other maintainers have thoughts on as well
Also whatever we decide, we will likely want a repodata patch as well. We can wait to do that until we have finished this PR though
- {{ pin_subpackage("libarrow-gandiva", max_pin="x") }} | ||
- {{ pin_subpackage("libarrow-substrait", max_pin="x") }} | ||
- {{ pin_subpackage("libparquet", max_pin="x") }} | ||
- {{ pin_subpackage("libarrow", max_pin="x.x") }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Am noticing in pyarrow
, we are doing this
- libarrow {{ version }}.*=*{{ build_ext }}
Do we want a similarly tight pin here or should pyarrow
relax its pin?
Recognize more is happening in the pyarrow
case. Am only focusing on how tight we pin (and whether we include the package hash)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Am noticing in
pyarrow
, we are doing this
This is necessary because pyarrow is built on a different feedstock, and so cannot use pin_subpackage
.
Do we want a similarly tight pin here or should
pyarrow
relax its pin?
Neither.
- {{ pin_subpackage("libarrow-acero", max_pin="x.x") }} | ||
- {{ pin_subpackage("libarrow-dataset", max_pin="x.x") }} | ||
- {{ pin_subpackage("libarrow-flight", max_pin="x.x") }} | ||
- {{ pin_subpackage("libarrow-flight-sql", max_pin="x.x") }} | ||
- {{ pin_subpackage("libarrow-gandiva", max_pin="x.x") }} | ||
- {{ pin_subpackage("libarrow-substrait", max_pin="x.x") }} | ||
- {{ pin_subpackage("libparquet", max_pin="x.x") }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For these is a sufficiently tight libarrow
pin sufficient? If so, could we drop max_pin
altogether?
Or do we want all of these equivalently tight? In which case maybe a Jinja for
-loop would be more DRY and easier to maintain
Also this comes up below
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In which case maybe a Jinja
for
-loop would be more DRY and easier to maintain
Let's not do that. It creates more comprehension overhead than it reduces, especially if the list of elements if not next to the loop anymore. For artefacts testing that's less of an issue, but for run-exports we really need to be crystal-clear, because errors have far more wide-reaching consequences.
@raulcd, this is exactly what I meant when asking about arrow's first minor version bump in #1407. 😅 Given that this was explicitly loosened in #1096, perhaps the ABI should be the same, and the libraries only versioned by the major versions? CC @kou @jorisvandenbossche |
@vyasr, could you send a PR to mark the just-released |
Yup will do. |
Opened conda-forge/admin-requests#996. Let me know if that looks right (there are a lot of arrow libraries, I went ahead and marked all of them). |
@jorisvandenbossche @kou @raulcd Footnotes
|
I have no idea why the original implementation did it this way. I wasn't aware the ABI would change between minor versions. Nonetheless for 16.1.0 we already have a change of ABI so the only solution for conda is to pin the minor version. Otherwise we would require a new Arrow release and for conda to skip 16.1.0, right? |
Please see #1096 by your colleagues. |
Do we know that the ABI actually changed? Or is it just the default way the SOVERSION is constructed (taking into account the minor version) that makes it seem like the ABI has changed? |
This is all very unclear and undocumented, a situation we should definitely improve on the Arrow side. See also apache/arrow#41659 and apache/arrow#41679 My understanding:
|
That's fair enough, though there's a cost to this safety - incompatibility of a lot of packages between each other, as well as the need to migrate all arrow-dependent packages every time. I'm OK with migrating per minor version if that's the reality, but I'd really ask if you can make it work to not break ABI on patch releases. |
Yep, that's fair. Migrating per minor version is what conda-forge should definitely do at the moment, because as long as we do not re-discuss and decide on this on the Arrow side, the official policy and what we do in practice is bumping the so version for minor versions. And we indeed should do a better job of ensuring to not break the ABI in patch releases (I also assume it hasn't happened much, I only remember this one report) Anyway, I think starting with moving the pin to minor versions sounds good, and I opened the following issues to follow-up on the Arrow side: |
FWIW from the RAPIDS end libcudf has not encountered ABI-breaking issues with patch releases of arrow that I can recall. We have historically used a runtime pinning that allows both the minor and patch version to float, and it hasn't caused us any issues. The minor version of course hasn't mattered since arrow hasn't had a minor release before now, but at least that indicates that empirically patch versions haven't broken the ABI. Sure I can put together a repodata patch. I won't get to it until next week, but I should be able to get it done early next week. |
OK thanks! In this case, I want to move forward with a bunch of migrations, so ended up doing it myself. 🙃 If you're interested in how it's done (for next time 😉), see here: conda-forge/conda-forge-repodata-patches-feedstock#731 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
@h-vetinari thanks! I'm familiar with repodata patches but I definitely would not have done it so fast. @conda-forge/arrow-cpp we'll need to backport this to get new builds of 16.0 up that have the correct run exports. The fix here ensures that 16.1 builds are find, but any packages on other channels that are building against 16.0 (which is still the conda-forge pinning) will still inherit the wrong run exports. |
Created an issue with this request in #1418 |
Checklist
0
(if the version changed)conda-smithy
(Use the phrase@conda-forge-admin, please rerender
in a comment in this PR for automated rerendering)The current run exports for this library are too loose. Minor releases within the same major family are not ABI-compatible with one another. Arrow has never released a minor version before, so we've never observed this issue, but with the recent 16.1 release we see that the libraries packaged by this recipe all contain SOVERSIONs that are based on both the major and minor versions of the library, so packages built against one minor version cannot use a different one at runtime.