-
-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a new 16.0 build with updated run exports #1418
Comments
The pinning moved to 16.1 now, as the migration from 16.0 -> 16.1 was very easy/quick. That should solve your problem?
I'm not 100% sure what you mean - packages not leveraging the pinning should be using 16.1 already? |
Yup, that's good enough for us to be able to fix our problem. We'll upgrade to 16.1
To elaborate, the problematic case is when a non-conda-forge package has We can upgrade to 16.1 to avoid this problem, but it could still impact anyone else who is building non-cf packages and has pinned their builds to 16.0. |
I understand that the run-exports on 16.0 are currently incorrect. My question was why someone not using the global pinning would be affected, as they would then presumably be using 16.1 already. I agree that it's something that should be fixed in principle, but I'm ambivalent about creating a branch just for that. |
The example I gave was how RAPIDS was previously handling its pinning and it's why we weren't using 16.1 yet. However, that was before you merged conda-forge/conda-forge-pinning-feedstock#5930. Our goal is usually to stay aligned with conda-forge's pinnings to make our packages install smoothly with conda-forge packages. Now that cf has been updated, I am also ambivalent about whether we update the 16.0 builds or just close this issue as a wontfix. I don't see any reason to jump through hoops to support the 16.0 use case given that cf has moved its pinning. IOW feel free to close this if you don't think the work of the branch is worthwhile. |
Potentially we could...
It would be nicer if we could just repodata patch the old packages though |
I don't think a branch is worthwhile because there are several more 16.0 builds with the wrong run-export, and all that's necessary to pull them in is to use an old pin for something that was migrated recently. So it's a corner case of a corner case IMO (the ABI wasn't broken AFAICT, just the SOVERSION changed), and it'll solve itself by anyone using up-to-date packages. That said, if you feel strongly about this, feel free to go ahead with your plan (just ensure that the back-merge doesn't change anything compared to
Are the run-exports actually present in the repodata itself? I thought (without any justification though) that they're just present on artefact-level? |
I believe that you're right and they're only at the artifact level, and not in the repodata. Sounds like we're leaning towards treating this as a wontfix, which as I said is fine with me. |
We've discussed adding |
Sounds like that would be worth an issue somewhere - could you raise one? |
Better late than never Here is an issue: conda/ceps#87 |
Solution to issue cannot be found in the documentation.
Issue
In #1409 we fixed the run exports for the arrow C++ packages because we realized that with the release of arrow 16.1 (the first minor release of arrow) that the run exports were too loose. The arrow library is SOVERSIONed according to both the major and minor version, so the 16.0 and 16.1 packages are not compatible (you can't build with 16.0 and run with 16.1). However, this fix was pushed directly to create new 16.1 packages. The conda-forge pinning for arrow is still set at 16.0, so any packages built against arrow on conda-forge will still be fine. However, for other packages leveraging the conda-forge arrow package that don't get the benefits of these pinnings, the previous run exports on the 16.0 package remain problematic.
We need to publish new arrow 16.0 packages with the correct run exports to fix this use case. We can use a new branch for that as documented here.
Installed packages
Environment info
The text was updated successfully, but these errors were encountered: