-
-
Notifications
You must be signed in to change notification settings - Fork 10
Conversation
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
d79ead9
to
6f3dc2a
Compare
Ping @conda-forge/hdk |
@conda-forge/hdk |
I believe we better can keep Ilya in the contributor list, he is quite active in HDK project, https://github.com/ienkovich |
Sorry for delay, we want to issue the release first (today/tomorrow) and then review this PR. Thanks! |
As I wrote in 56f4de3:
He's most welcome to be a maintainer, but then accept the invitation please.
Ideally a new release (from the POV of the feedstock) comes on top of this PR, but now that I've heard back, I'm happy to wait a bit (and rebase this PR later if unavoidable). |
Some more thoughts regarding this PR. It seems testing takes considerably more time. Thus it may make sense to limit a number of configurations to most popular ones. |
Hi @h-vetinari - Thanks for your patience while we got our release out. It looks like this PR needs a rebase. We do have some concerns about the CI explosion, but I agree with your point that having multiple compatible versions tested makes for an easier integration with other packages in the community. That being said, we dropped support for Arrow 10 and our minimum supported Arrow version is 11, so if you could make that change we would appreciate it. That will also reduce the CI explosion somewhat. |
It's a fair point. I dropped the CUDA 11.8 migration, as that's only really necessary for expanded architecture support. Otherwise, the CUDA 11.2 builds should be compatible with all other 11.x drivers.
Great 👍 |
6f17c5b
to
e5aabb9
Compare
The transition to CUDA 12 SDK includes new packages for all CUDA libraries and build tools. Notably, the cudatoolkit package no longer exists, and packages should depend directly on the specific CUDA libraries (libcublas, libcusolver, etc) as needed. For an in-depth overview of the changes and to report problems [see this issue]( conda-forge/conda-forge.github.io#1963 ). Please feel free to raise any issues encountered there. Thank you! 🙏
CUDA 11.0 is not supported anymore; now building for 11.2 & 12.0 on supported platforms; no other skip necessary.
…nda-forge-pinning 2023.09.29.05.06.46
e5aabb9
to
66f6f8d
Compare
recipe/meta.yaml
Outdated
- zlib | ||
run: | ||
- arrow-cpp-proc {{ arrow_proc_version }} {{ build_ext }} | ||
- pyarrow ={{ arrow_version }}=*{{ build_ext }} | ||
- pyarrow =*=*{{ build_ext }} | ||
- python | ||
- openjdk 20.* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sidenote: out of curiosity, do you really need to depend on the newest java version? This is going to make hdk impossible to co-install with a relatively large set of java-related packages. Not critical, but curious (on the openjdk feedstock we mainly focus on the LTS versions, but there's been some interest lately also in the intermediate ones).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@h-vetinari one of our users wants some specific security scan to be green and this usually implies the last java version
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, who's keeping them from depending on the latest java version on their side? That shouldn't be reflected in the recipe. Besides, java 11 & 17 still see regular releases, and will receive the same amount of security fixes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's remove this. If hdk just needs some openjdk, then any consumer can add a constraint on openjdk 20
or whatever they want.
- openjdk 20.* | |
- openjdk |
or, if there's a lower bound:
- openjdk 20.* | |
- openjdk >=11 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @vlad-penkin
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will merge this on the weekend unless there are any further comments until then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@h-vetinari few comments on openjdk pinning
- pyhdk targets python ecosystem, no interop with java ecosystem. openjdk is only needed to run calcite internally.
- java ecosystem use cases in conda-forge are quite limited and primarily around the bazel, which is needed for builds.
- older openjdk versions in conda-forge tends to have more vulnerabilities, even though 11 and 17 are LTS it's also about the cadence of the updates of the upstream binary builds and subsequent feedstock with repack updates
- we also want to ensure that pyhdk works with all openjdk versions in range, previously we had issues with the jni in older versions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pyhdk targets python ecosystem, no interop with java ecosystem.
That's fair, though you don't know what packages users will combine in their environments; if anyone needs another package that has a dependence on an LTS openjdk, this pin would prevent them from installing that package side-by-side with hdk, which is not great.
older openjdk versions in conda-forge tends to have more vulnerabilities, even though 11 and 17 are LTS it's also about the cadence of the updates of the upstream binary builds and subsequent feedstock with repack updates
I honestly doubt that LTS versions have more vulnerabilities, or for longer. Upstream releases builds pretty much concurrently for all supported versions every ~2 months, and the rebuilders (like temurin) that we rely on publish their versions pretty immediately afterwards. I'm also keeping an eye on this on the openjdk feedstock, and currently everything is on its newest supported version.
If you notice an issue, best would be to open an issue (or better: a PR) to the openjdk feedstock.
we also want to ensure that pyhdk works with all openjdk versions in range, previously we had issues with the jni in older versions.
Sounds great! 👍
see warning from link check: ``` WARNING (pyhdk,bin/SqliteConnector.dll): Needed DSO Library/bin/sqlite3.dll found in ['libsqlite'] WARNING (pyhdk,bin/SqliteConnector.dll): .. but ['libsqlite'] not in reqs/run, (i.e. it is overlinking) (likely) or a missing dependency (less likely) ```
this anticipates the upcoming boost 1.82 migrator (for windows at least); the run-dep is now taken care of by a run-export from libboost-devel.
…nda-forge-pinning 2023.09.29.05.06.46
1560f23
to
57dd789
Compare
the code looks good and clean for me, I especially like .bat file changes |
…nda-forge-pinning 2023.09.29.18.08.25
@conda-forge/hdk, I'd like to merge this soon. Please let me know what lower bound of openjdk is actually necessary for correct functioning of the package, and if you explicitly want to carry a lower-bound for libsqlite (this would only be for extra clarity; the build already pulls in the newest version by default anyway). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Java version pinning looks fine for now, if needed we can tweak it later. Otherwise I think this is good to go. Thanks for all your work on this and my apologies for the initial delays on our side.
Catch up on all outstanding migrations.
Closes #9
Closes #34
Closes #35
Closes #44
Also build for each supported arrow version, rather than just one. That way, hdk stays compatible with the rest of the ecosystems and up-to-date with the newest versions (modulo the migration PRs of course). We could reduce the explosion of the CI matrix a bit if we decided to only support the 2 newest arrow versions here, for example...