-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Re-enable cucim and xgboost in CUDA 12 rapids builds. #669
Changes from all commits
7da6367
921f35e
f4660ee
9417308
d1aab9c
8b16f6b
e9a1ff8
8e32ca7
9a1ab9d
1b21545
e7ebe5d
328f27f
264fa10
6b0f77b
2d55e20
a2bcb83
3743b2b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -6,20 +6,21 @@ set -euo pipefail | |
source rapids-env-update | ||
|
||
CONDA_CONFIG_FILE="conda/recipes/versions.yaml" | ||
export CONDA_OVERRIDE_CUDA="${RAPIDS_CUDA_VERSION}" | ||
|
||
rapids-print-env | ||
|
||
rapids-logger "Build rapids-xgboost" | ||
|
||
bdice marked this conversation as resolved.
Show resolved
Hide resolved
|
||
rapids-mamba-retry mambabuild \ | ||
--no-test \ | ||
--use-local \ | ||
--variant-config-files "${CONDA_CONFIG_FILE}" \ | ||
bdice marked this conversation as resolved.
Show resolved
Hide resolved
|
||
conda/recipes/rapids-xgboost | ||
|
||
rapids-logger "Build rapids" | ||
|
||
rapids-mamba-retry mambabuild \ | ||
--no-test \ | ||
--use-local \ | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. just want to mention that for other repos, we use
It looks like There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah |
||
--variant-config-files "${CONDA_CONFIG_FILE}" \ | ||
conda/recipes/rapids | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -45,20 +45,12 @@ requirements: | |
- cudf ={{ major_minor_version }}.* | ||
- cugraph ={{ major_minor_version }}.* | ||
- cuml ={{ major_minor_version }}.* | ||
{% if cuda_major == "11" %} | ||
# Temporarily disabled on CUDA 12 until | ||
# https://github.com/rapidsai/cucim/issues/513 is complete | ||
- cucim ={{ major_minor_version }}.* | ||
{% endif %} | ||
- cuspatial ={{ major_minor_version }}.* | ||
- custreamz ={{ major_minor_version }}.* | ||
- cuxfilter ={{ major_minor_version }}.* | ||
- dask-cuda ={{ major_minor_version }}.* | ||
{% if cuda_major == "11" %} | ||
# Temporarily disabled on CUDA 12 until | ||
# https://github.com/rapidsai/xgboost-feedstock/issues/4 is complete | ||
- rapids-xgboost ={{ major_minor_version }}.* | ||
{% endif %} | ||
- rmm ={{ major_minor_version }}.* | ||
- pylibcugraph ={{ major_minor_version }}.* | ||
- libcugraph_etl ={{ major_minor_version }}.* | ||
|
@@ -69,23 +61,11 @@ requirements: | |
- conda-forge::ucx-proc=*=gpu | ||
- conda-forge::ucx {{ ucx_version }} | ||
|
||
test: # [linux64] | ||
imports: # [linux64] | ||
- cucim # [linux64] | ||
- cudf # [linux64] | ||
- cudf_kafka # [linux64] | ||
- cugraph # [linux64] | ||
- cuml # [linux64] | ||
{% if cuda_major == "11" %} | ||
- cusignal # [linux64] | ||
{% endif %} | ||
- cuspatial # [linux64] | ||
- custreamz # [linux64] | ||
- cuxfilter # [linux64] | ||
- dask_cuda # [linux64] | ||
- dask_cudf # [linux64] | ||
- pylibcugraph # [linux64] | ||
- rmm # [linux64] | ||
Comment on lines
-73
to
-88
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We could run these through This would let us test for their existence without needing to There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't think this is necessary for this PR, maybe file an issue or PR with this proposal later on. I feel comfortable with the current level of testing, which is higher than what we had before. |
||
test: | ||
bdice marked this conversation as resolved.
Show resolved
Hide resolved
|
||
requires: | ||
- cuda-version ={{ cuda_version }} | ||
commands: | ||
- exit 0 | ||
|
||
about: | ||
home: https://rapids.ai/ | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably consider the implications here. Today, I think rapids is installable even on CPU-only machines. The new rapids-xgboost package design requires
__cuda
to install. This is important to support for cases like HPC systems with CPU login nodes and GPU worker nodes that use the same environment.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah they can also install by setting
CONDA_OVERRIDE_CUDA
to some valueIn any event, this is coming from
libxgboost
. So we could move this just to therapids-xgboost
if we preferThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the relevant new change? Is it in libxgboost or in something about how rapids-xgboost is packaged?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay. Well, maybe this requirement already existed. I'm not sure.
The question is whether
__cuda
should be a hard requirement for installation, which is coming from xgboost-related packages. I'm not sure if it was that way for the old xgboost packages we shipped in 23.06 or not. Regardless, it feels funny that no other RAPIDS package has this requirement besides xgboost.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dropping this in PR ( #673 ), which pulls in the new
xgboost
packages