Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates to build and test nx-cugraph wheel as part of CI and nightly workflows #3852

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
985e921
Updates GHA yaml files to build and test a nx-cugraph wheel, removes …
rlratzel Sep 8, 2023
9ba4595
Changed rapids-mamba-retry to rapids-conda-retry to work around break…
rlratzel Sep 8, 2023
d63eb6e
Revert "Changed rapids-mamba-retry to rapids-conda-retry to work arou…
rlratzel Sep 8, 2023
89e9dc9
Replaces rapids-mamba-retry mambabuild with rapids-conda-retry mambab…
rlratzel Sep 8, 2023
f3ffacc
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 8, 2023
64a3d8c
chmod +x to new nx-cugraph wheel scripts.
rlratzel Sep 8, 2023
b3c595f
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 8, 2023
e473c5b
Adds wheel smoke test scrip for nx-cugraph.
rlratzel Sep 8, 2023
e67a3ee
Adds comments to smoke test script.
rlratzel Sep 8, 2023
de14d8d
Cleanup.
rlratzel Sep 8, 2023
9412d20
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 22, 2023
52ffb67
Using nxcg import convention.
rlratzel Sep 22, 2023
f79f231
Skips running auditwheel on nx-cugraph wheel.
rlratzel Sep 22, 2023
0fbddeb
Modifies wheel upload based on if auditwheel needs to be run or not.
rlratzel Sep 22, 2023
a1d079d
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 22, 2023
b3faa5d
Changes conditional logic for clarity.
rlratzel Sep 22, 2023
0e9c754
Changes package name passed to test script to match expectations of t…
rlratzel Sep 22, 2023
e084d5d
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 22, 2023
a662ec7
Updates to use python package names where needed (- to _).
rlratzel Sep 25, 2023
8d7546e
Merge remote-tracking branch 'upstream/branch-23.10' into branch-23.1…
rlratzel Sep 26, 2023
5adefe8
Removes FIXME that was addressed in a prior commit.
rlratzel Sep 26, 2023
9f81547
Revert update of copyright date since no changes were made.
rlratzel Sep 26, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -112,3 +112,23 @@ jobs:
sha: ${{ inputs.sha }}
date: ${{ inputs.date }}
package-name: cugraph
wheel-build-nx-cugraph:
needs: wheel-publish-pylibcugraph
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
with:
build_type: ${{ inputs.build_type || 'branch' }}
branch: ${{ inputs.branch }}
sha: ${{ inputs.sha }}
date: ${{ inputs.date }}
script: ci/build_wheel_nx-cugraph.sh
wheel-publish-nx-cugraph:
needs: wheel-build-nx-cugraph
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
with:
build_type: ${{ inputs.build_type || 'branch' }}
branch: ${{ inputs.branch }}
sha: ${{ inputs.sha }}
date: ${{ inputs.date }}
package-name: nx-cugraph
16 changes: 16 additions & 0 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ jobs:
- wheel-tests-pylibcugraph
- wheel-build-cugraph
- wheel-tests-cugraph
- wheel-build-nx-cugraph
- wheel-tests-nx-cugraph
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
checks:
Expand Down Expand Up @@ -109,3 +111,17 @@ jobs:
with:
build_type: pull-request
script: ci/test_wheel_cugraph.sh
wheel-build-nx-cugraph:
needs: wheel-tests-pylibcugraph
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
with:
build_type: pull-request
script: ci/build_wheel_nx-cugraph.sh
wheel-tests-nx-cugraph:
needs: wheel-build-nx-cugraph
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
with:
build_type: pull-request
script: ci/test_wheel_nx-cugraph.sh
9 changes: 9 additions & 0 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,12 @@ jobs:
date: ${{ inputs.date }}
sha: ${{ inputs.sha }}
script: ci/test_wheel_cugraph.sh
wheel-tests-nx-cugraph:
secrets: inherit
uses: rapidsai/shared-action-workflows/.github/workflows/[email protected]
with:
build_type: nightly
branch: ${{ inputs.branch }}
date: ${{ inputs.date }}
sha: ${{ inputs.sha }}
script: ci/test_wheel_nx-cugraph.sh
12 changes: 8 additions & 4 deletions ci/build_wheel.sh
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,11 @@ cd "${package_dir}"

python -m pip wheel . -w dist -vvv --no-deps --disable-pip-version-check

mkdir -p final_dist
python -m auditwheel repair -w final_dist dist/*

RAPIDS_PY_WHEEL_NAME="${package_name}_${RAPIDS_PY_CUDA_SUFFIX}" rapids-upload-wheels-to-s3 final_dist
# pure-python packages should not have auditwheel run on them.
if [[ ${package_name} == "nx-cugraph" ]]; then
RAPIDS_PY_WHEEL_NAME="${package_name}_${RAPIDS_PY_CUDA_SUFFIX}" rapids-upload-wheels-to-s3 dist
else
mkdir -p final_dist
python -m auditwheel repair -w final_dist dist/*
RAPIDS_PY_WHEEL_NAME="${package_name}_${RAPIDS_PY_CUDA_SUFFIX}" rapids-upload-wheels-to-s3 final_dist
fi
6 changes: 6 additions & 0 deletions ci/build_wheel_nx-cugraph.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/bin/bash
# Copyright (c) 2023, NVIDIA CORPORATION.

set -euo pipefail

./ci/build_wheel.sh nx-cugraph python/nx-cugraph
9 changes: 6 additions & 3 deletions ci/test_wheel.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,20 @@ set -eoxu pipefail
package_name=$1
package_dir=$2

python_package_name=$(echo ${package_name}|sed 's/-/_/g')

mkdir -p ./dist
RAPIDS_PY_CUDA_SUFFIX="$(rapids-wheel-ctk-name-gen ${RAPIDS_CUDA_VERSION})"

# echo to expand wildcard before adding `[extra]` requires for pip
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was a case where the wildcard did not match the .whl files on disk and 'echo' simply echo'd the wildcard pattern, which made pip generate a misleading error message. Using 'ls' will result in the script erroring out with a clear message about the missing/mis-named files.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Odd, but I'm fine with the change if it works.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with this change, I see the problem with shell wildcard expansion.

# use 'ls' to expand wildcard before adding `[extra]` requires for pip
RAPIDS_PY_WHEEL_NAME="${package_name}_${RAPIDS_PY_CUDA_SUFFIX}" rapids-download-wheels-from-s3 ./dist
python -m pip install $(echo ./dist/${package_name}*.whl)[test]
# pip creates wheels using python package names
python -m pip install $(ls ./dist/${python_package_name}*.whl)[test]

# Run smoke tests for aarch64 pull requests
arch=$(uname -m)
if [[ "${arch}" == "aarch64" && ${RAPIDS_BUILD_TYPE} == "pull-request" ]]; then
python ./ci/wheel_smoke_test_${package_name}.py
else
RAPIDS_DATASET_ROOT_DIR=`pwd`/datasets python -m pytest ./python/${package_name}/${package_name}/tests
RAPIDS_DATASET_ROOT_DIR=`pwd`/datasets python -m pytest ./python/${package_name}/${python_package_name}/tests
fi
8 changes: 0 additions & 8 deletions ci/test_wheel_cugraph.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,4 @@ python -m pip install --no-deps ./local-pylibcugraph-dep/pylibcugraph*.whl
# Always install latest dask for testing
python -m pip install git+https://github.com/dask/dask.git@main git+https://github.com/dask/distributed.git@main

# Only download test data for x86
arch=$(uname -m)
if [[ "${arch}" == "x86_64" ]]; then
pushd ./datasets
bash ./get_test_data.sh
popd
fi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this change is necessary for nightlies? Were nightly tests for arm not running the tests that required datasets?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't remove this, but might need to modify it to only be skipped in ARM PR jobs. In ARM PR jobs, we can only run "smoke tests" due to limited resource capacity.

# Run smoke tests for aarch64 pull requests

Downloading the datasets is time-intensive and should not be performed for "smoke tests" in ARM PR jobs because it wastes a lot of time on the ARM GPU node.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm terribly sorry. I read this backwards. This removed downloading datasets for ALL jobs (all arches, nightly/PR). Are we not dependent on datasets for tests to pass anymore?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The get_test_data.sh script will download supplemental datasets that are too big to commit to our github repo. Those are currently only used by C++ tests. The python tests use the smaller .csv datasets committed to the repo here, so it's safe to skip downloading when only testing python code.

I only removed it here since it was in the proximity of the changes for the wheel builds, but I think an audit of what runs require downloading the supplemental datasets would be good for a separate PR.

Prior to the recent commits to the PR which triggered CI running now, we had everything passing with the download step removed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @rlratzel, I'll approve now.


./ci/test_wheel.sh cugraph python/cugraph
6 changes: 6 additions & 0 deletions ci/test_wheel_nx-cugraph.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/bin/bash
# Copyright (c) 2023, NVIDIA CORPORATION.

set -eoxu pipefail

./ci/test_wheel.sh nx-cugraph python/nx-cugraph
38 changes: 38 additions & 0 deletions ci/wheel_smoke_test_nx-cugraph.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Copyright (c) 2023, NVIDIA CORPORATION.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import math

import networkx as nx
import nx_cugraph as nxcg


if __name__ == "__main__":
G = nx.Graph()
G.add_edges_from([(0, 1), (1, 2), (2, 3)])

nx_result = nx.betweenness_centrality(G)
# nx_cugraph is intended to be called via the NetworkX dispatcher, like
# this:
# nxcu_result = nx.betweenness_centrality(G, backend="cugraph")
#
# but here it is being called directly since the NetworkX version that
# supports the "backend" kwarg may not be available in the testing env.
nxcu_result = nxcg.betweenness_centrality(G)

nx_nodes, nxcu_nodes = nx_result.keys(), nxcu_result.keys()
assert nxcu_nodes == nx_nodes
for node_id in nx_nodes:
nx_bc, nxcu_bc = nx_result[node_id], nxcu_result[node_id]
assert math.isclose(nx_bc, nxcu_bc, rel_tol=1e-6), \
f"bc for {node_id=} exceeds tolerance: {nx_bc=}, {nxcu_bc=}"