Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip certain cugraph-pyg tests when torch_sparse is not available #3962

Merged
merged 2 commits into from
Oct 31, 2023

Conversation

tingyu66
Copy link
Member

@tingyu66 tingyu66 commented Oct 30, 2023

The CSC (CSR) code path in cugraph-pyg requires torch_sparse package. However, torch_sparse does not seem to work out of box for rockylinux8. This PR fixes such CI failures.

Closes https://github.com/rapidsai/graph_dl/issues/343

@tingyu66 tingyu66 requested a review from a team as a code owner October 30, 2023 18:59
@tingyu66 tingyu66 added non-breaking Non-breaking change improvement Improvement / enhancement to an existing function labels Oct 30, 2023
@@ -201,6 +201,10 @@ def test_cugraph_loader_from_disk_subset():

@pytest.mark.skipif(isinstance(torch, MissingModule), reason="torch not available")
def test_cugraph_loader_from_disk_subset_csr():
# TODO: The CSC/CSR code path in cugraph_sampler.py requires torch_sparse
# as dependency. Remove this check when torch_sparse becomes a dependency.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

torch-sparse will never be a dependency of cuGraph-PyG. CSC model is an optional feature, and torch-sparse is only a test dependency of cuGraph-PyG. Eventually, I think PyG is going to throw out torch-sparse altogether anyways.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

data.put_edge_index(
(row_dict[key], col_dict[key]),
edge_type=key,
layout="csc",
is_sorted=True,
)

IIRC, data.put_edge_index() uses SparseTensor internally from torch_sparse for CSC/CSR types, so I don't think it's just a test dependency.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but that is optional. PyG users are not required to use put_edge_index or get_edge_index and we don't want to force torch-sparse unless they're using CSC mode, which uses those functions. In depenencies.yaml, we specify torch-sparse as a test-only dependency.

@@ -201,6 +201,10 @@ def test_cugraph_loader_from_disk_subset():

@pytest.mark.skipif(isinstance(torch, MissingModule), reason="torch not available")
def test_cugraph_loader_from_disk_subset_csr():
# TODO: The CSC/CSR code path in cugraph_sampler.py requires torch_sparse
# as dependency. Remove this check when torch_sparse becomes a dependency.
pytest.importorskip("torch_sparse", reason="PyTorch Sparse not available")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

importorskip is not the RAPIDS-preferred way to handle skipping. This should be changed to using import_optional and pytest.mark.skipif like we do for torch.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They have been corrected. PTAL.

@@ -334,6 +338,8 @@ def test_cugraph_loader_e2e_coo():
@pytest.mark.skipif(isinstance(torch, MissingModule), reason="torch not available")
@pytest.mark.parametrize("framework", ["pyg", "cugraph-ops"])
def test_cugraph_loader_e2e_csc(framework):
pytest.importorskip("torch_sparse", reason="PyTorch Sparse not available")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, we should be using import_optional and skipif.

@tingyu66 tingyu66 changed the title Skip certain cugraph-pyg tests when torch_scatter is not available Skip certain cugraph-pyg tests when torch_sparse is not available Oct 31, 2023
Copy link
Member

@alexbarghi-nv alexbarghi-nv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@alexbarghi-nv
Copy link
Member

/merge

@rapids-bot rapids-bot bot merged commit 24845ca into rapidsai:branch-23.12 Oct 31, 2023
70 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
improvement Improvement / enhancement to an existing function non-breaking Non-breaking change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants