-
Notifications
You must be signed in to change notification settings - Fork 310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: spectral clustering always crashes with > 100 eigenvectors #710
Comments
@joelb92 We currently limit the number of eigenvectors since is limited benefit to using more than the first 30 eigenvectors. Is there an analytical problem that you are trying to solve based on the 100th eigenvector? |
@joelb92 I'd recommend giving a try a smaller number of eigenpairs. Based on the experiment in Fig. 2a : https://www.sciencedirect.com/science/article/pii/S1877050917307913
|
making as closed |
Describe the bug
I am attempting to partition a large sparse graph using cugraph's new spectralModularityMaximizationClustering method (however this bug also can be reproduced using the balanced version)
The graph I am performing on is ~ 100k*100k in size, with 444429 edges (quite sparse). No matter what I set the partition size to (in my case 5000), I cannot set the num_eigen_vects parameter to be greater than 100 without a crash:
To Reproduce
Steps to reproduce the behavior:
Graph building:
Converting to cugraph:
Performing spectral clustering:
Error:
Expected behavior
With a matrix of this size I would expect partitioning to allow more than 100 eigen vectors.
Desktop (please complete the following information):
conda install -y -c nvidia -c rapidsai -c numba -c conda-forge -c defaults cugraph cudatoolkit=10.0
Can anyone else reproduce this issue with a large random graph?
Joel
The text was updated successfully, but these errors were encountered: