Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem running SAVER on large dataset #32

Open
RomiGoldner opened this issue Aug 18, 2021 · 4 comments
Open

Problem running SAVER on large dataset #32

RomiGoldner opened this issue Aug 18, 2021 · 4 comments

Comments

@RomiGoldner
Copy link

Hello,
I have a large dataset with 7685 cell and 18,000 genes that I was to use SAVER on (for imputation).
I tried running it with 12, 6, 4, and 2 cores (setting ncore argument) and it doesn't seem to work.
I also tries to split the genes to subgroups and combine using the saver combine but that didn't work either.

Is there a way you can assist me? Or have any recommendations?
Thanks in advance!

@mohuangx
Copy link
Owner

Hi,

Can you provide the function call and the error message that you get?

Thanks!

@RomiGoldner
Copy link
Author

The function is the general "saver" function.
It starts the computation process and after computing the lambda coefficients, when it predicts the "remaining genes" RStudio crashes. There is no specific error.

@mohuangx
Copy link
Owner

And the same thing happened when you tried to split the genes into subgroups?

It sounds to me like a memory issue. Can you try running the example code in the saver help file to see if that works? Or maybe try running it on a subset of 1000 genes and 100 cells.

@RomiGoldner
Copy link
Author

Yes, it happened when I tried to split it into subgroups of about 2500 genes.
I will try smaller groups and will update. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants