-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
24GB graphics card memory is not enough #5
Comments
Please make sure that the "use_checkpoint" is set as "True". |
Thank you very much for your answer. I keep the parameter setting of the network, 'use_checkpoint' defaults to True. Is there any other setting Registration-CorrMLP/CorrMLP/networks.py Line 22 in da5ce37
|
This might be because using the checkpointing function in 4090 can reduce more GPU memory than 3090. |
Ok, I'll try. Thank you |
Hello, have you solved this problem? I also encountered the same problem when training. I used 4090 |
Hello, I tried to use 3090 to run your network, and found that the video memory was insufficient. The picture size is the same. Is there any way to reduce the video memory usage? I see that your server is 4090, which is also 24GB. If you have time, please reply
The text was updated successfully, but these errors were encountered: