-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regarding GPU memory #23
Comments
The authors seem unresponsive but I have the same problem on a 3090 which has the same amount of VRAM. Please let me know if you manage to resolve this. |
Can you run right now ? |
There are frequent issues with the cuda OOM, and the L40 with 48GB of VRAM is not enough for inference. |
@ALisstry 尬住了,本来想对老视频修复一下,没想到会这样,有12GB显卡能够运行的方案吗? |
Yeah I often encounter cuda OOM even on a 48GB A6000. |
Hi authors,
Thanks for your great work! May I know what gpu did you use for inference, as performing inference on the video you provided with 4090 (24GB) memory would encounter cuda OOM issue. I'm not sure if it's a bug or your model requires a gpu with larger memory.
Thanks.
The text was updated successfully, but these errors were encountered: