-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
inference on 4k images. #14
Comments
Please set Running inference on such huge images will take a large amount of GPU memory. I'm not sure how much memory is enough. |
thanks for the reply, i get this error message, |
For reducing the memory usage, you can refer to this issue. But, I suppose 11G memory is not enough for 4k images... |
you can get k80 / 32gb of VRAM on ec2 spot pricing for $0.27 |
Thanks @z-x-yang and @johndpope for your help! |
For 480p, by any means is it possible for inference on CPU, below is the lowest memory usage on mobilent. MobileNetV2-Fast-CFBI-DAVIS - 3053MiB |
I've not tried to infer on CPU. Maybe you can try it by removing all the |
thanks, i tried with cpu below are the values i got. |
how can i run inference on 4k( 3840 x 2160) images, as presently DAVIS or youtube-vos uses 480p resolution.
The text was updated successfully, but these errors were encountered: