-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First call to sess.run() at inference time is slow #9
Comments
The quality for the grasps is also much worse than expected: I have tried recompiling the pointnet tf ops using this script |
Regarding inference: on the desktops I have tried it, it may take 2-3 seconds on the first inference but not 1162 seconds... not sure why it takes longer on your machine. Regarding problem in inference: Some thing is terribly wrong in here. I assume you already checked |
@thomasweng15 let me know if setting up cuda 11.1 fixes the issue for you. |
@thomasweng15 |
Hi, have you encountered an issue where the first call to sess.run() in contact_grasp_estimator.py is slow? I am running the inference example in the readme, and when I time sess.run() the first call takes much longer than subsequent calls:
I found this thread on what seems to be a similar issue but the simple resolutions have not worked, and I have not tried compiling tensorflow from source yet. I am running on a GTX 3090 with CUDA 11.1, tensorflow-gpu==2.2. Have you encountered this issue before? Thanks for your help.
The text was updated successfully, but these errors were encountered: