You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[19:46:58] src/operator/nn/./cudnn/./cudnn_algoreg-inl.h:97: Running performance tests to find the best convolution algorithm, this can take a while... (set the environment variable MXNET_CUDNN_AUTOTUNE_DEFAULT to 0 to disable)
Inference time:16.246319ms
,my gpu is rtx2060. and I try to use default cpu (inter i5),I got this:
I try to run gpu model like this:
python get_face_boxes_gluoncv.py --gpus 0
I get this:
,my gpu is rtx2060. and I try to use default cpu (inter i5),I got this:
so ,why it is no difference between them ?
The text was updated successfully, but these errors were encountered: