You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
No I did not. If you want to use it, you might need to correct the values beforehand. The raw reflectance need to be calibrated to actually mean something.
No I did not. If you want to use it, you might need to correct the values beforehand. The raw reflectance need to be calibrated to actually mean something.
Best,
Hugues
Thank you so much Hugues. I tried to train NPM3D dataset as you suggested (#15 (comment)) under CUDA9.2, tensorflow 1.12.3 and GeForce GTX 1080 Ti. The output loss still became NaN after several epochs. It seems not the internal bug of CUDA10. I found this bug happened randomly, sometimes happened even after 500 epochs.... Another weird thing is that this bug never happen when I training Semantic3D dataset.
Hi Hugues, did you use reflectance feature during training in NPM3D dataset? Thank you.
The text was updated successfully, but these errors were encountered: