You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am pretty sure this is caused by the floating-precision, although I didn't run your case. Issues are with some background area, you will have a division between two small values. This would not happen in NCC unless the entire image is a constant.
To workaround, you can try:
1.use double instead of float in the LNCC. (may fix the problem perfectly)
2. Clamp the value to (0,1) in the similarity measure.
for example:
add "torch.clamp(lncc, min=0.0, max=1.0) " before "1-lncc.mean()"
Of course, if others have better solution, welcome to share.
LNCC output will be out of the range [0,1] when the images have large area with zero value.
Steps to reproduce:
The log shows as follows:
0-Tot: E=099.4140 | simE=099.4140 | regE=000.0000 | optParE=000.0000 | relF= n/a |
0-Img: E=099.4140 | simE=099.4140 | regE=000.0000 |
The text was updated successfully, but these errors were encountered: