You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Could the evaluation metric unjustly reward over-prediction of trees?
Here is an example where I have a prediction with 38 pixels of trees, vs a ground truth of 12 pixels with trees. Here the metric shows 100% precision (12 tp and 0 fp), even though there is not a coregistration error, but instead a commission error. The commission error is not picked up by the modified Evaluation Metric.
Describe the bug
Hello! Please forgive me if I am wrong!
In Evaluation metrics box [33] in
4-model.ipynb
max_y
should be reference totrue.shape[1]
rather thantrue.shape[0]
https://github.com/wri/sentinel-tree-cover/blob/master/notebooks/4-model.ipynb?short_path=af94e1f#L1269
https://github.com/wri/sentinel-tree-cover/blob/master/notebooks/4-model.ipynb?short_path=af94e1f#L1292
The text was updated successfully, but these errors were encountered: