-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve extreme batch visualization callbacks #1488
Improve extreme batch visualization callbacks #1488
Conversation
BloodAxe
commented
Sep 25, 2023
- Allow user to control whether to run on train and/or validation loaders
- Allow to control maximum number of logged images
- Store extreme batch on CPU
- OD visualization of batches changed to render side-by-side (pred-gt) images per image (This is mainly to allow easy understanding in W&B since old implementation was showing whole plot of GT batch first, then you had to scroll down to see Pred batch and go up and down to compare prediction for certain sample => not convinient)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just 1 small doc note. But I think it would be nice having @shaydeci having a look since he wrote the original code he might have deeper insights.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not quite sure why we moved to channels last...
Did we check everything works well with WandB and default SG Logger and pictures make sense?
Also, would be great if you can add additional tests covering the train loader end cases in our tests.unit_tests.extreme_batch_cb_test.ExtremeBatchSanityTest test
…d return type and layout (channels last) of the image tensor for visualization
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM