-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docment vision.augment Affine Tfms to the end #92
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
I accidentally make this PR based on this PR #89 which hasn't been finished yet. Then it creates a big big PR. Hope we can step by step merging it |
fastai/vision/augment.py
Outdated
@@ -362,13 +362,25 @@ def _grid_sample(x, coords, mode='bilinear', padding_mode='reflection', align_co | |||
return F.grid_sample(x, coords, mode=mode, padding_mode=padding_mode, align_corners=align_corners) | |||
|
|||
# Cell | |||
def affine_grid(theta, size, align_corners=None): | |||
def affine_grid( | |||
theta:Tensor, # Transformation `Tensor` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
theta:Tensor, # Transformation `Tensor` | |
theta:Tensor, # A batch of Affine transformation matrices |
This is the same as mat
in affine_coord
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. I fix it in the new commit
fastai/vision/augment.py
Outdated
def __init__(self, aff_fs=None, coord_fs=None, size=None, mode='bilinear', pad_mode=PadMode.Reflection, | ||
mode_mask='nearest', align_corners=None, **kwargs): | ||
def __init__(self, | ||
aff_fs:(callable,[callable])=None, # Affine transformations function for a batch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
aff_fs:(callable,[callable])=None, # Affine transformations function for a batch | |
aff_fs:(callable,list)=None, # Affine transformations function for a batch |
Just list for ones like these instead of [callable]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. I still have a type like this in this PR. example draw:(int, [int], callable)
Do you think I need to change [int] to list? I believe the user can be confused to use a list of callable or list of int.
Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dienhoa yes, because [int] will cause errors for users that use IDEs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I change very [type] to list in the new commit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR.
I left a couple comments and suggestions which are applicable to many of the doc strings. Please review and apply to the rest of the doc strings as needed.
Thanks @warner-benjamin . Update a new commit with your suggestions. |
@dienhoa Did you do anything that might cause no_random to make this notebook non-deterministic for the no-random wrapped pieces? When I run the code I get the same results as the notebook so just wondering. I think that could be a bug in |
I'm not sure but do we need to, after experimenting, restart and run all the notebooks? I usually play around with the notebook and once finish, I run |
@dienhoa no_random should always return the same result, even if you run the cell multiple times, which is why I'm curious. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, we shouldn't be checking in example changes if the code hasn't changed. If you could revert those, the rest looks good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay just my list [callable] and see if you can get the old images back, and we can commit this one.
fastai/vision/augment.py
Outdated
max_warp:float=0.2, # Maximum value of changing warp per | ||
p_affine:float=0.75, # Probability of applying affine transformation | ||
p_lighting:float=0.75, # Probability of changing brightnest and contrast | ||
xtra_tfms:[callable]=None, # Custom Transformations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
xtra_tfms:[callable]=None, # Custom Transformations | |
xtra_tfms:list=None, # Custom Transformations |
Make this change, and I think we are mostly good from my perspective.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"List of" strikes me as unnecessary since the type hint is list
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, I agree just "Custom Transformations" as it was originally is good. The type just needs to be "list. I'll edit the suggestion.
I can not make no_random reproduce the same results as in master. I revert the output in my last commit to the output of master by VSCode. There are some images that, visually I found no differences but |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Docment for this issue: #29 . From Affine Tfms to the end