Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Total Variation Loss #978

Merged
merged 55 commits into from
Oct 5, 2022
Merged

Conversation

ragavvenkatesan
Copy link
Contributor

@ragavvenkatesan ragavvenkatesan commented Apr 22, 2022

What does this PR do?

This PR adds Total Variation Loss.
Fixes #972

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

Initialize the PR.
@codecov
Copy link

codecov bot commented Apr 22, 2022

Codecov Report

Merging #978 (49c32b3) into master (4aeb6cb) will decrease coverage by 48%.
The diff coverage is 98%.

Additional details and impacted files
@@           Coverage Diff            @@
##           master    #978     +/-   ##
========================================
- Coverage      86%     38%    -48%     
========================================
  Files         193     195      +2     
  Lines       11404   11461     +57     
========================================
- Hits         9778    4303   -5475     
- Misses       1626    7158   +5532     

@Borda Borda marked this pull request as draft April 22, 2022 03:44
@Borda
Copy link
Member

Borda commented May 5, 2022

@ragavvenkatesan how is it going here? 🐰
we are considering a new feature release next week, do you think we can finish this addition by then so it will be included? 🙏

@ragavvenkatesan
Copy link
Contributor Author

So, I was waiting on you folks for some feedback on the initial code. I am working on tests this weekend, I can prioritize for next release since you brought it up.

@ragavvenkatesan
Copy link
Contributor Author

ragavvenkatesan commented May 10, 2022

So, @Borda I think the metric is implemented. I working on unit test (taking suggestions for tests now). I have tested this on actual model training and seems like it is working as expected. Let me know how this looks.

Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some initial comments

torchmetrics/functional/image/total_variation.py Outdated Show resolved Hide resolved
torchmetrics/functional/image/total_variation.py Outdated Show resolved Hide resolved
torchmetrics/image/total_variation.py Outdated Show resolved Hide resolved
torchmetrics/image/total_variation.py Outdated Show resolved Hide resolved
torchmetrics/functional/image/total_variation.py Outdated Show resolved Hide resolved
torchmetrics/functional/image/total_variation.py Outdated Show resolved Hide resolved
@ragavvenkatesan
Copy link
Contributor Author

Addressed all comments @SkafteNicki Anything on tests?

@Borda
Copy link
Member

Borda commented Sep 15, 2022

@ragavvenkatesan, could you pls check and resolve conflicts? 🦦

@mergify mergify bot removed the has conflicts label Sep 29, 2022
@SkafteNicki SkafteNicki enabled auto-merge (squash) September 29, 2022 15:14
@Borda
Copy link
Member

Borda commented Oct 5, 2022

It seems to be failing for all the latest tests; in fact, it hangs and dies...

@mergify mergify bot added the ready label Oct 5, 2022
@SkafteNicki SkafteNicki merged commit eecfa7d into Lightning-AI:master Oct 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Total Variation Loss
3 participants