Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metric sweeping #544

Merged
merged 8 commits into from
Sep 27, 2021
Merged

Metric sweeping #544

merged 8 commits into from
Sep 27, 2021

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented Sep 24, 2021

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Fixes #148
Introduce the higher_is_better static property that determines if a metric is optimal when it is maximized (True), minimized (False) or this is undefined (None).
Still missing adding some metrics were I am not sure what the property should be.
cc: @maximsch2

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Sep 24, 2021

Codecov Report

Merging #544 (ad1c2cc) into master (ac52dd7) will decrease coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff          @@
##           master   #544   +/-   ##
=====================================
- Coverage      95%    95%   -0%     
=====================================
  Files         130    130           
  Lines        4618   4632   +14     
=====================================
+ Hits         4400   4411   +11     
- Misses        218    221    +3     

@Borda
Copy link
Member

Borda commented Sep 24, 2021

I think it shall be property so it is not possible to change it, the same as is differentiable...
also, do we have another name candidate, this sounds a bit hacky to me...
@PyTorchLightning/core-metrics @justusschock @maximsch2

@SkafteNicki
Copy link
Member Author

SkafteNicki commented Sep 24, 2021

Fine with this being a property instead. Only reason I wanted this to be a class attribute was to give the user the possibility of actually getting this value even before the class was initialized.
Name was suggested by @maximsch2

@SkafteNicki
Copy link
Member Author

@lucadiliello what is the correct value for all our retrieval metrics (you are the expert)?
Is a model optimal when they are maximized or minimized or is it not well defined?

@lucadiliello
Copy link
Contributor

@lucadiliello what is the correct value for all our retrieval metrics (you are the expert)?
Is a model optimal when they are maximized or minimized or is it not well defined?

Hi @SkafteNicki , all retrieval metrics should have higher_is_better=True apart from FallOut, which should be minimised.

@Borda
Copy link
Member

Borda commented Sep 24, 2021

Only reason I wanted this to be a class attribute was to give the user the possibility of actually getting this value even before the class was initialized.

that is a good point but in the same light why the is_differentiable would be needed first to be initialized?

we can ban certain attributes from being overwritten as for example:

    def _ _setattr_ _(self, name, value):
        if name in ("higher_is_better", "is_diffrenetiable"):
            raise RuntimeError(f"Can't chnage const `{name}`."
        self._ _dict_ _[name] = value

@SkafteNicki
Copy link
Member Author

Only reason I wanted this to be a class attribute was to give the user the possibility of actually getting this value even before the class was initialized.

that is a good point but in the same light why the is_differentiable would be needed first to be initialized?

we can ban certain attributes from being overwritten as for example:

    def _ _setattr_ _(self, name, value):
        if name in ("higher_is_better", "is_diffrenetiable"):
            raise RuntimeError(f"Can't chnage const `{name}`."
        self._ _dict_ _[name] = value

You are completely right. I think we should also change is_differentiable to be this way and it looks like a good solution to me :]

@SkafteNicki
Copy link
Member Author

SkafteNicki commented Sep 25, 2021

Anyone got a better name? On top of my head

  • 🎉 higher_is_better: options True, False, None
  • 👍 maximize (or minimize): options True, False, None
  • ❤️ direction or optim_direction: options maximize, minimize, None
  • ...

cc: @PyTorchLightning/core-metrics @PyTorchLightning/core-contributors

@mergify mergify bot added the ready label Sep 25, 2021
@Borda Borda changed the title Metric sweeping Metric sweeping [wip] Sep 25, 2021
@Borda Borda changed the title Metric sweeping [wip] Metric sweeping Sep 25, 2021
@Borda Borda enabled auto-merge (squash) September 25, 2021 19:50
@SkafteNicki
Copy link
Member Author

@Borda please wait with merge, going to add the property to some more metrics

@Borda Borda mentioned this pull request Sep 25, 2021
4 tasks
@SkafteNicki SkafteNicki enabled auto-merge (squash) September 26, 2021 08:06
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@Borda
Copy link
Member

Borda commented Sep 27, 2021

@SkafteNicki mind check the failing tests on GPU?

@mergify mergify bot removed the has conflicts label Sep 27, 2021
@SkafteNicki SkafteNicki merged commit 1e29035 into master Sep 27, 2021
@SkafteNicki SkafteNicki deleted the sweeping branch September 27, 2021 13:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Metrics support for sweeping
4 participants