Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spearman correlation coefficient #158

Merged
merged 16 commits into from
Apr 13, 2021
Merged

Spearman correlation coefficient #158

merged 16 commits into from
Apr 13, 2021

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented Apr 5, 2021

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Adds spearman correlation coefficient

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@SkafteNicki SkafteNicki added the enhancement New feature or request label Apr 5, 2021
@pep8speaks
Copy link

pep8speaks commented Apr 5, 2021

Hello @SkafteNicki! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-04-13 17:29:43 UTC

@SkafteNicki SkafteNicki changed the title [WIP] [New metric] Spearman correlation coefficient Spearman correlation coefficient Apr 13, 2021
@SkafteNicki SkafteNicki marked this pull request as ready for review April 13, 2021 12:09
@codecov
Copy link

codecov bot commented Apr 13, 2021

Codecov Report

Merging #158 (5588daf) into master (b098efd) will decrease coverage by 16.61%.
The diff coverage is 93.93%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master     #158       +/-   ##
===========================================
- Coverage   96.03%   79.41%   -16.62%     
===========================================
  Files         172       88       -84     
  Lines        5240     2681     -2559     
===========================================
- Hits         5032     2129     -2903     
- Misses        208      552      +344     
Flag Coverage Δ
Linux 79.41% <93.93%> (+0.36%) ⬆️
Windows 79.41% <93.93%> (+0.36%) ⬆️
cpu 79.41% <93.93%> (-16.62%) ⬇️
gpu ?
macOS 79.41% <93.93%> (-16.62%) ⬇️
pytest 79.41% <93.93%> (-16.62%) ⬇️
python3.6 ?
python3.8 ?
python3.9 ?
torch1.3.1 ?
torch1.4.0 ?
torch1.8.1 ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/__init__.py 100.00% <ø> (ø)
torchmetrics/functional/regression/spearman.py 90.69% <90.69%> (ø)
torchmetrics/functional/__init__.py 100.00% <100.00%> (ø)
torchmetrics/functional/regression/__init__.py 100.00% <100.00%> (ø)
torchmetrics/regression/__init__.py 100.00% <100.00%> (ø)
torchmetrics/regression/spearman.py 100.00% <100.00%> (ø)
torchmetrics/utilities/distributed.py 22.85% <0.00%> (-74.29%) ⬇️
torchmetrics/classification/auc.py 47.61% <0.00%> (-52.39%) ⬇️
torchmetrics/functional/classification/auroc.py 46.15% <0.00%> (-40.01%) ⬇️
torchmetrics/metric.py 55.38% <0.00%> (-39.70%) ⬇️
... and 130 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b098efd...5588daf. Read the comment docs.

torchmetrics/regression/spearman.py Show resolved Hide resolved
torchmetrics/regression/spearman.py Outdated Show resolved Hide resolved
@Borda Borda enabled auto-merge (squash) April 13, 2021 17:30
@Borda Borda merged commit f64a06b into master Apr 13, 2021
@Borda Borda deleted the spearman branch April 13, 2021 17:47
Borda added a commit to alanhdu/metrics that referenced this pull request Apr 14, 2021
* ranking

* init files

* update

* nearly working

* fix tests

* pep8

* add docs

* fix doctests

* fix docs

* pep8

* isort

* ghlog

* Apply suggestions from code review

Co-authored-by: Jirka Borovec <[email protected]>
Borda added a commit that referenced this pull request Apr 14, 2021
* Add AverageMeter

* Fix type annotation to accomodate Python 3.6 bug

* Add tests

* Update changelog

* Add AverageMeter to docs

* fixup! Add AverageMeter to docs

* Code review comments

* Add tests for scalar case

* Fix behavior on PyTorch <1.8

* fixup! Add tests for scalar case

* fixup! fixup! Add tests for scalar case

* Update CHANGELOG.md

* Add Pearson correlation coefficient (#157)

* init files

* rest

* pep8

* changelog

* clamp

* suggestions

* rename

* format

* _sk_pearsonr

* inline

* fix sync

* fix tests

* fix docs

* Apply suggestions from code review

* Update torchmetrics/functional/regression/pearson.py

* atol

* update

* pep8

* pep8

* chlog

* .

Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>

* Spearman correlation coefficient (#158)

* ranking

* init files

* update

* nearly working

* fix tests

* pep8

* add docs

* fix doctests

* fix docs

* pep8

* isort

* ghlog

* Apply suggestions from code review

Co-authored-by: Jirka Borovec <[email protected]>

* Added changes for Test Differentiability [1/n] (#154)

* added test changes

* fix style error

* fixed typo

* added changes for requires_grad

* metrics differentiability testing generalization

* Update tests/classification/test_accuracy.py

Co-authored-by: Nicki Skafte <[email protected]>

* fix tests

* pep8

* changelog

* fix docs

* fix tests

* pep8

* Apply suggestions from code review

Co-authored-by: Nicki Skafte <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>

* Binned PR-related metrics (#128)

* WIP: Binned PR-related metrics

* attempt to fix types

* switch to linspace to make old pytorch happy

* make flake happy

* clean up

* Add more testing, move test input generation to the approproate place

* bugfixes and more stable and thorough tests

* flake8

* Reuse python zip-based implementation as it can't be reproduced with torch.where/max

* address comments

* isort

* Add docs and doctests, make APIs same as non-binned versions

* pep8

* isort

* doctests likes longer title underlines :O

* use numpy's nan_to_num

* add atol to bleu tests to make them more stable

* atol=1e-2 for bleu

* add more docs

* pep8

* remove nlp test hack

* address comments

* pep8

* abc

* flake8

* remove typecheck

* chlog

Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Nicki Skafte <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>

* version + about (#170)

* version + about

* flake8

* try

* .

* fix doc

* overload sig

* fix

* Different import style

Co-authored-by: Nicki Skafte <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Bhadresh Savani <[email protected]>
Co-authored-by: Maxim Grechkin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request ready
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants