Skip to content

Commit

Permalink
merge
Browse files Browse the repository at this point in the history
  • Loading branch information
hookSSi committed Feb 23, 2022
2 parents 2f144cc + 112b1ea commit a1116cb
Show file tree
Hide file tree
Showing 9 changed files with 35 additions and 30 deletions.
15 changes: 3 additions & 12 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,7 @@ assignees: ''

### To Reproduce

Steps to reproduce the behavior:

1. Go to '...'
1. Run '....'
1. Scroll down to '....'
1. See error
Steps to reproduce the behavior...

<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->

Expand All @@ -32,13 +27,9 @@ Minimal means having the shortest code but still preserving the bug. -->

### Environment

- PyTorch Version (e.g., 1.0):
- OS (e.g., Linux):
- How you installed PyTorch (`conda`, `pip`, source):
- Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Python & PyTorch Version (e.g., 1.0):
- How you installed PyTorch (`conda`, `pip`, build command if you used source):
- Any other relevant information:

### Additional context
Expand Down
6 changes: 3 additions & 3 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@ Fixes #\<issue_number>

## Before submitting

- [ ] Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
- [ ] Was this **discussed/approved** via a Github issue? (no need for typos and docs improvements)
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/metrics/blob/master/.github/CONTRIBUTING.md), Pull Request section?
- [ ] Did you make sure to update the docs?
- [ ] Did you write any new necessary tests?
- [ ] Did you make sure to **update the docs**?
- [ ] Did you write any new **necessary tests**?

## PR review

Expand Down
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Changed

- Made `num_classes` in `jaccard_index` a required argument ([#853](https://github.com/PyTorchLightning/metrics/pull/853))


### Deprecated

Expand Down
12 changes: 12 additions & 0 deletions tests/classification/test_jaccard.py
Original file line number Diff line number Diff line change
Expand Up @@ -155,6 +155,7 @@ def test_jaccard(half_ones, reduction, ignore_index, expected):
jaccard_val = jaccard_index(
preds=preds,
target=target,
num_classes=3,
ignore_index=ignore_index,
reduction=reduction,
)
Expand Down Expand Up @@ -233,3 +234,14 @@ def test_jaccard_ignore_index(pred, target, ignore_index, num_classes, reduction
reduction=reduction,
)
assert torch.allclose(jaccard_val, tensor(expected).to(jaccard_val))


def test_warning_on_difference_in_number_of_classes():
"""Test that warning is thrown if the detected number of classes are different from the the specified number of
classes."""
preds = torch.randint(3, (10,))
target = torch.randint(3, (10,))
with pytest.warns(
RuntimeWarning,
):
jaccard_index(preds, target, num_classes=4)
3 changes: 1 addition & 2 deletions tests/text/test_rouge.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,5 +208,4 @@ def test_rouge_metric_normalizer_tokenizer(pl_rouge_metric_key):
)
metrics_score = Scorer.compute()

threshold = 1e-08
np.isclose(metrics_score[rouge_level + "_" + metric], original_score, atol=threshold, equal_nan=True)
np.isclose(metrics_score[rouge_level + "_" + metric], original_score, atol=1e-8, equal_nan=True)
11 changes: 4 additions & 7 deletions tm_examples/rouge_score-own_normalizer_and_tokenizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,7 @@ def __call__(self, text: str) -> str:
should obey the input/output arguments structure described below.
Args:
text:
Input text. `str`
text: Input text.
Return:
Normalized python string object
Expand All @@ -51,19 +50,17 @@ def __call__(self, text: str) -> str:
class UserTokenizer:
"""The `UserNormalizer` class is required to tokenize a non-alphabet language text input.
The user's defined tokenizer is expected to return `Sequence[str]` that are fed into the rouge score.
The user's defined tokenizer is expected to return ``Sequence[str]`` that are fed into the rouge score.
"""

def __init__(self) -> None:
self.pattern = r"\s+"
pattern = r"\s+"

def __call__(self, text: str) -> Sequence[str]:
"""The `__call__` method must be defined for this class. To ensure the functionality, the `__call__` method
should obey the input/output arguments structure described below.
Args:
text:
Input text. `str`
text: Input text.
Return:
Tokenized sentence
Expand Down
4 changes: 3 additions & 1 deletion torchmetrics/audio/pesq.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,9 @@ class PerceptualEvaluationSpeechQuality(Metric):
to perform the metric calculation.
.. note:: using this metrics requires you to have ``pesq`` install. Either install as ``pip install
torchmetrics[audio]`` or ``pip install pesq``
torchmetrics[audio]`` or ``pip install pesq``. Note that ``pesq`` will compile with your currently
installed version of numpy, meaning that if you upgrade numpy at some point in the future you will
most likely have to reinstall ``pesq``.
Forward accepts
Expand Down
4 changes: 3 additions & 1 deletion torchmetrics/functional/audio/pesq.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,9 @@ def perceptual_evaluation_speech_quality(
to perform the metric calculation.
.. note:: using this metrics requires you to have ``pesq`` install. Either install as ``pip install
torchmetrics[audio]`` or ``pip install pesq``
torchmetrics[audio]`` or ``pip install pesq``. Note that ``pesq`` will compile with your currently
installed version of numpy, meaning that if you upgrade numpy at some point in the future you will
most likely have to reinstall ``pesq``.
Args:
preds:
Expand Down
8 changes: 4 additions & 4 deletions torchmetrics/functional/classification/jaccard.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,10 +69,10 @@ def _jaccard_from_confmat(
def jaccard_index(
preds: Tensor,
target: Tensor,
num_classes: int,
ignore_index: Optional[int] = None,
absent_score: float = 0.0,
threshold: float = 0.5,
num_classes: Optional[int] = None,
reduction: str = "elementwise_mean",
) -> Tensor:
r"""
Expand All @@ -95,6 +95,7 @@ def jaccard_index(
Args:
preds: tensor containing predictions from model (probabilities, or labels) with shape ``[N, d1, d2, ...]``
target: tensor containing ground truth labels with shape ``[N, d1, d2, ...]``
num_classes: Specify the number of classes
ignore_index: optional int specifying a target class to ignore. If given,
this class index does not contribute to the returned score, regardless
of reduction method. Has no effect if given an int that is not in the
Expand All @@ -107,8 +108,7 @@ def jaccard_index(
assigned the `absent_score`.
threshold:
Threshold value for binary or multi-label probabilities.
num_classes:
Optionally specify the number of classes
reduction: a method to reduce metric score over labels.
- ``'elementwise_mean'``: takes the mean (default)
Expand All @@ -124,7 +124,7 @@ def jaccard_index(
>>> target = torch.randint(0, 2, (10, 25, 25))
>>> pred = torch.tensor(target)
>>> pred[2:5, 7:13, 9:15] = 1 - pred[2:5, 7:13, 9:15]
>>> jaccard_index(pred, target)
>>> jaccard_index(pred, target, num_classes=2)
tensor(0.9660)
"""

Expand Down

0 comments on commit a1116cb

Please sign in to comment.