Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

using torch<1.12.0 fails cross_entropy tests #2665

Closed
jvesely opened this issue May 8, 2023 · 0 comments · Fixed by #2667
Closed

using torch<1.12.0 fails cross_entropy tests #2665

jvesely opened this issue May 8, 2023 · 0 comments · Fixed by #2667
Labels
bug Should work but doesn't

Comments

@jvesely
Copy link
Collaborator

jvesely commented May 8, 2023

The minimum torch version for PNL is set to 1.8.0, but versions < 1.12.0 fail tests:

FAILED tests/composition/test_autodiffcomposition.py::TestMiscTrainingFunctionality::test_various_loss_specs[ExecutionMode.PyTorch-Loss.CROSS_ENTROPY] - IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)
FAILED tests/composition/test_autodiffcomposition.py::TestBatching::test_cross_entropy_loss - IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)
@jvesely jvesely added the bug Should work but doesn't label May 8, 2023
jvesely added a commit to jvesely/PsyNeuLink that referenced this issue May 8, 2023
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
use of cross entropy loss function to use "one hot" format for targts
instead of the previously used index.
This new format requires torch >= 1.12.0 [0]

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this issue May 9, 2023
…ch<=1.11.x

1.12.0+ can use the CrossEntropyLoss instance directly.
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this issue May 9, 2023
…ch<=1.11.x

1.12.0+ can use the CrossEntropyLoss instance directly.
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this issue May 12, 2023
…torch<=1.11.x

d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12
1.12.0+ includes input handling path to consider inputs without batch
dimension and can be used directly. [0,1,2]

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0
[1] pytorch/pytorch#77653
[2] pytorch/pytorch@8881d7a

Signed-off-by: Jan Vesely <[email protected]>
@jvesely jvesely closed this as completed May 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Should work but doesn't
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant