-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nback #2540
Conversation
…Link into devel � Conflicts: � Scripts/Models (Under Development)/N-back.py
- add save and load methods (from Samyak) - test_autodiffcomposition.py: add test_autodiff_saveload, but commented out for now, as it may be causing hanging on PR
- add save and load methods (from Samyak) - test_autodiffcomposition.py: add test_autodiff_saveload, but commented out for now, as it may be causing hanging on PR
…Link into feat/autodiff_various � Conflicts: � Scripts/Models (Under Development)/N-back.py � tests/mdf/model_basic.yml
- pytorch_function_creator: add SoftMax • transferfunctions.py: - disable changes to ReLU.derivative for now
- iscompatible: attempt to replace try and except, commented out for now
…Link into devel � Conflicts: � Scripts/Models (Under Development)/N-back.py
…Link into feat/autodiff_various
- save and load: augment file and directory handling - exclude processing of any ModulatoryProjections
save(): add projection.matrix.base = matrix (fixes test_autodiff_saveload)
- save: return path • test_autodiffcomposition.py: - test_autodiff_saveload: modify to use current working directory rather than tmp
- save() and load(): ignore CIM, learning, and other modulation-related projections
…Link into devel Conflicts: .github/actions/install-pnl/action.yml .github/actions/on-branch/action.yml .github/workflows/pnl-ci-docs.yml .github/workflows/pnl-ci.yml .github/workflows/test-release.yml Scripts/Models (Under Development)/N-back.py
…Link into devel Conflicts: .github/actions/install-pnl/action.yml .github/actions/on-branch/action.yml .github/workflows/pnl-ci-docs.yml .github/workflows/pnl-ci.yml .github/workflows/test-release.yml Scripts/Models (Under Development)/N-back.py
- test_xor_training_identicalness_standard_composition_vs_PyTorch_vs_LLVM: replaces test_xor_training_identicalness_standard_composition_vs_Autodiff
- BackPropagation: - fix bug in which derivative for default loss (MSE) was computed using L0 - add explicit specification for L0 loss • composition.py: - _create_terminal_backprop_learning_components: - add explicit assignment of output_port[SUM] for L0 loss • test_learning.py: - test_multilayer: - fix bug in which SSE was assigned as loss, but oputput_port[MSE] was used for objective_mechanism - replace with explicit L0 loss and ouput_port[SUM] for objective_mechanism
This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):
See CI logs for the full diff. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CodeQL found more than 10 potential problems in the proposed changes. Check the Files changed tab for more details.
This pull request introduces 18 alerts and fixes 2 when merging 98e6525 into 89d2561 - view on LGTM.com new alerts:
fixed alerts:
Heads-up: LGTM.com's PR analysis will be disabled on the 5th of December, and LGTM.com will be shut down ⏻ completely on the 16th of December 2022. It looks like GitHub code scanning with CodeQL is already set up for this repo, so no further action is needed 🚀. For more information, please check out our post on the GitHub blog. |
- clean up test_xor_training_identicalness_standard_composition_vs_PyTorch_and_LLVM
This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):
See CI logs for the full diff. |
This pull request introduces 11 alerts and fixes 6 when merging a46be70 into 89d2561 - view on LGTM.com new alerts:
fixed alerts:
Heads-up: LGTM.com's PR analysis will be disabled on the 5th of December, and LGTM.com will be shut down ⏻ completely on the 16th of December 2022. It looks like GitHub code scanning with CodeQL is already set up for this repo, so no further action is needed 🚀. For more information, please check out our post on the GitHub blog. |
- clean up test_xor_training_identicalness_standard_composition_vs_PyTorch_and_LLVM
This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):
See CI logs for the full diff. |
This pull request introduces 10 alerts and fixes 6 when merging 787e8a4 into 89d2561 - view on LGTM.com new alerts:
fixed alerts:
Heads-up: LGTM.com's PR analysis will be disabled on the 5th of December, and LGTM.com will be shut down ⏻ completely on the 16th of December 2022. It looks like GitHub code scanning with CodeQL is already set up for this repo, so no further action is needed 🚀. For more information, please check out our post on the GitHub blog. |
NOTE: This PR has many changes to learning, as well as the nback script.
• nback.py:
to match and validate against version in Beukers et al., 2022:
• transferfunctions.py:
• learningfunctions.py
• combinationfunctions.py:
• pytorchmodelcreator.py:
• compiledloss.py:
• llvm/init.py:
• composition.py
• autodiffcomposition.py and keywords.py
• ComparatorMechanism:
• keywords.py:
• test_learning.py: