Skip to content

Commit

Permalink
Nback (#2540)
Browse files Browse the repository at this point in the history
* Add autodiff save/load functionality

* -

* Update autodiffcomposition.py

* Update test_autodiffcomposition.py

* Merge branch 'devel' of https://github.com/PrincetonUniversity/PsyNeuLink into devel

* • Autodiff:
  - add save and load methods (from Samyak)
  - test_autodiffcomposition.py:
    add test_autodiff_saveload, but commented out for now, as it may be causing hanging on PR

* • Autodiff:
  - add save and load methods (from Samyak)
  - test_autodiffcomposition.py:
    add test_autodiff_saveload, but commented out for now, as it may be causing hanging on PR

* -

* -

* • pytorchcomponents.py:
  - pytorch_function_creator: add SoftMax

• transferfunctions.py:
  - disable changes to ReLU.derivative for now

* • utilities.py:
  - iscompatible:
    attempt to replace try and except, commented out for now

* -

* -

* • autodiffcomposition.py:
  - save and load: augment file and directory handling
  - exclude processing of any ModulatoryProjections

* -

* -

* -

* • autodiffcomposition.py
  save(): add projection.matrix.base = matrix
           (fixes test_autodiff_saveload)

* -

* • autodiffcomposition.py:
  - save: return path
• test_autodiffcomposition.py:
  - test_autodiff_saveload: modify to use current working directory rather than tmp

* • autodiffcomposition.py:
  - save() and load(): ignore CIM, learning, and other modulation-related projections

* • autodiffcomposition.py:
  - load(): change test for path (failing on Windows) from PosixPath to Path

* • autodiffcomposition.py:
  - add _runtime_learning_rate attribute
  - _build_pytorch_representation():
      use _runtime_learning_rate attribute for optimizer if provided in call to learn
      else use learning_rate specified at construction
• compositionrunner.py:
  - assign learning_rate to _runtime_learning_rate attribute if specified in call to learn

* -

* [skip ci]

* [skip ci]

* [skip ci]
• autodiffcomposition.py:
  load():  add testing for match of matrix shape

* [skip ci]
• N-back:
  - reset em after each run
  - save and load weights
  - torch epochs = batch size (number training stimuli) * num_epochs

* [skip ci]

* [skip ci]

* Feat/add pathway default matrix (#2518)

* • compositioninterfacemechanism.py:
  - _get_source_node_for_input_CIM:
        restore (modeled on _get_source_of_modulation_for_parameter_CIM) but NEEDS TESTS
  - _get_source_of_modulation_for_parameter_CIM: clean up comments, NEEDS TESTS

* -

* -

* -

* -

* -

* -

* • Nback
  - EM uses ContentAddressableMemory (instead of DictionaryMemory)
  - Implements FFN for comparison of current and retrieved stimulus and context

• Project:
  replace all instances of "RETREIVE" with "RETRIEVE"

* • objectivefunctions.py
  - add cosine_similarity (needs compiled version)

* • Project: make COSINE_SIMILARITY a synonym of COSINE
• nback_CAM_FFN:
  - refactor to implement FFN and task input
  - assign termination condition for execution that is dependent on control
  - ContentAddressableMemory: selection_function=SoftMax(output=MAX_INDICATOR,
                                                            gain=SOFT_MAX_TEMP)
• DriftOnASphereIntegrator:
  - add dimension as dependency for initializer parameter

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* • test_integrator.py:
  Added identicalness test for DriftOnASphereIntegrator agains nback-paper implementation.

* -

* -

* Parameters: allow _validate_ methods to reference other parameters (#2512)

* • Scripts:
  - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed
  - Deleted N-back_WITH_OBJECTIVE_MECH.py

* • Scripts:
  - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed
  - Deleted N-back_WITH_OBJECTIVE_MECH.py

* • N-back.py:
  - added stimulus generation per nback-paper protocol

* - N-back.py
  tstep(s) -> trial(s)

* -

* -

* • N-back.py
  - comp -> nback_model
  - implement stim_set() method

* -

* • N-back.py:
  - added training set generation

* -

* -

* • N-back.py
  - modularized script

* -

* -

* -

* -

* • showgraph.py:
  - _assign_processing_components(): fix bug in which nested graphs not highlighted in animation.

* • showgraph.py * composition.py
  - add further description of animation, including note that animation of nested Compostions is limited.

* • showgraph.py * composition.py
  - add animation to N-back doc

* • autodiffcomposition.py
  - __init__(): move pathways arg to beginning, to capture positional assignment (i.e. w/o kw)

* -

* • N-back.py
  - ffn: implement as autodiff; still needs small random initial weight assignment

* • pathway.py
  - implement default_projection attribute

* • pathway.py
  - implement default_projection attribute

* • utilities.py:
  random_matrxi:  refactored to allow negative values and use keyword ZERO_CENTER

* • projection.py
  RandomMatrix: added class that can be used to pass a function as matrix spec

* • utilities.py
  - RandomMatrix moved here from projection.py

• function.py
  - get_matrix():  added support for RandomMatrix spec

* • port.py
  - _parse_port_spec(): added support for RandomMatrix

* • port.py
  - _parse_port_spec(): added support for RandomMatrix

* • utilities.py
  - is_matrix(): modified to support random_matrix and RandomMatrix

* • composition.py
  - add_linear_processing_pathway: add support for default_matrix argument
     (replaces default for MappingProjection for any otherwise unspecified projections)
     though still not used.

* -

* - RandomMatrix: moved from Utilities to Function

* -

* [skip ci]

* [skip ci]

* [skip ci]
• N-back.py
  - clean up script

* [skip ci]
• N-back.py
  - further script clean-up

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• BeukersNBackModel.rst:
  - Overview written
  - Needs other sections completed

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• N-back.py:
  - replace functions of TransferMechanisms with ReLU
  - replace function of Decision Mechanisms with SoftMax
  - more doc cleanup

* [skip ci]
• N-back.py:
  - replace functions of TransferMechanisms with ReLU
  - replace function of Decision Mechanisms with SoftMax
  - more doc cleanup

* [skip ci]

* -

* -

* [skip ci]

* [skip ci]
• composition.py:
  implement default_projection_matrix in add_XXX_pathway() methods

* [skip ci]
• composition.py:
  implement default_projection_matrix in add_XXX_pathway() methods

* [skip ci]
• test_composition.py:
  - add test_pathway_tuple_specs()

* -

* -

* [skip ci]

* [skip ci]

* [skip ci]

* -

Co-authored-by: jdcpni <pniintel55>
Co-authored-by: Katherine Mantel <[email protected]>

* Feat/add pathway default matrix (#2519)

* • compositioninterfacemechanism.py:
  - _get_source_node_for_input_CIM:
        restore (modeled on _get_source_of_modulation_for_parameter_CIM) but NEEDS TESTS
  - _get_source_of_modulation_for_parameter_CIM: clean up comments, NEEDS TESTS

* -

* -

* -

* -

* -

* -

* • Nback
  - EM uses ContentAddressableMemory (instead of DictionaryMemory)
  - Implements FFN for comparison of current and retrieved stimulus and context

• Project:
  replace all instances of "RETREIVE" with "RETRIEVE"

* • objectivefunctions.py
  - add cosine_similarity (needs compiled version)

* • Project: make COSINE_SIMILARITY a synonym of COSINE
• nback_CAM_FFN:
  - refactor to implement FFN and task input
  - assign termination condition for execution that is dependent on control
  - ContentAddressableMemory: selection_function=SoftMax(output=MAX_INDICATOR,
                                                            gain=SOFT_MAX_TEMP)
• DriftOnASphereIntegrator:
  - add dimension as dependency for initializer parameter

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* -

* • test_integrator.py:
  Added identicalness test for DriftOnASphereIntegrator agains nback-paper implementation.

* -

* -

* Parameters: allow _validate_ methods to reference other parameters (#2512)

* • Scripts:
  - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed
  - Deleted N-back_WITH_OBJECTIVE_MECH.py

* • Scripts:
  - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed
  - Deleted N-back_WITH_OBJECTIVE_MECH.py

* • N-back.py:
  - added stimulus generation per nback-paper protocol

* - N-back.py
  tstep(s) -> trial(s)

* -

* -

* • N-back.py
  - comp -> nback_model
  - implement stim_set() method

* -

* • N-back.py:
  - added training set generation

* -

* -

* • N-back.py
  - modularized script

* -

* -

* -

* -

* • showgraph.py:
  - _assign_processing_components(): fix bug in which nested graphs not highlighted in animation.

* • showgraph.py * composition.py
  - add further description of animation, including note that animation of nested Compostions is limited.

* • showgraph.py * composition.py
  - add animation to N-back doc

* • autodiffcomposition.py
  - __init__(): move pathways arg to beginning, to capture positional assignment (i.e. w/o kw)

* -

* • N-back.py
  - ffn: implement as autodiff; still needs small random initial weight assignment

* • pathway.py
  - implement default_projection attribute

* • pathway.py
  - implement default_projection attribute

* • utilities.py:
  random_matrxi:  refactored to allow negative values and use keyword ZERO_CENTER

* • projection.py
  RandomMatrix: added class that can be used to pass a function as matrix spec

* • utilities.py
  - RandomMatrix moved here from projection.py

• function.py
  - get_matrix():  added support for RandomMatrix spec

* • port.py
  - _parse_port_spec(): added support for RandomMatrix

* • port.py
  - _parse_port_spec(): added support for RandomMatrix

* • utilities.py
  - is_matrix(): modified to support random_matrix and RandomMatrix

* • composition.py
  - add_linear_processing_pathway: add support for default_matrix argument
     (replaces default for MappingProjection for any otherwise unspecified projections)
     though still not used.

* -

* - RandomMatrix: moved from Utilities to Function

* -

* [skip ci]

* [skip ci]

* [skip ci]
• N-back.py
  - clean up script

* [skip ci]
• N-back.py
  - further script clean-up

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• BeukersNBackModel.rst:
  - Overview written
  - Needs other sections completed

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• N-back.py:
  - replace functions of TransferMechanisms with ReLU
  - replace function of Decision Mechanisms with SoftMax
  - more doc cleanup

* [skip ci]
• N-back.py:
  - replace functions of TransferMechanisms with ReLU
  - replace function of Decision Mechanisms with SoftMax
  - more doc cleanup

* [skip ci]

* -

* -

* [skip ci]

* [skip ci]
• composition.py:
  implement default_projection_matrix in add_XXX_pathway() methods

* [skip ci]
• composition.py:
  implement default_projection_matrix in add_XXX_pathway() methods

* [skip ci]
• test_composition.py:
  - add test_pathway_tuple_specs()

* -

* -

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• composition.py:
  - add_linear_processing_pathway: fixed bug when Reinforcement or TDLearning are specified

• test_composition.py:
  - test_pathway_tuple_specs:  add tests for Reinforcement and TDLearning

* • composition.py:
  - add_linear_processing_pathway: fixed bug when Reinforcement or TDLearning are specified

• test_composition.py:
  - test_pathway_tuple_specs:  add tests for Reinforcement and TDLearning

Co-authored-by: jdcpni <pniintel55>
Co-authored-by: Katherine Mantel <[email protected]>

* autodiff: Use most recent context while save/load

* tests/autodiff: Use portable path join

* autodiff: Add assertions for save/load

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* • autodiffcomposition, test_autodiff_saveload:
  - merged from feat/autodiff_save

* -

* -

* -

* • autodiffcomposition.py
  - fix path assignment bug

* -

* • N-back mods

* • N-back: reimplementing get_run_inputs

* -

* -

* -

* -

* • N-back.py
  - refactoring of generate_stim_seq:
    continuous presentation of stimuli
    balancing of trial_types (with fill-in)
    return trial_type_seq

* [skip ci]

* Merge branch 'devel' of https://github.com/PrincetonUniversity/PsyNeuLink into devel

� Conflicts:
�	psyneulink/library/compositions/autodiffcomposition.py

* Merge branch 'devel' of https://github.com/PrincetonUniversity/PsyNeuLink into devel

� Conflicts:
�	psyneulink/library/compositions/autodiffcomposition.py

* [skip ci]

* [skip ci]
• N-back.py
  - docstring mods

* [skip ci]

* [skip ci]
• N-back.py:
  add Kane stimuli (2back)

* [skip ci]

* [skip ci]

* [skip ci]

* • N-back.py
  - add analyze_results()

* [skip ci]
• N-back.py
  - add analyze_results()

* [skip ci]
• N-back.py:
  - analyze_results: fully implemented

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• composition.py, pytorchmodelcreator.py
  - modify loss_spec to use keywords for loss

• Nback.py:
  - Autodiff loss_spec = CROSS_ENTROPY

* [skip ci]
• pytorchmodelcreator.py:
  - _gen_llvm_training_function_body():
    - add support for loss_type = CROSS_ENTROPY

• compiledloss.py:
  - _gen_loss_function:  add support for CROSS_ENTROPYLoss
     - needs to be debugged
     - need differential implemented

* [skip ci]

* [skip ci]

* [skip ci]
• composition.py:
  - _create_terminal_backprop_learning_components:
    - support loss_function = CROSS_ENTROPY

• combinationfunctions.py:
  - LinearCombination: add CROSS_ENTROPY as operation

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• transferfunctions.py:
  - ReLU: modified derivative to use infer input from output if provided (needed for BackPropagation)

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]

* -

* • transferfunctions.py:
  - SoftMax.derivative:  fixes
    NOTE: LLVM needs to be modified accordingly

• test_transfer.py:
  - test_transfer_derivative: modify tests to match changes to SoftMax
    NOTE:  LLVM tests don't pass

* [skip ci]

* [skip ci]

* [skip ci]
• composition.py
  - docstring mods re: Autodiff

* [skip ci]

* Merge branch 'nback' of https://github.com/PrincetonUniversity/PsyNeuLink into nback
� Conflicts:
�	Scripts/Models (Under Development)/Nback/nback.py

• composition.py:
  - run(): try addining Report context for LLVM execution

* [skip ci]
• composition.py
  - add Report for compiled mode

• compiledloss.py:
  - CROSS_ENTROPYLoss:
    _gen_loss_function():
       fixed bug, now runs
    _gen_inject_loss_differential():
      dummy copied from MSELoss -- NEEDS TO BE FIXED

• transferfunctions.py:
  - ReLU: added compiled support for derivative using output

• test_transfer.py:
  - test_transfer_derivative_out:
     test derivatives with output instead of input as arg

* Merge branch 'nback' of https://github.com/PrincetonUniversity/PsyNeuLink into nback
� Conflicts:
�	Scripts/Models (Under Development)/Nback/nback.py

• composition.py:
  - run(): try addining Report context for LLVM execution

* [skip ci]

* [skip ci]

* [skip ci]

* [skip ci]
• Merge branch 'nback' of https://github.com/PrincetonUniversity/PsyNeuLink into nback

• composition.py:
  - docstring mods regarding Autodiff learning

* [skip ci]

* • composition.py
  - more docstrings re: Autodiff

* [skip ci]
• composition.py
  - table for learning execution modes

* [skip ci]
• llvm/__init__.py
  - ExecuteMode: add PyTorch as synonym for Python

• autodiffcomposition.py
  - docstrring refactoring

* [skip ci]

* [skip ci]
• composition.py, autodiffcomposition.py
  - docstring mods

* [skip ci]
• composition.py, autodiffcomposition.py
  - docstring mods

* [skip ci]
• composition.py, autodiffcomposition.py
  - docstring mods

* [skip ci]

* [skip ci]
• test_transfer.py:
  - get rid of duplicative test

* Merge branch 'devel' of https://github.com/PrincetonUniversity/PsyNeuLink into devel

 Conflicts:
	.github/actions/install-pnl/action.yml
	.github/actions/on-branch/action.yml
	.github/workflows/pnl-ci-docs.yml
	.github/workflows/pnl-ci.yml
	.github/workflows/test-release.yml
	Scripts/Models (Under Development)/N-back.py

* Merge branch 'devel' of https://github.com/PrincetonUniversity/PsyNeuLink into devel

 Conflicts:
	.github/actions/install-pnl/action.yml
	.github/actions/on-branch/action.yml
	.github/workflows/pnl-ci-docs.yml
	.github/workflows/pnl-ci.yml
	.github/workflows/test-release.yml
	Scripts/Models (Under Development)/N-back.py

* • test_learning.py:
  - test_xor_training_identicalness_standard_composition_vs_PyTorch_vs_LLVM:
      replaces test_xor_training_identicalness_standard_composition_vs_Autodiff

* • learningfunctions.py
  - BackPropagation:
    - fix bug in which derivative for default loss (MSE) was computed using L0
    - add explicit specification for L0 loss

• composition.py:
  - _create_terminal_backprop_learning_components:
    - add explicit assignment of output_port[SUM] for L0 loss

• test_learning.py:
  - test_multilayer:
    - fix bug in which SSE was assigned as loss, but oputput_port[MSE] was used for objective_mechanism
    - replace with explicit L0 loss and ouput_port[SUM] for objective_mechanism

* [skip ci]
• learningfunctions.py
  - _create_non_terminal_backprop_learning_components:
    - fixed bug in which loss function for hidden layers was set to MSE rather than simple L0

* [skip ci]
• All tests pass

* [skip ci]
• test_learning.py:
  - test_multilayer_truth():  parameterize test with expected results

* [skip ci]
• test_learning.py:
  - test_multilayer_truth():  test for L0, SSE and MSE

* [skip ci]
• All tests pass

* [skip ci]
• All tests pass

* [skip ci]

* [skip ci]
• keywords.py
  - add Loss enum

• llvm.rst
  - add ExecutionMode

• Project
  - replace MSE, SSE, L0, L1, CROSS_ENTROPY, KL_DIV, NLL and POISSON_NLL with Loss enum members

* [skip ci]
• composition.py:
  - run(): add warning for use of PyTorch with Composition

* [skip ci]

* [skip ci]
• composition.py:
  - run(): commented out warning, as can't distinguish ExecutionMode.PyTorch from ExecutionMode.Python

* -

* • test_learning.py
  - clean up test_xor_training_identicalness_standard_composition_vs_PyTorch_and_LLVM

* • test_learning.py
  - clean up test_xor_training_identicalness_standard_composition_vs_PyTorch_and_LLVM

Co-authored-by: SamKG <[email protected]>
Co-authored-by: Katherine Mantel <[email protected]>
Co-authored-by: jdcpni <pniintel55>
  • Loading branch information
3 people authored Nov 19, 2022
1 parent 89d2561 commit fc13a63
Show file tree
Hide file tree
Showing 56 changed files with 2,487 additions and 2,075 deletions.
231 changes: 0 additions & 231 deletions Scripts/Models (Under Development)/N-back/N-back MODULARIZED.py

This file was deleted.

Loading

0 comments on commit fc13a63

Please sign in to comment.