-
Notifications
You must be signed in to change notification settings - Fork 32
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Feat/add pathway default matrix (#2518)
* • compositioninterfacemechanism.py: - _get_source_node_for_input_CIM: restore (modeled on _get_source_of_modulation_for_parameter_CIM) but NEEDS TESTS - _get_source_of_modulation_for_parameter_CIM: clean up comments, NEEDS TESTS * - * - * - * - * - * - * • Nback - EM uses ContentAddressableMemory (instead of DictionaryMemory) - Implements FFN for comparison of current and retrieved stimulus and context • Project: replace all instances of "RETREIVE" with "RETRIEVE" * • objectivefunctions.py - add cosine_similarity (needs compiled version) * • Project: make COSINE_SIMILARITY a synonym of COSINE • nback_CAM_FFN: - refactor to implement FFN and task input - assign termination condition for execution that is dependent on control - ContentAddressableMemory: selection_function=SoftMax(output=MAX_INDICATOR, gain=SOFT_MAX_TEMP) • DriftOnASphereIntegrator: - add dimension as dependency for initializer parameter * - * - * - * - * - * - * - * - * - * - * - * - * - * - * • test_integrator.py: Added identicalness test for DriftOnASphereIntegrator agains nback-paper implementation. * - * - * Parameters: allow _validate_ methods to reference other parameters (#2512) * • Scripts: - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed - Deleted N-back_WITH_OBJECTIVE_MECH.py * • Scripts: - Updated N-back to use objective_mechanism, with commented out code for version that doesn't use it once bug is fixed - Deleted N-back_WITH_OBJECTIVE_MECH.py * • N-back.py: - added stimulus generation per nback-paper protocol * - N-back.py tstep(s) -> trial(s) * - * - * • N-back.py - comp -> nback_model - implement stim_set() method * - * • N-back.py: - added training set generation * - * - * • N-back.py - modularized script * - * - * - * - * • showgraph.py: - _assign_processing_components(): fix bug in which nested graphs not highlighted in animation. * • showgraph.py * composition.py - add further description of animation, including note that animation of nested Compostions is limited. * • showgraph.py * composition.py - add animation to N-back doc * • autodiffcomposition.py - __init__(): move pathways arg to beginning, to capture positional assignment (i.e. w/o kw) * - * • N-back.py - ffn: implement as autodiff; still needs small random initial weight assignment * • pathway.py - implement default_projection attribute * • pathway.py - implement default_projection attribute * • utilities.py: random_matrxi: refactored to allow negative values and use keyword ZERO_CENTER * • projection.py RandomMatrix: added class that can be used to pass a function as matrix spec * • utilities.py - RandomMatrix moved here from projection.py • function.py - get_matrix(): added support for RandomMatrix spec * • port.py - _parse_port_spec(): added support for RandomMatrix * • port.py - _parse_port_spec(): added support for RandomMatrix * • utilities.py - is_matrix(): modified to support random_matrix and RandomMatrix * • composition.py - add_linear_processing_pathway: add support for default_matrix argument (replaces default for MappingProjection for any otherwise unspecified projections) though still not used. * - * - RandomMatrix: moved from Utilities to Function * - * [skip ci] * [skip ci] * [skip ci] • N-back.py - clean up script * [skip ci] • N-back.py - further script clean-up * [skip ci] * [skip ci] * [skip ci] * [skip ci] • BeukersNBackModel.rst: - Overview written - Needs other sections completed * [skip ci] * [skip ci] * [skip ci] * [skip ci] * [skip ci] * [skip ci] * [skip ci] * [skip ci] • N-back.py: - replace functions of TransferMechanisms with ReLU - replace function of Decision Mechanisms with SoftMax - more doc cleanup * [skip ci] • N-back.py: - replace functions of TransferMechanisms with ReLU - replace function of Decision Mechanisms with SoftMax - more doc cleanup * [skip ci] * - * - * [skip ci] * [skip ci] • composition.py: implement default_projection_matrix in add_XXX_pathway() methods * [skip ci] • composition.py: implement default_projection_matrix in add_XXX_pathway() methods * [skip ci] • test_composition.py: - add test_pathway_tuple_specs() * - * - * [skip ci] * [skip ci] * [skip ci] * - Co-authored-by: jdcpni <pniintel55> Co-authored-by: Katherine Mantel <[email protected]>
- Loading branch information
Showing
19 changed files
with
1,051 additions
and
253 deletions.
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,81 @@ | ||
|
||
N-Back Model (Beukers et al., 2022) | ||
================================================================== | ||
`"When Working Memory is Just Working, Not Memory" <https://psyarxiv.com/jtw5p>`_ | ||
|
||
Overview | ||
-------- | ||
This implements a model of the `N-back task <https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task>`_ | ||
described in `Beukers et al. (2022) <https://psyarxiv.com/jtw5p>`_. The model uses a simple implementation of episodic | ||
memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which | ||
they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n'th | ||
preceding stimulus (nback-level)retrieved from episodic memory. The temporal context is provided by a randomly | ||
drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere). The FFN is | ||
trained, given an n-back level of *n*, to identify when the current stimulus matches one stored in EM | ||
with a temporal context vector that differs by an amount corresponding to *n* time steps of drift. During n-back | ||
performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the | ||
current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the | ||
FFN to evaluate whether it is an n-back match. The model responds "match" if the FFN detects a match; otherwise, it | ||
either responds "non-match" or, with a fixed probability (hazard rate), it uses the current stimulus and temporal | ||
context to retrieve another sample from EM and repeat the evaluation. | ||
|
||
This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory | ||
e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control, | ||
and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long | ||
short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage | ||
and content-based retrieval, such as the Neural Turing Machine (NTN; | ||
`Graves et al., 2016 <https://arxiv.org/abs/1410.5401>`_), Episodic Planning Networks (EPN; | ||
`Ritter et al., 2020 <https://arxiv.org/abs/2006.03662>`_), and Emergent Symbols through Binding Networks (ESBN; | ||
`Webb et al., 2021 <https://arxiv.org/abs/2012.14601>`_). | ||
|
||
The script respectively, to construct, train and run the model: | ||
|
||
* construct_model(args): | ||
takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top | ||
of the script (see "Construction parameters"). | ||
.. | ||
* train_network(args) | ||
takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train. | ||
Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under "Training parameters"). | ||
.. | ||
* run_model() | ||
takes as arguments the drift rate in the temporal context vector to be applied on each trial, | ||
and the number of trials to execute, as well as reporting and animation specifications | ||
(see "Execution parameters"). | ||
|
||
The default parameters are ones that have been fit to empirical data concerning human performance | ||
(taken from `Kane et al., 2007 <https://psycnet.apa.org/record/2007-06096-010?doi=1>`_). | ||
|
||
|
||
The Model | ||
--------- | ||
|
||
The models is composed of two `Compositions <Composition>`: an outer one that contains the full model (nback_model), | ||
and an `AutodiffComposition` (ffn), nested within nback_model (see red box in Figure), that implements the | ||
feedforward neural network (ffn). | ||
|
||
nback_model | ||
~~~~~~~~~~~ | ||
|
||
This contains three input Mechanisms ( | ||
|
||
Both of these are constructed in the construct_model function. | ||
The ffn Composition is trained use | ||
|
||
.. _nback_Fig: | ||
|
||
.. figure:: _static/N-Back_Model_movie.gif | ||
:align: left | ||
:alt: N-Back Model Animation | ||
|
||
|
||
Training | ||
-------- | ||
|
||
|
||
Execution | ||
--------- | ||
|
||
|
||
Script: :download:`N-back.py <../../Scripts/Models (Under Development)/Beukers_N-Back_2022.py>` | ||
.. Script: :download:`N-back.py <../../psyneulink/library/models/Beukers -Back.py>` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.