Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nback #2617

Merged
merged 165 commits into from
Apr 2, 2023
Merged

Nback #2617

merged 165 commits into from
Apr 2, 2023

Conversation

jdcpni
Copy link
Collaborator

@jdcpni jdcpni commented Apr 1, 2023

• Project

  • get rid of keywords for MSE, SSE, L0, L1, etc. and rename as Loss.<*>.name
  • NOTE: keyword for CROSS_ENTROPY ('cross-entropy') persists,
    as it is used for DistanceMetrics and LinearCombination.operation

• composition.py, learningfunctions.py

  • loss_function -> loss_spec (for consistency with autodiff)

• outputport.py

  • StandardOutputPorts: fix bug in get_port_dict in which name searched for with "is" rather than "=="

• comparatormechanism.py

  • rename standard_output_port['sum'] as standard_output_port['SUM']
  • NOTE: names of MSE and SSE output_ports use Loss.<*>.name, where as SUM uses SUM.upper() (instead of L0)

  - add "results" dir to path for saving files

• composition.py
  - change CROSS_ENTROPY assignment of LinearCombination operation to 'cross-entropy'
  - get rid of MSE, SSE, etc.
  - rename as Loss.<*>.name
  - StandardOutputPorts: fix bug in get_port_dict in which name searched for with "is" rather than "=="
  - loss_function -> loss_spec (to be consistent with autodiffcomposition.py
• transferfunctions.py:
  - BinomialDistort()
  - Dropout()
  BOTH NEED LLVM IMPLEMENTATION
• utilities.py
  - iscompatible():
    add warnings.simplefilter(action='ignore', category=np.VisibleDeprecationWarning)
    to avoide warnings about ragged arrays
  - docstring mods re: learning

ª report.py
  - remove ReportLearning
  - docstring mods re: learning

ª report.py
  - remove ReportLearning
…Link into feat/binomial_distort

� Conflicts:
�	psyneulink/core/globals/utilities.py
• transferfunctions.py
  BinomialDistort and Dropout: add error messages for LLVM and derivatives
• nback.py, BeukersNBackModel.rst
  - consolidate docstrings
jdcpni added 6 commits April 1, 2023 10:25
  - test_ContentAddressableMemory_simple_distances:
    - add test for empty list in call to c.distances_by_field
    - add tests for [] in field_weights
• Project
  - get rid of keywords for MSE, SSE, L0, L1, etc. and rename as Loss.<*>.name
  - NOTE: keyword for CROSS_ENTROPY ('cross-entropy') persists, as it is used for DistanceMetrics and LinearCombination.operation

• composition.py, learningfunctions.py
  loss_function -> loss_spec (for consistency with autodiff)

• outputport.py
  - StandardOutputPorts: fix bug in get_port_dict in which name searched for with "is" rather than "=="

• comparatormechanism.py
  - rename standard_output_port['sum'] as standard_output_port['SUM']
  - NOTE: names of MSE and SSE output_ports use Loss.<*>.name, where as SUM uses SUM.upper() (instead of L0)
• Project
  - get rid of keywords for MSE, SSE, L0, L1, etc. and rename as Loss.<*>.name
  - NOTE: keyword for CROSS_ENTROPY ('cross-entropy') persists, as it is used for DistanceMetrics and LinearCombination.operation

• composition.py, learningfunctions.py
  loss_function -> loss_spec (for consistency with autodiff)

• outputport.py
  - StandardOutputPorts: fix bug in get_port_dict in which name searched for with "is" rather than "=="

• comparatormechanism.py
  - rename standard_output_port['sum'] as standard_output_port['SUM']
  - NOTE: names of MSE and SSE output_ports use Loss.<*>.name, where as SUM uses SUM.upper() (instead of L0)
Integrated changes from fix/contentaddressablememory_empty_entry
@github-actions
Copy link

github-actions bot commented Apr 1, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model (Beukers et al., 2022)</a><ul>
< <li><a class="reference internal" href="#overview">Overview</a></li>
< <li><a cla
...

See CI logs for the full diff.

Copy link

@github-advanced-security github-advanced-security bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CodeQL found more than 10 potential problems in the proposed changes. Check the Files changed tab for more details.

@github-actions
Copy link

github-actions bot commented Apr 1, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@github-actions
Copy link

github-actions bot commented Apr 1, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@github-actions
Copy link

github-actions bot commented Apr 2, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@github-actions
Copy link

github-actions bot commented Apr 2, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@github-actions
Copy link

github-actions bot commented Apr 2, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@github-actions
Copy link

github-actions bot commented Apr 2, 2023

This PR causes the following changes to the html docs (ubuntu-latest-3.7-x64):

diff -r docs-base/AutodiffComposition.html docs-head/AutodiffComposition.html
293,294c293,294
< This <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a> provides a summary and comparison of these different modes of execution,
< that are described in greater detail below.</p>
---
> These are each described in greater detail below, and summarized in this <a class="reference internal" href="Composition.html#composition-compilation-table"><span class="std std-ref">table</span></a>
> which provides a comparison of the different modes of execution for an AutodiffComposition and standard <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a>.</p>
325c325
< executing using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
---
> executed using the <em>Python</em> interpreter (and not PyTorch);  this is so that any modulation can take effect
327a328,341
> <div class="admonition warning">
> <p class="admonition-title">Warning</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.LLVM" title="psyneulink.core.llvm.__init__.ExecutionMode.LLVM"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.LLVM</span></code></a> or <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> in the learn() method of a standard
> <a class="reference internal" href="Composition.html"><span class="doc">Composition</span></a> causes an error.</p></li>
> </ul>
> </div>
> <div class="admonition note">
> <p class="admonition-title">Note</p>
> <ul class="simple">
> <li><p>Specifying <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.Python" title="psyneulink.core.llvm.__init__.ExecutionMode.Python"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.Python</span></code></a> in the learn() method of a <a class="reference internal" href="#"><span class="doc">AutodiffComposition</span></a> is treated as a
> synonym of <a class="reference internal" href="LLVM.html#psyneulink.core.llvm.__init__.ExecutionMode.PyTorch" title="psyneulink.core.llvm.__init__.ExecutionMode.PyTorch"><code class="xref any py py-attr docutils literal notranslate"><span class="pre">ExecutionMode.PyTorch</span></code></a> (see table).</p></li>
> </ul>
> </div>
415a430,452
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
> <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
> <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
> arguments from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch optimizer function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
> <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
> <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
> <dl class="field-list simple">
> <dt class="field-odd">Type</dt>
> <dd class="field-odd"><p>PyTorch loss function</p>
> </dd>
> </dl>
> </dd></dl>
> 
> <dl class="py attribute">
418c455
< <dd><p>tracks the average for each weight update (i.e. each minibatch)</p>
---
> <dd><p>tracks the average loss after each weight update (i.e. each minibatch) during learning.</p>
427,430c464,466
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer">
< <span class="sig-name descname"><span class="pre">optimizer</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.optimizer" title="Permalink to this definition">¶</a></dt>
< <dd><p>the optimizer used for training. Depends on the <strong>optimizer_type</strong>, <strong>learning_rate</strong>, and <strong>weight_decay</strong>
< arguments from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights">
> <span class="sig-name descname"><span class="pre">last_saved_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_saved_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file to which weights were last saved.</p>
433c469
< <dd class="field-odd"><p>PyTorch optimizer function</p>
---
> <dd class="field-odd"><p>path</p>
439,441c475,477
< <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss">
< <span class="sig-name descname"><span class="pre">loss</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.loss" title="Permalink to this definition">¶</a></dt>
< <dd><p>the loss function used for training. Depends on the <strong>loss_spec</strong> argument from initialization.</p>
---
> <dt class="sig sig-object py" id="psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights">
> <span class="sig-name descname"><span class="pre">last_loaded_weights</span></span><a class="headerlink" href="#psyneulink.library.compositions.autodiffcomposition.AutodiffComposition.last_loaded_weights" title="Permalink to this definition">¶</a></dt>
> <dd><p>path for file from which weights were last loaded.</p>
444c480
< <dd class="field-odd"><p>PyTorch loss function</p>
---
> <dd class="field-odd"><p>path</p>
457c493
< <dd><p>Updates parameters based on trials run since last update.</p>
---
> <dd><p>Updates parameters (weights) based on trials run since last update.</p>
636c672
< <dd><p>Loads all weights matrices for all MappingProjections in the AutodiffComposition from file
---
> <dd><p>Loads all weight matrices for all MappingProjections in the AutodiffComposition from file
diff -r docs-base/BasicsAndPrimer.html docs-head/BasicsAndPrimer.html
259c259
< <span id="basicsandprimer-grandview-figure"></span><img alt="_static/BasicsAndPrimer_GrandView_fig.svg" src="_static/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
---
> <span id="basicsandprimer-grandview-figure"></span><img alt="_images/BasicsAndPrimer_GrandView_fig.svg" src="_images/BasicsAndPrimer_GrandView_fig.svg" /><figcaption>
269,270c269,270
< <code class="xref any docutils literal notranslate"><span class="pre">tutorial</span></code> provides additional introductory material for those who are newer to computational modeling, as well as a
< more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
---
> <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">tutorial</span></a> provides additional introductory material for those who are newer to computational modeling,
> as well as a more detailed and comprehensive introduction to the use of PsyNeuLink.</p>
diff -r docs-base/BeukersNBackModel.html docs-head/BeukersNBackModel.html
13c13
<   <title>N-Back Model (Beukers et al., 2022) &mdash; PsyNeuLink 0.0.0.0 documentation</title>
---
>   <title>Nback Model &mdash; PsyNeuLink 0.0.0.0 documentation</title>
35c35,37
<     <link rel="search" title="Search" href="search.html" /> 
---
>     <link rel="search" title="Search" href="search.html" />
>     <link rel="next" title="Contributors Guide" href="ContributorsGuide.html" />
>     <link rel="prev" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" href="BustamanteStroopXORLVOCModel.html" /> 
129c131
<               <ul>
---
>               <ul class="current">
134c136
< <li class="toctree-l1"><a class="reference internal" href="Library.html">Library</a></li>
---
> <li class="toctree-l1 current"><a class="reference internal" href="Library.html">Library</a></li>
177c179,183
<       <li>N-Back Model (Beukers et al., 2022)</li>
---
>           <li><a href="Library.html">Library</a> &gt;</li>
>         
>           <li><a href="Models.html">Models</a> &gt;</li>
>         
>       <li>Nback Model</li>
200,268c206,213
<   <section id="n-back-model-beukers-et-al-2022">
< <h1>N-Back Model (Beukers et al., 2022)<a class="headerlink" href="#n-back-model-beukers-et-al-2022" title="Permalink to this headline">¶</a></h1>
< <p><a class="reference external" href="https://psyarxiv.com/jtw5p">“When Working Memory is Just Working, Not Memory”</a></p>
< <section id="overview">
< <h2>Overview<a class="headerlink" href="#overview" title="Permalink to this headline">¶</a></h2>
< <p>This implements a model of the <a class="reference external" href="https://en.wikipedia.org/wiki/N-back#Neurobiology_of_n-back_task">N-back task</a>
< described in <a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al. (2022)</a>.  The model uses a simple implementation of episodic
< memory (EM, as a form of content-retrieval memory) to store previous stimuli along with the temporal context in which
< they occured, and a feedforward neural network (FFN)to evaluate whether the current stimulus is a match to the n’th
< preceding stimulus (nback-level)retrieved from episodic memory.  The temporal context is provided by a randomly
< drifting high dimensional vector that maintains a constant norm (i.e., drifts on a sphere).  The FFN is
< trained, given an n-back level of <em>n</em>, to identify when the current stimulus matches one stored in EM
< with a temporal context vector that differs by an amount corresponding to <em>n</em> time steps of drift.  During n-back
< performance, the model encodes the current stimulus and temporal context, retrieves an item from EM that matches the
< current stimulus, weighted by the similarity of its temporal context vector (i.e., most recent), and then uses the
< FFN to evaluate whether it is an n-back match.  The model responds “match” if the FFN detects a match; otherwise, it
< either responds “non-match” or, with a fixed probability (hazard rate), it uses the current stimulus and temporal
< context to retrieve another sample from EM and repeat the evaluation.</p>
< <p>This model is an example of proposed interactions between working memory (e.g., in neocortex) and episodic memory
< e.g., in hippocampus and/or cerebellum) in the performance of tasks demanding of sequential processing and control,
< and along the lines of models emerging machine learning that augment the use of recurrent neural networks (e.g., long
< short-term memory mechanisms; LSTMs) for active memory and control with an external memory capable of rapid storage
< and content-based retrieval, such as the Neural Turing Machine (NTN;
< <a class="reference external" href="https://arxiv.org/abs/1410.5401">Graves et al., 2016</a>), Episodic Planning Networks (EPN;
< <a class="reference external" href="https://arxiv.org/abs/2006.03662">Ritter et al., 2020</a>), and Emergent Symbols through Binding Networks (ESBN;
< <a class="reference external" href="https://arxiv.org/abs/2012.14601">Webb et al., 2021</a>).</p>
< <p>The script respectively, to construct, train and run the model:</p>
< <ul class="simple">
< <li><p>construct_model(args):
< takes as arguments parameters used to construct the model; for convenience, defaults are defined toward the top
< of the script (see “Construction parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>train_network(args)
< takes as arguments the feedforward neural network Composition (FFN_COMPOSITION) and number of epochs to train.
< Note: learning_rate is set at construction (which can be specified using LEARNING_RATE under “Training parameters”).</p></li>
< </ul>
< <ul class="simple">
< <li><p>run_model()
< takes as arguments the drift rate in the temporal context vector to be applied on each trial,
< and the number of trials to execute, as well as reporting and animation specifications
< (see “Execution parameters”).</p></li>
< </ul>
< <p>The default parameters are ones that have been fit to empirical data concerning human performance
< (taken from <a class="reference external" href="https://psycnet.apa.org/record/2007-06096-010?doi=1">Kane et al., 2007</a>).</p>
< </section>
< <section id="the-model">
< <h2>The Model<a class="headerlink" href="#the-model" title="Permalink to this headline">¶</a></h2>
< <p>The models is composed of two <a class="reference internal" href="Composition.html"><span class="doc">Compositions</span></a>: an outer one that contains the full model (nback_model),
< and an <a class="reference internal" href="AutodiffComposition.html"><span class="doc">AutodiffComposition</span></a> (ffn), nested within nback_model (see red box in Figure), that implements the
< feedforward neural network (ffn).</p>
< <section id="nback-model">
< <h3>nback_model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h3>
< <p>This contains three input Mechanisms (</p>
< <p>Both of these are constructed in the construct_model function.
< The ffn Composition is trained use</p>
< <figure class="align-left" id="nback-fig">
< <img alt="N-Back Model Animation" src="_images/N-Back_Model_movie.gif" />
< </figure>
< </section>
< </section>
< <section id="training">
< <h2>Training<a class="headerlink" href="#training" title="Permalink to this headline">¶</a></h2>
< </section>
< <section id="execution">
< <h2>Execution<a class="headerlink" href="#execution" title="Permalink to this headline">¶</a></h2>
< <p>Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code>
< .. Script: <code class="xref download docutils literal notranslate"><span class="pre">N-back.py</span></code></p>
< </section>
---
>   <section id="nback-model">
> <h1>Nback Model<a class="headerlink" href="#nback-model" title="Permalink to this headline">¶</a></h1>
> <p><strong>“When Working Memory is Just Working, Not Memory”</strong> (<a class="reference external" href="https://psyarxiv.com/jtw5p">Beukers et al., 2022</a>)</p>
> <p>Script: <a class="reference download internal" download="" href="_downloads/3ca6d8ab14ce9c785a6b5c1b1d60a955/nback.py"><code class="xref download docutils literal notranslate"><span class="pre">nback.py</span></code></a></p>
> <p>Example of Jupyter Notebook: <a class="reference internal" href="BeukersNBackModel_NB.html"><span class="doc">Nback Model Notebook</span></a></p>
> <p>(Instructions for running the actual notebook can be found <a class="reference internal" href="index_logo_with_text.html#tutorial"><span class="std std-ref">here</span></a>, replacing <code class="docutils literal notranslate"><span class="pre">nback_nb</span></code> for <code class="docutils literal notranslate"><span class="pre">tutorial</span></code>)</p>
> <div class="toctree-wrapper compound">
> </div>
276a222,230
>     <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
>       
>         <a href="ContributorsGuide.html" class="btn btn-neutral float-right" title="Contributors Guide" accesskey="n" rel="next">Next <img src="_static/images/chevron-right-orange.svg" class="next-page"></a>
>       
>       
>         <a href="BustamanteStroopXORLVOCModel.html" class="btn btn-neutral" title="LVOC Model of Stroop XOR Task (Bustamante et al. 2017)" accesskey="p" rel="prev"><img src="_static/images/chevron-right-orange.svg" class="previous-page"> Previous</a>
>       
>     </div>
>   
305,314c259
< <li><a class="reference internal" href="#">N-Back Model
...

See CI logs for the full diff.

@jdcpni jdcpni merged commit d6d95f3 into devel Apr 2, 2023
@jdcpni jdcpni deleted the nback branch April 2, 2023 20:32
kmantel added a commit to kmantel/PsyNeuLink that referenced this pull request Apr 6, 2023
* origin/allclose-clean:
  tests: replace all unaffected assert np.allclose with np.testing.assert_allclose
  tests: replace assert np.allclose with np.testing.assert_allclose where tolerance fails
  tests: test_identicalness_of_control_and_gating: fix allclose shape
  tests: test_log: fix allclose shape
  tests: test_rpy.py test_multilayer: fix allclose
  tests: test_reset_run_2darray: fix allclose
  tests: test_transfer_mech_array_var_normal_array_noise: fix allclose
  tests: test_single_array: fix allclose
  tests: test_connect_outer_composition_to_only_input_node_in_inner_comp_option2: fix allclose
  tests: test_AGTUtility_valid: fix allclose
  tests: test_target_dict_spec_single_trial_scalar_and_lists_rl: fix allclose
  tests: test_transfer_mech_func: fix allclose
  tests: test_input_specification_multiple_nested_compositions: fix allclose
  tests: test_input_not_provided_to_run: fix allclose
  tests: test_example_11: fix allclose
  tests: test_buffer_standalone: fix allclose
  tests: test_assign_value: fix allclose
  tests: TestRunInputSpecifications::test_2_mechanisms_input_5: fix allclose
  tests: test_integrator_multiple_input: fix allclose
  tests: test_nested_composition_run: fix allclose
  tests: tests_output_port_variable_spec_composition: fix allclose
  tests: test_output_ports: fix allclose
  tests: test_LCAMechanism_threshold: fix allclose
  tests: test_user_def_func_numpy: match 2d output to 2d input
  tests: test_linear_combination_function_in_mechanism: fix allclose
  tests: test_connect_compositions_with_simple_states: fix allclose
  tests: TestInputSpecifications test_run_2_mechanisms_reuse_input: fix allclose
  tests: TestRun test_run_2_mechanisms_reuse_input: fix allclose
  tests: TestRunInputSpecifications test_run_2_mechanisms_reuse_input: fix allclose
  tests: test_nested_transfer_mechanism_composition_parallel shape mismatch
  Fix test_ddm_mechanism TestInputPorts
  Fix test_reset_run_array
  Fix TestReduce
  Fix test_reset_state_integrator_mechanism
  Fix more execute tests.
  Fix more composition.run() tests.
  Fix test_four_level_nested_transfer_mechanism_composition_parallel
  Fix test_processing_mechanism_function
  tests: allclose changes
  Fix log tests
  tests: allclose changes
  tests: allclose changes
  Fix test_nested_composition_execution
  Fix expected 3D outputs
  Fix test_run_no_inputs
  tests: allclose changes
  tests: allclose changes
  tests: allclose changes
  Fix expected results shape
  Fix test_xor_training_correctness
  tests: test_nested_composition_run_trials_inputs: undo disable test parametrizations
  tests: allclose changes
  Nback (PrincetonUniversity#2617)
  Fix/contentaddressablememory empty entry (PrincetonUniversity#2616)
  llvm: Add human readable name to _node_wrapper instances
  llvm: Use WeakRefDictionary for node wrappers
  llvm/builder_context: Use proxy object for _node_wrapper owning composition
kmantel added a commit to kmantel/PsyNeuLink that referenced this pull request Apr 14, 2023
* origin/devel:
  tests: remove asserts using zip loops where possible
  tests: replace all unaffected assert np.allclose with np.testing.assert_allclose
  tests: replace assert np.allclose with np.testing.assert_allclose where tolerance fails
  tests: test_gating_with_composition: fix allclose
  tests: test_log: fix allclose shape
  tests: test_rpy.py test_multilayer: fix allclose
  tests: test_reset_run_2darray: fix allclose
  tests: test_transfer_mech_array_var_normal_array_noise: fix allclose
  tests: test_single_array: fix allclose
  tests: test_connect_outer_composition_to_only_input_node_in_inner_comp_option2: fix allclose
  tests: test_AGTUtility_valid: fix allclose
  tests: test_target_dict_spec_single_trial_scalar_and_lists_rl: fix allclose
  tests: test_transfer_mech_func: fix allclose
  tests: test_input_specification_multiple_nested_compositions: fix allclose
  tests: test_input_not_provided_to_run: fix allclose
  tests: test_example_11: fix allclose
  tests: test_buffer_standalone: fix allclose
  tests: test_assign_value: fix allclose
  tests: TestRunInputSpecifications::test_2_mechanisms_input_5: fix allclose
  tests: test_integrator_multiple_input: fix allclose
  tests: tests_output_port_variable_spec_composition: fix allclose
  tests: test_output_ports: fix allclose
  tests: test_LCAMechanism_threshold: fix allclose
  tests: test_user_def_func_numpy: match 2d output to 2d input
  tests: test_linear_combination_function_in_mechanism: fix allclose
  tests: test_connect_compositions_with_simple_states: fix allclose
  tests: TestInputSpecifications test_run_2_mechanisms_reuse_input: fix allclose
  tests: TestRun test_run_2_mechanisms_reuse_input: fix allclose
  tests: TestRunInputSpecifications test_run_2_mechanisms_reuse_input: fix allclose
  tests: test_nested_transfer_mechanism_composition_parallel shape mismatch
  Fix test_ddm_mechanism TestInputPorts
  Fix test_reset_run_array
  Fix TestReduce
  Fix test_reset_state_integrator_mechanism
  Fix more execute tests.
  Fix more composition.run() tests.
  Fix test_four_level_nested_transfer_mechanism_composition_parallel
  Fix test_processing_mechanism_function
  tests: allclose changes
  Fix log tests
  tests: allclose changes
  Fix expected 3D outputs
  Fix test_run_no_inputs
  tests: allclose changes
  tests: allclose changes
  tests: allclose changes
  Fix expected results shape
  tests: test_nested_composition_run_trials_inputs: undo disable test parametrizations
  tests: allclose changes
  tests: test_documentation_models: reenable parametrizations
  github-actions: Add job running --benchmark-enable
  tests/AudodiffComposition: Add helper function to return the first set of learning results
  github-actions: Run all tests with --fp-precision=fp32
  tests/Autodiff: Check for execution mode PyTorch in test_optimizer specs
  tests/control: FP32 should only be used in compiled tests
  tests/xor_training_identicalness: Convert to use autodiff_mode
  tests/MemoryFunction: Prefer np.testing.assert_equal to assert np.all(==)
  test/AutodiffComposition: Use np.testing.assert_equal instead of assert np.all(==)
  requirements: update beartype requirement from <0.13.0 to <0.14.0 (PrincetonUniversity#2625)
  requirements: update pytest requirement from <7.2.3 to <7.3.1 (PrincetonUniversity#2627)
  requirements: Drop grpcio-tools
  requirements: Add protobuf to dependency list
  {dev,tutorial}_requirements: Use sharp inequality for jupyter upper bound
  dev_requirements: Use sharp inequality of pytest-profiling upper bound
  requirements: Sort alphabetically
  requirements: Use sharp inequality for leabra-psyneulink upper bound
  requirements: Fix dill dependency upper bound
  requirements: Use sharp inequality for bear type upper bound
  Tentative learning branch (PrincetonUniversity#2623)
  tests/llvm/multiple_executions: Use np.testing.assert_allclose to check resutls (PrincetonUniversity#2621)
  requirements: update pandas requirement from <1.5.4 to <2.0.1 (PrincetonUniversity#2619)
  Add back rate validation in constructor.
  Add back validation I added for offset.
  Remove validation I added for offset.
  Add validate methods as kmantel suggests
  requirements: update pillow requirement from <9.5.0 to <9.6.0 (PrincetonUniversity#2618)
  Remove typecheck-decorator requirment.
  Fix some annotations.
  requirements: update networkx requirement from <3.1 to <3.2 (PrincetonUniversity#2620)
  Nback (PrincetonUniversity#2617)
  Fix/contentaddressablememory empty entry (PrincetonUniversity#2616)
  Fix some annotations on learning methods.
  Fix some issues with merge of composition.
  Fix type annotation on _get_port_value_labels
  Remove check_user_specified in places.
  Replace typecheck decorator with beartype
  Add typecheck decorator back temporarily
  llvm: Add human readable name to _node_wrapper instances
  llvm: Use WeakRefDictionary for node wrappers
  llvm/builder_context: Use proxy object for _node_wrapper owning composition
  remove redefinition of ENTROPY keyword
  Raise KeyError on MechanismList.__seitem__
  fix some lgtm errors
  remove unused imports
  revert ci testing changes completely for now.
  Fix codestyle errors.
  Revert back to testing on Python 3.9.
  Fix specifcation of Python 3.10 in ci workflow
  Fixes for Python 3.7
  Enable tests for python 3.10 in CI.
  Add beartype for runtime type checking.
  Replace typecheck import with beartype
  Removed all the tc.optional typecheck annotations
  replaced all typecheck enums with Literal
  Replace is_function_type with static type
  Replace is_pref_set with static type
  Replace parameter_spec with static type.
  Remove dead code in parameter_spec
  ran some regexes to replace easy type hints
  fix for some tests that expect runtime type checks
  Comment out all typecheck decorations.
jvesely added a commit to jvesely/PsyNeuLink that referenced this pull request May 8, 2023
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
use of cross entropy loss function to use "one hot" format for targts
instead of the previously used index.
This new format requires torch >= 1.12.0 [0]

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this pull request May 9, 2023
…ch<=1.11.x

1.12.0+ can use the CrossEntropyLoss instance directly.
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this pull request May 9, 2023
…ch<=1.11.x

1.12.0+ can use the CrossEntropyLoss instance directly.
d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0

Signed-off-by: Jan Vesely <[email protected]>
jvesely added a commit to jvesely/PsyNeuLink that referenced this pull request May 12, 2023
…torch<=1.11.x

d6d95f3 ("Nback (PrincetonUniversity#2617)") changed the
format of cross entropy target that requires torch >=1.12
1.12.0+ includes input handling path to consider inputs without batch
dimension and can be used directly. [0,1,2]

Fixes: d6d95f3 ("Nback (PrincetonUniversity#2617)")
Closes: PrincetonUniversity#2665

[0] https://github.com/pytorch/pytorch/releases/tag/v1.12.0
[1] pytorch/pytorch#77653
[2] pytorch/pytorch@8881d7a

Signed-off-by: Jan Vesely <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants