diff --git a/README.md b/README.md
index 01cdfb62..e01134ae 100644
--- a/README.md
+++ b/README.md
@@ -1,31 +1,101 @@
 # pyiron_workflow
 
+[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/pyiron/pyiron_workflow/HEAD)
+[![License](https://img.shields.io/badge/License-BSD_3--Clause-blue.svg)](https://opensource.org/licenses/BSD-3-Clause)
+[![Codacy Badge](https://app.codacy.com/project/badge/Grade/0b4c75adf30744a29de88b5959246882)](https://app.codacy.com/gh/pyiron/pyiron_workflow/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade)
+[![Coverage Status](https://coveralls.io/repos/github/pyiron/pyiron_workflow/badge.svg?branch=main)](https://coveralls.io/github/pyiron/pyiron_workflow?branch=main)
+
+[//]: # ([![Documentation Status](https://readthedocs.org/projects/pyiron-workflow/badge/?version=latest)](https://pyiron-workflow.readthedocs.io/en/latest/?badge=latest))
+
+[![Anaconda](https://anaconda.org/conda-forge/pyiron_workflow/badges/version.svg)](https://anaconda.org/conda-forge/pyiron_workflow)
+[![Last Updated](https://anaconda.org/conda-forge/pyiron_workflow/badges/latest_release_date.svg
+)](https://anaconda.org/conda-forge/pyiron_workflow)
+[![Platform](https://anaconda.org/conda-forge/pyiron_workflow/badges/platforms.svg)](https://anaconda.org/conda-forge/pyiron_workflow)
+[![Downloads](https://anaconda.org/conda-forge/pyiron_workflow/badges/downloads.svg)](https://anaconda.org/conda-forge/pyiron_workflow)
 
 ## Overview
 
-This repository is home to the pyiron code for structuring workflows as graph objects, with different computational elements as nodes and data and execution signals travelling along edges. It is currently in an alpha state, changing quickly, and not yet feature-complete.
+`pyiron_workflow` is a framework for constructing workflows as computational graphs from simple python functions. Its objective is to make it as easy as possible to create reliable, reusable, and sharable workflows, with a special focus on research workflows for HPC environments.
+
+Nodes are formed from python functions with simple decorators, and the resulting nodes can have their data inputs and outputs connected. 
+
+By allowing (but not demanding, in the case of data DAGs) users to specify the execution flow, both cyclic and acyclic graphs are supported. 
+
+By scraping type hints from decorated functions, both new data values and new graph connections are (optionally) required to conform to hints, making workflows strongly typed.
+
+Individual node computations can be shipped off to parallel processes for scalability. (This is an alpha-feature at time of writing and limited to single core parallel python processes; full support of [`pympipool`](https://github.com/pyiron/pympipool) is under active development)
+
+Once you're happy with a workflow, it can be easily turned it into a macro for use in other workflows. This allows the clean construction of increasingly complex computation graphs by composing simpler graphs.
+
+Nodes (including macros) can be stored in plain text, and registered by future workflows for easy access. This encourages and supports an ecosystem of useful nodes, so you don't need to re-invent the wheel. (This is an alpha-feature, with full support of [FAIR](https://en.wikipedia.org/wiki/FAIR_data) principles for node packages planned.)
+
+## Example
+
+`pyiron_workflow` offers a single-point-of-entry in the form of the `Workflow` object, and uses decorators to make it easy to turn regular python functions into "nodes" that can be put in a computation graph.
+
+Nodes can be used by themselves and -- other than being "delayed" in that their computation needs to be requested after they're instantiated -- they feel an awful lot like the regular python functions they wrap:
+
+```python
+from pyiron_workflow import Workflow
+
+@Workflow.wrap_as.single_value_node()
+def add_one(x):
+    return x + 1
 
-## The absolute basics
+add_one(add_one(add_one(x=0)))()
+>>> 3
+```
 
-`pyiron_workflow` offers a single-point-of-entry in the form of the `Workflow` object, and uses decorators to make it easy to turn regular python functions into "nodes" that can be put in a computation graph:
+But the intent is to collect them together into a workflow and leverage existing nodes:
 
 ```python
 from pyiron_workflow import Workflow
 
-@Workflow.wrap_as.function_node("sum")
-def x_plus_y(x: int = 0, y: int = 0) -> int:
-    return x + y
+@Workflow.wrap_as.single_value_node()
+def add_one(x):
+    return x + 1
+
+@Workflow.wrap_as.macro_node()
+def add_three_macro(macro):
+    macro.start = add_one()
+    macro.middle = add_one(x=macro.start)
+    macro.end = add_one(x=macro.middle)
+    macro.inputs_map = {"start__x": "x"}
+    macro.outputs_map = {"end__x + 1": "y"}
 
-wf = Workflow("my_workflow")
-wf.a1 = x_plus_y()
-wf.a2 = x_plus_y()
-wf.b = x_plus_y(x=wf.a1.outputs.sum, y=wf.a2.outputs.sum)
+Workflow.register(
+    "plotting", 
+    "pyiron_workflow.node_library.plotting"
+)
 
-out = wf(a1__x=0, a1__y=1, a2__x=2, a2__y=3)
-out.b__sum
->>> 6
+wf = Workflow("add_5_and_plot")
+wf.add_one = add_one()
+wf.add_three = add_three_macro(x=wf.add_one)
+wf.plot = wf.create.plotting.Scatter(
+    x=wf.add_one,
+    y=wf.add_three.outputs.y
+)
 
-wf.draw()
+diagram = wf.draw()
+
+import numpy as np
+fig = wf(add_one__x=np.arange(5)).plot__fig
 ```
 
-![](docs/_static/demo.png)
\ No newline at end of file
+Which gives the workflow `diagram`
+
+![](docs/_static/readme_diagram.png)
+
+And the resulting `fig`
+
+![](docs/_static/readme_shifted.png)
+
+## Installation
+
+`conda install -c conda-forge pyiron_workflow`
+
+To unlock the associated node packages and ensure that the demo notebooks run, also make sure your conda environment has the packages listed in our [notebooks dependencies](.ci_support/environment-notebooks.yml)
+
+## Learning more
+
+Check out the demo [notebooks](notebooks), read through the docstrings, and don't be scared to raise an issue on this GitHub repo!
\ No newline at end of file
diff --git a/docs/_static/demo.png b/docs/_static/demo.png
deleted file mode 100644
index 60cf5223..00000000
Binary files a/docs/_static/demo.png and /dev/null differ
diff --git a/docs/_static/readme_diagram.png b/docs/_static/readme_diagram.png
new file mode 100644
index 00000000..b5e24273
Binary files /dev/null and b/docs/_static/readme_diagram.png differ
diff --git a/docs/_static/readme_shifted.png b/docs/_static/readme_shifted.png
new file mode 100644
index 00000000..7daa81da
Binary files /dev/null and b/docs/_static/readme_shifted.png differ
diff --git a/notebooks/workflow_example.ipynb b/notebooks/deepdive.ipynb
similarity index 90%
rename from notebooks/workflow_example.ipynb
rename to notebooks/deepdive.ipynb
index 32414fef..d3549fe2 100644
--- a/notebooks/workflow_example.ipynb
+++ b/notebooks/deepdive.ipynb
@@ -5,16 +5,23 @@
    "id": "5edfe456-c5b8-4347-a74f-1fb19fdff91b",
    "metadata": {},
    "source": [
-    "# Pyiron workflows: Introduction and Syntax\n",
+    "# Pyiron workflows: A ground-up walkthrough of syntax and features\n",
     "\n",
-    "Here we will highlight:\n",
-    "- How to instantiate a node\n",
-    "- How to make reusable node classes\n",
-    "- How to connect node inputs and outputs together\n",
+    "Contents:\n",
+    "- From function to node\n",
+    "- Making reusable node classes\n",
+    "- Connecting nodes to form a graph\n",
     "- SingleValue nodes and syntactic sugar\n",
     "- Workflows: keeping your computational graphs organized\n",
-    "- Using pre-defined nodes \n",
-    "- Macro nodes"
+    "- Node packages: making nodes re-usable\n",
+    "- Macro nodes: complex computations by composing sub-graphs\n",
+    "- Dragons and the future: remote execution, cyclic flows, and more\n",
+    "\n",
+    "To jump straight to how to use `pyiron_workflow`, go look at the quickstart guide -- this jumps straight to using `Workflow` as a single-point-of-access, creating nodes with decorators, and leveraging node packages to form complex graphs.\n",
+    "\n",
+    "Here we start from the ground up and do \"silly\" things like importing _just_ the `Function` class directly from the `pyiron_workflow` package. This isn't meant to show actual practical, recommended use-styles, but rather is indended as a pedagogical deep-dive that builds knowledge from the ground up. While the quickstart is aimed at users who just want to get running, this is intended for people who want to develop nodes for others to use, or for people who are stuck or seeing unexpected behaviour and want a better understanding of what `pyiron_workflow` is doing under the hood.\n",
+    "\n",
+    "The next recommendation is to simply read the class and method docstrings directly!"
    ]
   },
   {
@@ -24,7 +31,7 @@
    "source": [
     "## Instantiating a node\n",
     "\n",
-    "Simple nodes can be defined on-the-fly by passing any callable to the `Function(Node)` class. This transforms the function into a node instance which has input and output, can be connected to other nodes in a workflow, and can run the function it stores.\n",
+    "Simple nodes can be defined on-the-fly by passing any callable to a special `Node` class, `Function(Node)`, which transforms the function into a class. Instances of this node have input and output, can be connected to other nodes in a workflow, and can run the function it stores.\n",
     "\n",
     "Input and output channels are _automatically_ extracted from the signature and return value(s) of the function. (Note: \"Nodized\" functions must have _at most_ one `return` expression!)"
    ]
@@ -238,12 +245,41 @@
    "id": "58ed9b25-6dde-488d-9582-d49d405793c6",
    "metadata": {},
    "source": [
-    "This node also exploits type hinting! New values are checked against the node type hint, so trying to assign an incommensurate value will raise an error:"
+    "This node also exploits type hinting! Like the variable names, these hints get scraped automatically and added to the channels:"
    ]
   },
   {
    "cell_type": "code",
    "execution_count": 10,
+   "id": "09eee102-f8f1-4d2d-806f-01254c2483dc",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "(int, int, int)"
+      ]
+     },
+     "execution_count": 10,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "adder_node.inputs.x.type_hint, adder_node.inputs.y.type_hint, adder_node.outputs.sum_.type_hint"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "8382ba4d-9bf7-4057-8da8-0ee411d95c18",
+   "metadata": {},
+   "source": [
+    "New values are checked against the node type hint, so trying to assign an incommensurate value will raise an error:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 11,
    "id": "ac0fe993-6c82-48c8-a780-cbd0c97fc386",
    "metadata": {},
    "outputs": [],
@@ -261,7 +297,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 11,
+   "execution_count": 12,
    "id": "a63b2cc0-9030-45ad-8d37-d11e16e61369",
    "metadata": {},
    "outputs": [],
@@ -272,7 +308,17 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 12,
+   "execution_count": 13,
+   "id": "3ed72790-899b-408b-bdfd-ebcf03f4c7e3",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# adder_node.failed = False  # Reset if you force-failed by messing with the private value"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 14,
    "id": "15742a49-4c23-4d4a-84d9-9bf19677544c",
    "metadata": {},
    "outputs": [
@@ -282,7 +328,7 @@
        "3"
       ]
      },
-     "execution_count": 12,
+     "execution_count": 14,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -302,7 +348,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 13,
+   "execution_count": 15,
    "id": "0c8f09a7-67c4-4c6c-a021-e3fea1a16576",
    "metadata": {},
    "outputs": [
@@ -312,7 +358,7 @@
        "30"
       ]
      },
-     "execution_count": 13,
+     "execution_count": 15,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -332,7 +378,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 14,
+   "execution_count": 16,
    "id": "69b59737-9e09-4b4b-a0e2-76a09de02c08",
    "metadata": {},
    "outputs": [
@@ -342,7 +388,7 @@
        "31"
       ]
      },
-     "execution_count": 14,
+     "execution_count": 16,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -356,7 +402,7 @@
    "id": "f233f3f7-9576-4400-8e92-a1f6109d7f9b",
    "metadata": {},
    "source": [
-    "Note for advanced users: when the node has an executor set, running returns a futures object for the calculation, whose `.result()` will eventually be the function output."
+    "Note for advanced users: when the node has an executor set, running returns a futures object for the calculation, whose `.result()` will eventually be the function output. This result object can also be accessed on the node's `.result` attribute as long as it's running."
    ]
   },
   {
@@ -368,14 +414,14 @@
     "\n",
     "If we're going to use a node many times, we may want to define a new sub-class of `Function` to handle this.\n",
     "\n",
-    "The can be done directly by inheriting from `Function` and overriding it's `__init__` function so that the core functionality of the node (i.e. the node function and output labels) are set in stone, but even easier is to use the `function_node` decorator to do this for you! \n",
+    "The can be done directly by inheriting from `Function` and overriding it's `__init__` function and/or directly defining the `node_function` property so that the core functionality of the node (i.e. the node function and output labels) are set in stone, but even easier is to use the `function_node` decorator to do this for you! \n",
     "\n",
     "The decorator also lets us explicitly choose the names of our output channels by passing the `output_labels` argument to the decorator -- as a string to create a single channel for the returned values, or as a list of strings equal to the number of returned values in a returned tuple."
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 15,
+   "execution_count": 17,
    "id": "61b43a9b-8dad-48b7-9194-2045e465793b",
    "metadata": {},
    "outputs": [],
@@ -385,7 +431,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 16,
+   "execution_count": 18,
    "id": "647360a9-c971-4272-995c-aa01e5f5bb83",
    "metadata": {},
    "outputs": [
@@ -422,7 +468,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 17,
+   "execution_count": 19,
    "id": "b8c845b7-7088-43d7-b106-7a6ba1c571ec",
    "metadata": {},
    "outputs": [
@@ -455,11 +501,11 @@
     "\n",
     "Multiple nodes can be used together to build a computational graph, with each node performing a particular operation in the overall workflow.\n",
     "\n",
-    "The input and output of nodes can be chained together by connecting their data channels.\n",
+    "The input and output of nodes can be chained together by connecting their data channels to form a data graph.\n",
     "\n",
-    "The flow of execution can be manually configured by using other \"signal\" channels. However, for acyclic graphs (DAGs), execution flow can be automatically determined from the topology of the data connections.\n",
+    "The flow of execution can be manually configured by using other \"signal\" channels to form an execution graph. However, for data graphs that are a directed acyclic graph (DAG), the execution flow can be automatically determined from the topology of the data connections!\n",
     "\n",
-    "The `run` command we saw above has several boolean flags for controlling the style of execution. The two main run modes are with a \"pull\" paradigm, where everything upstream is run first then the node invoking `pull` gets run; and with a \"push\" paradigm (the default for `run`), where the node invoking `run` gets run and then runs everything downstream. Calling an instantiated node runs a particularly aggressive version of `pull`.\n",
+    "The `run` command we saw above has several boolean flags for controlling the style of execution. The two main run modes are with a \"pull\" paradigm, where everything upstream on the graph of data connections is run first then the node invoking `pull` gets run; and with a \"push\" paradigm (the default for `run`), where the node invoking `run` gets run and then runs everything downstream on the execution graph. Calling an instantiated node runs a particularly aggressive version of `pull`.\n",
     "\n",
     "We'll talk more about grouping nodes together inside a `Workflow` object, but without a parent workflow, only the `pull` method will automate execution signals; trying to push data downstream using `run` requires specifying the execution flow manually.\n",
     "\n",
@@ -468,7 +514,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 18,
+   "execution_count": 20,
    "id": "2e418abf-7059-4e1e-9b9f-b3dc0a4b5e35",
    "metadata": {
     "tags": []
@@ -478,8 +524,6 @@
      "name": "stderr",
      "output_type": "stream",
      "text": [
-      "/Users/huber/work/pyiron/pyiron_workflow/pyiron_workflow/channels.py:158: UserWarning: The channel ran was not connected to run, andthus could not disconnect from it.\n",
-      "  warn(\n",
       "/Users/huber/work/pyiron/pyiron_workflow/pyiron_workflow/channels.py:158: UserWarning: The channel run was not connected to ran, andthus could not disconnect from it.\n",
       "  warn(\n"
      ]
@@ -490,7 +534,7 @@
        "2"
       ]
      },
-     "execution_count": 18,
+     "execution_count": 20,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -524,7 +568,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 19,
+   "execution_count": 21,
    "id": "f3b0b700-683e-43cb-b374-48735e413bc9",
    "metadata": {},
    "outputs": [
@@ -534,7 +578,7 @@
        "4"
       ]
      },
-     "execution_count": 19,
+     "execution_count": 21,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -560,7 +604,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 20,
+   "execution_count": 22,
    "id": "59c29856-c77e-48a1-9f17-15d4c58be588",
    "metadata": {},
    "outputs": [
@@ -596,7 +640,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 21,
+   "execution_count": 23,
    "id": "1a4e9693-0980-4435-aecc-3331d8b608dd",
    "metadata": {},
    "outputs": [],
@@ -608,7 +652,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 22,
+   "execution_count": 24,
    "id": "7c4d314b-33bb-4a67-bfb9-ed77fba3949c",
    "metadata": {},
    "outputs": [
@@ -647,7 +691,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 23,
+   "execution_count": 25,
    "id": "61ae572f-197b-4a60-8d3e-e19c1b9cc6e2",
    "metadata": {},
    "outputs": [
@@ -657,7 +701,7 @@
        "4"
       ]
      },
-     "execution_count": 23,
+     "execution_count": 25,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -688,7 +732,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 24,
+   "execution_count": 26,
    "id": "6569014a-815b-46dd-8b47-4e1cd4584b3b",
    "metadata": {},
    "outputs": [
@@ -702,7 +746,7 @@
     },
     {
      "data": {
-      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAGdCAYAAAA1/PiZAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAjyklEQVR4nO3df0xUV/7/8dcAwlhXplErjMpadLWVklqBYME1zfZT8UdDvybdSNNVW9c2pT/WKmu3um6kuE1Iu1mztVvoL23TaLukP+xqwlL5Y6v4Y9cVoanFpI2yxR9DCZgO9Ada4Xz/8AOfjkDLHRnGM/N8JPePOZ7LvOde4b44596DyxhjBAAAYIGYcBcAAAAwWAQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA14sJdwGB0d3fr7NmzGj16tFwuV7jLAQAAg2CMUUdHhyZMmKCYmKEZK7EiuJw9e1YpKSnhLgMAAATh1KlTmjRp0pB8LSuCy+jRoyVd+uCJiYlhrgYAAAxGe3u7UlJSeq/jQ8GK4NIzPZSYmEhwAQDAMkN5mwc35wIAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1nAcXPbt26f8/HxNmDBBLpdL77///o/us3fvXmVmZsrtdmvKlCl68cUXg6kVAHp1dRsdOtGmv9ef0aETberqNuEuCcAwcLxy7tdff62ZM2dqxYoVuvvuu3+0f2NjoxYtWqQHH3xQ27dv14EDB/TII4/ouuuuG9T+AHC5qmM+lexukM/f2dvm9bhVnJ+mBeneMFYGINRcxpigf01xuVzauXOnFi9ePGCfJ598Urt27dLx48d72woLC/XRRx/p0KFDg3qf9vZ2eTwe+f1+lvwHolzVMZ8e3n5Ul//g6llQvHxpBuEFuEqE4vod8ntcDh06pLy8vIC2+fPn68iRI/ruu+9C/fYAIkhXt1HJ7oY+oUVSb1vJ7gamjYAIFvLg0tzcrKSkpIC2pKQkXbx4Ua2trf3uc/78ebW3twdsAHC48VzA9NDljCSfv1OHG88NX1EAhtWwPFV0+V+F7JmdGuivRZaWlsrj8fRuKSkpIa8RwNWvpWPg0BJMPwD2CXlwSU5OVnNzc0BbS0uL4uLiNHbs2H73Wb9+vfx+f+926tSpUJcJwALjR7uHtB8A+zh+qsipnJwc7d69O6Btz549ysrK0ogRI/rdJyEhQQkJCaEuDYBlslPHyOtxq9nf2e99Li5JyR63slPHDHdpAIaJ4xGXr776SvX19aqvr5d06XHn+vp6NTU1Sbo0WrJ8+fLe/oWFhfr8889VVFSk48ePa9u2bdq6davWrl07NJ8AQNSIjXGpOD9N0v89RdSj53VxfppiY/qfhgZgP8fB5ciRI5o1a5ZmzZolSSoqKtKsWbO0ceNGSZLP5+sNMZKUmpqqyspKffjhh7rlllv0xz/+UVu2bGENFwBBWZDuVfnSDCV7AqeDkj1uHoUGosAVreMyXFjHBcDlurqNDjeeU0tHp8aPvjQ9xEgLcHUJxfU75Pe4AEAoxMa4lDO1/xv8AUQu/sgiAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFgjqOBSVlam1NRUud1uZWZmqqam5gf779ixQzNnztQ111wjr9erFStWqK2tLaiCAQBA9HIcXCoqKrR69Wpt2LBBdXV1mjt3rhYuXKimpqZ+++/fv1/Lly/XypUr9cknn+jtt9/Wf/7zHz3wwANXXDwAAIgujoPL5s2btXLlSj3wwAOaMWOG/vKXvyglJUXl5eX99v/Xv/6l66+/XqtWrVJqaqp+/vOf66GHHtKRI0euuHgAABBdHAWXCxcuqLa2Vnl5eQHteXl5OnjwYL/75Obm6vTp06qsrJQxRl988YXeeecd3XnnnQO+z/nz59Xe3h6wAQAAOAoura2t6urqUlJSUkB7UlKSmpub+90nNzdXO3bsUEFBgeLj45WcnKxrr71Wzz///IDvU1paKo/H07ulpKQ4KRMAAESooG7OdblcAa+NMX3aejQ0NGjVqlXauHGjamtrVVVVpcbGRhUWFg749devXy+/39+7nTp1KpgyAQBAhIlz0nncuHGKjY3tM7rS0tLSZxSmR2lpqebMmaMnnnhCknTzzTdr1KhRmjt3rp5++ml5vd4++yQkJCghIcFJaQAAIAo4GnGJj49XZmamqqurA9qrq6uVm5vb7z7ffPONYmIC3yY2NlbSpZEaAACAwXI04iJJRUVFWrZsmbKyspSTk6OXX35ZTU1NvVM/69ev15kzZ/TGG29IkvLz8/Xggw+qvLxc8+fPl8/n0+rVq5Wdna0JEyYM7acBgEHo6jY63HhOLR2dGj/arezUMYqN6X+6G8DVxXFwKSgoUFtbmzZt2iSfz6f09HRVVlZq8uTJkiSfzxewpsv999+vjo4O/fWvf9Vvf/tbXXvttbr99tv1zDPPDN2nAIBBqjrmU8nuBvn8nb1tXo9bxflpWpDed+oawNXFZSyYr2lvb5fH45Hf71diYmK4ywFgqapjPj28/agu/6HXM9ZSvjSD8AIMoVBcv/lbRQCiQle3Ucnuhj6hRVJvW8nuBnV1X/W/ywFRjeACICocbjwXMD10OSPJ5+/U4cZzw1cUAMcILgCiQkvHwKElmH4AwoPgAiAqjB/tHtJ+AMKD4AIgKmSnjpHX49ZADz27dOnpouzUMcNZFgCHCC4AokJsjEvF+WmS1Ce89Lwuzk9jPRfgKkdwARA1FqR7Vb40Q8mewOmgZI+bR6EBSzhegA4AbLYg3at5acmsnAtYiuACIOrExriUM3VsuMsAEASmigAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDWCCi5lZWVKTU2V2+1WZmamampqfrD/+fPntWHDBk2ePFkJCQmaOnWqtm3bFlTBAAAgesU53aGiokKrV69WWVmZ5syZo5deekkLFy5UQ0ODfvrTn/a7z5IlS/TFF19o69at+tnPfqaWlhZdvHjxiosHAADRxWWMMU52mD17tjIyMlReXt7bNmPGDC1evFilpaV9+ldVVemee+7RyZMnNWbMmKCKbG9vl8fjkd/vV2JiYlBfAwAADK9QXL8dTRVduHBBtbW1ysvLC2jPy8vTwYMH+91n165dysrK0rPPPquJEydq+vTpWrt2rb799tsB3+f8+fNqb28P2AAAABxNFbW2tqqrq0tJSUkB7UlJSWpubu53n5MnT2r//v1yu93auXOnWltb9cgjj+jcuXMD3udSWlqqkpISJ6UBAIAoENTNuS6XK+C1MaZPW4/u7m65XC7t2LFD2dnZWrRokTZv3qzXX399wFGX9evXy+/3926nTp0KpkwAABBhHI24jBs3TrGxsX1GV1paWvqMwvTwer2aOHGiPB5Pb9uMGTNkjNHp06c1bdq0PvskJCQoISHBSWkAACAKOBpxiY+PV2ZmpqqrqwPaq6urlZub2+8+c+bM0dmzZ/XVV1/1tn366aeKiYnRpEmTgigZAABEK8dTRUVFRXr11Ve1bds2HT9+XGvWrFFTU5MKCwslXZrmWb58eW//e++9V2PHjtWKFSvU0NCgffv26YknntCvf/1rjRw5cug+CTBIXd1Gh0606e/1Z3ToRJu6uh09WAcACCPH67gUFBSora1NmzZtks/nU3p6uiorKzV58mRJks/nU1NTU2//n/zkJ6qurtZvfvMbZWVlaezYsVqyZImefvrpofsUwCBVHfOpZHeDfP7O3javx63i/DQtSPeGsTIAwGA4XsclHFjHBUOh6phPD28/qsv/w/fcVl6+NIPwAgBDKOzruAC26uo2Ktnd0Ce0SOptK9ndwLQRAFzlCC6ICocbzwVMD13OSPL5O3W48dzwFQUAcIzggqjQ0jFwaAmmHwAgPAguiArjR7uHtB8AIDwILogK2alj5PW41f/6zpdu0PV63MpODe4PgQIAhgfBBVEhNsal4vw0SeoTXnpeF+enKTZmoGgDALgaEFwQNRake1W+NEPJnsDpoGSPm0ehAcASjhegA2y2IN2reWnJOtx4Ti0dnRo/+tL0ECMtAGAHgguiTmyMSzlTx4a7DABAEJgqAgAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgjbhwF4CrQ1e30eHGc2rp6NT40W5lp45RbIwr3GUBABCA4AJVHfOpZHeDfP7O3javx63i/DQtSPeGsTIAAAIxVRTlqo759PD2owGhRZKa/Z16ePtRVR3zhakyAAD6IrhEsa5uo5LdDTL9/FtPW8nuBnV199cDAIDhR3CJYocbz/UZafk+I8nn79ThxnPDVxQAAD+A4BLFWjoGDi3B9AMAINQILlFs/Gj3kPYDACDUCC5RLDt1jLwetwZ66NmlS08XZaeOGc6yAAAYEMElisXGuFScnyZJfcJLz+vi/DTWcwEAXDUILlFuQbpX5UszlOwJnA5K9rhVvjSDdVwAAFcVFqCDFqR7NS8tmZVzAQBXPYILJF2aNsqZOjbcZQAA8IOYKgIAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArBFUcCkrK1NqaqrcbrcyMzNVU1MzqP0OHDiguLg43XLLLcG8LQAAiHKOg0tFRYVWr16tDRs2qK6uTnPnztXChQvV1NT0g/v5/X4tX75c//M//xN0sQAAILq5jDHGyQ6zZ89WRkaGysvLe9tmzJihxYsXq7S0dMD97rnnHk2bNk2xsbF6//33VV9fP+j3bG9vl8fjkd/vV2JiopNyAQBAmITi+u1oxOXChQuqra1VXl5eQHteXp4OHjw44H6vvfaaTpw4oeLi4kG9z/nz59Xe3h6wAQAAOAoura2t6urqUlJSUkB7UlKSmpub+93ns88+07p167Rjxw7FxQ3ubzqWlpbK4/H0bikpKU7KBAAAESqom3NdLlfAa2NMnzZJ6urq0r333quSkhJNnz590F9//fr18vv9vdupU6eCKRMAAESYwQ2B/K9x48YpNja2z+hKS0tLn1EYSero6NCRI0dUV1enxx57TJLU3d0tY4zi4uK0Z88e3X777X32S0hIUEJCgpPSAABAFHA04hIfH6/MzExVV1cHtFdXVys3N7dP/8TERH388ceqr6/v3QoLC3XDDTeovr5es2fPvrLqAQBAVHE04iJJRUVFWrZsmbKyspSTk6OXX35ZTU1NKiwslHRpmufMmTN64403FBMTo/T09ID9x48fL7fb3acdAADgxzgOLgUFBWpra9OmTZvk8/mUnp6uyspKTZ48WZLk8/l+dE0XAACAYDhexyUcWMcFAAD7hH0dFwAAgHAiuAAAAGsQXAAAgDUILgAAwBoEFwAAYA2CCwAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALBGXLgLAAD0r6vb6HDjObV0dGr8aLeyU8coNsYV7rKAsCK4AMBVqOqYTyW7G+Tzd/a2eT1uFeenaUG6N4yVAeHFVBEAXGWqjvn08PajAaFFkpr9nXp4+1FVHfOFqTIg/AguIdbVbXToRJv+Xn9Gh060qavbhLskAFexrm6jkt0N6u8nRU9bye4GfpYgajFVFEIM9QJw6nDjuT4jLd9nJPn8nTrceE45U8cOX2HAVYIRlxBhqBdAMFo6Bg4twfQDIg3BJQQY6gUQrPGj3UPaD4g0BJcQcDLUCwDfl506Rl6PWwM99OzSpSnn7NQxw1kWcNUguIQAQ70AghUb41Jxfpok9QkvPa+L89NYzwVRi+ASAgz1ArgSC9K9Kl+aoWRP4M+IZI9b5UszuLkfUY2nikKgZ6i32d/Z730uLl36AcRQL4CBLEj3al5aMivnApchuIRAz1Dvw9uPyiUFhBeGegEMVmyMi0eegcswVRQiDPUCADD0GHEJIYZ6AQAYWgSXEGOoFwCAocNUEQAAsAbBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsEFVzKysqUmpoqt9utzMxM1dTUDNj3vffe07x583TdddcpMTFROTk5+uCDD4IuGAAARC/HwaWiokKrV6/Whg0bVFdXp7lz52rhwoVqamrqt/++ffs0b948VVZWqra2Vr/4xS+Un5+vurq6Ky4eAABEF5cxxjjZYfbs2crIyFB5eXlv24wZM7R48WKVlpYO6mvcdNNNKigo0MaNGwfVv729XR6PR36/X4mJiU7KBQAAYRKK67ejEZcLFy6otrZWeXl5Ae15eXk6ePDgoL5Gd3e3Ojo6NGbMmAH7nD9/Xu3t7QEbAACAo+DS2tqqrq4uJSUlBbQnJSWpubl5UF/jz3/+s77++mstWbJkwD6lpaXyeDy9W0pKipMyAQBAhArq5lyXyxXw2hjTp60/b731lp566ilVVFRo/PjxA/Zbv369/H5/73bq1KlgygQAABEmzknncePGKTY2ts/oSktLS59RmMtVVFRo5cqVevvtt3XHHXf8YN+EhAQlJCQ4KQ0AAEQBRyMu8fHxyszMVHV1dUB7dXW1cnNzB9zvrbfe0v33368333xTd955Z3CVAgCAqOdoxEWSioqKtGzZMmVlZSknJ0cvv/yympqaVFhYKOnSNM+ZM2f0xhtvSLoUWpYvX67nnntOt956a+9ozciRI+XxeIbwowAAgEjnOLgUFBSora1NmzZtks/nU3p6uiorKzV58mRJks/nC1jT5aWXXtLFixf16KOP6tFHH+1tv++++/T6669f+ScAAABRw/E6LuHAOi4AANgn7Ou4AAAAhBPBBQAAWIPgAgAArEFwAQAA1iC4AAAAaxBcAACANQguAADAGgQXAABgDYILAACwBsEFAABYg+ACAACsQXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDUILgAAwBoEFwAAYI24cBcAwE5d3UaHG8+ppaNT40e7lZ06RrExrnCXBSDCEVwAOFZ1zKeS3Q3y+Tt727wet4rz07Qg3RvGygBEOqaKADhSdcynh7cfDQgtktTs79TD24+q6pgvTJUBiAYEFwCD1tVtVLK7Qaaff+tpK9ndoK7u/noAwJUjuAAYtMON5/qMtHyfkeTzd+pw47nhKwpAVCG4ABi0lo6BQ0sw/QDAKYILgEEbP9o9pP0AwCmCC4BBy04dI6/HrYEeenbp0tNF2aljhrMsAFGE4AJg0GJjXCrOT5OkPuGl53VxfhrruQCW6Oo2OnSiTX+vP6NDJ9qsuLGedVwAOLIg3avypRl91nFJZh0XwCq2rsfkMsZc9fGqvb1dHo9Hfr9fiYmJ4S4HgFg5F7BZz3pMlweAnu/g8qUZQxJeQnH9ZsQFQFBiY1zKmTo23GUAcOjH1mNy6dJ6TPPSkq/KX0a4xwUAgChi+3pMBBcAAKKI7esxEVwAAIgitq/HRHABACCK2L4eE8EFAIAoYvt6TAQXAACiTM96TMmewOmgZI97yB6FDhUehwYAIAotSPdqXlqydesxRW1wYfEsAEC0s3E9pqgMLrYucwwAQLSLuntcepY5vnzxnWZ/px7eflRVx3xhqgwAAPyYqAouP7bMsXRpmWMb/jomAADRKKqCi+3LHAMAEO2i6h4X25c5BiIZN8wDGIyoCi62L3MMRCpumAcwWFE1VWT7MsdAJOKGeQBORFVwsX2ZYyDScMM8AKeiKrhIdi9zDEQabpgH4FRU3ePSw9ZljoFIww3zAJyKyuAi2bnMMRBpuGEegFNRN1UE4OrBDfMAnCK4AAgbbpgH4BTBBUBYccM8ACeCCi5lZWVKTU2V2+1WZmamampqfrD/3r17lZmZKbfbrSlTpujFF18MqlgAkWlBulf7n7xdbz14q5675xa99eCt2v/k7YQWAH04Di4VFRVavXq1NmzYoLq6Os2dO1cLFy5UU1NTv/0bGxu1aNEizZ07V3V1dfr973+vVatW6d13373i4gFEjp4b5v/fLROVM3Us00MA+uUyxjha2Wn27NnKyMhQeXl5b9uMGTO0ePFilZaW9un/5JNPateuXTp+/HhvW2FhoT766CMdOnRoUO/Z3t4uj8cjv9+vxMREJ+UCAIAwCcX129GIy4ULF1RbW6u8vLyA9ry8PB08eLDffQ4dOtSn//z583XkyBF99913/e5z/vx5tbe3B2wAAACOgktra6u6urqUlJQU0J6UlKTm5uZ+92lubu63/8WLF9Xa2trvPqWlpfJ4PL1bSkqKkzIBAECECurmXJcrcO7ZGNOn7cf699feY/369fL7/b3bqVOngikTAABEGEcr544bN06xsbF9RldaWlr6jKr0SE5O7rd/XFycxo7tf+XahIQEJSQkOCkNAABEAUcjLvHx8crMzFR1dXVAe3V1tXJzc/vdJycnp0//PXv2KCsrSyNGjHBYLgAAiGaOp4qKior06quvatu2bTp+/LjWrFmjpqYmFRYWSro0zbN8+fLe/oWFhfr8889VVFSk48ePa9u2bdq6davWrl07dJ8CAABEBcd/ZLGgoEBtbW3atGmTfD6f0tPTVVlZqcmTJ0uSfD5fwJouqampqqys1Jo1a/TCCy9owoQJ2rJli+6+++6h+xQAACAqOF7HJRxYxwUAAPuE4vrteMQlHHqyFeu5AABgj57r9lCOkVgRXDo6OiSJ9VwAALBQR0eHPB7PkHwtK6aKuru7dfbsWY0ePfoH14vp0d7erpSUFJ06dYqppTDg+Icf5yD8OAfhxfEPv55z0NDQoBtuuEExMUEtHdeHFSMuMTExmjRpkuP9EhMT+Q8bRhz/8OMchB/nILw4/uE3ceLEIQstUpAr5wIAAIQDwQUAAFgjIoNLQkKCiouL+bMBYcLxDz/OQfhxDsKL4x9+oToHVtycCwAAIEXoiAsAAIhMBBcAAGANggsAALAGwQUAAFjD2uBSVlam1NRUud1uZWZmqqam5gf77927V5mZmXK73ZoyZYpefPHFYao0Mjk5/u+9957mzZun6667TomJicrJydEHH3wwjNVGJqffAz0OHDiguLg43XLLLaEtMAo4PQfnz5/Xhg0bNHnyZCUkJGjq1Knatm3bMFUbeZwe/x07dmjmzJm65ppr5PV6tWLFCrW1tQ1TtZFn3759ys/P14QJE+RyufT+++//6D5Dci02Fvrb3/5mRowYYV555RXT0NBgHn/8cTNq1Cjz+eef99v/5MmT5pprrjGPP/64aWhoMK+88ooZMWKEeeedd4a58sjg9Pg//vjj5plnnjGHDx82n376qVm/fr0ZMWKEOXr06DBXHjmcnoMeX375pZkyZYrJy8szM2fOHJ5iI1Qw5+Cuu+4ys2fPNtXV1aaxsdH8+9//NgcOHBjGqiOH0+NfU1NjYmJizHPPPWdOnjxpampqzE033WQWL148zJVHjsrKSrNhwwbz7rvvGklm586dP9h/qK7FVgaX7OxsU1hYGNB24403mnXr1vXb/3e/+5258cYbA9oeeughc+utt4asxkjm9Pj3Jy0tzZSUlAx1aVEj2HNQUFBg/vCHP5ji4mKCyxVyeg7+8Y9/GI/HY9ra2oajvIjn9Pj/6U9/MlOmTAlo27Jli5k0aVLIaowmgwkuQ3Uttm6q6MKFC6qtrVVeXl5Ae15eng4ePNjvPocOHerTf/78+Tpy5Ii+++67kNUaiYI5/pfr7u5WR0eHxowZE4oSI16w5+C1117TiRMnVFxcHOoSI14w52DXrl3KysrSs88+q4kTJ2r69Olau3atvv322+EoOaIEc/xzc3N1+vRpVVZWyhijL774Qu+8847uvPPO4SgZGrprsRV/ZPH7Wltb1dXVpaSkpID2pKQkNTc397tPc3Nzv/0vXryo1tZWeb3ekNUbaYI5/pf785//rK+//lpLliwJRYkRL5hz8Nlnn2ndunWqqalRXJx13/ZXnWDOwcmTJ7V//3653W7t3LlTra2teuSRR3Tu3Dnuc3EomOOfm5urHTt2qKCgQJ2dnbp48aLuuusuPf/888NRMjR012LrRlx6uFyugNfGmD5tP9a/v3YMjtPj3+Ott97SU089pYqKCo0fPz5U5UWFwZ6Drq4u3XvvvSopKdH06dOHq7yo4OT7oLu7Wy6XSzt27FB2drYWLVqkzZs36/XXX2fUJUhOjn9DQ4NWrVqljRs3qra2VlVVVWpsbFRhYeFwlIr/NRTXYut+9Ro3bpxiY2P7pOqWlpY+Sa5HcnJyv/3j4uI0duzYkNUaiYI5/j0qKiq0cuVKvf3227rjjjtCWWZEc3oOOjo6dOTIEdXV1emxxx6TdOkiaoxRXFyc9uzZo9tvv31Yao8UwXwfeL1eTZw4UR6Pp7dtxowZMsbo9OnTmjZtWkhrjiTBHP/S0lLNmTNHTzzxhCTp5ptv1qhRozR37lw9/fTTjLwPg6G6Fls34hIfH6/MzExVV1cHtFdXVys3N7fffXJycvr037Nnj7KysjRixIiQ1RqJgjn+0qWRlvvvv19vvvkmc8pXyOk5SExM1Mcff6z6+vrerbCwUDfccIPq6+s1e/bs4So9YgTzfTBnzhydPXtWX331VW/bp59+qpiYGE2aNCmk9UaaYI7/N998o5iYwEtebGyspP/7rR+hNWTXYke38l4leh6D27p1q2loaDCrV682o0aNMv/973+NMcasW7fOLFu2rLd/zyNYa9asMQ0NDWbr1q08Dn0FnB7/N99808TFxZkXXnjB+Hy+3u3LL78M10ewntNzcDmeKrpyTs9BR0eHmTRpkvnlL39pPvnkE7N3714zbdo088ADD4TrI1jN6fF/7bXXTFxcnCkrKzMnTpww+/fvN1lZWSY7OztcH8F6HR0dpq6uztTV1RlJZvPmzaaurq73kfRQXYutDC7GGPPCCy+YyZMnm/j4eJORkWH27t3b+2/33Xefue222wL6f/jhh2bWrFkmPj7eXH/99aa8vHyYK44sTo7/bbfdZiT12e67777hLzyCOP0e+D6Cy9Bweg6OHz9u7rjjDjNy5EgzadIkU1RUZL755pthrjpyOD3+W7ZsMWlpaWbkyJHG6/WaX/3qV+b06dPDXHXk+Oc///mDP9tDdS12GcMYGQAAsIN197gAAIDoRXABAADWILgAAABrEFwAAIA1CC4AAMAaBBcAAGANggsAALAGwQUAAFiD4AIAAKxBcAEAANYguAAAAGsQXAAAgDX+P0x3cfh7nhBhAAAAAElFTkSuQmCC",
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiMAAAGdCAYAAADAAnMpAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAn/UlEQVR4nO3df1Tb133/8ZcQBrme0Q52DEqMqeIlDpg1CeJAwPOyNTG100O/Pt0W2sx20tpbcJM2mJPua+adEnx6Dm2Xek62QOLETo7nH2Ork6w+ozQ6Z62DQzYfY3xOPbIli+kgtggDTiXSFqjF5/uHD3ytCBw+QuYi8Xyc8znHurpXeuvc46MX9/P5XDksy7IEAABgSIrpAgAAwMJGGAEAAEYRRgAAgFGEEQAAYBRhBAAAGEUYAQAARhFGAACAUYQRAABgVKrpAmZifHxcly9f1tKlS+VwOEyXAwAAZsCyLA0PD+vmm29WSsr06x8JEUYuX76snJwc02UAAIAY9Pb2auXKldM+nxBhZOnSpZKufpiMjAzD1QAAgJkIhULKycmZ/B6fTkKEkYlTMxkZGYQRAAASzCddYsEFrAAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjEmLTsxslPG7pTPeQ+odHtGKpS8XeTDlT+O0bAADm0oINI60XAqo/2aVAcGSyzeN2qa4iXxsLPAYrAwBgYVmQp2laLwS088i5iCAiSX3BEe08ck6tFwKGKgMAYOFZcGEkPG6p/mSXrCmem2irP9ml8PhUPQAAQLwtuDBypnsoakXkWpakQHBEZ7qH5q4oAAAWsAUXRvqHpw8isfQDAACzs+DCyIqlrrj2AwAAsxNTGGlsbJTX65XL5ZLP51NbW9u0fR955BE5HI6oY+3atTEXPRvF3kx53C5NdwOvQ1fvqin2Zs5lWbMSHrf09vuD+ufzl/T2+4Nc7wIASCi2b+1tbm5WdXW1GhsbtW7dOr3wwgvatGmTurq6tGrVqqj+zzzzjL7zne9MPr5y5YruvPNO/cmf/MnsKo+RM8Whuop87TxyTg4p4kLWiYBSV5GfMPuNcIsyACDROSzLsvVndElJiQoLC9XU1DTZlpeXp82bN6uhoeETx7/++uv64he/qO7ubuXm5s7oPUOhkNxut4LBoDIyMuyUO61k+BKfuEX54xM4EaOathQmzGcBACSfmX5/21oZGRsbU0dHh3bv3h3RXl5ervb29hm9xsGDB3X//fdfN4iMjo5qdHR08nEoFLJT5oxsLPBoQ352wu7A+km3KDt09RblDfnZCfOZAAALk60wMjAwoHA4rKysrIj2rKws9fX1feL4QCCgH/3oRzp27Nh1+zU0NKi+vt5OaTFxpjhUunrZDX+fG8HOLcqJ+hkBAAtDTBewOhyRf2lblhXVNpVXXnlFv/3bv63Nmzdft19tba2CweDk0dvbG0uZSY1blAEAycLWysjy5cvldDqjVkH6+/ujVks+zrIsHTp0SFu3blVaWtp1+6anpys9Pd1OaQsOtygDAJKFrZWRtLQ0+Xw++f3+iHa/36+ysrLrjj116pT++7//W9u3b7dfJaIk4y3KAICFyfZpmpqaGr300ks6dOiQ3nnnHe3atUs9PT2qqqqSdPUUy7Zt26LGHTx4UCUlJSooKJh91Zi8RVlSVCBJxFuUAQALl+19RiorKzU4OKi9e/cqEAiooKBALS0tk3fHBAIB9fT0RIwJBoM6ceKEnnnmmfhUDUlX7whq2lIYdYtydoLdogwAWNhs7zNiwo3YZySZhMethL1FGQCQvG7IPiOYnxL5FmUAABbcD+UBAID5hTACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAo2IKI42NjfJ6vXK5XPL5fGpra7tu/9HRUe3Zs0e5ublKT0/X6tWrdejQoZgKBgAAySXV7oDm5mZVV1ersbFR69at0wsvvKBNmzapq6tLq1atmnLMgw8+qA8//FAHDx7U7/zO76i/v19XrlyZdfEAACDxOSzLsuwMKCkpUWFhoZqamibb8vLytHnzZjU0NET1b21t1Ze+9CVdvHhRmZmZMRUZCoXkdrsVDAaVkZER02sAAIC5NdPvb1unacbGxtTR0aHy8vKI9vLycrW3t0855oc//KGKior0ve99T7fccotuv/12Pfnkk/r1r39t560BAECSsnWaZmBgQOFwWFlZWRHtWVlZ6uvrm3LMxYsXdfr0ablcLr322msaGBjQ1772NQ0NDU173cjo6KhGR0cnH4dCITtlAgCABBLTBawOhyPisWVZUW0TxsfH5XA4dPToURUXF+uBBx7Qvn379Morr0y7OtLQ0CC32z155OTkxFImAABIALbCyPLly+V0OqNWQfr7+6NWSyZ4PB7dcsstcrvdk215eXmyLEsffPDBlGNqa2sVDAYnj97eXjtlAgCABGIrjKSlpcnn88nv90e0+/1+lZWVTTlm3bp1unz5sj766KPJtnfffVcpKSlauXLllGPS09OVkZERcQAAgORk+zRNTU2NXnrpJR06dEjvvPOOdu3apZ6eHlVVVUm6uqqxbdu2yf4PPfSQli1bpq985Svq6urSm2++qW9+85v66le/qsWLF8fvkwAAgIRke5+RyspKDQ4Oau/evQoEAiooKFBLS4tyc3MlSYFAQD09PZP9f+u3fkt+v19f//rXVVRUpGXLlunBBx/Ut7/97fh9CgAAkLBs7zNiAvuMAACQeG7IPiMAAADxRhgBAABGEUYAAIBRhBEAAGAUYQQAABhFGAEAAEYRRgAAgFGEEQAAYBRhBAAAGEUYAQAARhFGAACAUYQRAABgFGEEAAAYRRgBAABGEUYAAIBRhBEAAGAUYQQAABhFGAEAAEYRRgAAgFGppgtAYgiPWzrTPaT+4RGtWOpSsTdTzhSH6bIAAEmAMIJP1HohoPqTXQoERybbPG6X6irytbHAY7AyAEAy4DQNrqv1QkA7j5yLCCKS1Bcc0c4j59R6IWCoMgBAsiCMYFrhcUv1J7tkTfHcRFv9yS6Fx6fqAQDAzBBGMK0z3UNRKyLXsiQFgiM60z00d0UBAJIOYQTT6h+ePojE0g8AgKkQRjCtFUtdce0HAMBUCCOYVrE3Ux63S9PdwOvQ1btqir2Zc1kWACDJEEYwLWeKQ3UV+ZIUFUgmHtdV5LPfCABgVggjuK6NBR41bSlUtjvyVEy226WmLYXsMwIAmDU2PcMn2ljg0Yb8bHZgBQDcEIQRzIgzxaHS1ctMlwEASEKcpgEAAEYRRgAAgFGEEQAAYBRhBAAAGEUYAQAARhFGAACAUTGFkcbGRnm9XrlcLvl8PrW1tU3b96c//akcDkfU8Z//+Z8xFw0AAJKH7TDS3Nys6upq7dmzR52dnVq/fr02bdqknp6e6477r//6LwUCgcnjtttui7loAACQPGyHkX379mn79u3asWOH8vLytH//fuXk5Kipqem641asWKHs7OzJw+l0xlw0AABIHrbCyNjYmDo6OlReXh7RXl5ervb29uuOvfvuu+XxeHTffffpJz/5yXX7jo6OKhQKRRwAACA52QojAwMDCofDysrKimjPyspSX1/flGM8Ho8OHDigEydO6NVXX9WaNWt033336c0335z2fRoaGuR2uyePnJwcO2UCAIAEEtNv0zgckT+QZllWVNuENWvWaM2aNZOPS0tL1dvbq6efflq///u/P+WY2tpa1dTUTD4OhUIEEgAAkpStlZHly5fL6XRGrYL09/dHrZZczz333KP33ntv2ufT09OVkZERcQAAgORkK4ykpaXJ5/PJ7/dHtPv9fpWVlc34dTo7O+XxeOy8NQAASFK2T9PU1NRo69atKioqUmlpqQ4cOKCenh5VVVVJunqK5dKlSzp8+LAkaf/+/fr0pz+ttWvXamxsTEeOHNGJEyd04sSJ+H4SAACQkGyHkcrKSg0ODmrv3r0KBAIqKChQS0uLcnNzJUmBQCBiz5GxsTE9+eSTunTpkhYvXqy1a9fqX/7lX/TAAw/E71MAAICE5bAsyzJdxCcJhUJyu90KBoNcPwIAQIKY6fc3v00DAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKNSTRcAAMB8FR63dKZ7SP3DI1qx1KVib6acKQ7TZSUdwggAAFNovRBQ/ckuBYIjk20et0t1FfnaWOAxWFny4TQNAAAf03ohoJ1HzkUEEUnqC45o55Fzar0QMFRZciKMAABwjfC4pfqTXbKmeG6irf5kl8LjU/VALAgjAABc40z3UNSKyLUsSYHgiM50D81dUUmOMAIAwDX6h6cPIrH0wycjjAAAcI0VS11x7YdPRhgBAOAaxd5MedwuTXcDr0NX76op9mbOZVlJjTACAMA1nCkO1VXkS1JUIJl4XFeRz34jcUQYAQDgYzYWeNS0pVDZ7shTMdlul5q2FLLPSJyx6RkAAFPYWODRhvxsdmCdA4QRAACm4UxxqHT1MtNlJD1O0wAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAqJjCSGNjo7xer1wul3w+n9ra2mY07q233lJqaqruuuuuWN4WAAAkIdthpLm5WdXV1dqzZ486Ozu1fv16bdq0ST09PdcdFwwGtW3bNt13330xFwsAAJKPw7Isy86AkpISFRYWqqmpabItLy9PmzdvVkNDw7TjvvSlL+m2226T0+nU66+/rvPnz8/4PUOhkNxut4LBoDIyMuyUCwAADJnp97etlZGxsTF1dHSovLw8or28vFzt7e3Tjnv55Zf1/vvvq66ubkbvMzo6qlAoFHEAAIDkZCuMDAwMKBwOKysrK6I9KytLfX19U4557733tHv3bh09elSpqTPbfb6hoUFut3vyyMnJsVMmAABIIDFdwOpwRP5IkGVZUW2SFA6H9dBDD6m+vl633377jF+/trZWwWBw8ujt7Y2lTAAAkABs/VDe8uXL5XQ6o1ZB+vv7o1ZLJGl4eFhnz55VZ2enHn/8cUnS+Pi4LMtSamqq3njjDX32s5+NGpeenq709HQ7pQEAgARla2UkLS1NPp9Pfr8/ot3v96usrCyqf0ZGhn72s5/p/Pnzk0dVVZXWrFmj8+fPq6SkZHbVAwCAhGdrZUSSampqtHXrVhUVFam0tFQHDhxQT0+PqqqqJF09xXLp0iUdPnxYKSkpKigoiBi/YsUKuVyuqHYAALAw2Q4jlZWVGhwc1N69exUIBFRQUKCWlhbl5uZKkgKBwCfuOQIAADDB9j4jJrDPCAAAiWem39+2V0YAADApPG7pTPeQ+odHtGKpS8XeTDlTou/oROIgjAAAEkbrhYDqT3YpEByZbPO4XaqryNfGAo/ByjAb/GovACAhtF4IaOeRcxFBRJL6giPaeeScWi8EDFWG2SKMAADmvfC4pfqTXZrqIseJtvqTXQqPz/vLIDEFwggAYN470z0UtSJyLUtSIDiiM91Dc1cU4oYwAgCY9/qHpw8isfTD/EIYAQDMeyuWuuLaD/MLYQQAMO8VezPlcbs03Q28Dl29q6bYmzmXZSFOCCMAgHnPmeJQXUW+JEUFkonHdRX57DeSoAgjAICEsLHAo6Ythcp2R56KyXa71LSlkH1GEhibngEAEsbGAo825GezA2uSIYwAABKKM8Wh0tXLTJeBOOI0DQAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwKiYwkhjY6O8Xq9cLpd8Pp/a2tqm7Xv69GmtW7dOy5Yt0+LFi3XHHXfob/7mb2IuGAAAJJdUuwOam5tVXV2txsZGrVu3Ti+88II2bdqkrq4urVq1Kqr/kiVL9Pjjj+szn/mMlixZotOnT+vRRx/VkiVL9Od//udx+RAAACBxOSzLsuwMKCkpUWFhoZqamibb8vLytHnzZjU0NMzoNb74xS9qyZIl+vu///sZ9Q+FQnK73QoGg8rIyLBTLgAAMGSm39+2TtOMjY2po6ND5eXlEe3l5eVqb2+f0Wt0dnaqvb1d995777R9RkdHFQqFIg4AAJCcbIWRgYEBhcNhZWVlRbRnZWWpr6/vumNXrlyp9PR0FRUV6bHHHtOOHTum7dvQ0CC32z155OTk2CkTAAAkkJguYHU4HBGPLcuKavu4trY2nT17Vs8//7z279+v48ePT9u3trZWwWBw8ujt7Y2lTAAAkABsXcC6fPlyOZ3OqFWQ/v7+qNWSj/N6vZKk3/3d39WHH36op556Sl/+8pen7Juenq709HQ7pQEAgARla2UkLS1NPp9Pfr8/ot3v96usrGzGr2NZlkZHR+28NQAASFK2b+2tqanR1q1bVVRUpNLSUh04cEA9PT2qqqqSdPUUy6VLl3T48GFJ0nPPPadVq1bpjjvukHR135Gnn35aX//61+P4MewLj1s60z2k/uERrVjqUrE3U86U659qAgAA8Wc7jFRWVmpwcFB79+5VIBBQQUGBWlpalJubK0kKBALq6emZ7D8+Pq7a2lp1d3crNTVVq1ev1ne+8x09+uij8fsUNrVeCKj+ZJcCwZHJNo/bpbqKfG0s8BirCwCAhcj2PiMmxHOfkdYLAe08ck4f/9ATayJNWwoJJAAAxMEN2Wck0YXHLdWf7IoKIpIm2+pPdik8Pu/zGQAASWNBhZEz3UMRp2Y+zpIUCI7oTPfQ3BUFAMACt6DCSP/w9EEkln4AAGD2FlQYWbHUFdd+AABg9hZUGCn2Zsrjdmm6G3gdunpXTbE3cy7LAgBgQVtQYcSZ4lBdRb4kRQWSicd1FfnsNwIAwBxaUGFEkjYWeNS0pVDZ7shTMdluF7f1AgBggO1Nz5LBxgKPNuRnswMrAADzwIIMI9LVUzalq5eZLgMAgAVvwZ2mAQAA8wthBAAAGEUYAQAARhFGAACAUYQRAABgFGEEAAAYRRgBAABGEUYAAIBRhBEAAGAUYQQAABhFGAEAAEYRRgAAgFGEEQAAYBRhBAAAGJVqugAASHThcUtnuofUPzyiFUtdKvZmypniMF0WkDAIIwAwC60XAqo/2aVAcGSyzeN2qa4iXxsLPAYrAxIHp2kAIEatFwLaeeRcRBCRpL7giHYeOafWCwFDlQGJhTACADEIj1uqP9kla4rnJtrqT3YpPD5VDwDXIowAQAzOdA9FrYhcy5IUCI7oTPfQ3BUFJCjCCADEoH94+iASSz9gISOMAEAMVix1xbUfsJARRgAgBsXeTHncLk13A69DV++qKfZmzmVZQEIijABADJwpDtVV5EtSVCCZeFxXkc9+I8AMEEYAIEYbCzxq2lKobHfkqZhst0tNWwrZZwSYITY9A4BZ2Fjg0Yb8bHZgBWaBMAIAs+RMcah09TLTZQAJi9M0AADAqJjCSGNjo7xer1wul3w+n9ra2qbt++qrr2rDhg266aablJGRodLSUv34xz+OuWAAAJBcbIeR5uZmVVdXa8+ePers7NT69eu1adMm9fT0TNn/zTff1IYNG9TS0qKOjg794R/+oSoqKtTZ2Tnr4gEAQOJzWJZl64cTSkpKVFhYqKampsm2vLw8bd68WQ0NDTN6jbVr16qyslLf+ta3ZtQ/FArJ7XYrGAwqIyPDTrkAAMCQmX5/21oZGRsbU0dHh8rLyyPay8vL1d7ePqPXGB8f1/DwsDIz2QgIAADYvJtmYGBA4XBYWVlZEe1ZWVnq6+ub0Wt8//vf1y9/+Us9+OCD0/YZHR3V6Ojo5ONQKGSnTAAAkEBiuoDV4Yi8f96yrKi2qRw/flxPPfWUmpubtWLFimn7NTQ0yO12Tx45OTmxlAkAABKArTCyfPlyOZ3OqFWQ/v7+qNWSj2tubtb27dv1j//4j7r//vuv27e2tlbBYHDy6O3ttVMmAABIILbCSFpamnw+n/x+f0S73+9XWVnZtOOOHz+uRx55RMeOHdPnP//5T3yf9PR0ZWRkRBwAACA52d6BtaamRlu3blVRUZFKS0t14MAB9fT0qKqqStLVVY1Lly7p8OHDkq4GkW3btumZZ57RPffcM7mqsnjxYrnd7jh+FAAAkIhsh5HKykoNDg5q7969CgQCKigoUEtLi3JzcyVJgUAgYs+RF154QVeuXNFjjz2mxx57bLL94Ycf1iuvvDL7TwAAABKa7X1GTGCfEQAAEs8N2WcEAAAg3ggjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjEo1XQAAADAjPG7pTPeQ+odHtGKpS8XeTDlTHHNeR0wrI42NjfJ6vXK5XPL5fGpra5u2byAQ0EMPPaQ1a9YoJSVF1dXVsdYKAADipPVCQL/33X/Vl1/8Nz3xD+f15Rf/Tb/33X9V64XAnNdiO4w0Nzerurpae/bsUWdnp9avX69Nmzapp6dnyv6jo6O66aabtGfPHt15552zLhgAAMxO64WAdh45p0BwJKK9LziinUfOzXkgcViWZdkZUFJSosLCQjU1NU225eXlafPmzWpoaLju2D/4gz/QXXfdpf3799sqMhQKye12KxgMKiMjw9ZYAADw/4XHLf3ed/81KohMcEjKdrt0+v9+dtanbGb6/W1rZWRsbEwdHR0qLy+PaC8vL1d7e3tslU5hdHRUoVAo4gAAALN3pnto2iAiSZakQHBEZ7qH5qwmW2FkYGBA4XBYWVlZEe1ZWVnq6+uLW1ENDQ1yu92TR05OTtxeGwCAhax/ePogEku/eIjpAlaHI3LZxrKsqLbZqK2tVTAYnDx6e3vj9toAACxkK5a64tovHmzd2rt8+XI5nc6oVZD+/v6o1ZLZSE9PV3p6etxeDwAAXFXszZTH7VJfcERTXTQ6cc1IsTdzzmqytTKSlpYmn88nv98f0e73+1VWVhbXwgAAQPw5Uxyqq8iXdDV4XGvicV1F/pzuN2L7NE1NTY1eeuklHTp0SO+884527dqlnp4eVVVVSbp6imXbtm0RY86fP6/z58/ro48+0v/+7//q/Pnz6urqis8nAAAAtmws8KhpS6Gy3ZGnYrLdLjVtKdTGAs+c1mN7B9bKykoNDg5q7969CgQCKigoUEtLi3JzcyVd3eTs43uO3H333ZP/7ujo0LFjx5Sbm6uf//zns6seAADEZGOBRxvys+fFDqy29xkxgX1GAABIPDdknxEAAIB4I4wAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKMIIwAAwCjCCAAAMIowAgAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADCKMAIAAIwijAAAAKNSTReA+S08bulM95D6h0e0YqlLxd5MOVMcpssCACQRwgim1XohoPqTXQoERybbPG6X6irytbHAY7AyAEAy4TQNptR6IaCdR85FBBFJ6guOaOeRc2q9EDBUGQAg2RBGECU8bqn+ZJesKZ6baKs/2aXw+FQ9AACwhzCCKGe6h6JWRK5lSQoER3Sme2juigIAxEV43NLb7w/qn89f0tvvD86LPyy5ZgRR+oenDyKx9AMAzA/z9VrAmFZGGhsb5fV65XK55PP51NbWdt3+p06dks/nk8vl0q233qrnn38+pmIxN1YsdcW1HwDAvPl8LaDtMNLc3Kzq6mrt2bNHnZ2dWr9+vTZt2qSenp4p+3d3d+uBBx7Q+vXr1dnZqb/8y7/UN77xDZ04cWLWxePGKPZmyuN2abobeB26mqSLvZlzWRYAIEbz/VpA22Fk37592r59u3bs2KG8vDzt379fOTk5ampqmrL/888/r1WrVmn//v3Ky8vTjh079NWvflVPP/30rIvHjeFMcaiuIl+SogLJxOO6inz2GwGABDHfrwW0FUbGxsbU0dGh8vLyiPby8nK1t7dPOebtt9+O6v+5z31OZ8+e1W9+85spx4yOjioUCkUcmFsbCzxq2lKobHfkqZhst0tNWwrZZwQAEsh8vxbQ1gWsAwMDCofDysrKimjPyspSX1/flGP6+vqm7H/lyhUNDAzI44n+UmtoaFB9fb2d0nADbCzwaEN+NjuwAkCCm+/XAsZ0AavDEfllZFlWVNsn9Z+qfUJtba2CweDk0dvbG0uZiANnikOlq5fp/9x1i0pXLyOIAEACmu/XAtoKI8uXL5fT6YxaBenv749a/ZiQnZ09Zf/U1FQtW7ZsyjHp6enKyMiIOAAAQGzm+7WAtsJIWlqafD6f/H5/RLvf71dZWdmUY0pLS6P6v/HGGyoqKtKiRYtslgsAAGIxn68FtL3pWU1NjbZu3aqioiKVlpbqwIED6unpUVVVlaSrp1guXbqkw4cPS5Kqqqr0d3/3d6qpqdGf/dmf6e2339bBgwd1/Pjx+H4SAABwXfP1WkDbYaSyslKDg4Pau3evAoGACgoK1NLSotzcXElSIBCI2HPE6/WqpaVFu3bt0nPPPaebb75Zzz77rP7oj/4ofp8CAADMyMS1gPOJw5q4mnQeC4VCcrvdCgaDXD8CAECCmOn3Nz+UBwAAjCKMAAAAowgjAADAKMIIAAAwijACAACMIowAAACjCCMAAMAowggAADDK9g6sJkzsyxYKhQxXAgAAZmrie/uT9ldNiDAyPDwsScrJyTFcCQAAsGt4eFhut3va5xNiO/jx8XFdvnxZS5culcNh9sd85pNQKKScnBz19vayTf48wrzMP8zJ/MS8zE/xnBfLsjQ8PKybb75ZKSnTXxmSECsjKSkpWrlypeky5q2MjAz+I89DzMv8w5zMT8zL/BSvebneisgELmAFAABGEUYAAIBRhJEElp6errq6OqWnp5suBddgXuYf5mR+Yl7mJxPzkhAXsAIAgOTFyggAADCKMAIAAIwijAAAAKMIIwAAwCjCyDzX2Ngor9crl8sln8+ntra2afu++uqr2rBhg2666SZlZGSotLRUP/7xj+ew2oXBzpxc66233lJqaqruuuuuG1vgAmV3XkZHR7Vnzx7l5uYqPT1dq1ev1qFDh+ao2oXD7rwcPXpUd955pz71qU/J4/HoK1/5igYHB+eo2uT35ptvqqKiQjfffLMcDodef/31Txxz6tQp+Xw+uVwu3XrrrXr++efjX5iFeesf/uEfrEWLFlkvvvii1dXVZT3xxBPWkiVLrP/5n/+Zsv8TTzxhffe737XOnDljvfvuu1Ztba21aNEi69y5c3NcefKyOycTfvGLX1i33nqrVV5ebt15551zU+wCEsu8fOELX7BKSkosv99vdXd3W//+7/9uvfXWW3NYdfKzOy9tbW1WSkqK9cwzz1gXL1602trarLVr11qbN2+e48qTV0tLi7Vnzx7rxIkTliTrtddeu27/ixcvWp/61KesJ554wurq6rJefPFFa9GiRdYPfvCDuNZFGJnHiouLraqqqoi2O+64w9q9e/eMXyM/P9+qr6+Pd2kLVqxzUllZaf3VX/2VVVdXRxi5AezOy49+9CPL7XZbg4ODc1HegmV3Xv76r//auvXWWyPann32WWvlypU3rMaFbCZh5C/+4i+sO+64I6Lt0Ucfte6555641sJpmnlqbGxMHR0dKi8vj2gvLy9Xe3v7jF5jfHxcw8PDyszMvBElLjixzsnLL7+s999/X3V1dTe6xAUplnn54Q9/qKKiIn3ve9/TLbfcottvv11PPvmkfv3rX89FyQtCLPNSVlamDz74QC0tLbIsSx9++KF+8IMf6POf//xclIwpvP3221Fz+LnPfU5nz57Vb37zm7i9T0L8UN5CNDAwoHA4rKysrIj2rKws9fX1zeg1vv/97+uXv/ylHnzwwRtR4oITy5y899572r17t9ra2pSayn+3GyGWebl48aJOnz4tl8ul1157TQMDA/ra176moaEhrhuJk1jmpaysTEePHlVlZaVGRkZ05coVfeELX9Df/u3fzkXJmEJfX9+Uc3jlyhUNDAzI4/HE5X1YGZnnHA5HxGPLsqLapnL8+HE99dRTam5u1ooVK25UeQvSTOckHA7roYceUn19vW6//fa5Km/BsvN/ZXx8XA6HQ0ePHlVxcbEeeOAB7du3T6+88gqrI3FmZ166urr0jW98Q9/61rfU0dGh1tZWdXd3q6qqai5KxTSmmsOp2meDP9XmqeXLl8vpdEb9BdHf3x+VUj+uublZ27dv1z/90z/p/vvvv5FlLih252R4eFhnz55VZ2enHn/8cUlXvwQty1JqaqreeOMNffazn52T2pNZLP9XPB6PbrnlloifNs/Ly5NlWfrggw9022233dCaF4JY5qWhoUHr1q3TN7/5TUnSZz7zGS1ZskTr16/Xt7/97bj9FY6Zy87OnnIOU1NTtWzZsri9Dysj81RaWpp8Pp/8fn9Eu9/vV1lZ2bTjjh8/rkceeUTHjh3jPGuc2Z2TjIwM/exnP9P58+cnj6qqKq1Zs0bnz59XSUnJXJWe1GL5v7Ju3TpdvnxZH3300WTbu+++q5SUFK1cufKG1rtQxDIvv/rVr5SSEvm15HQ6Jf3/v8Yxt0pLS6Pm8I033lBRUZEWLVoUvzeK6+WwiKuJ2+IOHjxodXV1WdXV1daSJUusn//855ZlWdbu3butrVu3TvY/duyYlZqaaj333HNWIBCYPH7xi1+Y+ghJx+6cfBx309wYdudleHjYWrlypfXHf/zH1n/8x39Yp06dsm677TZrx44dpj5CUrI7Ly+//LKVmppqNTY2Wu+//751+vRpq6ioyCouLjb1EZLO8PCw1dnZaXV2dlqSrH379lmdnZ2Tt1t/fE4mbu3dtWuX1dXVZR08eJBbexei5557zsrNzbXS0tKswsJC69SpU5PPPfzww9a99947+fjee++1JEUdDz/88NwXnsTszMnHEUZuHLvz8s4771j333+/tXjxYmvlypVWTU2N9atf/WqOq05+dufl2WeftfLz863FixdbHo/H+tM//VPrgw8+mOOqk9dPfvKT635PTDUnP/3pT627777bSktLsz796U9bTU1Nca/LYVmsfQEAAHO4ZgQAABhFGAEAAEYRRgAAgFGEEQAAYBRhBAAAGEUYAQAARhFGAACAUYQRAABgFGEEAAAYRRgBAABGEUYAAIBRhBEAAGDU/wMHxRPbcikYmQAAAABJRU5ErkJggg==",
       "text/plain": [
        "<Figure size 640x480 with 1 Axes>"
       ]
@@ -757,7 +801,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 25,
+   "execution_count": 27,
    "id": "1cd000bd-9b24-4c39-9cac-70a3291d0660",
    "metadata": {},
    "outputs": [],
@@ -784,7 +828,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 26,
+   "execution_count": 28,
    "id": "7964df3c-55af-4c25-afc5-9e07accb606a",
    "metadata": {},
    "outputs": [
@@ -804,7 +848,8 @@
     "n1 = greater_than_half(label=\"n1\")\n",
     "\n",
     "wf = Workflow(\"my_wf\", n1)  # As args at init\n",
-    "wf.create.SingleValue(n1.node_function, output_labels=\"p1\", label=\"n2\")  # Instantiating from the class with a function\n",
+    "wf.create.SingleValue(n1.node_function, output_labels=\"p1\", label=\"n2\")  \n",
+    "# ^ Instantiating from the class with a function\n",
     "wf.add(greater_than_half(label=\"n3\"))  # Instantiating then passing to node adder\n",
     "wf.n4 = greater_than_half(label=\"will_get_overwritten_with_n4\")  # Set attribute to instance\n",
     "greater_than_half(label=\"n5\", parent=wf)  # By passing the workflow to the node\n",
@@ -825,7 +870,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 27,
+   "execution_count": 29,
    "id": "809178a5-2e6b-471d-89ef-0797db47c5ad",
    "metadata": {},
    "outputs": [
@@ -879,7 +924,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 28,
+   "execution_count": 30,
    "id": "52c48d19-10a2-4c48-ae81-eceea4129a60",
    "metadata": {},
    "outputs": [
@@ -889,7 +934,7 @@
        "{'ay': 3, 'a + b + 2': 7}"
       ]
      },
-     "execution_count": 28,
+     "execution_count": 30,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -912,12 +957,12 @@
    "id": "e3f4b51b-7c28-47f7-9822-b4755e12bd4d",
    "metadata": {},
    "source": [
-    "We can see now why we've been trying to givesuccinct string labels to our `Function` node outputs instead of just arbitrary expressions! The expressions are typically not dot-accessible:"
+    "We can see now why we've been trying to give succinct string labels to our `Function` node outputs instead of just arbitrary expressions! The expressions are typically not dot-accessible:"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 29,
+   "execution_count": 31,
    "id": "bb35ba3e-602d-4c9c-b046-32da9401dd1c",
    "metadata": {},
    "outputs": [
@@ -927,7 +972,7 @@
        "(7, 3)"
       ]
      },
-     "execution_count": 29,
+     "execution_count": 31,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -946,7 +991,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 30,
+   "execution_count": 32,
    "id": "2b0d2c85-9049-417b-8739-8a8432a1efbe",
    "metadata": {},
    "outputs": [
@@ -965,127 +1010,127 @@
        "<title>clustersimple</title>\n",
        "<polygon fill=\"white\" stroke=\"none\" points=\"-4,4 -4,-328.25 759.93,-328.25 759.93,4 -4,4\"/>\n",
        "<text text-anchor=\"middle\" x=\"377.96\" y=\"-4.95\" font-family=\"Times,serif\" font-size=\"14.00\">simple: Workflow</text>\n",
+       "<g id=\"clust6\" class=\"cluster\">\n",
+       "<title>clustersimpleb</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust6_l_0\" gradientUnits=\"userSpaceOnUse\" x1=\"362.7\" y1=\"-75.25\" x2=\"362.7\" y2=\"-241.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#17becf;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#b9ecf1;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust6_l_0)\" stroke=\"black\" points=\"274.7,-75.25 274.7,-241.25 450.7,-241.25 450.7,-75.25 274.7,-75.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"362.7\" y=\"-223.95\" font-family=\"Times,serif\" font-size=\"14.00\">b: AddOne</text>\n",
+       "</g>\n",
+       "<g id=\"clust7\" class=\"cluster\">\n",
+       "<title>clustersimplebInputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust7_l_1\" gradientUnits=\"userSpaceOnUse\" x1=\"282.7\" y1=\"-147.25\" x2=\"352.7\" y2=\"-147.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust7_l_1)\" stroke=\"black\" points=\"282.7,-83.25 282.7,-211.25 352.7,-211.25 352.7,-83.25 282.7,-83.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"317.7\" y=\"-193.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust8\" class=\"cluster\">\n",
+       "<title>clustersimplebOutputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust8_l_2\" gradientUnits=\"userSpaceOnUse\" x1=\"442.7\" y1=\"-147.25\" x2=\"372.7\" y2=\"-147.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust8_l_2)\" stroke=\"black\" points=\"372.7,-83.25 372.7,-211.25 442.7,-211.25 442.7,-83.25 372.7,-83.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"407.7\" y=\"-193.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
+       "</g>\n",
        "<g id=\"clust9\" class=\"cluster\">\n",
        "<title>clustersimplesum</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust9_l_0\" gradientUnits=\"userSpaceOnUse\" x1=\"546.7\" y1=\"-75.25\" x2=\"546.7\" y2=\"-295.25\" >\n",
+       "<linearGradient id=\"clust9_l_3\" gradientUnits=\"userSpaceOnUse\" x1=\"546.7\" y1=\"-75.25\" x2=\"546.7\" y2=\"-295.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#17becf;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#b9ecf1;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust9_l_0)\" stroke=\"black\" points=\"458.7,-75.25 458.7,-295.25 634.7,-295.25 634.7,-75.25 458.7,-75.25\"/>\n",
+       "<polygon fill=\"url(#clust9_l_3)\" stroke=\"black\" points=\"458.7,-75.25 458.7,-295.25 634.7,-295.25 634.7,-75.25 458.7,-75.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"546.7\" y=\"-277.95\" font-family=\"Times,serif\" font-size=\"14.00\">sum: AddNode</text>\n",
        "</g>\n",
        "<g id=\"clust10\" class=\"cluster\">\n",
        "<title>clustersimplesumInputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust10_l_1\" gradientUnits=\"userSpaceOnUse\" x1=\"466.7\" y1=\"-174.25\" x2=\"536.7\" y2=\"-174.25\" >\n",
+       "<linearGradient id=\"clust10_l_4\" gradientUnits=\"userSpaceOnUse\" x1=\"466.7\" y1=\"-174.25\" x2=\"536.7\" y2=\"-174.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust10_l_1)\" stroke=\"black\" points=\"466.7,-83.25 466.7,-265.25 536.7,-265.25 536.7,-83.25 466.7,-83.25\"/>\n",
+       "<polygon fill=\"url(#clust10_l_4)\" stroke=\"black\" points=\"466.7,-83.25 466.7,-265.25 536.7,-265.25 536.7,-83.25 466.7,-83.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"501.7\" y=\"-247.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
        "</g>\n",
        "<g id=\"clust11\" class=\"cluster\">\n",
        "<title>clustersimplesumOutputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust11_l_2\" gradientUnits=\"userSpaceOnUse\" x1=\"626.7\" y1=\"-147.25\" x2=\"556.7\" y2=\"-147.25\" >\n",
+       "<linearGradient id=\"clust11_l_5\" gradientUnits=\"userSpaceOnUse\" x1=\"626.7\" y1=\"-147.25\" x2=\"556.7\" y2=\"-147.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust11_l_2)\" stroke=\"black\" points=\"556.7,-83.25 556.7,-211.25 626.7,-211.25 626.7,-83.25 556.7,-83.25\"/>\n",
+       "<polygon fill=\"url(#clust11_l_5)\" stroke=\"black\" points=\"556.7,-83.25 556.7,-211.25 626.7,-211.25 626.7,-83.25 556.7,-83.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"591.7\" y=\"-193.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
        "</g>\n",
        "<g id=\"clust1\" class=\"cluster\">\n",
        "<title>clustersimpleInputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust1_l_3\" gradientUnits=\"userSpaceOnUse\" x1=\"8\" y1=\"-121.25\" x2=\"78.7\" y2=\"-121.25\" >\n",
+       "<linearGradient id=\"clust1_l_6\" gradientUnits=\"userSpaceOnUse\" x1=\"8\" y1=\"-121.25\" x2=\"78.7\" y2=\"-121.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust1_l_3)\" stroke=\"black\" points=\"8,-30.25 8,-212.25 78.7,-212.25 78.7,-30.25 8,-30.25\"/>\n",
+       "<polygon fill=\"url(#clust1_l_6)\" stroke=\"black\" points=\"8,-30.25 8,-212.25 78.7,-212.25 78.7,-30.25 8,-30.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"43.35\" y=\"-194.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
        "</g>\n",
        "<g id=\"clust2\" class=\"cluster\">\n",
        "<title>clustersimpleOutputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust2_l_4\" gradientUnits=\"userSpaceOnUse\" x1=\"747.93\" y1=\"-121.25\" x2=\"646.7\" y2=\"-121.25\" >\n",
+       "<linearGradient id=\"clust2_l_7\" gradientUnits=\"userSpaceOnUse\" x1=\"747.93\" y1=\"-121.25\" x2=\"646.7\" y2=\"-121.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust2_l_4)\" stroke=\"black\" points=\"646.7,-30.25 646.7,-212.25 747.93,-212.25 747.93,-30.25 646.7,-30.25\"/>\n",
+       "<polygon fill=\"url(#clust2_l_7)\" stroke=\"black\" points=\"646.7,-30.25 646.7,-212.25 747.93,-212.25 747.93,-30.25 646.7,-30.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"697.31\" y=\"-194.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
        "</g>\n",
        "<g id=\"clust3\" class=\"cluster\">\n",
        "<title>clustersimplea</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust3_l_5\" gradientUnits=\"userSpaceOnUse\" x1=\"178.7\" y1=\"-150.25\" x2=\"178.7\" y2=\"-316.25\" >\n",
+       "<linearGradient id=\"clust3_l_8\" gradientUnits=\"userSpaceOnUse\" x1=\"178.7\" y1=\"-150.25\" x2=\"178.7\" y2=\"-316.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#17becf;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#b9ecf1;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust3_l_5)\" stroke=\"black\" points=\"90.7,-150.25 90.7,-316.25 266.7,-316.25 266.7,-150.25 90.7,-150.25\"/>\n",
+       "<polygon fill=\"url(#clust3_l_8)\" stroke=\"black\" points=\"90.7,-150.25 90.7,-316.25 266.7,-316.25 266.7,-150.25 90.7,-150.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"178.7\" y=\"-298.95\" font-family=\"Times,serif\" font-size=\"14.00\">a: AddOne</text>\n",
        "</g>\n",
        "<g id=\"clust4\" class=\"cluster\">\n",
        "<title>clustersimpleaInputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust4_l_6\" gradientUnits=\"userSpaceOnUse\" x1=\"98.7\" y1=\"-222.25\" x2=\"168.7\" y2=\"-222.25\" >\n",
+       "<linearGradient id=\"clust4_l_9\" gradientUnits=\"userSpaceOnUse\" x1=\"98.7\" y1=\"-222.25\" x2=\"168.7\" y2=\"-222.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust4_l_6)\" stroke=\"black\" points=\"98.7,-158.25 98.7,-286.25 168.7,-286.25 168.7,-158.25 98.7,-158.25\"/>\n",
+       "<polygon fill=\"url(#clust4_l_9)\" stroke=\"black\" points=\"98.7,-158.25 98.7,-286.25 168.7,-286.25 168.7,-158.25 98.7,-158.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"133.7\" y=\"-268.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
        "</g>\n",
        "<g id=\"clust5\" class=\"cluster\">\n",
        "<title>clustersimpleaOutputs</title>\n",
        "<defs>\n",
-       "<linearGradient id=\"clust5_l_7\" gradientUnits=\"userSpaceOnUse\" x1=\"258.7\" y1=\"-222.25\" x2=\"188.7\" y2=\"-222.25\" >\n",
+       "<linearGradient id=\"clust5_l_10\" gradientUnits=\"userSpaceOnUse\" x1=\"258.7\" y1=\"-222.25\" x2=\"188.7\" y2=\"-222.25\" >\n",
        "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
        "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
        "</linearGradient>\n",
        "</defs>\n",
-       "<polygon fill=\"url(#clust5_l_7)\" stroke=\"black\" points=\"188.7,-158.25 188.7,-286.25 258.7,-286.25 258.7,-158.25 188.7,-158.25\"/>\n",
+       "<polygon fill=\"url(#clust5_l_10)\" stroke=\"black\" points=\"188.7,-158.25 188.7,-286.25 258.7,-286.25 258.7,-158.25 188.7,-158.25\"/>\n",
        "<text text-anchor=\"middle\" x=\"223.7\" y=\"-268.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
        "</g>\n",
-       "<g id=\"clust6\" class=\"cluster\">\n",
-       "<title>clustersimpleb</title>\n",
-       "<defs>\n",
-       "<linearGradient id=\"clust6_l_8\" gradientUnits=\"userSpaceOnUse\" x1=\"362.7\" y1=\"-75.25\" x2=\"362.7\" y2=\"-241.25\" >\n",
-       "<stop offset=\"0\" style=\"stop-color:#17becf;stop-opacity:1.;\"/>\n",
-       "<stop offset=\"1\" style=\"stop-color:#b9ecf1;stop-opacity:1.;\"/>\n",
-       "</linearGradient>\n",
-       "</defs>\n",
-       "<polygon fill=\"url(#clust6_l_8)\" stroke=\"black\" points=\"274.7,-75.25 274.7,-241.25 450.7,-241.25 450.7,-75.25 274.7,-75.25\"/>\n",
-       "<text text-anchor=\"middle\" x=\"362.7\" y=\"-223.95\" font-family=\"Times,serif\" font-size=\"14.00\">b: AddOne</text>\n",
-       "</g>\n",
-       "<g id=\"clust7\" class=\"cluster\">\n",
-       "<title>clustersimplebInputs</title>\n",
-       "<defs>\n",
-       "<linearGradient id=\"clust7_l_9\" gradientUnits=\"userSpaceOnUse\" x1=\"282.7\" y1=\"-147.25\" x2=\"352.7\" y2=\"-147.25\" >\n",
-       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
-       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
-       "</linearGradient>\n",
-       "</defs>\n",
-       "<polygon fill=\"url(#clust7_l_9)\" stroke=\"black\" points=\"282.7,-83.25 282.7,-211.25 352.7,-211.25 352.7,-83.25 282.7,-83.25\"/>\n",
-       "<text text-anchor=\"middle\" x=\"317.7\" y=\"-193.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
-       "</g>\n",
-       "<g id=\"clust8\" class=\"cluster\">\n",
-       "<title>clustersimplebOutputs</title>\n",
-       "<defs>\n",
-       "<linearGradient id=\"clust8_l_10\" gradientUnits=\"userSpaceOnUse\" x1=\"442.7\" y1=\"-147.25\" x2=\"372.7\" y2=\"-147.25\" >\n",
-       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
-       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
-       "</linearGradient>\n",
-       "</defs>\n",
-       "<polygon fill=\"url(#clust8_l_10)\" stroke=\"black\" points=\"372.7,-83.25 372.7,-211.25 442.7,-211.25 442.7,-83.25 372.7,-83.25\"/>\n",
-       "<text text-anchor=\"middle\" x=\"407.7\" y=\"-193.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
-       "</g>\n",
        "<!-- clustersimpleInputsrun -->\n",
        "<g id=\"node1\" class=\"node\">\n",
        "<title>clustersimpleInputsrun</title>\n",
@@ -1264,10 +1309,10 @@
        "</svg>\n"
       ],
       "text/plain": [
-       "<graphviz.graphs.Digraph at 0x146704050>"
+       "<graphviz.graphs.Digraph at 0x111b52f90>"
       ]
      },
-     "execution_count": 30,
+     "execution_count": 32,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -1294,14 +1339,14 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 31,
+   "execution_count": 33,
    "id": "ae500d5e-e55b-432c-8b5f-d5892193cdf5",
    "metadata": {},
    "outputs": [
     {
      "data": {
       "application/vnd.jupyter.widget-view+json": {
-       "model_id": "9fad22dbcc8940cbaa936a48e84d054c",
+       "model_id": "55f7b5a7a3704dd98ddfe767bd36d833",
        "version_major": 2,
        "version_minor": 0
       },
@@ -1320,10 +1365,10 @@
     {
      "data": {
       "text/plain": [
-       "<matplotlib.collections.PathCollection at 0x15240c450>"
+       "<matplotlib.collections.PathCollection at 0x158853190>"
       ]
      },
-     "execution_count": 31,
+     "execution_count": 33,
      "metadata": {},
      "output_type": "execute_result"
     },
@@ -1366,7 +1411,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 32,
+   "execution_count": 34,
    "id": "2114d0c3-cdad-43c7-9ffa-50c36d56d18f",
    "metadata": {},
    "outputs": [
@@ -1574,10 +1619,10 @@
        "</svg>\n"
       ],
       "text/plain": [
-       "<graphviz.graphs.Digraph at 0x15241edd0>"
+       "<graphviz.graphs.Digraph at 0x1587f8350>"
       ]
      },
-     "execution_count": 32,
+     "execution_count": 34,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -1594,6 +1639,14 @@
     "Note: the `draw` call returns a `graphviz.graphs.Digraphs` object; these get natively rendered alright in jupyter notebooks, as seen above, but you can also snag the object in a variable and do everything else graphviz allows, e.g. using the `render` method on the object to save it to file. Cf. the graphviz docs for details."
    ]
   },
+  {
+   "cell_type": "markdown",
+   "id": "7a4e235d-905f-4763-a1ff-d0c3e24c591c",
+   "metadata": {},
+   "source": [
+    "Workflows are \"living\" objects -- their IO is (re)created on access, so it is always up-to-date with the latest state of the workflow's children (who's there and who they're connected to), and (unless you explicitly tell it otherwise) a workflow will always re-compute the execution flow at run-time. This makes them incredibly convenient for working with as you put together a new computational graph, but is not particularly computationally efficeint."
+   ]
+  },
   {
    "cell_type": "markdown",
    "id": "d1f3b308-28b2-466b-8cf5-6bfd806c08ca",
@@ -1601,12 +1654,12 @@
    "source": [
     "# Macros\n",
     "\n",
-    "Once you have a workflow that you're happy with, you may want to store it as a macro so it can be stored in a human-readable way, reused, and shared. Automated conversion of an existing `Workflow` instance into a `Macro` subclass is still on the TODO list, but defining a new macro is pretty easy: they are just composite nodes that have a function defining their graph setup:"
+    "Once you have a workflow that you're happy with, you may want to store it as a macro so it can be stored in a human-readable way, reused, shared, and executed with more efficiency than the \"living\" `Workflow` instance. Automated conversion of an existing `Workflow` instance into a `Macro` subclass is still on the TODO list, but defining a new macro is pretty easy: they are just composite nodes that have a function defining their graph setup:"
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 33,
+   "execution_count": 35,
    "id": "c71a8308-f8a1-4041-bea0-1c841e072a6d",
    "metadata": {},
    "outputs": [],
@@ -1616,7 +1669,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 34,
+   "execution_count": 36,
    "id": "2b9bb21a-73cd-444e-84a9-100e202aa422",
    "metadata": {},
    "outputs": [
@@ -1634,7 +1687,7 @@
        "13"
       ]
      },
-     "execution_count": 34,
+     "execution_count": 36,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -1663,6 +1716,14 @@
     "macro(add_one__x=10).add_three__result"
    ]
   },
+  {
+   "cell_type": "markdown",
+   "id": "d4f797d6-8d88-415f-bb9c-00f3e1b15e37",
+   "metadata": {},
+   "source": [
+    "Even in the abscence of an automated converter, it should be easy to take the workflow you've been developing and copy-paste that code into a function -- then bam, you've got a macro!"
+   ]
+  },
   {
    "cell_type": "markdown",
    "id": "bd5099c4-1c01-4a45-a5bb-e5087595db9f",
@@ -1673,7 +1734,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 35,
+   "execution_count": 37,
    "id": "3668f9a9-adca-48a4-84ea-13add965897c",
    "metadata": {},
    "outputs": [
@@ -1683,7 +1744,7 @@
        "{'intermediate': 102, 'plus_three': 103}"
       ]
      },
-     "execution_count": 35,
+     "execution_count": 37,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -1714,14 +1775,14 @@
    "source": [
     "## Nesting\n",
     "\n",
-    "Composite nodes can be nested to abstract workflows into simpler components -- i.e. macros can be added to workflows, and macros can be used inside of macros.\n",
+    "Composite nodes can be nested to abstract workflows into simpler components -- i.e. macros can be added to workflows, and macros can be used inside of macros. This is a critically important feature because it allows us to easily create more and more complex workflows by \"composing\" simple(r) sub-graphs together!\n",
     "\n",
     "For our final example, let's define a macro for doing Lammps minimizations, then use this in a workflow to compare energies between different phases."
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 36,
+   "execution_count": 38,
    "id": "9aaeeec0-5f88-4c94-a6cc-45b56d2f0111",
    "metadata": {},
    "outputs": [],
@@ -1751,7 +1812,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 37,
+   "execution_count": 39,
    "id": "a832e552-b3cc-411a-a258-ef21574fc439",
    "metadata": {},
    "outputs": [],
@@ -1778,7 +1839,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 38,
+   "execution_count": 40,
    "id": "b764a447-236f-4cb7-952a-7cba4855087d",
    "metadata": {},
    "outputs": [
@@ -3002,10 +3063,10 @@
        "</svg>\n"
       ],
       "text/plain": [
-       "<graphviz.graphs.Digraph at 0x146a3b110>"
+       "<graphviz.graphs.Digraph at 0x1588e7f10>"
       ]
      },
-     "execution_count": 38,
+     "execution_count": 40,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -3016,7 +3077,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 39,
+   "execution_count": 41,
    "id": "b51bef25-86c5-4d57-80c1-ab733e703caf",
    "metadata": {},
    "outputs": [
@@ -3037,7 +3098,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 40,
+   "execution_count": 42,
    "id": "091e2386-0081-436c-a736-23d019bd9b91",
    "metadata": {},
    "outputs": [
@@ -3078,7 +3139,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 41,
+   "execution_count": 43,
    "id": "4cdffdca-48d3-4486-9045-48102c7e5f31",
    "metadata": {},
    "outputs": [
@@ -3116,7 +3177,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 42,
+   "execution_count": 44,
    "id": "ed4a3a22-fc3a-44c9-9d4f-c65bc1288889",
    "metadata": {},
    "outputs": [
@@ -3146,7 +3207,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 43,
+   "execution_count": 45,
    "id": "5a985cbf-c308-4369-9223-b8a37edb8ab1",
    "metadata": {},
    "outputs": [
@@ -3207,7 +3268,9 @@
    "source": [
     "## Parallelization\n",
     "\n",
-    "You can currently run nodes in a single-core background process by setting that node's `executor` to `True`. The plan is to eventually lean on `pympipool` for more powerful executors that allow for multiple cores, interaction with HPC clusters, etc. We may also leverage the `Submitter` in `pyiron_contrib.tinybase` so that multiple nodes can lean on the same resources.\n",
+    "You can currently run nodes in a single-core background process by setting that node's `executor` to `True`. If you're interested you can dig into the test suite to see a number of examples for this.\n",
+    "\n",
+    "The much more powerful executors in pyiron's `pympipool` appear to also be working, which should allow for graph nodes to run using multiple cores, interact with HPC clusters, etc.  We may also leverage the `Submitter` in `pyiron_contrib.tinybase` so that multiple nodes can lean on the same resources.\n",
     "\n",
     "Unfortunately, _nested_ executors are not yet working. So if you set a macro to use an executor, none of its (grand...)children may specify an executor.\n",
     "\n",
@@ -3253,7 +3316,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 44,
+   "execution_count": 46,
    "id": "0b373764-b389-4c24-8086-f3d33a4f7fd7",
    "metadata": {},
    "outputs": [
@@ -3267,7 +3330,7 @@
        " 17.230249999999995]"
       ]
      },
-     "execution_count": 44,
+     "execution_count": 46,
      "metadata": {},
      "output_type": "execute_result"
     }
@@ -3304,7 +3367,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 45,
+   "execution_count": 47,
    "id": "0dd04b4c-e3e7-4072-ad34-58f2c1e4f596",
    "metadata": {},
    "outputs": [
@@ -3363,7 +3426,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 46,
+   "execution_count": 48,
    "id": "2dfb967b-41ac-4463-b606-3e315e617f2a",
    "metadata": {},
    "outputs": [
@@ -3387,7 +3450,7 @@
   },
   {
    "cell_type": "code",
-   "execution_count": 47,
+   "execution_count": 49,
    "id": "2e87f858-b327-4f6b-9237-c8a557f29aeb",
    "metadata": {},
    "outputs": [
@@ -3395,14 +3458,14 @@
      "name": "stdout",
      "output_type": "stream",
      "text": [
-      "0.460 > 0.2\n",
-      "0.990 > 0.2\n",
+      "0.851 > 0.2\n",
+      "0.497 > 0.2\n",
+      "0.779 > 0.2\n",
       "0.321 > 0.2\n",
-      "0.663 > 0.2\n",
-      "0.231 > 0.2\n",
-      "0.695 > 0.2\n",
-      "0.122 <= 0.2\n",
-      "Finally 0.122\n"
+      "0.560 > 0.2\n",
+      "0.462 > 0.2\n",
+      "0.049 <= 0.2\n",
+      "Finally 0.049\n"
      ]
     }
    ],
diff --git a/notebooks/quickstart.ipynb b/notebooks/quickstart.ipynb
new file mode 100644
index 00000000..7357ef2c
--- /dev/null
+++ b/notebooks/quickstart.ipynb
@@ -0,0 +1,657 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "id": "96fdb45b-624c-4301-a0cf-44874b0693b1",
+   "metadata": {},
+   "source": [
+    "# Pyiron workflows: quickstart\n",
+    "\n",
+    "You can start converting python functions to `pyiron_workflow` nodes by wrapping them with decorators accessible from our single-point-of-entry, the `Workflow` class:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "id": "4655322e-5755-455e-aff7-30067a999b7d",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "from pyiron_workflow import Workflow"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "8d6274b4-880d-40d7-9ce9-63d05c4a60e2",
+   "metadata": {},
+   "source": [
+    "## From function to node\n",
+    "\n",
+    "Let's start with a super simple function that only returns a single thing"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "id": "4022f7b6-1192-454f-bc15-98d8242fedaf",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "@Workflow.wrap_as.single_value_node()\n",
+    "def add_one(x):\n",
+    "    y = x + 1\n",
+    "    return y\n",
+    "\n",
+    "node = add_one()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "7c04df9a-856d-4015-87f5-b8ce3b0d87df",
+   "metadata": {},
+   "source": [
+    "This node object can be run just like the function it wraps"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "id": "4520136f-d8a7-4721-9eb3-52b271cce33f",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "43"
+      ]
+     },
+     "execution_count": 3,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "node(42)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "d5e804d6-93ab-43a0-a330-31b76b719a18",
+   "metadata": {},
+   "source": [
+    "But is also a class instance with input and output channels (note that here the output value takes its name based on what came after the `return` statement)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 4,
+   "id": "e3577e45-f693-4ef4-80ed-743d2a8e0557",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "1"
+      ]
+     },
+     "execution_count": 4,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "node.inputs.x = 0\n",
+    "node.run()\n",
+    "node.outputs.y.value"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "4c4969d5-dd73-413d-af69-8f568b890247",
+   "metadata": {},
+   "source": [
+    "So other than being delayed, these nodes behave a _lot_ like the regular python functions that wrap them:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 5,
+   "id": "768e99e8-901e-4f2b-9a80-4efe25d59e67",
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stderr",
+     "output_type": "stream",
+     "text": [
+      "/Users/huber/work/pyiron/pyiron_workflow/pyiron_workflow/channels.py:158: UserWarning: The channel run was not connected to ran, andthus could not disconnect from it.\n",
+      "  warn(\n"
+     ]
+    },
+    {
+     "data": {
+      "text/plain": [
+       "5"
+      ]
+     },
+     "execution_count": 5,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "calculation = add_one(add_one(add_one(2)))\n",
+    "calculation()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "bfbdc0bb-fba0-45d9-b1bf-c0dfa07871c2",
+   "metadata": {},
+   "source": [
+    "But they are actually nodes, and what we saw above is just syntactic sugar for building a _graph_ connecting the inputs and outputs of the nodes:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 6,
+   "id": "f1f7c7e2-0300-4be7-afd7-4a490bac06f9",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "3"
+      ]
+     },
+     "execution_count": 6,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "n1 = add_one()\n",
+    "n2 = add_one()\n",
+    "n3 = add_one()\n",
+    "\n",
+    "n2.inputs.x = n1.outputs.y\n",
+    "n3.inputs.x = n2.outputs.y\n",
+    "\n",
+    "n1.inputs.x = 0\n",
+    "n3()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "dfa3db51-31d7-43c8-820a-6e5f3525837e",
+   "metadata": {},
+   "source": [
+    "## Putting it together in a workflow\n",
+    "\n",
+    "We can work with nodes all by themselves, but since the whole point is to connect them together to make a computation graph, we can get extra tools by intentionally making these children of a `Workflow` node.\n",
+    "\n",
+    "The `Workflow` class not only gives us access to the decorators for defining new nodes, but also lets us register modules of existing nodes and use them. Let's put together a workflow that uses both an existing node from a package, and a `Function` node that is more general than we used above in that it allows us to have multiple return values. This function node will also exploit our ability to name outputs and give type hints:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 7,
+   "id": "4c80aee3-a8e4-444c-9260-3078f8d617a4",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "image/svg+xml": [
+       "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n",
+       "<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\n",
+       " \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\n",
+       "<!-- Generated by graphviz version 8.0.5 (0)\n",
+       " -->\n",
+       "<!-- Title: clustermy_workflow Pages: 1 -->\n",
+       "<svg width=\"729pt\" height=\"304pt\"\n",
+       " viewBox=\"0.00 0.00 728.94 304.25\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\n",
+       "<g id=\"graph0\" class=\"graph\" transform=\"scale(1 1) rotate(0) translate(4 300.25)\">\n",
+       "<title>clustermy_workflow</title>\n",
+       "<polygon fill=\"white\" stroke=\"none\" points=\"-4,4 -4,-300.25 724.94,-300.25 724.94,4 -4,4\"/>\n",
+       "<text text-anchor=\"middle\" x=\"360.47\" y=\"-4.95\" font-family=\"Times,serif\" font-size=\"14.00\">my_workflow: Workflow</text>\n",
+       "<g id=\"clust1\" class=\"cluster\">\n",
+       "<title>clustermy_workflowInputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust1_l_0\" gradientUnits=\"userSpaceOnUse\" x1=\"8\" y1=\"-94.25\" x2=\"139.76\" y2=\"-94.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust1_l_0)\" stroke=\"black\" points=\"8,-30.25 8,-158.25 139.76,-158.25 139.76,-30.25 8,-30.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"73.88\" y=\"-140.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust2\" class=\"cluster\">\n",
+       "<title>clustermy_workflowOutputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust2_l_1\" gradientUnits=\"userSpaceOnUse\" x1=\"712.94\" y1=\"-120.25\" x2=\"614.58\" y2=\"-120.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust2_l_1)\" stroke=\"black\" points=\"614.58,-56.25 614.58,-184.25 712.94,-184.25 712.94,-56.25 614.58,-56.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"663.76\" y=\"-166.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust3\" class=\"cluster\">\n",
+       "<title>clustermy_workflowarrays</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust3_l_2\" gradientUnits=\"userSpaceOnUse\" x1=\"271.46\" y1=\"-68.25\" x2=\"271.46\" y2=\"-288.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#2ca02c;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#c0e2c0;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust3_l_2)\" stroke=\"black\" points=\"151.76,-68.25 151.76,-288.25 391.16,-288.25 391.16,-68.25 151.76,-68.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"271.46\" y=\"-270.95\" font-family=\"Times,serif\" font-size=\"14.00\">arrays: SquareRange</text>\n",
+       "</g>\n",
+       "<g id=\"clust4\" class=\"cluster\">\n",
+       "<title>clustermy_workflowarraysInputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust4_l_3\" gradientUnits=\"userSpaceOnUse\" x1=\"159.76\" y1=\"-174.25\" x2=\"232.36\" y2=\"-174.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust4_l_3)\" stroke=\"black\" points=\"159.76,-110.25 159.76,-238.25 232.36,-238.25 232.36,-110.25 159.76,-110.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"196.06\" y=\"-220.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust5\" class=\"cluster\">\n",
+       "<title>clustermy_workflowarraysOutputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust5_l_4\" gradientUnits=\"userSpaceOnUse\" x1=\"383.16\" y1=\"-167.25\" x2=\"252.36\" y2=\"-167.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust5_l_4)\" stroke=\"black\" points=\"252.36,-76.25 252.36,-258.25 383.16,-258.25 383.16,-76.25 252.36,-76.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"317.76\" y=\"-240.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust6\" class=\"cluster\">\n",
+       "<title>clustermy_workflowplot</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust6_l_5\" gradientUnits=\"userSpaceOnUse\" x1=\"500.87\" y1=\"-68.25\" x2=\"500.87\" y2=\"-288.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#17becf;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#b9ecf1;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust6_l_5)\" stroke=\"black\" points=\"399.16,-68.25 399.16,-288.25 602.58,-288.25 602.58,-68.25 399.16,-68.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"500.87\" y=\"-270.95\" font-family=\"Times,serif\" font-size=\"14.00\">plot: Scatter</text>\n",
+       "</g>\n",
+       "<g id=\"clust7\" class=\"cluster\">\n",
+       "<title>clustermy_workflowplotInputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust7_l_6\" gradientUnits=\"userSpaceOnUse\" x1=\"407.16\" y1=\"-167.25\" x2=\"504.58\" y2=\"-167.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust7_l_6)\" stroke=\"black\" points=\"407.16,-76.25 407.16,-258.25 504.58,-258.25 504.58,-76.25 407.16,-76.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"455.87\" y=\"-240.95\" font-family=\"Times,serif\" font-size=\"14.00\">Inputs</text>\n",
+       "</g>\n",
+       "<g id=\"clust8\" class=\"cluster\">\n",
+       "<title>clustermy_workflowplotOutputs</title>\n",
+       "<defs>\n",
+       "<linearGradient id=\"clust8_l_7\" gradientUnits=\"userSpaceOnUse\" x1=\"594.58\" y1=\"-187.25\" x2=\"524.58\" y2=\"-187.25\" >\n",
+       "<stop offset=\"0\" style=\"stop-color:#7f7f7f;stop-opacity:1.;\"/>\n",
+       "<stop offset=\"1\" style=\"stop-color:#d9d9d9;stop-opacity:1.;\"/>\n",
+       "</linearGradient>\n",
+       "</defs>\n",
+       "<polygon fill=\"url(#clust8_l_7)\" stroke=\"black\" points=\"524.58,-123.25 524.58,-251.25 594.58,-251.25 594.58,-123.25 524.58,-123.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"559.58\" y=\"-233.95\" font-family=\"Times,serif\" font-size=\"14.00\">Outputs</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowInputsrun -->\n",
+       "<g id=\"node1\" class=\"node\">\n",
+       "<title>clustermy_workflowInputsrun</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"88.88,-68.25 46.88,-68.25 46.88,-44.25 88.88,-44.25 100.88,-56.25 88.88,-68.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"73.88\" y=\"-50.08\" font-family=\"Times,serif\" font-size=\"14.00\">run</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowOutputsran -->\n",
+       "<g id=\"node3\" class=\"node\">\n",
+       "<title>clustermy_workflowOutputsran</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"678.76,-94.25 636.76,-94.25 636.76,-70.25 678.76,-70.25 690.76,-82.25 678.76,-94.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"663.76\" y=\"-76.08\" font-family=\"Times,serif\" font-size=\"14.00\">ran</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowInputsrun&#45;&gt;clustermy_workflowOutputsran -->\n",
+       "<!-- clustermy_workflowInputsarrays__x -->\n",
+       "<g id=\"node2\" class=\"node\">\n",
+       "<title>clustermy_workflowInputsarrays__x</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"73.88\" cy=\"-110.25\" rx=\"57.88\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"73.88\" y=\"-104.08\" font-family=\"Times,serif\" font-size=\"14.00\">arrays__x: int</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysInputsx -->\n",
+       "<g id=\"node6\" class=\"node\">\n",
+       "<title>clustermy_workflowarraysInputsx</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"196.06\" cy=\"-136.25\" rx=\"28.3\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"196.06\" y=\"-130.07\" font-family=\"Times,serif\" font-size=\"14.00\">x: int</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowInputsarrays__x&#45;&gt;clustermy_workflowarraysInputsx -->\n",
+       "<g id=\"edge6\" class=\"edge\">\n",
+       "<title>clustermy_workflowInputsarrays__x&#45;&gt;clustermy_workflowarraysInputsx</title>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M122.38,-120.53C128.39,-121.83 134.59,-123.17 140.7,-124.49\"/>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M140.7,-124.49C146.81,-125.81 152.84,-127.12 158.54,-128.35\"/>\n",
+       "<polygon fill=\"#ff7f0e\" stroke=\"#ff7f0e\" points=\"157.47,-131.92 167.99,-130.61 158.95,-125.07 157.47,-131.92\"/>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowOutputsplot__fig -->\n",
+       "<g id=\"node4\" class=\"node\">\n",
+       "<title>clustermy_workflowOutputsplot__fig</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"663.76\" cy=\"-136.25\" rx=\"41.18\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"663.76\" y=\"-130.07\" font-family=\"Times,serif\" font-size=\"14.00\">plot__fig</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysInputsrun -->\n",
+       "<g id=\"node5\" class=\"node\">\n",
+       "<title>clustermy_workflowarraysInputsrun</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"211.06,-202.25 169.06,-202.25 169.06,-178.25 211.06,-178.25 223.06,-190.25 211.06,-202.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"196.06\" y=\"-184.07\" font-family=\"Times,serif\" font-size=\"14.00\">run</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysOutputsran -->\n",
+       "<g id=\"node7\" class=\"node\">\n",
+       "<title>clustermy_workflowarraysOutputsran</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"332.76,-222.25 290.76,-222.25 290.76,-198.25 332.76,-198.25 344.76,-210.25 332.76,-222.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"317.76\" y=\"-204.07\" font-family=\"Times,serif\" font-size=\"14.00\">ran</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysInputsrun&#45;&gt;clustermy_workflowarraysOutputsran -->\n",
+       "<!-- clustermy_workflowarraysOutputsx -->\n",
+       "<g id=\"node8\" class=\"node\">\n",
+       "<title>clustermy_workflowarraysOutputsx</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"317.76\" cy=\"-156.25\" rx=\"45.48\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"317.76\" y=\"-150.07\" font-family=\"Times,serif\" font-size=\"14.00\">x: ndarray</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotInputsx -->\n",
+       "<g id=\"node11\" class=\"node\">\n",
+       "<title>clustermy_workflowplotInputsx</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"455.87\" cy=\"-156.25\" rx=\"40.71\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"455.87\" y=\"-150.07\" font-family=\"Times,serif\" font-size=\"14.00\">x: Union</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysOutputsx&#45;&gt;clustermy_workflowplotInputsx -->\n",
+       "<g id=\"edge4\" class=\"edge\">\n",
+       "<title>clustermy_workflowarraysOutputsx&#45;&gt;clustermy_workflowplotInputsx</title>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M363.71,-156.25C370.18,-156.25 376.97,-156.25 383.8,-156.25\"/>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M383.8,-156.25C390.63,-156.25 397.5,-156.25 404.13,-156.25\"/>\n",
+       "<polygon fill=\"#ff7f0e\" stroke=\"#ff7f0e\" points=\"404.01,-159.75 414.01,-156.25 404.01,-152.75 404.01,-159.75\"/>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysOutputsx_sq -->\n",
+       "<g id=\"node9\" class=\"node\">\n",
+       "<title>clustermy_workflowarraysOutputsx_sq</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"317.76\" cy=\"-102.25\" rx=\"57.4\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"317.76\" y=\"-96.08\" font-family=\"Times,serif\" font-size=\"14.00\">x_sq: ndarray</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotInputsy -->\n",
+       "<g id=\"node12\" class=\"node\">\n",
+       "<title>clustermy_workflowplotInputsy</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"455.87\" cy=\"-102.25\" rx=\"40.71\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"455.87\" y=\"-96.08\" font-family=\"Times,serif\" font-size=\"14.00\">y: Union</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowarraysOutputsx_sq&#45;&gt;clustermy_workflowplotInputsy -->\n",
+       "<g id=\"edge5\" class=\"edge\">\n",
+       "<title>clustermy_workflowarraysOutputsx_sq&#45;&gt;clustermy_workflowplotInputsy</title>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M375.3,-102.25C380.08,-102.25 384.93,-102.25 389.77,-102.25\"/>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M389.77,-102.25C394.6,-102.25 399.42,-102.25 404.12,-102.25\"/>\n",
+       "<polygon fill=\"#ff7f0e\" stroke=\"#ff7f0e\" points=\"403.88,-105.75 413.88,-102.25 403.88,-98.75 403.88,-105.75\"/>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotInputsrun -->\n",
+       "<g id=\"node10\" class=\"node\">\n",
+       "<title>clustermy_workflowplotInputsrun</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"470.87,-222.25 428.87,-222.25 428.87,-198.25 470.87,-198.25 482.87,-210.25 470.87,-222.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"455.87\" y=\"-204.07\" font-family=\"Times,serif\" font-size=\"14.00\">run</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotOutputsran -->\n",
+       "<g id=\"node13\" class=\"node\">\n",
+       "<title>clustermy_workflowplotOutputsran</title>\n",
+       "<polygon fill=\"#1f77b4\" stroke=\"#1f77b4\" points=\"574.58,-215.25 532.58,-215.25 532.58,-191.25 574.58,-191.25 586.58,-203.25 574.58,-215.25\"/>\n",
+       "<text text-anchor=\"middle\" x=\"559.58\" y=\"-197.07\" font-family=\"Times,serif\" font-size=\"14.00\">ran</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotInputsrun&#45;&gt;clustermy_workflowplotOutputsran -->\n",
+       "<!-- clustermy_workflowplotOutputsfig -->\n",
+       "<g id=\"node14\" class=\"node\">\n",
+       "<title>clustermy_workflowplotOutputsfig</title>\n",
+       "<ellipse fill=\"#ff7f0e\" stroke=\"#ff7f0e\" cx=\"559.58\" cy=\"-149.25\" rx=\"27\" ry=\"18\"/>\n",
+       "<text text-anchor=\"middle\" x=\"559.58\" y=\"-143.07\" font-family=\"Times,serif\" font-size=\"14.00\">fig</text>\n",
+       "</g>\n",
+       "<!-- clustermy_workflowplotOutputsfig&#45;&gt;clustermy_workflowOutputsplot__fig -->\n",
+       "<g id=\"edge7\" class=\"edge\">\n",
+       "<title>clustermy_workflowplotOutputsfig&#45;&gt;clustermy_workflowOutputsplot__fig</title>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M586.62,-145.94C590.66,-145.42 594.98,-144.87 599.42,-144.31\"/>\n",
+       "<path fill=\"none\" stroke=\"#ff7f0e\" d=\"M599.42,-144.31C603.87,-143.74 608.45,-143.16 613.01,-142.58\"/>\n",
+       "<polygon fill=\"#ff7f0e\" stroke=\"#ff7f0e\" points=\"613.15,-145.96 622.63,-141.23 612.27,-139.02 613.15,-145.96\"/>\n",
+       "</g>\n",
+       "</g>\n",
+       "</svg>\n"
+      ],
+      "text/plain": [
+       "<graphviz.graphs.Digraph at 0x14dc61990>"
+      ]
+     },
+     "execution_count": 7,
+     "metadata": {},
+     "output_type": "execute_result"
+    }
+   ],
+   "source": [
+    "import numpy as np\n",
+    "\n",
+    "wf = Workflow(\"my_workflow\")\n",
+    "\n",
+    "@Workflow.wrap_as.function_node([\"x\", \"x_sq\"])\n",
+    "def square_range(x: int) -> tuple[np.ndarray, np.ndarray]:\n",
+    "    x = np.arange(x)\n",
+    "    return x, (x**2)\n",
+    "\n",
+    "wf.register(\"plotting\", \"pyiron_workflow.node_library.plotting\")\n",
+    "\n",
+    "wf.arrays = square_range()\n",
+    "wf.plot = wf.create.plotting.Scatter(\n",
+    "    x=wf.arrays.outputs.x,\n",
+    "    y=wf.arrays.outputs.x_sq\n",
+    ")\n",
+    "\n",
+    "wf.draw()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "ffc897e4-0f12-4231-8ebe-82862c890de5",
+   "metadata": {},
+   "source": [
+    "We can see that the workflow automatically exposes unconnected IO of its children and gives them a name based on the child node's name and that node's IO name.\n",
+    "\n",
+    "Let's run our workflow and look at the result:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 8,
+   "id": "c499c0ed-7af5-491a-b340-2d2f4f48529c",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "<matplotlib.collections.PathCollection at 0x14e34ddd0>"
+      ]
+     },
+     "execution_count": 8,
+     "metadata": {},
+     "output_type": "execute_result"
+    },
+    {
+     "data": {
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAh8AAAGdCAYAAACyzRGfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAhN0lEQVR4nO3de3BU5f3H8c8mIRu1yWqwIcmwQLQUDKmoXBQFhaII0gh12noBSrWdERrl1lGJ1mJsdaV1KHaoOHiDlkGcKYZKrUimkEQGKZckKqIgGiWVZFKL3Q2xWSE5vz/8ZceYzWXD2Wf3LO/XzPljn31OzveZh2E/85yby7IsSwAAAIYkxboAAABwZiF8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADAqJdYFfF1bW5uOHTum9PR0uVyuWJcDAAB6wbIsNTU1KTc3V0lJ3a9txF34OHbsmLxeb6zLAAAAfVBXV6eBAwd22yfuwkd6erqkL4vPyMiIcTUAAKA3AoGAvF5v6He8O3EXPtpPtWRkZBA+AABwmN5cMsEFpwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACj4u4hYwAAIDpa2yztqT2uxqYWZaWnaWxeppKTzL9HLeKVj8rKShUWFio3N1cul0ubN2/u1Ofdd9/VjTfeKI/Ho/T0dF1xxRU6evSoHfUCAIA+2HqgXuOXb9etT+/Wwo01uvXp3Rq/fLu2Hqg3XkvE4aO5uVkjR47UqlWrwn7/wQcfaPz48Ro+fLjKy8v15ptv6sEHH1RaWtppFwsAACK39UC95q+vUr2/pUN7g79F89dXGQ8gLsuyrD7v7HKptLRUM2fODLXdcsst6tevn/785z/36W8GAgF5PB75/X7e7QIAwGlqbbM0fvn2TsGjnUtStidNO+/77mmdgonk99vWC07b2tr0yiuv6Nvf/rauv/56ZWVl6fLLLw97aqZdMBhUIBDosAEAAHvsqT3eZfCQJEtSvb9Fe2qPG6vJ1vDR2NioEydO6LHHHtPUqVO1bds2ff/739dNN92kioqKsPv4fD55PJ7Q5vV67SwJAIAzWmNT18GjL/3sYPvKhyTNmDFDixcv1iWXXKKlS5fqe9/7np566qmw+xQXF8vv94e2uro6O0sCAOCMlpXeu2sue9vPDrbeanv++ecrJSVF+fn5Hdovuugi7dy5M+w+brdbbrfbzjIAAMD/G5uXqRxPmhr8LQp3kWf7NR9j8zKN1WTrykdqaqrGjBmjQ4cOdWg/fPiwBg8ebOehAABALyQnubSs8MtFga9fTtr+eVlhvtHnfUS88nHixAkdOXIk9Lm2tlY1NTXKzMzUoEGDdM899+jmm2/W1VdfrUmTJmnr1q3asmWLysvL7awbAAD00tSCHK2efZlKthzscPFptidNywrzNbUgx2g9Ed9qW15erkmTJnVqnzt3rtauXStJeu655+Tz+fSvf/1Lw4YNU0lJiWbMmNGrv8+ttgAAREc0n3Aaye/3aT3nIxoIHwAAOE/MnvMBAADQE8IHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIyKOHxUVlaqsLBQubm5crlc2rx5c5d977zzTrlcLq1cufI0SgQAAIkk4vDR3NyskSNHatWqVd3227x5s/75z38qNze3z8UBAIDEkxLpDtOmTdO0adO67fPJJ5/orrvu0muvvabp06f3uTgAAJB4Ig4fPWlra9OcOXN0zz33aMSIET32DwaDCgaDoc+BQMDukgAAQByx/YLT5cuXKyUlRQsWLOhVf5/PJ4/HE9q8Xq/dJQEAgDhia/jYv3+/nnjiCa1du1Yul6tX+xQXF8vv94e2uro6O0sCAABxxtbw8frrr6uxsVGDBg1SSkqKUlJS9PHHH+sXv/iFhgwZEnYft9utjIyMDhsAAEhctl7zMWfOHF177bUd2q6//nrNmTNHt99+u52HAgAADhVx+Dhx4oSOHDkS+lxbW6uamhplZmZq0KBB6t+/f4f+/fr1U3Z2toYNG3b61QIAAMeLOHzs27dPkyZNCn1esmSJJGnu3Llau3atbYUBAIDEFHH4mDhxoizL6nX/jz76KNJDAACABMa7XQAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYFXH4qKysVGFhoXJzc+VyubR58+bQdydPntR9992n73znOzrnnHOUm5urH//4xzp27JidNQMAAAeLOHw0Nzdr5MiRWrVqVafvPv/8c1VVVenBBx9UVVWVXnrpJR0+fFg33nijLcUCAADnc1mWZfV5Z5dLpaWlmjlzZpd99u7dq7Fjx+rjjz/WoEGDevybgUBAHo9Hfr9fGRkZfS0NAAAYFMnvd0q0i/H7/XK5XDr33HPDfh8MBhUMBkOfA4FAtEsCAAAxFNULTltaWrR06VLddtttXaYgn88nj8cT2rxebzRLAgAAMRa18HHy5Endcsstamtr05NPPtllv+LiYvn9/tBWV1cXrZIAAEAciMppl5MnT+pHP/qRamtrtX379m7P/bjdbrnd7miUAQAA4pDt4aM9eLz//vvasWOH+vfvb/chAACAg0UcPk6cOKEjR46EPtfW1qqmpkaZmZnKzc3VD37wA1VVVelvf/ubWltb1dDQIEnKzMxUamqqfZUDAABHivhW2/Lyck2aNKlT+9y5c/XQQw8pLy8v7H47duzQxIkTe/z73GoLAIDzRPVW24kTJ6q7vHIajw0BAABnAN7tAgAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMCoiMNHZWWlCgsLlZubK5fLpc2bN3f43rIsPfTQQ8rNzdVZZ52liRMn6p133rGrXgAA4HARh4/m5maNHDlSq1atCvv9b3/7W61YsUKrVq3S3r17lZ2dreuuu05NTU2nXSwAAHC+lEh3mDZtmqZNmxb2O8uytHLlSj3wwAO66aabJEnr1q3TgAEDtGHDBt15552nVy0AAHA8W6/5qK2tVUNDg6ZMmRJqc7vduuaaa7Rr166w+wSDQQUCgQ4bAABIXLaGj4aGBknSgAEDOrQPGDAg9N3X+Xw+eTye0Ob1eu0sCQAAxJmo3O3icrk6fLYsq1Nbu+LiYvn9/tBWV1cXjZIAAECciPiaj+5kZ2dL+nIFJCcnJ9Te2NjYaTWkndvtltvttrMMAAAQx2xd+cjLy1N2drbKyspCbV988YUqKip05ZVX2nkoAADgUBGvfJw4cUJHjhwJfa6trVVNTY0yMzM1aNAgLVq0SI8++qiGDh2qoUOH6tFHH9XZZ5+t2267zdbCAQCAM0UcPvbt26dJkyaFPi9ZskSSNHfuXK1du1b33nuv/ve//+nnP/+5PvvsM11++eXatm2b0tPT7asaAAA4lsuyLCvWRXxVIBCQx+OR3+9XRkZGrMsBAAC9EMnvN+92AQAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEbZ+lZbAEDiam2ztKf2uBqbWpSVnqaxeZlKTnLFuiw4EOEDANCjrQfqVbLloOr9LaG2HE+alhXma2pBTgwrgxNx2gUA0K2tB+o1f31Vh+AhSQ3+Fs1fX6WtB+pjVBmcivABAOhSa5ulki0HFe4NpO1tJVsOqrUtrt5RijhH+AAAdGlP7fFOKx5fZUmq97doT+1xc0XB8QgfAIAuNTZ1HTz60g+QCB8AgG5kpafZ2g+QCB8AgG6MzctUjidNXd1Q69KXd72Mzcs0WRYcjvABAOhScpJLywrzJalTAGn/vKwwn+d9ICKEDwBAt6YW5Gj17MuU7el4aiXbk6bVsy/jOR+IGA8ZAwD0aGpBjq7Lz+YJp7AF4QMA0CvJSS6Nu7B/rMtAAuC0CwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIyyPXycOnVKv/zlL5WXl6ezzjpLF1xwgR5++GG1tbXZfSgAAOBAtr/bZfny5Xrqqae0bt06jRgxQvv27dPtt98uj8ejhQsX2n04AADgMLaHjzfeeEMzZszQ9OnTJUlDhgzRCy+8oH379tl9KAAA4EC2n3YZP368/vGPf+jw4cOSpDfffFM7d+7UDTfcELZ/MBhUIBDosAEAgMRl+8rHfffdJ7/fr+HDhys5OVmtra165JFHdOutt4bt7/P5VFJSYncZAAAgTtm+8vHiiy9q/fr12rBhg6qqqrRu3To9/vjjWrduXdj+xcXF8vv9oa2urs7ukgAAQBxxWZZl2fkHvV6vli5dqqKiolDbb37zG61fv17vvfdej/sHAgF5PB75/X5lZGTYWRoAAIiSSH6/bV/5+Pzzz5WU1PHPJicnc6stAACQFIVrPgoLC/XII49o0KBBGjFihKqrq7VixQrdcccddh8KAAA4kO2nXZqamvTggw+qtLRUjY2Nys3N1a233qpf/epXSk1N7XF/TrsAAOA8kfx+2x4+ThfhAwAA54npNR8AAADdIXwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwKiohI9PPvlEs2fPVv/+/XX22Wfrkksu0f79+6NxKAAA4DApdv/Bzz77TFdddZUmTZqkV199VVlZWfrggw907rnn2n0oAADgQLaHj+XLl8vr9er5558PtQ0ZMsTuwwAAAIey/bTLyy+/rNGjR+uHP/yhsrKydOmll+rpp5/usn8wGFQgEOiwAQCAxGV7+Pjwww+1evVqDR06VK+99prmzZunBQsW6E9/+lPY/j6fTx6PJ7R5vV67SwIAAHHEZVmWZecfTE1N1ejRo7Vr165Q24IFC7R371698cYbnfoHg0EFg8HQ50AgIK/XK7/fr4yMDDtLAwAAURIIBOTxeHr1+237ykdOTo7y8/M7tF100UU6evRo2P5ut1sZGRkdNgAAkLhsDx9XXXWVDh061KHt8OHDGjx4sN2HAgAADmR7+Fi8eLF2796tRx99VEeOHNGGDRu0Zs0aFRUV2X0oAADgQLaHjzFjxqi0tFQvvPCCCgoK9Otf/1orV67UrFmz7D4UAABwINsvOD1dkVywAgAA4kNMLzgFAADoDuEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABiVEusCAJwZWtss7ak9rsamFmWlp2lsXqaSk1yxLgtADER95cPn88nlcmnRokXRPhSAOLX1QL3GL9+uW5/erYUba3Tr07s1fvl2bT1QH+vSAMRAVMPH3r17tWbNGl188cXRPAyAOLb1QL3mr69Svb+lQ3uDv0Xz11cRQIAzUNTCx4kTJzRr1iw9/fTTOu+886J1GABxrLXNUsmWg7LCfNfeVrLloFrbwvUAkKiiFj6Kioo0ffp0XXvttd32CwaDCgQCHTYAiWFP7fFOKx5fZUmq97doT+1xc0UBiLmoXHC6ceNGVVVVae/evT329fl8KikpiUYZAGKssanr4NGXfgASg+0rH3V1dVq4cKHWr1+vtLS0HvsXFxfL7/eHtrq6OrtLAhAjWek9/x8QST8AicH2lY/9+/ersbFRo0aNCrW1traqsrJSq1atUjAYVHJycug7t9stt9ttdxkA4sDYvEzleNLU4G8Je92HS1K258vbbgGcOWxf+Zg8ebLefvtt1dTUhLbRo0dr1qxZqqmp6RA8ACS25CSXlhXmS/oyaHxV++dlhfk87wM4w9i+8pGenq6CgoIObeecc4769+/fqR1A4ptakKPVsy9TyZaDHS4+zfakaVlhvqYW5MSwOgCxwBNOAUTd1IIcXZefzRNOAUgyFD7Ky8tNHAZAHEtOcmnchf1jXQaAOMCL5QAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYRfgAAABGET4AAIBRhA8AAGAU4QMAABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYZXv48Pl8GjNmjNLT05WVlaWZM2fq0KFDdh8GAAA4lO3ho6KiQkVFRdq9e7fKysp06tQpTZkyRc3NzXYfCgAAOJDLsiwrmgf497//raysLFVUVOjqq6/usX8gEJDH45Hf71dGRkY0SwMAADaJ5Pc7JdrF+P1+SVJmZmbY74PBoILBYOhzIBCIdkkAACCGonrBqWVZWrJkicaPH6+CgoKwfXw+nzweT2jzer3RLAkAAMRYVE+7FBUV6ZVXXtHOnTs1cODAsH3CrXx4vV5OuwAA4CBxcdrl7rvv1ssvv6zKysoug4ckud1uud3uaJUBAADijO3hw7Is3X333SotLVV5ebny8vLsPgQAAHAw28NHUVGRNmzYoL/+9a9KT09XQ0ODJMnj8eiss86y+3AAAMBhbL/mw+VyhW1//vnn9ZOf/KTH/bnVFgAA54npNR9RfmwIAABwON7tAgAAjCJ8AAAAowgfAADAKMIHAAAwivABAACMInwAAACjCB8AAMAowgcAADCK8AEAAIyK2lttAbu1tlnaU3tcjU0tykpP09i8TCUnhX+cPwAgfhE+4AhbD9SrZMtB1ftbQm05njQtK8zX1IKcGFYGAIgUp10Q97YeqNf89VUdgockNfhbNH99lbYeqI9RZQCAviB8IK61tlkq2XJQ4V5X2N5WsuWgWtt4oSEAOAXhA3FtT+3xTiseX2VJqve3aE/tcXNFAQBOC+EDca2xqevg0Zd+AIDYI3wgrmWlp9naDwAQe4QPxLWxeZnK8aSpqxtqXfryrpexeZkmywIAnAbCB+JacpJLywrzJalTAGn/vKwwn+d9AICDED4Q96YW5Gj17MuU7el4aiXbk6bVsy/jOR8A4DA8ZAyOMLUgR9flZ/OEUwBIAIQPOEZykkvjLuwf6zIAAKeJ0y4AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAqDPmCaetbRaP5gYAIA5ELXw8+eST+t3vfqf6+nqNGDFCK1eu1IQJE6J1uG5tPVCvki0HVe9vCbXleNK0rDCfl5IBAGBYVE67vPjii1q0aJEeeOABVVdXa8KECZo2bZqOHj0ajcN1a+uBes1fX9UheEhSg79F89dXaeuBeuM1AQBwJotK+FixYoV++tOf6mc/+5kuuugirVy5Ul6vV6tXr47G4brU2mapZMtBWWG+a28r2XJQrW3hegAAgGiwPXx88cUX2r9/v6ZMmdKhfcqUKdq1a1en/sFgUIFAoMNmlz21xzuteHyVJane36I9tcdtOyYAAOie7eHj008/VWtrqwYMGNChfcCAAWpoaOjU3+fzyePxhDav12tbLY1NXQePvvQDAACnL2q32rpcHe8ksSyrU5skFRcXy+/3h7a6ujrbashKT7O1HwAAOH223+1y/vnnKzk5udMqR2NjY6fVEElyu91yu912lyFJGpuXqRxPmhr8LWGv+3BJyvZ8edstAAAww/aVj9TUVI0aNUplZWUd2svKynTllVfafbhuJSe5tKwwX9KXQeOr2j8vK8zneR8AABgUldMuS5Ys0TPPPKPnnntO7777rhYvXqyjR49q3rx50Thct6YW5Gj17MuU7el4aiXbk6bVsy/jOR8AABgWlYeM3XzzzfrPf/6jhx9+WPX19SooKNDf//53DR48OBqH69HUghxdl5/NE04BAIgDLsuy4uohF4FAQB6PR36/XxkZGbEuBwAA9EIkv9+8WA4AABhF+AAAAEYRPgAAgFGEDwAAYBThAwAAGEX4AAAARhE+AACAUYQPAABgFOEDAAAYFZXHq5+O9geuBgKBGFcCAAB6q/13uzcPTo+78NHU1CRJ8nq9Ma4EAABEqqmpSR6Pp9s+cfdul7a2Nh07dkzp6elyuex98VsgEJDX61VdXV1Cvjcm0ccnJf4YGZ/zJfoYE318UuKPMVrjsyxLTU1Nys3NVVJS91d1xN3KR1JSkgYOHBjVY2RkZCTkP6h2iT4+KfHHyPicL9HHmOjjkxJ/jNEYX08rHu244BQAABhF+AAAAEadUeHD7XZr2bJlcrvdsS4lKhJ9fFLij5HxOV+ijzHRxycl/hjjYXxxd8EpAABIbGfUygcAAIg9wgcAADCK8AEAAIwifAAAAKMSLnw8+eSTysvLU1pamkaNGqXXX3+92/4VFRUaNWqU0tLSdMEFF+ipp54yVGnfRDK+8vJyuVyuTtt7771nsOLeq6ysVGFhoXJzc+VyubR58+Ye93Ha/EU6RifNoc/n05gxY5Senq6srCzNnDlThw4d6nE/J81hX8bopDlcvXq1Lr744tDDp8aNG6dXX321232cNH9S5GN00vyF4/P55HK5tGjRom77mZ7HhAofL774ohYtWqQHHnhA1dXVmjBhgqZNm6ajR4+G7V9bW6sbbrhBEyZMUHV1te6//34tWLBAmzZtMlx570Q6vnaHDh1SfX19aBs6dKihiiPT3NyskSNHatWqVb3q77T5kyIfYzsnzGFFRYWKioq0e/dulZWV6dSpU5oyZYqam5u73Mdpc9iXMbZzwhwOHDhQjz32mPbt26d9+/bpu9/9rmbMmKF33nknbH+nzZ8U+RjbOWH+vm7v3r1as2aNLr744m77xWQerQQyduxYa968eR3ahg8fbi1dujRs/3vvvdcaPnx4h7Y777zTuuKKK6JW4+mIdHw7duywJFmfffaZgersJckqLS3tto/T5u/rejNGJ89hY2OjJcmqqKjoso/T57A3Y3TyHFqWZZ133nnWM888E/Y7p89fu+7G6NT5a2pqsoYOHWqVlZVZ11xzjbVw4cIu+8ZiHhNm5eOLL77Q/v37NWXKlA7tU6ZM0a5du8Lu88Ybb3Tqf/3112vfvn06efJk1Grti76Mr92ll16qnJwcTZ48WTt27IhmmUY5af5OlxPn0O/3S5IyMzO77OP0OezNGNs5bQ5bW1u1ceNGNTc3a9y4cWH7OH3+ejPGdk6bv6KiIk2fPl3XXnttj31jMY8JEz4+/fRTtba2asCAAR3aBwwYoIaGhrD7NDQ0hO1/6tQpffrpp1GrtS/6Mr6cnBytWbNGmzZt0ksvvaRhw4Zp8uTJqqysNFFy1Dlp/vrKqXNoWZaWLFmi8ePHq6CgoMt+Tp7D3o7RaXP49ttv6xvf+IbcbrfmzZun0tJS5efnh+3r1PmLZIxOmz9J2rhxo6qqquTz+XrVPxbzGHdvtT1dLperw2fLsjq19dQ/XHu8iGR8w4YN07Bhw0Kfx40bp7q6Oj3++OO6+uqro1qnKU6bv0g5dQ7vuusuvfXWW9q5c2ePfZ06h70do9PmcNiwYaqpqdF///tfbdq0SXPnzlVFRUWXP85OnL9Ixui0+aurq9PChQu1bds2paWl9Xo/0/OYMCsf559/vpKTkzutAjQ2NnZKdO2ys7PD9k9JSVH//v2jVmtf9GV84VxxxRV6//337S4vJpw0f3aK9zm8++679fLLL2vHjh0aOHBgt32dOoeRjDGceJ7D1NRUfetb39Lo0aPl8/k0cuRIPfHEE2H7OnX+IhljOPE8f/v371djY6NGjRqllJQUpaSkqKKiQn/4wx+UkpKi1tbWTvvEYh4TJnykpqZq1KhRKisr69BeVlamK6+8Muw+48aN69R/27ZtGj16tPr16xe1WvuiL+MLp7q6Wjk5OXaXFxNOmj87xescWpalu+66Sy+99JK2b9+uvLy8Hvdx2hz2ZYzhxOschmNZloLBYNjvnDZ/XelujOHE8/xNnjxZb7/9tmpqakLb6NGjNWvWLNXU1Cg5ObnTPjGZx6hdyhoDGzdutPr162c9++yz1sGDB61FixZZ55xzjvXRRx9ZlmVZS5cutebMmRPq/+GHH1pnn322tXjxYuvgwYPWs88+a/Xr18/6y1/+EqshdCvS8f3+97+3SktLrcOHD1sHDhywli5dakmyNm3aFKshdKupqcmqrq62qqurLUnWihUrrOrqauvjjz+2LMv582dZkY/RSXM4f/58y+PxWOXl5VZ9fX1o+/zzz0N9nD6HfRmjk+awuLjYqqystGpra6233nrLuv/++62kpCRr27ZtlmU5f/4sK/IxOmn+uvL1u13iYR4TKnxYlmX98Y9/tAYPHmylpqZal112WYdb4ObOnWtdc801HfqXl5dbl156qZWammoNGTLEWr16teGKIxPJ+JYvX25deOGFVlpamnXeeedZ48ePt1555ZUYVN077be0fX2bO3euZVmJMX+RjtFJcxhuXJKs559/PtTH6XPYlzE6aQ7vuOOO0P8v3/zmN63JkyeHfpQty/nzZ1mRj9FJ89eVr4ePeJhHl2X9/1UlAAAABiTMNR8AAMAZCB8AAMAowgcAADCK8AEAAIwifAAAAKMIHwAAwCjCBwAAMIrwAQAAjCJ8AAAAowgfAADAKMIHAAAwivABAACM+j9THZLTIiRtrwAAAABJRU5ErkJggg==",
+      "text/plain": [
+       "<Figure size 640x480 with 1 Axes>"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "out = wf(arrays__x=5)\n",
+    "out.plot__fig"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "f69983f7-c110-4ea1-8da1-009b7c5410af",
+   "metadata": {},
+   "source": [
+    "Unless it's turned off, `pyiron_workflow` will make sure that all new nodes and connections obey type hints (where provided). For instance, if we try to pass a non-int to our `square_range` node, we'll get an error:"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 9,
+   "id": "04a19675-c98d-4255-8583-a567cda45e08",
+   "metadata": {},
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "The channel x cannot take the value `5.5` because it is not compliant with the type hint <class 'int'>\n"
+     ]
+    }
+   ],
+   "source": [
+    "try:\n",
+    "    wf.arrays.inputs.x = 5.5\n",
+    "except TypeError as e:\n",
+    "    message = e.args[0]\n",
+    "    print(message)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "be52f21f-2aa3-4182-88a5-815f2153a703",
+   "metadata": {},
+   "source": [
+    "## Composing complex workflows from macros\n",
+    "\n",
+    "There's just one last step: once we have a workflow we're happy with, we can package it as a \"macro\"! This lets us make more and more complex workflows by composing sub-graphs.\n",
+    "\n",
+    "We don't yet have an automated tool for converting workflows into macros, but we can create them by decorating a function that takes a macro instance and builds its graph, so we can just copy-and-paste our workflow above into a decorated function! While we're here, we'll take advantage of the option to define \"maps\" to give our IO prettier names (this is also available for workflows, we just didn't bother)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 10,
+   "id": "f67312c0-7028-4569-8b3a-d9e2fe88df48",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "@Workflow.wrap_as.macro_node()\n",
+    "def my_square_plot(macro):\n",
+    "    macro.arrays = square_range()\n",
+    "    macro.plot = macro.create.plotting.Scatter(\n",
+    "        x=macro.arrays.outputs.x,\n",
+    "        y=macro.arrays.outputs.x_sq\n",
+    "    )\n",
+    "    macro.inputs_map = {\"arrays__x\": \"n\"}\n",
+    "    macro.outputs_map = {\n",
+    "        \"arrays__x\": \"x\",\n",
+    "        \"arrays__x_sq\": \"y\",\n",
+    "        \"plot__fig\": \"fig\"\n",
+    "    }\n",
+    "    # Note that we also forced regularly hidden IO to be exposed!\n",
+    "    # We can also hide IO that's usually exposed by mapping to `None`"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 11,
+   "id": "b43f7a86-4579-4476-89a9-9d7c5942c3fb",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "text/plain": [
+       "{'square_plot__fig': <matplotlib.collections.PathCollection at 0x14e4c8a50>,\n",
+       " 'shifted_square_plot__fig': <matplotlib.collections.PathCollection at 0x14e4e6650>}"
+      ]
+     },
+     "execution_count": 11,
+     "metadata": {},
+     "output_type": "execute_result"
+    },
+    {
+     "data": {
+      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAh8AAAGdCAYAAACyzRGfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAuQElEQVR4nO3df3RUdWL//9ck6CTRmbFhzUxmDexo02LIugsirBEXurukYTlBD6faFVnZck6PAu4aOS2BZVuIRxPBsxxa040H21o8HKp7PvVH6Kkp2XUb10ZPIsguJFZ315yQQoZUoTPhR8LXzP3+Mc3IkB8wyeR9Z4bn45w5nnnPO8krOR7vy/e9930dlmVZAgAAMCTL7gAAAODqQvkAAABGUT4AAIBRlA8AAGAU5QMAABhF+QAAAEZRPgAAgFGUDwAAYNQ0uwNcKhKJ6MSJE3K5XHI4HHbHAQAAV8CyLPX398vv9ysra/y1jZQrHydOnFBRUZHdMQAAwAT09PTopptuGndOypUPl8slKRre7XbbnAYAAFyJcDisoqKi2HF8PClXPoZPtbjdbsoHAABp5koumeCCUwAAYBTlAwAAGEX5AAAARlE+AACAUZQPAABgFOUDAAAYRfkAAABGUT4AAIBRKbfJGAAA6WAoYqmt65T6+gdU4MrR/EC+srNS/JlkkSGpu1U6c1K63ivNLJOyso3HoHwAAJCgpqO9qtnfqd7QQGys0JOjrZUlqigttDHZODobpaZqKXzi8zG3X6rYLpUsNxqF0y4AACSg6Wiv1u49FFc8JCkYGtDavYfUdLTXpmTj6GyUfvpQfPGQpHBvdLyz0WgcygcAAFdoKGKpZn+nrFE+Gx6r2d+pochoM2wSGYqueIyXumlTdJ4hlA8AAK5QW9epESseF7Mk9YYG1NZ1ylyoy+luHbniEceSwsej8wxJqHx89tln+tGPfqRAIKDc3FzdfPPNeuKJJxSJRGJzLMvStm3b5Pf7lZubq8WLF6ujoyPpwQEAMK2vf+ziMZF5Rpw5mdx5SZBQ+di+fbuee+451dfX64MPPtCOHTv0zDPP6Nlnn43N2bFjh3bu3Kn6+nq1t7fL5/NpyZIl6u/vT3p4AABMKnDlJHWeEdd7kzsvCRIqH++8847uueceLVu2TF/60pf0J3/yJyovL9d7770nKbrqsWvXLm3ZskUrVqxQaWmp9uzZo3Pnzmnfvn1T8gsAAGDK/EC+Cj05GuuGWoeid73MD+SbjDW+mWXRu1rGS+3+YnSeIQmVj4ULF+rnP/+5PvroI0nSr371K7399tv69re/LUnq6upSMBhUeXl57GucTqcWLVqk1lZz55IAAJgK2VkOba0skTTyUD78fmtlSWrt95GVHb2dVtKYqSueNrrfR0Llo7q6Wg888IBmzZqla665RnPmzFFVVZUeeOABSVIwGJQkeb3xSzderzf22aUGBwcVDofjXgAApKqK0kI1rJornyf+1IrPk6OGVXNTc5+PkuXS/S9K7kuyuf3RccP7fCS0ydjLL7+svXv3at++fZo9e7YOHz6sqqoq+f1+rV69OjbP4YhvVpZljRgbVldXp5qamglEBwDAHhWlhVpS4kuvHU5LlkuzlqXEDqcOy7Ku+GbkoqIibdq0SevXr4+NPfnkk9q7d6/+67/+Sx9//LFuueUWHTp0SHPmzInNueeee3TDDTdoz549I77n4OCgBgcHY+/D4bCKiooUCoXkdrsn+nsBAACDwuGwPB7PFR2/Ezrtcu7cOWVlxX9JdnZ27FbbQCAgn8+n5ubm2OcXLlxQS0uLyspGv5DF6XTK7XbHvQAAQOZK6LRLZWWlnnrqKc2YMUOzZ8/W+++/r507d2rNmjWSoqdbqqqqVFtbq+LiYhUXF6u2tlZ5eXlauXLllPwCAAAgvSRUPp599ln91V/9ldatW6e+vj75/X49/PDD+uu//uvYnI0bN+r8+fNat26dTp8+rQULFujAgQNyuVxJDw8AANJPQtd8mJDIOSMAAJAapuyaDwAAgMmifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAqITKx5e+9CU5HI4Rr/Xr10uSLMvStm3b5Pf7lZubq8WLF6ujo2NKggMAgPSUUPlob29Xb29v7NXc3CxJuu+++yRJO3bs0M6dO1VfX6/29nb5fD4tWbJE/f39yU8OAADSUkLl48Ybb5TP54u9/vVf/1W33HKLFi1aJMuytGvXLm3ZskUrVqxQaWmp9uzZo3Pnzmnfvn1TlR8AAKSZCV/zceHCBe3du1dr1qyRw+FQV1eXgsGgysvLY3OcTqcWLVqk1tbWMb/P4OCgwuFw3AsAAGSuCZeP1157Tf/7v/+r733ve5KkYDAoSfJ6vXHzvF5v7LPR1NXVyePxxF5FRUUTjQQAANLAhMvHP/zDP2jp0qXy+/1x4w6HI+69ZVkjxi62efNmhUKh2Kunp2eikQAAQBqYNpEv6u7u1s9+9jO98sorsTGfzycpugJSWFgYG+/r6xuxGnIxp9Mpp9M5kRgAACANTWjl44UXXlBBQYGWLVsWGwsEAvL5fLE7YKTodSEtLS0qKyubfFIAAJAREl75iEQieuGFF7R69WpNm/b5lzscDlVVVam2tlbFxcUqLi5WbW2t8vLytHLlyqSGBgAA6Svh8vGzn/1Mx44d05o1a0Z8tnHjRp0/f17r1q3T6dOntWDBAh04cEAulyspYQEAQPpzWJZl2R3iYuFwWB6PR6FQSG632+44AADgCiRy/ObZLgAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwKhpdgcAAGAoYqmt65T6+gdU4MrR/EC+srMcdscaX2RI6m6VzpyUrvdKM8ukrGy7U6UFygcAwFZNR3tVs79TvaGB2FihJ0dbK0tUUVpoY7JxdDZKTdVS+MTnY26/VLFdKlluX640wWkXAIBtmo72au3eQ3HFQ5KCoQGt3XtITUd7bUo2js5G6acPxRcPSQr3Rsc7G+3JlUYoHwAAWwxFLNXs75Q1ymfDYzX7OzUUGW2GTSJD0RWP8VI3bYrOw5goHwAAW7R1nRqx4nExS1JvaEBtXafMhbqc7taRKx5xLCl8PDoPY6J8AABs0dc/dvGYyDwjzpxM7ryrFOUDAGCLAldOUucZcb03ufOuUpQPAIAt5gfyVejJ0Vg31DoUvetlfiDfZKzxzSyL3tUyXmr3F6PzMCbKBwDAFtlZDm2tLJE08lA+/H5rZUlq7feRlR29nVbSmKkrnma/j8ugfAAAbFNRWqiGVXPl88SfWvF5ctSwam5q7vNRsly6/0XJfUk2tz86zj4fl+WwLCuF7mGSwuGwPB6PQqGQ3G633XEAAAaww2n6S+T4zQ6nAADbZWc5dOct0+2OkZisbClwt90p0hKnXQAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUQmXj+PHj2vVqlWaPn268vLy9NWvflUHDx6MfW5ZlrZt2ya/36/c3FwtXrxYHR0dSQ0NAADSV0Ll4/Tp07rrrrt0zTXX6I033lBnZ6d+/OMf64YbbojN2bFjh3bu3Kn6+nq1t7fL5/NpyZIl6u/vT3Z2AACQhhLaZGzTpk36z//8T/3yl78c9XPLsuT3+1VVVaXq6mpJ0uDgoLxer7Zv366HH374sj+DTcYAAEg/iRy/E1r5aGxs1Lx583TfffepoKBAc+bM0fPPPx/7vKurS8FgUOXl5bExp9OpRYsWqbW1NcFfAwAAZKKEysfHH3+shoYGFRcX69///d/1yCOP6Ac/+IFefPFFSVIwGJQkeb3xjxL2er2xzy41ODiocDgc9wIAAJkroe3VI5GI5s2bp9raWknSnDlz1NHRoYaGBj300EOxeQ5H/H78lmWNGBtWV1enmpqaRHMDAIA0ldDKR2FhoUpKSuLGbr31Vh07dkyS5PP5JGnEKkdfX9+I1ZBhmzdvVigUir16enoSiQQAANJMQuXjrrvu0ocffhg39tFHH2nmzJmSpEAgIJ/Pp+bm5tjnFy5cUEtLi8rKykb9nk6nU263O+4FAAAyV0KnXR5//HGVlZWptrZW999/v9ra2rR7927t3r1bUvR0S1VVlWpra1VcXKzi4mLV1tYqLy9PK1eunJJfAAAApJeEyscdd9yhV199VZs3b9YTTzyhQCCgXbt26cEHH4zN2bhxo86fP69169bp9OnTWrBggQ4cOCCXy5X08AAAIP0ktM+HCezzAQBA+pmyfT4AAAAmi/IBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMSKh/btm2Tw+GIe/l8vtjnlmVp27Zt8vv9ys3N1eLFi9XR0ZH00AAAIH0lvPIxe/Zs9fb2xl5HjhyJfbZjxw7t3LlT9fX1am9vl8/n05IlS9Tf35/U0AAAIH0lXD6mTZsmn88Xe914442Soqseu3bt0pYtW7RixQqVlpZqz549OnfunPbt25f04AAAID0lXD5+85vfyO/3KxAI6Dvf+Y4+/vhjSVJXV5eCwaDKy8tjc51OpxYtWqTW1tYxv9/g4KDC4XDcCwAwcUMRS+/87lO9fvi43vndpxqKWHZHurzIkNT1S+nI/4v+MzJkdyJMoWmJTF6wYIFefPFF/cEf/IFOnjypJ598UmVlZero6FAwGJQkeb3euK/xer3q7u4e83vW1dWppqZmAtEBAJdqOtqrmv2d6g0NxMYKPTnaWlmiitJCG5ONo7NRaqqWwic+H3P7pYrtUsly+3Jhyjgsy5pwJT579qxuueUWbdy4UV/72td011136cSJEyos/Pxf8D//8z9XT0+PmpqaRv0eg4ODGhwcjL0Ph8MqKipSKBSS2+2eaDQAuOo0He3V2r2HdOl/1B3/98+GVXNTr4B0Nko/fUgaK/X9L1JA0kQ4HJbH47mi4/ekbrW97rrr9OUvf1m/+c1vYne9DK+ADOvr6xuxGnIxp9Mpt9sd9wIAJGYoYqlmf+eIQ7j0+WG9Zn9nap2CiQxFVzzGS920iVMwGWhS5WNwcFAffPCBCgsLFQgE5PP51NzcHPv8woULamlpUVlZ2aSDAgDG1tZ1Ku5Uy6UsSb2hAbV1nTIX6nK6W+NPtYxgSeHj0XnIKAld8/EXf/EXqqys1IwZM9TX16cnn3xS4XBYq1evlsPhUFVVlWpra1VcXKzi4mLV1tYqLy9PK1eunKr8AABJff1jF4+JzDPizMnkzkPaSKh8/Pd//7ceeOABffLJJ7rxxhv1ta99Te+++65mzpwpSdq4caPOnz+vdevW6fTp01qwYIEOHDggl8s1JeEBAFEFrpykzjPi+rFPyU9oHtLGpC44nQqJXLACAIgailhauP1NBUMDo15B4ZDk8+To7epvKDvLMcoMG0SGpF2lUrhXo1/34Yje9VJ1RMrKNp0OCTJ2wSkAIDVkZzm0tbJE0ud3twwbfr+1siR1iocULRQV2//vzRipK56meGQgygcAZIiK0kI1rJornyf+1IrPk5Oat9lK0dto739Rcl+Sze3nNtsMxmkXAMgwQxFLbV2n1Nc/oAJXjuYH8lNrxWM0kaHoXS1nTkav8ZhZxopHmknk+J3QBacAgNSXneXQnbdMtztGYrKypcDddqeAIZx2AQAARlE+AACAUZQPAABgFOUDAAAYRfkAAABGUT4AAIBRlA8AAGAU5QMAABhF+QAAAEZRPgAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUZQPAABgFOUDAAAYRfkAAABGUT4AAIBRlA8AAGAU5QMAABhF+QAAAEZRPgAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUZMqH3V1dXI4HKqqqoqNWZalbdu2ye/3Kzc3V4sXL1ZHR8dkcwIAgAwx4fLR3t6u3bt367bbbosb37Fjh3bu3Kn6+nq1t7fL5/NpyZIl6u/vn3RYAACQ/iZUPs6cOaMHH3xQzz//vH7v934vNm5Zlnbt2qUtW7ZoxYoVKi0t1Z49e3Tu3Dnt27cvaaEBAED6mlD5WL9+vZYtW6ZvfetbceNdXV0KBoMqLy+PjTmdTi1atEitra2TSwoAADLCtES/4KWXXtKhQ4fU3t4+4rNgMChJ8nq9ceNer1fd3d2jfr/BwUENDg7G3ofD4UQjAQCANJLQykdPT48ee+wx7d27Vzk5OWPOczgcce8tyxoxNqyurk4ejyf2KioqSiQSAABIMwmVj4MHD6qvr0+33367pk2bpmnTpqmlpUV/+7d/q2nTpsVWPIZXQIb19fWNWA0ZtnnzZoVCodirp6dngr8KAABIBwmddvnmN7+pI0eOxI392Z/9mWbNmqXq6mrdfPPN8vl8am5u1pw5cyRJFy5cUEtLi7Zv3z7q93Q6nXI6nROMDwAA0k1C5cPlcqm0tDRu7LrrrtP06dNj41VVVaqtrVVxcbGKi4tVW1urvLw8rVy5MnmpAQBA2kr4gtPL2bhxo86fP69169bp9OnTWrBggQ4cOCCXy5XsHwUAANKQw7Isy+4QFwuHw/J4PAqFQnK73XbHAQAAVyCR4zfPdgEAAEZRPgAAgFGUDwAAYBTlAwAAGEX5AAAARiX9VlsAyCRDEUttXafU1z+gAleO5gfylZ01+uMiUkZkSOpulc6clK73SjPLpKxsu1MBMZQPABhD09Fe1ezvVG9oIDZW6MnR1soSVZQW2phsHJ2NUlO1FD7x+ZjbL1Vsl0qW25cLuAinXQBgFE1He7V276G44iFJwdCA1u49pKajvTYlG0dno/TTh+KLhySFe6PjnY325AIuQfkAgEsMRSzV7O/UaDswDo/V7O/UUCSF9miMDEVXPMZL3bQpOg+wGeUDAC7R1nVqxIrHxSxJvaEBtXWdMhfqcrpbR654xLGk8PHoPMBmlA8AuERf/9jFYyLzjDhzMrnzgClE+QCASxS4cpI6z4jrvcmdB0whygcAXGJ+IF+FnhyNdUOtQ9G7XuYH8k3GGt/MsuhdLeOldn8xOg+wGeUDAC6RneXQ1soSSSMP5cPvt1aWpNZ+H1nZ0dtpJY2ZuuJp9vtASqB8AMAoKkoL1bBqrnye+FMrPk+OGlbNTc19PkqWS/e/KLkvyeb2R8fZ5wMpwmFZVgrdKyaFw2F5PB6FQiG53W674wC4yrHDKXBlEjl+s8MpAIwjO8uhO2+ZbneMxGRlS4G77U4BjInTLgAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMCqh8tHQ0KDbbrtNbrdbbrdbd955p954443Y55Zladu2bfL7/crNzdXixYvV0dGR9NAAACB9JVQ+brrpJj399NN677339N577+kb3/iG7rnnnljB2LFjh3bu3Kn6+nq1t7fL5/NpyZIl6u/vn5LwAAAg/Tgsy7Im8w3y8/P1zDPPaM2aNfL7/aqqqlJ1dbUkaXBwUF6vV9u3b9fDDz98Rd8vHA7L4/EoFArJ7XZPJhoAADAkkeP3hK/5GBoa0ksvvaSzZ8/qzjvvVFdXl4LBoMrLy2NznE6nFi1apNbW1jG/z+DgoMLhcNwLAABkroTLx5EjR3T99dfL6XTqkUce0auvvqqSkhIFg0FJktfrjZvv9Xpjn42mrq5OHo8n9ioqKko0EgAASCMJl48//MM/1OHDh/Xuu+9q7dq1Wr16tTo7O2OfOxyOuPmWZY0Yu9jmzZsVCoVir56enkQjAQCANDIt0S+49tpr9fu///uSpHnz5qm9vV1/8zd/E7vOIxgMqrCwMDa/r69vxGrIxZxOp5xOZ6IxAABAmpr0Ph+WZWlwcFCBQEA+n0/Nzc2xzy5cuKCWlhaVlZVN9scAAIAMkdDKxw9/+EMtXbpURUVF6u/v10svvaT/+I//UFNTkxwOh6qqqlRbW6vi4mIVFxertrZWeXl5Wrly5VTlBwAAaSah8nHy5El997vfVW9vrzwej2677TY1NTVpyZIlkqSNGzfq/PnzWrdunU6fPq0FCxbowIEDcrlcUxIeAACkn0nv85Fs7PMBAED6MbLPBwAAwERQPgAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUZQPAABgFOUDAAAYRfkAAABGJfxUWwCYiKGIpbauU+rrH1CBK0fzA/nKznLYHevyIkNSd6t05qR0vVeaWSZlZdudCkhrlA8AU67paK9q9neqNzQQGyv05GhrZYkqSgttTHYZnY1SU7UUPvH5mNsvVWyXSpbblwtIc5x2ATClmo72au3eQ3HFQ5KCoQGt3XtITUd7bUp2GZ2N0k8fii8ekhTujY53NtqTC8gAlA8AU2YoYqlmf6dGe3rl8FjN/k4NRVLq+ZbRUy1N1dJ4yZs2RecBSBjlA8CUaes6NWLF42KWpN7QgNq6TpkLdSW6W0eueMSxpPDx6DwACaN8AJgyff1jF4+JzDPmzMnkzgMQh/IBYMoUuHKSOs+Y673JnQcgDuUDwJSZH8hXoSdHY91Q61D0rpf5gXyTsS5vZln0rpbxkru/GJ0HIGGUDwBTJjvLoa2VJZJGHsaH32+tLEm9/T6ysqO300oaM3nF0+z3AUwQ5QPAlKooLVTDqrnyeeJPrfg8OWpYNTd19/koWS7d/6LkviSf2x8dZ58PYMIclmWl1D1u4XBYHo9HoVBIbrfb7jgAkoQdToHMlsjxmx1OARiRneXQnbdMtztG4rKypcDddqcAMgqnXQAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUZQPAABgFOUDAAAYRfkAAABGUT4AAIBRCZWPuro63XHHHXK5XCooKNC9996rDz/8MG6OZVnatm2b/H6/cnNztXjxYnV0dCQ1NAAASF8JlY+WlhatX79e7777rpqbm/XZZ5+pvLxcZ8+ejc3ZsWOHdu7cqfr6erW3t8vn82nJkiXq7+9PengAAJB+JvVguf/5n/9RQUGBWlpa9PWvf12WZcnv96uqqkrV1dWSpMHBQXm9Xm3fvl0PP/zwZb8nD5YDACD9JHL8ntQ1H6FQSJKUn58vSerq6lIwGFR5eXlsjtPp1KJFi9Ta2jqZHwUAADLEhJ9qa1mWNmzYoIULF6q0tFSSFAwGJUlerzdurtfrVXd396jfZ3BwUIODg7H34XB4opEAAEAamPDKx6OPPqpf//rX+ud//ucRnzkcjrj3lmWNGBtWV1cnj8cTexUVFU00EgAASAMTKh/f//731djYqF/84he66aabYuM+n0/S5ysgw/r6+kashgzbvHmzQqFQ7NXT0zORSAAAIE0kVD4sy9Kjjz6qV155RW+++aYCgUDc54FAQD6fT83NzbGxCxcuqKWlRWVlZaN+T6fTKbfbHfcCAACZK6FrPtavX699+/bp9ddfl8vliq1weDwe5ebmyuFwqKqqSrW1tSouLlZxcbFqa2uVl5enlStXTskvAAAA0ktC5aOhoUGStHjx4rjxF154Qd/73vckSRs3btT58+e1bt06nT59WgsWLNCBAwfkcrmSEhgAAKS3Se3zMRXY5wMAgPRjbJ8PAACARFE+AACAUZQPAABgFOUDAAAYRfkAAABGTfjZLgDsMxSx1NZ1Sn39Aypw5Wh+IF/ZWaM/wiBlRIak7lbpzEnpeq80s0zKyrY7FQAbUD6ANNN0tFc1+zvVGxqIjRV6crS1skQVpYU2JhtHZ6PUVC2FT3w+5vZLFdulkuX25QJgC067AGmk6Wiv1u49FFc8JCkYGtDavYfUdLTXpmTj6GyUfvpQfPGQpHBvdLyz0Z5cAGxD+QDSxFDEUs3+To22K+DwWM3+Tg1FUmjfwMhQdMVjvNRNm6LzAFw1KB9AmmjrOjVixeNilqTe0IDauk6ZC3U53a0jVzziWFL4eHQegKsG5QNIE339YxePicwz4szJ5M4DkBEoH0CaKHDlJHWeEdd7kzsPQEagfABpYn4gX4WeHI11Q61D0bte5gfyTcYa38yy6F0t46V2fzE6D8BVg/IBpInsLIe2VpZIGnkoH36/tbIktfb7yMqO3k4raczUFU+z3wdwlaF8AGmkorRQDavmyueJP7Xi8+SoYdXc1Nzno2S5dP+LkvuSbG5/dJx9PoCrjsOyrBS6L08Kh8PyeDwKhUJyu912xwFSEjucAkg1iRy/2eEUSEPZWQ7dect0u2MkJitbCtxtdwoAKYDTLgAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwivIBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMCrh8vHWW2+psrJSfr9fDodDr732WtznlmVp27Zt8vv9ys3N1eLFi9XR0ZGsvAAAIM0lXD7Onj2rr3zlK6qvrx/18x07dmjnzp2qr69Xe3u7fD6flixZov7+/kmHBQAA6W9aol+wdOlSLV26dNTPLMvSrl27tGXLFq1YsUKStGfPHnm9Xu3bt08PP/zw5NICAIC0l9RrPrq6uhQMBlVeXh4bczqdWrRokVpbW0f9msHBQYXD4bgXYNJQxNI7v/tUrx8+rnd+96mGIpbdkS4vMiR1/VI68v+i/4wM2Z0IAK5Ywisf4wkGg5Ikr9cbN+71etXd3T3q19TV1ammpiaZMYAr1nS0VzX7O9UbGoiNFXpytLWyRBWlhTYmG0dno9RULYVPfD7m9ksV26WS5fblAoArNCV3uzgcjrj3lmWNGBu2efNmhUKh2Kunp2cqIgEjNB3t1dq9h+KKhyQFQwNau/eQmo722pRsHJ2N0k8fii8ekhTujY53NtqTCwASkNTy4fP5JH2+AjKsr69vxGrIMKfTKbfbHfcCptpQxFLN/k6NdoJleKxmf2dqnYKJDEVXPMZL3bSJUzAAUl5Sy0cgEJDP51Nzc3Ns7MKFC2ppaVFZWVkyfxQwKW1dp0aseFzMktQbGlBb1ylzoS6nu3XkikccSwofj84DgBSW8DUfZ86c0W9/+9vY+66uLh0+fFj5+fmaMWOGqqqqVFtbq+LiYhUXF6u2tlZ5eXlauXJlUoMDk9HXP3bxmMg8I86cTO48ALBJwuXjvffe0x/90R/F3m/YsEGStHr1av3TP/2TNm7cqPPnz2vdunU6ffq0FixYoAMHDsjlciUvNTBJBa6cpM4z4vrRT11OeB4A2MRhWVYKndSWwuGwPB6PQqEQ139gygxFLC3c/qaCoYFRr6BwSPJ5cvR29TeUnTX6xdLGRYakXaXRi0vHSu32S1VHpKxs0+kAXOUSOX7zbBdclbKzHNpaWSIpWjQuNvx+a2VJ6hQPKVooKrb/35sxUlc8TfEAkPIoH7hqVZQWqmHVXPk88adWfJ4cNayam5r7fJQsl+5/UXJfks3tj46zzweANMBpF1z1hiKW2rpOqa9/QAWuHM0P5KfWisdoIkPRu1rOnIxe4zGzjBUPALZK5Pid1B1OgXSUneXQnbdMtztGYrKypcDddqcAgAnhtAsAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMIryAQAAjKJ8AAAAoygfAADAKMoHAAAwih1OkVRsVQ4AuBzKB5Km6WivavZ3qjc0EBsr9ORoa2VJaj6kTZI6G6Wmail84vMxtz/69Fge0gYAU4LTLkiKpqO9Wrv3UFzxkKRgaEBr9x5S09Fem5KNo7NR+ulD8cVDksK90fHORntyAUCGo3xg0oYilmr2d2q0xyMPj9Xs79RQJIUeoBwZiq54jJe6aVN0HgAgqSgfmLS2rlMjVjwuZknqDQ2oreuUuVCX0906csUjjiWFj0fnAQCSivKBSevrH7t4TGSeEWdOJnceAOCKUT4waQWunKTOM+J6b3LnAQCuGOUDkzY/kK9CT47GuqHWoehdL/MD+SZjjW9mWfSulvFSu78YnQcASCrKByYtO8uhrZUlkkYeyoffb60sSa39PrKyo7fTShozdcXT7PcBAFOA8oGkqCgtVMOqufJ54k+t+Dw5alg1NzX3+ShZLt3/ouS+JJvbHx1nnw8AmBIOy7JS6P5HKRwOy+PxKBQKye122x0HCWKHUwC4OiVy/GaHUyRVdpZDd94y3e4YicnKlgJ3250CAK4alI8UxioCACATUT5SFM9JAQBkKi44TUE8JwUAkMkoHymG56QAADLdVVM+hiKW3vndp3r98HG987tPU+vgfZGLn5OSpYi+ltWp5Vmt+lpWp7IU4TkpAIC0N2XXfPzkJz/RM888o97eXs2ePVu7du3S3Xfbc0dBOl0/Mfz8kz/OatPWa16U3/F5yThh5avm/3tI/x6Zz3NSAABpa0pWPl5++WVVVVVpy5Ytev/993X33Xdr6dKlOnbs2FT8uHGl2/UTBa4c/XFWmxqu2SWf4lc3fDqlhmt26Y+z2nhOCgAgbU3JJmMLFizQ3Llz1dDQEBu79dZbde+996qurm7cr03mJmNDEUsLt7855uPeHYruwPl29TdS5hbWoc8+0ydP/oFutD7VaJEiltTnmK4bf/SRsqelyM1KkSFpV2n04tJRr/twRO96qTrCbbcAkKESOX4nfeXjwoULOnjwoMrLy+PGy8vL1do68pz/4OCgwuFw3CtZLr5+YjSpeP1Eds878mr04iFJWQ7Jp0+V3fOO2WDj4TkpAIAEJL18fPLJJxoaGpLXG7/E7vV6FQwGR8yvq6uTx+OJvYqKipKW5Uqvi+D6iSTgOSkAgCs0Zev2Dkf8/wFbljViTJI2b96sDRs2xN6Hw+GkFZArvS6C6yeSpGS5NGsZO5wCAMaV9PLxhS98QdnZ2SNWOfr6+kashkiS0+mU0+lMdgxJ0vxAvgo9OQqGBsa6EkE+T3Tb8pQxsyy6WnC56ydmlplOdmV4TgoA4DKSftrl2muv1e23367m5ua48ebmZpWVmT1gZmc5tLWyRNKYVyJoa2VJylxsKonrJwAAGW9KbrXdsGGD/v7v/17/+I//qA8++ECPP/64jh07pkceeWQqfty4KkoL1bBqrnye+FMrPk+OGlbNTbl9PiRx/QQAIKNNyTUff/qnf6pPP/1UTzzxhHp7e1VaWqp/+7d/08yZM6fix11WRWmhlpT40usJsVw/AQDIUFOyz8dkJHOfDwAAYIat+3wAAACMh/IBAACMonwAAACjKB8AAMAoygcAADCK8gEAAIyifAAAAKMoHwAAwCjKBwAAMGpKtlefjOENV8PhsM1JAADAlRo+bl/JxukpVz76+/slSUVFRTYnAQAAierv75fH4xl3Tso92yUSiejEiRNyuVxyOJL74LdwOKyioiL19PTw3JgpxN/ZDP7O5vC3NoO/sxlT9Xe2LEv9/f3y+/3Kyhr/qo6UW/nIysrSTTfdNKU/w+128y+2AfydzeDvbA5/azP4O5sxFX/ny614DOOCUwAAYBTlAwAAGHVVlQ+n06mtW7fK6XTaHSWj8Xc2g7+zOfytzeDvbEYq/J1T7oJTAACQ2a6qlQ8AAGA/ygcAADCK8gEAAIyifAAAAKOumvLxk5/8RIFAQDk5Obr99tv1y1/+0u5IGaeurk533HGHXC6XCgoKdO+99+rDDz+0O1bGq6urk8PhUFVVld1RMs7x48e1atUqTZ8+XXl5efrqV7+qgwcP2h0ro3z22Wf60Y9+pEAgoNzcXN1888164oknFIlE7I6W9t566y1VVlbK7/fL4XDotddei/vcsixt27ZNfr9fubm5Wrx4sTo6OoxkuyrKx8svv6yqqipt2bJF77//vu6++24tXbpUx44dsztaRmlpadH69ev17rvvqrm5WZ999pnKy8t19uxZu6NlrPb2du3evVu33Xab3VEyzunTp3XXXXfpmmuu0RtvvKHOzk79+Mc/1g033GB3tIyyfft2Pffcc6qvr9cHH3ygHTt26JlnntGzzz5rd7S0d/bsWX3lK19RfX39qJ/v2LFDO3fuVH19vdrb2+Xz+bRkyZLYM9amlHUVmD9/vvXII4/Ejc2aNcvatGmTTYmuDn19fZYkq6Wlxe4oGam/v98qLi62mpubrUWLFlmPPfaY3ZEySnV1tbVw4UK7Y2S8ZcuWWWvWrIkbW7FihbVq1SqbEmUmSdarr74aex+JRCyfz2c9/fTTsbGBgQHL4/FYzz333JTnyfiVjwsXLujgwYMqLy+PGy8vL1dra6tNqa4OoVBIkpSfn29zksy0fv16LVu2TN/61rfsjpKRGhsbNW/ePN13330qKCjQnDlz9Pzzz9sdK+MsXLhQP//5z/XRRx9Jkn71q1/p7bff1re//W2bk2W2rq4uBYPBuGOj0+nUokWLjBwbU+7Bcsn2ySefaGhoSF6vN27c6/UqGAzalCrzWZalDRs2aOHChSotLbU7TsZ56aWXdOjQIbW3t9sdJWN9/PHHamho0IYNG/TDH/5QbW1t+sEPfiCn06mHHnrI7ngZo7q6WqFQSLNmzVJ2draGhob01FNP6YEHHrA7WkYbPv6Ndmzs7u6e8p+f8eVjmMPhiHtvWdaIMSTPo48+ql//+td6++237Y6ScXp6evTYY4/pwIEDysnJsTtOxopEIpo3b55qa2slSXPmzFFHR4caGhooH0n08ssva+/evdq3b59mz56tw4cPq6qqSn6/X6tXr7Y7Xsaz69iY8eXjC1/4grKzs0escvT19Y1ofEiO73//+2psbNRbb72lm266ye44GefgwYPq6+vT7bffHhsbGhrSW2+9pfr6eg0ODio7O9vGhJmhsLBQJSUlcWO33nqr/uVf/sWmRJnpL//yL7Vp0yZ95zvfkSR9+ctfVnd3t+rq6igfU8jn80mKroAUFhbGxk0dGzP+mo9rr71Wt99+u5qbm+PGm5ubVVZWZlOqzGRZlh599FG98sorevPNNxUIBOyOlJG++c1v6siRIzp8+HDsNW/ePD344IM6fPgwxSNJ7rrrrhG3in/00UeaOXOmTYky07lz55SVFX8oys7O5lbbKRYIBOTz+eKOjRcuXFBLS4uRY2PGr3xI0oYNG/Td735X8+bN05133qndu3fr2LFjeuSRR+yOllHWr1+vffv26fXXX5fL5YqtNnk8HuXm5tqcLnO4XK4R19Fcd911mj59OtfXJNHjjz+usrIy1dbW6v7771dbW5t2796t3bt32x0to1RWVuqpp57SjBkzNHv2bL3//vvauXOn1qxZY3e0tHfmzBn99re/jb3v6urS4cOHlZ+frxkzZqiqqkq1tbUqLi5WcXGxamtrlZeXp5UrV059uCm/nyZF/N3f/Z01c+ZM69prr7Xmzp3L7Z9TQNKorxdeeMHuaBmPW22nxv79+63S0lLL6XRas2bNsnbv3m13pIwTDoetxx57zJoxY4aVk5Nj3XzzzdaWLVuswcFBu6OlvV/84hej/jd59erVlmVFb7fdunWr5fP5LKfTaX3961+3jhw5YiSbw7Isa+orDgAAQFTGX/MBAABSC+UDAAAYRfkAAABGUT4AAIBRlA8AAGAU5QMAABhF+QAAAEZRPgAAgFGUDwAAYBTlAwAAGEX5AAAARlE+AACAUf8/WDEE/tpKzk4AAAAASUVORK5CYII=",
+      "text/plain": [
+       "<Figure size 640x480 with 1 Axes>"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "wf2 = Workflow(\"my_composed_workflow\")\n",
+    "\n",
+    "wf2.square_plot = my_square_plot(n=10)\n",
+    "wf2.shift = add_one(wf2.square_plot.outputs.x)\n",
+    "wf2.shifted_square_plot = wf2.create.plotting.Scatter(\n",
+    "    x=wf2.shift,\n",
+    "    y=wf2.square_plot.outputs.y,\n",
+    ")\n",
+    "wf2()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "3b30dbca-3d89-44df-b47a-951aedcb939d",
+   "metadata": {},
+   "source": [
+    "## What else?\n",
+    "\n",
+    "To learn more, take a look at the `deepdive.ipynb` notebook, and/or start looking through the class docstrings. Here's a brief map of what you're still missing:\n",
+    "\n",
+    "### Features that are currently available but in alpha stage\n",
+    "- Distributing node execution onto remote processes\n",
+    "  - Single core parallel python processes is available by setting the `.executor = True`\n",
+    "- Acyclic graphs\n",
+    "  - Execution for graphs whose data flow topology is a DAG happens automatically, but you're always free to specify this manually with `Signals`, and indeed _must_ specify the execution flow manually for cyclic graphs -- but cyclic graphs _are_ possible!\n",
+    "- Complex flow nodes\n",
+    "  - If, While, and For nodes are all available for more complex flow control\n",
+    "- A node library for atomistic simulations with Lammps\n",
+    "  \n",
+    "### Features coming shortly\n",
+    "- Storing workflow results and restarting partially executed workflows\n",
+    "- More and richer node packages\n",
+    "\n",
+    "### Features planned\n",
+    "- \"FAIR\" principles for node packages and package registration\n",
+    "- Ontological typing and guided workflow design (see our `ironflow` project for a working prototype)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "cf7a44a7-cf8e-4077-9683-909d89f5a5ef",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/pyiron_workflow/__init__.py b/pyiron_workflow/__init__.py
index b2636e88..566dfc9f 100644
--- a/pyiron_workflow/__init__.py
+++ b/pyiron_workflow/__init__.py
@@ -1 +1,30 @@
+"""
+`pyiron_workflow` is a python framework for constructing computational workflows in a
+graph-based format.
+The intent of such a framework is to improve the reliability and shareability of
+computational workflows, as well as providing supporting infrastructure for the
+storage and retrieval of data, and executing computations on remote resources (with a
+special emphasis on HPC environments common in academic research).
+It is a key goal that writing such workflows should be as easy as possible, and simple
+cases should be _almost_ as simple as writing and running plain python functions.
+
+Key features:
+- Single point of import
+- Easy "nodeification" of regular python code
+- Macro nodes, so complex workflows can be built by composition
+- (Optional) type checking for data connections
+- (Optional) remote execution of individual nodes (currently only very simple
+    single-core, same-machine parallel processes)
+- Both acyclic (execution flow is automated) and cyclic (execution flow must be
+    specified) graphs allowed
+- Easy extensibility by collecting packages of nodes together for sharing/reusing
+
+Planned:
+- Storage of executed workflows, including restarting from a partially executed workflow
+- Support for more complex remote execution, especially leveraging `pympipool`
+- Infrastructure that supports and encourages of FAIR principles for node packages and
+  finished workflows
+- Ontological hinting for data channels in order to provide guided workflow design
+- GUI on top for code-lite/code-free visual scripting
+"""
 from pyiron_workflow.workflow import Workflow
diff --git a/pyiron_workflow/channels.py b/pyiron_workflow/channels.py
index f2901ad6..196a3388 100644
--- a/pyiron_workflow/channels.py
+++ b/pyiron_workflow/channels.py
@@ -1,22 +1,9 @@
 """
 Channels are access points for information to flow into and out of nodes.
+They accomplish this by forming connections between each other, and it should be as
+easy as possible to form sensible and reliable connections.
 
-Data channels carry, unsurprisingly, data.
-Connections are only permissible between opposite sub-types, i.e. input-output.
-When input channels `fetch()` data in, they set their `value` to the first available
-data value among their connections -- i.e. the `value` of the first output channel in
-their connections who has something other than `NotData`.
-Input data channels will raise an error if a `fetch()` is attempted while their parent
- node is running.
-
-Signal channels are tools for procedurally exposing functionality on nodes.
-Input signal channels are connected to a callback function which gets invoked when the
-channel is called.
-Output signal channels call all the input channels they are connected to when they get
- called themselves.
-In this way, signal channels can force behaviour (node method calls) to propagate
-forwards through a graph.
-They do not hold any data and have no `value` attribute, but rather fire for an effect.
+Nodes get the attention, but channels are the real heroes.
 """
 
 from __future__ import annotations
@@ -44,16 +31,31 @@ class Channel(HasChannel, HasToDict, ABC):
     """
     Channels facilitate the flow of information (data or control signals) into and
     out of nodes.
-    They must have a label and belong to a node.
 
-    Input/output channels can be (dis)connected from other output/input channels of the
-    same generic type (i.e. data or signal), and store all of their current connections
-    in a list.
+    They must have an identifier (`label: str`) and belong to a parent node
+    (`node: pyiron_workflow.node.Node`).
+
+    Non-abstract channel classes should come in input/output pairs with a shared
+    ancestor (`generic_type: type[Channel]`).
+
+    Channels may form (`connect`/`disconnect`) and store (`connections: list[Channel]`)
+    connections with other channels.
+
     This connection information is reflexive, and is duplicated to be stored on _both_
     channels in the form of a reference to their counterpart in the connection.
 
-    Child classes must define a string representation, `__str__`, and their
-    `generic_type` which is a parent of both themselves and their output/input partner.
+    By using the provided methods to modify connections, the reflexive nature of
+    these (dis)connections is guaranteed to be handled, and new connections are
+    subjected to a validity test.
+
+    In this abstract class the only requirement is that the connecting channels form a
+    "conjugate pair" of classes, i.e. they are different classes but have the same
+    parent class (`generic_type: type[Channel]`) -- input/output connects to
+    output/input.
+
+    Iterating over channels yields their connections.
+
+    The length of a channel is the length of its connections.
 
     Attributes:
         label (str): The name of the channel.
@@ -72,8 +74,7 @@ def __init__(
 
         Args:
             label (str): A name for the channel.
-            node (pyiron_workflow.node.Node): The node to which the
-             channel belongs.
+            node (pyiron_workflow.node.Node): The node to which the channel belongs.
         """
         self.label: str = label
         self.node: Node = node
@@ -105,7 +106,6 @@ def connect(self, *others: Channel) -> None:
         Connections are reflexive, and must occur between input and output channels of
         the same `generic_type` (i.e. data or signal).
 
-
         Args:
             *others (Channel): The other channel objects to attempt to connect with.
 
@@ -145,8 +145,8 @@ def disconnect(self, *others: Channel) -> list[tuple[Channel, Channel]]:
             *others (Channel): The other channels to disconnect from.
 
         Returns:
-            [list[tuple[Channel, Channel]]]: A list of the pairs of channels that no
-                longer participate in a connection.
+            [list[tuple[Channel, Channel]]]: A list of the (input, output) conjugate
+                pairs of channels that no longer participate in a connection.
         """
         destroyed_connections = []
         for other in others:
@@ -227,25 +227,42 @@ def __repr__(cls):
 class DataChannel(Channel, ABC):
     """
     Data channels control the flow of data on the graph.
-    They store this data in a `value` attribute.
-    They may optionally have a type hint.
-    They have a `ready` attribute which tells whether their value matches their type
-    hint (if one is provided, else `True`).
-    (In the future they may optionally have a storage priority.)
-    (In the future they may optionally have a storage history limit.)
-    (In the future they may optionally have an ontological type.)
-
-    Note that type checking is performed on value updates. This is typically not super
-    expensive, but once you have a workflow you're happy with, you may wish to
-    deactivate `strict_hints` throughout the workflow for the sake of computational
-    efficiency during production runs.
-
-    When type checking channel connections, we insist that the output type hint be
-    _as or more specific_ than the input type hint, to ensure that the input always
-    receives output of a type it expects. This behaviour can be disabled and all
-    connections allowed by setting `strict_hints = False` on the relevant input
-    channel.
 
+    They store data persistently (`value`).
+
+    This value may have a default (`default`) and the default-default is to be
+    `NotData`.
+
+    They may optionally have a type hint (`type_hint`).
+
+    New data and new connections are tested against type hints (if any).
+
+    In addition to the requirement of being a "conjugate pair", if both connecting
+    channels have type hints, the output channel must have a type hint that is as or
+    more specific than the input channel.
+
+    In addition to connections, these channels can have a single partner
+    (`value_receiver: DataChannel`) that is of the _same_ class and obeys type hints as
+    though it were the "downstream" (input) partner in a connection.
+    Channels with such partners pass any data updates they receive directly to this
+    partner (via the `value` setter).
+    (This is helpful for passing data between scopes, where we want input at one scope
+    to be passed to the input of nodes at a deeper scope, i.e. macro input passing to
+    child node input, or vice versa for output.)
+
+    All these type hint tests can be disabled on the input/receiving channel
+    (`strict_hints: bool`), and this is recommended for the optimal performance in
+    production runs.
+
+    Channels can indicate whether they hold data they are happy with (`ready: bool`),
+    which is to say it is data (not `NotData`) and that it conforms to the type hint
+    (if one is provided and checking is active).
+
+    TODO:
+        - Storage (including priority and history)
+        - Ontological hinting
+
+    Some comments on type hinting:
     For simple type hints like `int` or `str`, type hint comparison is trivial.
     However, some hints take arguments, e.g. `dict[str, int]` to specify key and value
     types; `tuple[int, int, str]` to specify a tuple with certain values;
@@ -261,10 +278,6 @@ class DataChannel(Channel, ABC):
     E.g. `Literal[1, 2]` is as or more specific that both `Literal[1, 2]` and
     `Literal[1, 2, "three"]`.
 
-    The data `value` will initialize to an instance of `NotData` by default.
-    The channel will identify as `ready` when the value is _not_ an instance of
-    `NotData`, and when the value conforms to type hints (if any).
-
     Warning:
         Type hinting in python is quite complex, and determining when a hint is
         "more specific" can be tricky. For instance, in python 3.11 you can now type
@@ -311,6 +324,12 @@ def value(self):
 
     @value.setter
     def value(self, new_value):
+        self._type_check_new_value(new_value)
+        if self.value_receiver is not None:
+            self.value_receiver.value = new_value
+        self._value = new_value
+
+    def _type_check_new_value(self, new_value):
         if (
             self.strict_hints
             and new_value is not NotData
@@ -321,9 +340,6 @@ def value(self, new_value):
                 f"The channel {self.label} cannot take the value `{new_value}` because "
                 f"it is not compliant with the type hint {self.type_hint}"
             )
-        if self.value_receiver is not None:
-            self.value_receiver.value = new_value
-        self._value = new_value
 
     @property
     def value_receiver(self) -> InputData | OutputData | None:
@@ -372,25 +388,25 @@ def generic_type(self) -> type[Channel]:
     @property
     def ready(self) -> bool:
         """
-        Check if the currently stored value satisfies the channel's type hint.
+        Check if the currently stored value is data and satisfies the channel's type
+        hint (if hint checking is activated).
 
         Returns:
-            (bool): Whether the value matches the type hint.
+            (bool): Whether the value is data and matches the type hint.
         """
-        if self.type_hint is not None:
-            return self._value_is_data and valid_value(self.value, self.type_hint)
-        else:
-            return self._value_is_data
+        return self._value_is_data and (
+            valid_value(self.value, self.type_hint) if self._has_hint else True
+        )
 
     @property
-    def _value_is_data(self):
+    def _value_is_data(self) -> bool:
         return self.value is not NotData
 
     @property
-    def _has_hint(self):
+    def _has_hint(self) -> bool:
         return self.type_hint is not None
 
-    def _valid_connection(self, other) -> bool:
+    def _valid_connection(self, other: DataChannel) -> bool:
         if super()._valid_connection(other):
             if self._both_typed(other):
                 out, inp = self._figure_out_who_is_who(other)
@@ -436,19 +452,32 @@ def fetch(self) -> None:
         `NotData`; if no such value exists (e.g. because there are no connections or
         because all the connected output channels have `NotData` as their value),
         `value` remains unchanged.
+        I.e., the connection with the highest priority for updating input data is the
+        0th connection; build graphs accordingly.
 
         Raises:
             RuntimeError: If the parent node is `running`.
         """
+        for out in self.connections:
+            if out.value is not NotData:
+                self.value = out.value
+                break
+
+    @property
+    def value(self):
+        return self._value
+
+    @value.setter
+    def value(self, new_value):
         if self.node.running:
             raise RuntimeError(
                 f"Parent node {self.node.label} of {self.label} is running, so value "
                 f"cannot be updated."
             )
-        for out in self.connections:
-            if out.value is not NotData:
-                self.value = out.value
-                break
+        self._type_check_new_value(new_value)
+        if self.value_receiver is not None:
+            self.value_receiver.value = new_value
+        self._value = new_value
 
 
 class OutputData(DataChannel):
@@ -458,10 +487,9 @@ class OutputData(DataChannel):
 class SignalChannel(Channel, ABC):
     """
     Signal channels give the option control execution flow by triggering callback
-    functions.
+    functions when the channel is called.
 
-    Output channels can be called to trigger the callback functions of all input
-    channels to which they are connected.
+    Inputs hold a callback function to call, and outputs call each of their connections.
 
     Signal channels support `>` as syntactic sugar for their connections, i.e.
     `some_output > some_input` is equivalent to `some_input.connect(some_output)`.
@@ -476,15 +504,11 @@ def __call__(self) -> None:
     def generic_type(self) -> type[Channel]:
         return SignalChannel
 
-    def connect_output_signal(self, signal: OutputSignal):
+    def _connect_output_signal(self, signal: OutputSignal):
         self.connect(signal)
 
 
 class InputSignal(SignalChannel):
-    """
-    Invokes a callback when called.
-    """
-
     def __init__(
         self,
         label: str,
@@ -517,10 +541,6 @@ def to_dict(self) -> dict:
 
 
 class OutputSignal(SignalChannel):
-    """
-    Calls all the input signal objects in its connections list when called.
-    """
-
     def __call__(self) -> None:
         for c in self.connections:
             c()
@@ -532,5 +552,5 @@ def __str__(self):
         )
 
     def __gt__(self, other: InputSignal | Node):
-        other.connect_output_signal(self)
+        other._connect_output_signal(self)
         return True
diff --git a/pyiron_workflow/composite.py b/pyiron_workflow/composite.py
index 1f8376da..283bd1f4 100644
--- a/pyiron_workflow/composite.py
+++ b/pyiron_workflow/composite.py
@@ -24,37 +24,44 @@
 
 class Composite(Node, ABC):
     """
-    A base class for nodes that have internal structure -- i.e. they hold a sub-graph.
-
-    Item and attribute access is modified to give access to owned nodes.
-    Adding a node with the `add` functionality or by direct attribute assignment sets
-    this object as the parent of that node.
-
-    Guarantees that each owned node is unique, and does not belong to any other parents.
-
-    Offers a class method (`wrap_as`) to give easy access to the node-creating
-    decorators.
-
-    Offers a creator (the `create` method) which allows instantiation of other workflow
-    objects.
-    This method behaves _differently_ on the composite class and its instances -- on
-    instances, any created nodes get their `parent` attribute automatically set to the
-    composite instance being used.
-
-    Specifies the required `on_run()` and `run_args` to call `run()` on a subset of
-    owned `starting_nodes`, thus kick-starting computation on the owned sub-graph.
-    Both the specification of these starting nodes and specifying execution signals to
-    propagate execution through the graph is left to the user/child classes.
-    In the case of non-cyclic workflows (i.e. DAGs in terms of data flow), both
-    starting nodes and execution flow can be specified by invoking execution flow can
-    be determined automatically.
-
-    Also specifies `process_run_result` such that the `run` method (and its aliases)
-    return a new dot-accessible dictionary of keys and values created from the
-    composite output IO panel.
-
-    Does not specify `input` and `output` as demanded by the parent class; this
-    requirement is still passed on to children.
+    A base class for nodes that have internal graph structure -- i.e. they hold a
+    collection of child nodes and their computation is to execute that graph.
+
+    Promises (in addition parent class promises):
+    - The class offers access...
+        - To the node-izing `pyiron_workflow` decorators
+        - To a creator for other `pyiron_workflow` objects (namely nodes)
+            - From the class level, this simply creates these objects
+            - From the instance level, created nodes get the instance as their parent
+    - Child nodes...
+        - Can be added by...
+            - Creating them from the creator on a composite _instance_
+            - Passing a node instance to the adding method
+            - Setting the composite instance as the node's parent at node instantiation
+            - Assigning a node instance as an attribute
+        - Can be accessed by...
+            - Attribute access using their node label
+            - Attribute or item access in the child nodes collection
+            - Iterating over the composite instance
+        - Can be removed by method
+        - Each have a unique label (within the scope of this composite)
+        - Have no other parent
+        - Can be replaced in-place with another node that has commensurate IO
+        - Have their working directory nested inside the composite's
+    - The length of a composite instance is its number of child nodes
+    - Running the composite...
+        - Runs the child nodes (either using manually specified execution signals, or
+            leveraging a helper tool that automates this process for data DAGs --
+            details are left to child classes)
+        - Returns a dot-dictionary of output IO
+    - Composite IO...
+        - Is some subset of the child nodes IO
+            - Default channel labels indicate both child and child's channel labels
+            - Default behaviour is to expose all unconnected child nodes' IO
+        - Bijective maps can be used to...
+            - Rename IO
+            - Force a child node's IO to appear
+            - Force a child node's IO to _not_ appear
 
     Attributes:
         inputs/outputs_map (bidict|None): Maps in the form
@@ -82,6 +89,11 @@ class Composite(Node, ABC):
         add(node: Node): Add the node instance to this subgraph.
         remove(node: Node): Break all connections the node has, remove it from this
          subgraph, and set its parent to `None`.
+        (de)activate_strict_hints(): Recursively (de)activate strict type hints.
+        replace(owned_node: Node | str, replacement: Node | type[Node]): Replaces an
+            owned node with a new node, as long as the new node's IO is commensurate
+            with the node being replaced.
+        register(): A short-cut to registering a new node package with the node creator.
     """
 
     wrap_as = Wrappers()
@@ -110,30 +122,43 @@ def __init__(
 
     @property
     def inputs_map(self) -> bidict | None:
+        self._deduplicate_nones(self._inputs_map)
         return self._inputs_map
 
     @inputs_map.setter
     def inputs_map(self, new_map: dict | bidict | None):
-        new_map = new_map if new_map is None else bidict(new_map)
+        self._deduplicate_nones(new_map)
+        if new_map is not None:
+            new_map = bidict(new_map)
         self._inputs_map = new_map
 
     @property
     def outputs_map(self) -> bidict | None:
+        self._deduplicate_nones(self._outputs_map)
         return self._outputs_map
 
     @outputs_map.setter
     def outputs_map(self, new_map: dict | bidict | None):
-        new_map = new_map if new_map is None else bidict(new_map)
+        self._deduplicate_nones(new_map)
+        if new_map is not None:
+            new_map = bidict(new_map)
         self._outputs_map = new_map
 
+    @staticmethod
+    def _deduplicate_nones(some_map: dict | bidict | None) -> dict | bidict | None:
+        if some_map is not None:
+            for k, v in some_map.items():
+                if v is None:
+                    some_map[k] = (None, f"{k} disabled")
+
     def activate_strict_hints(self):
         super().activate_strict_hints()
-        for node in self.nodes:
+        for node in self:
             node.activate_strict_hints()
 
     def deactivate_strict_hints(self):
         super().deactivate_strict_hints()
-        for node in self.nodes:
+        for node in self:
             node.deactivate_strict_hints()
 
     @property
@@ -178,7 +203,7 @@ def _update_children(self, children_from_another_process: DotDict[str, Node]):
         replace your own nodes with them, and set yourself as their parent.
         """
         for child in children_from_another_process.values():
-            child.parent = self
+            child._parent = self
         self.nodes = children_from_another_process
 
     def disconnect_run(self) -> list[tuple[Channel, Channel]]:
@@ -233,7 +258,8 @@ def _build_io(
                 default_key = f"{node.label}__{channel_label}"
                 try:
                     io_panel_key = key_map[default_key]
-                    if io_panel_key is not None:
+                    if not isinstance(io_panel_key, tuple):
+                        # Tuples indicate that the channel has been deactivated
                         io[io_panel_key] = self._get_linking_channel(
                             channel, io_panel_key
                         )
@@ -300,7 +326,7 @@ def add(self, node: Node, label: Optional[str] = None) -> None:
 
             self.nodes[label] = node
             node.label = label
-            node.parent = self
+            node._parent = self
         return node
 
     def _get_unique_label(self, label):
@@ -365,7 +391,7 @@ def remove(self, node: Node | str) -> list[tuple[Channel, Channel]]:
             (list[tuple[Channel, Channel]]): Any connections that node had.
         """
         node = self.nodes[node] if isinstance(node, str) else node
-        node.parent = None
+        node._parent = None
         disconnected = node.disconnect()
         if node in self.starting_nodes:
             self.starting_nodes.remove(node)
@@ -431,15 +457,67 @@ def replace(self, owned_node: Node | str, replacement: Node | type[Node]) -> Nod
         self.add(replacement)
         if is_starting_node:
             self.starting_nodes.append(replacement)
+
+        # Finally, make sure the IO is constructible with this new node, which will
+        # catch things like incompatible IO maps
+        try:
+            # Make sure node-level IO is pointing to the new node and that macro-level
+            # IO gets safely reconstructed
+            self._rebuild_data_io()
+        except Exception as e:
+            # If IO can't be successfully rebuilt using this node, revert changes and
+            # raise the exception
+            self.replace(replacement, owned_node)  # Guaranteed to work since
+            # replacement in the other direction was already a success
+            raise e
+
         return owned_node
 
+    def _rebuild_data_io(self):
+        """
+        Try to rebuild the IO.
+
+        If an error is encountered, revert back to the existing IO then raise it.
+        """
+        old_inputs = self.inputs
+        old_outputs = self.outputs
+        connection_changes = []  # For reversion if there's an error
+        try:
+            self._inputs = self._build_inputs()
+            self._outputs = self._build_outputs()
+            for old, new in [(old_inputs, self.inputs), (old_outputs, self.outputs)]:
+                for old_channel in old:
+                    if old_channel.connected:
+                        # If the old channel was connected to stuff, we'd better still
+                        # have a corresponding channel and be able to copy these, or we
+                        # should fail hard.
+                        # But, if it wasn't connected, we don't even care whether or not
+                        # we still have a corresponding channel to copy to
+                        new_channel = new[old_channel.label]
+                        new_channel.copy_connections(old_channel)
+                        swapped_conenctions = old_channel.disconnect_all()  # Purge old
+                        connection_changes.append(
+                            (new_channel, old_channel, swapped_conenctions)
+                        )
+        except Exception as e:
+            for new_channel, old_channel, swapped_conenctions in connection_changes:
+                new_channel.disconnect(*swapped_conenctions)
+                old_channel.connect(*swapped_conenctions)
+            self._inputs = old_inputs
+            self._outputs = old_outputs
+            e.message = (
+                f"Unable to rebuild IO for {self.label}; reverting to old IO."
+                f"{e.message}"
+            )
+            raise e
+
     @classmethod
     @wraps(Creator.register)
     def register(cls, domain: str, package_identifier: str) -> None:
         cls.create.register(domain=domain, package_identifier=package_identifier)
 
     def __setattr__(self, key: str, node: Node):
-        if isinstance(node, Node) and key != "parent":
+        if isinstance(node, Node) and key != "_parent":
             self.add(node, label=key)
         elif (
             isinstance(node, type)
diff --git a/pyiron_workflow/function.py b/pyiron_workflow/function.py
index e16cff32..e4a36f92 100644
--- a/pyiron_workflow/function.py
+++ b/pyiron_workflow/function.py
@@ -20,30 +20,6 @@
 class Function(Node):
     """
     Function nodes wrap an arbitrary python function.
-    Node IO, including type hints, is generated automatically from the provided
-    function.
-    Input data for the wrapped function can be provided as any valid combination of
-    `*arg` and `**kwarg` at both initialization and on calling the node.
-
-    On running, the function node executes this wrapped function with its current input
-    and uses the results to populate the node output.
-
-    Function nodes must be instantiated with a callable to deterimine their function,
-    and a string to name each returned value of that callable. (If you really want to
-    return a tuple, just have multiple return values but only one output label -- there
-    is currently no way to mix-and-match, i.e. to have multiple return values at least
-    one of which is a tuple.)
-
-    The node label (unless otherwise provided), IO channel names, IO types, and input
-    defaults for the node are produced _automatically_ from introspection of the node
-    function.
-    Explicit output labels can be provided to modify the number of return values (from
-    $N$ to 1 in case you _want_ a tuple returned) and to dodge constraints on the
-    automatic scraping routine (namely, that there be _at most_ one `return`
-    expression).
-    (Additional properties like storage priority and ontological type are forthcoming
-    as kwarg dictionaries with keys corresponding to the channel labels (i.e. the node
-    arguments of the node function, or the output labels provided).)
 
     Actual function node instances can either be instances of the base node class, in
     which case the callable node function *must* be provided OR they can be instances
@@ -59,14 +35,18 @@ class Function(Node):
     Further, functions with multiple return branches that return different types or
     numbers of return values may or may not work smoothly, depending on the details.
 
-    Output is updated according to `process_run_result` -- which gets invoked by the
-    post-run callbacks defined in `Node` -- such that run results are used to populate
-    the output channels.
-
-    After a node is instantiated, its input can be updated as `*args` and/or `**kwargs`
-    on call.
-    `run()` and its aliases return the output of the executed function, or a futures
-    object if the node is set to use an executor.
+    Promises:
+    - IO channels are constructed automatically from the wrapped function
+        - This includes type hints (if any)
+        - This includes defaults (if any)
+        - By default one channel is created for each returned value (from a tuple)...
+        - Output channel labels are taken from the returned value, but may be overriden
+        - A single tuple output channel can be forced by manually providing exactly one
+            output label
+    - Running the node executes the wrapped function and returns its result
+    - Input updates can be made with `*args` as well as the usual `**kwargs`, following
+        the same input order as the wrapped function.
+    - A default label can be scraped from the name of the wrapped function
 
     Args:
         node_function (callable): The function determining the behaviour of the node.
@@ -305,8 +285,9 @@ class Function(Node):
         `Workflow` class.
 
     Comments:
-
-        Using the `self` argument for function nodes is not currently supported.
+        Using the `self` argument for function nodes is not fully supported; it will
+        raise an error when combined with an executor, and otherwise behaviour is not
+        guaranteed.
     """
 
     def __init__(
@@ -582,8 +563,10 @@ class SingleValue(Function, HasChannel):
     available directly at the node level (at least those which don't conflict with the
     existing node namespace).
 
-    This also allows the entire node to be used as a reference to its output channel
-    when making data connections, e.g. `some_node.input.some_channel = my_svn_instance`.
+    Promises (in addition parent class promises):
+    - Attribute and item access will finally attempt to access the output value
+    - The entire node can be used in place of its output value for connections, e.g.
+        `some_node.input.some_channel = my_svn_instance`.
     """
 
     def __init__(
diff --git a/pyiron_workflow/io.py b/pyiron_workflow/io.py
index 0932a57d..9a540be4 100644
--- a/pyiron_workflow/io.py
+++ b/pyiron_workflow/io.py
@@ -1,5 +1,8 @@
 """
 Collections of channel objects.
+
+These also support the syntactic sugar of treating value assignments and new
+connections on the same footing.
 """
 
 from __future__ import annotations
@@ -34,19 +37,9 @@ class IO(HasToDict, ABC):
     When assigning something to an attribute holding an existing channel, if the
     assigned object is a `Channel`, then an attempt is made to make a `connection`
     between the two channels, otherwise we fall back on a value assignment that must
-    be defined in child classes under `_assign_value_to_existing_channel`, i.e.
-    >>> some_io.some_existing_channel = 5
-
-    is equivalent to
-    >>> some_io._assign_value_to_existing_channel(
-    ...     some_io["some_existing_channel"], 5
-    ... )
-
-    and
-    >>> some_io.some_existing_channel = some_other_channel
-
-    is equivalent to
-    >>> some_io.some_existing_channel.connect(some_other_channel)
+    be defined in child classes under `_assign_value_to_existing_channel`.
+    This provides syntactic sugar such that both new connections and new values can
+    be assigned with a simple `=`.
     """
 
     def __init__(self, *channels: Channel):
@@ -172,10 +165,6 @@ def __setstate__(self, state):
 
 
 class DataIO(IO, ABC):
-    """
-    Extends the base IO class with helper methods relevant to data channels.
-    """
-
     def _assign_a_non_channel_value(self, channel: DataChannel, value) -> None:
         channel.value = value
 
diff --git a/pyiron_workflow/macro.py b/pyiron_workflow/macro.py
index f3a9edc3..ab0636c6 100644
--- a/pyiron_workflow/macro.py
+++ b/pyiron_workflow/macro.py
@@ -15,8 +15,6 @@
 if TYPE_CHECKING:
     from bidict import bidict
 
-    from pyiron_workflow.node import Node
-
 
 class Macro(Composite):
     """
@@ -24,16 +22,13 @@ class Macro(Composite):
     pre-populated workflow that is the same every time you instantiate it.
 
     At instantiation, the macro uses a provided callable to build and wire the graph,
-    then builds a static IO interface for this graph. (By default, unconnected IO is
-    passed using the same formalism as workflows to combine node and channel names, but
-    this can be overriden to rename the channels in the IO panel and/or to expose
-    channels that already have an internal connection.)
-
-    Like function nodes, initial values for input can be set using kwargs, and the node
-    will (by default) attempt to update at the end of the instantiation process.
-
-    It is intended that subclasses override the initialization signature and provide
-    the graph creation directly from their own method.
+    then builds a static IO interface for this graph. (See the parent class docstring
+    for more details, but by default and as with workflows, unconnected IO is
+    represented by combining node and channel names, but can be controlled in more
+    detail with maps.)
+    This IO is _value linked_ to the child IO, so that their values stay synchronized,
+    but the child nodes of a macro form an isolated sub-graph.
+    As with function nodes, sub-classes may define a method for creating the graph.
 
     As with workflows, all DAG macros can determine their execution flow automatically,
     if you have cycles in your data flow, or otherwise want more control over the
@@ -43,6 +38,22 @@ class Macro(Composite):
     both then no further checks of their validity/reasonableness are performed, so be
     careful.
 
+    Promises (in addition parent class promises):
+    - IO is...
+        - Only built at instantiation, after child node replacement, or at request, so
+            it is "static" for improved efficiency
+        - By value, i.e. the macro has its own IO channel instances and children are
+            duly encapsulated inside their own sub-graph
+        - Value-linked to the values of their corresponding child nodes' IO -- i.e.
+            updating a macro input value changes a child node's input value, and a
+            child node updating its output value changes a macro output value (if that
+            child's output is regularly included in the macro's output, e.g. because it
+            is disconnected or otherwise included in the outputs map)
+    - Macros will attempt to set the execution graph automatically for DAGs, as long as
+        no execution flow is set in the function that builds the sub-graph
+    - A default node label can be generated using the name of the callable that builds
+        the graph.
+
     Examples:
         Let's consider the simplest case of macros that just consecutively add 1 to
         their input:
@@ -169,15 +180,32 @@ def __init__(
         outputs_map: Optional[dict | bidict] = None,
         **kwargs,
     ):
+        if not callable(graph_creator):
+            # Children of `Function` may explicitly provide a `node_function` static
+            # method so the node has fixed behaviour.
+            # In this case, the `__init__` signature should be changed so that the
+            # `node_function` argument is just always `None` or some other non-callable.
+            # If a callable `node_function` is not received, you'd better have it as an
+            # attribute already!
+            if not hasattr(self, "graph_creator"):
+                raise AttributeError(
+                    f"If `None` is provided as a `graph_creator`, a `graph_creator` "
+                    f"property must be defined instead, e.g. when making child classes"
+                    f"of `Macro` with specific behaviour"
+                )
+        else:
+            # If a callable graph creator is received, use it
+            self.graph_creator = graph_creator
+
         self._parent = None
         super().__init__(
-            label=label if label is not None else graph_creator.__name__,
+            label=label if label is not None else self.graph_creator.__name__,
             parent=parent,
             strict_naming=strict_naming,
             inputs_map=inputs_map,
             outputs_map=outputs_map,
         )
-        graph_creator(self)
+        self.graph_creator(self)
         self._configure_graph_execution()
 
         self._inputs: Inputs = self._build_inputs()
@@ -228,44 +256,6 @@ def _update_children(self, children_from_another_process):
         super()._update_children(children_from_another_process)
         self._rebuild_data_io()
 
-    def _rebuild_data_io(self):
-        """
-        Try to rebuild the IO.
-
-        If an error is encountered, revert back to the existing IO then raise it.
-        """
-        old_inputs = self.inputs
-        old_outputs = self.outputs
-        connection_changes = []  # For reversion if there's an error
-        try:
-            self._inputs = self._build_inputs()
-            self._outputs = self._build_outputs()
-            for old, new in [(old_inputs, self.inputs), (old_outputs, self.outputs)]:
-                for old_channel in old:
-                    if old_channel.connected:
-                        # If the old channel was connected to stuff, we'd better still
-                        # have a corresponding channel and be able to copy these, or we
-                        # should fail hard.
-                        # But, if it wasn't connected, we don't even care whether or not
-                        # we still have a corresponding channel to copy to
-                        new_channel = new[old_channel.label]
-                        new_channel.copy_connections(old_channel)
-                        swapped_conenctions = old_channel.disconnect_all()  # Purge old
-                        connection_changes.append(
-                            (new_channel, old_channel, swapped_conenctions)
-                        )
-        except Exception as e:
-            for new_channel, old_channel, swapped_conenctions in connection_changes:
-                new_channel.disconnect(*swapped_conenctions)
-                old_channel.connect(*swapped_conenctions)
-            self._inputs = old_inputs
-            self._outputs = old_outputs
-            e.message = (
-                f"Unable to rebuild IO for {self.label}; reverting to old IO."
-                f"{e.message}"
-            )
-            raise e
-
     def _configure_graph_execution(self):
         run_signals = self.disconnect_run()
 
@@ -292,19 +282,6 @@ def _reconnect_run(self, run_signal_pairs_to_restore):
         for pairs in run_signal_pairs_to_restore:
             pairs[0].connect(pairs[1])
 
-    def replace(self, owned_node: Node | str, replacement: Node | type[Node]):
-        replaced_node = super().replace(owned_node=owned_node, replacement=replacement)
-        try:
-            # Make sure node-level IO is pointing to the new node and that macro-level
-            # IO gets safely reconstructed
-            self._rebuild_data_io()
-        except Exception as e:
-            # If IO can't be successfully rebuilt using this node, revert changes and
-            # raise the exception
-            self.replace(replacement, replaced_node)  # Guaranteed to work since
-            # replacement in the other direction was already a success
-            raise e
-
     def to_workfow(self):
         raise NotImplementedError
 
@@ -326,11 +303,8 @@ def as_node(graph_creator: callable[[Macro], None]):
             graph_creator.__name__.title().replace("_", ""),  # fnc_name to CamelCase
             (Macro,),  # Define parentage
             {
-                "__init__": partialmethod(
-                    Macro.__init__,
-                    graph_creator,
-                    **node_class_kwargs,
-                )
+                "__init__": partialmethod(Macro.__init__, None, **node_class_kwargs),
+                "graph_creator": staticmethod(graph_creator),
             },
         )
 
diff --git a/pyiron_workflow/node.py b/pyiron_workflow/node.py
index 0a9fa5fc..80084ba5 100644
--- a/pyiron_workflow/node.py
+++ b/pyiron_workflow/node.py
@@ -1,6 +1,8 @@
 """
 A base class for objects that can form nodes in the graph representation of a
 computational workflow.
+
+The workhorse class for the entire concept.
 """
 
 from __future__ import annotations
@@ -65,65 +67,56 @@ def wrapped_method(node: Node, *args, **kwargs):  # rather node:Node
 class Node(HasToDict, ABC):
     """
     Nodes are elements of a computational graph.
-    They have input and output data channels that interface with the outside
-    world, and a callable that determines what they actually compute, and input and
-    output signal channels that can be used to customize the execution flow of their
-    graph;
-    Together these channels represent edges on the dual data and execution computational
-    graphs.
-
-    Nodes can be run in a variety of ways..
-    Non-exhaustively, they can be run in a "push" paradigm where they do their
-    calculation and then trigger downstream calculations; in a "pull" mode where they
-    first make sure all their upstream dependencies then run themselves (but not
-    anything downstream); or they may be forced to run their calculation with exactly
-    the input they have right now.
-    These and more options are available, and for more information look at the `run`
-    method.
-
-    Nodes may have a `parent` node that owns them as part of a sub-graph.
-
-    Every node must be named with a `label`, and may use this label to attempt to create
-    a working directory in memory for itself if requested.
-    These labels also help to identify nodes in the wider context of (potentially
-    nested) computational graphs.
-
-    By default, nodes' signals input comes with `run` and `ran` IO ports, which invoke
-    the `run()` method and emit after running the node, respectfully.
-    (Whether we get all the way to emitting the `ran` signal depends on how the node
-    was invoked -- it is possible to computing things with the node without sending
-    any more signals downstream.)
-    These signal connections can be made manually by reference to the node signals
-    channel, or with the `>` symbol to indicate a flow of execution. This syntactic
-    sugar can be mixed between actual signal channels (output signal > input signal),
-    or nodes, but when referring to nodes it is always a shortcut to the `run`/`ran`
-    channels.
-
-    The `run()` method returns a representation of the node output (possible a futures
-    object, if the node is running on an executor), and consequently the `pull`,
-    `execute`, and `__call__` shortcuts to `run` also return the same thing.
-
-    Invoking the `run` method (or one of its aliases) of an already instantiated node
-    allows its input channels to be updated using keyword arguments corresponding to
-    the channel labels, performing a batch-update of all supplied input and then
-    proceeding.
-    As such, _if_ the run invocation updates the input values some other way, these
-    supplied values will get overwritten.
-
-    Nodes have a status, which is currently represented by the `running` and `failed`
-    boolean flag attributes.
-    These are updated automatically when the node's operation is invoked, e.g. with
-    `run`, `execute`, `pull`, or by calling the node instance.
-
-    Nodes can be run on the main python process that owns them, or by setting their
-    `executor` attribute to `True`, in which case a
-    `pyiron_workflow.executors.CloudPickleExecutor` will be used to run the node on a
-    new process on a single core (in the future, the interface will look a little
-    different and you'll have more options than that).
-    In case they are run with an executor, their `future` attribute will be populated
-    with the resulting future object.
-    WARNING: Executors are currently only working when the node executable function does
-        not use `self`.
+    They have inputs and outputs to interface with the wider world, and perform some
+    operation.
+    By connecting multiple nodes' inputs and outputs together, computational graphs can
+    be formed.
+    These can be collected under a parent, such that new graphs can be composed of
+    one or more sub-graphs.
+
+    Promises:
+    - Nodes perform some computation, but this is delayed and won't happen until asked
+        for (the nature of the computation is left to child classes).
+    - Nodes have input and output for interfacing with the outside world
+        - Which can be connected to output/input to form a computation graph
+        - These have a data flavour, to control the flow of information
+        - And a signal flavour, to control the flow of execution
+            - Execution flows can be specified manually, but in the case of data flows
+                which form directed acyclic graphs (DAGs), this can be automated
+    - When running their computation, nodes may or may not:
+        - First update their input data values using kwargs
+            - (Note that since this happens first, if the "fetching" step later occurs,
+                any values provided here will get overwritten by data that is flowing
+                on the data graph)
+        - Then instruct their parent node to ask all of the nodes
+            upstream in its data connections to run (recursively to the parent-most
+            super-graph)
+        - Ask for the nodes upstream of them to run (in the local context of their own
+            parent)
+        - Fetch the latest output data, prioritizing the first actual data among their
+            each of their inputs connections
+        - Check if they are ready to run, i.e.
+            - Status is neither running nor failed
+            - Input is all ready, i.e. each input has data and that data is
+                commensurate with type hints (if any)
+        - Submit their computation to an executor for remote processing, or ignore any
+            executor suggested and force the computation to be local (i.e. in the same
+            python process that owns the node)
+            - If computation is non-local, the node status will stay running and the
+                futures object returned by the executor will be accessible
+        - Emit their run-completed output signal to trigger runs in nodes downstream in
+            the execution flow
+    - Running the node (and all aliases of running) return a representation of data
+        held by the output channels
+    - If an error is encountered _after_ reaching the state of actually computing the
+        node's task, the status will get set to failure
+    - Nodes have a label by which they are identified
+    - Nodes may open a working directory related to their label, their parent(age) and
+        the python process working directory
+
+    WARNING: Executors are currently only working when the node executable function
+        does not use `self`.
+
     NOTE: Executors are only allowed in a "push" paradigm, and you will get an
     exception if you try to `pull` and one of the upstream nodes uses an executor.
 
@@ -133,7 +126,12 @@ class Node(HasToDict, ABC):
     `process_run_result` once `on_run` finishes.
     They may optionally add additional signal channels to the signals IO.
 
-    # TODO: Everything with (de)serialization for storage
+    # TODO:
+        - Everything with (de)serialization for storage
+        - Integration with more powerful tools for remote execution (anything obeying
+            the standard interface of a `submit` method taking the callable and
+            arguments and returning a futures object should work, as long as it can
+            handle serializing dynamically defined objects.
 
     Attributes:
         connected (bool): Whether _any_ of the IO (including signals) are connected.
@@ -168,6 +166,7 @@ class Node(HasToDict, ABC):
     Methods:
         __call__: An alias for `pull` that aggressively runs upstream nodes even
             _outside_ the local scope (i.e. runs parents' dependencies as well).
+        (de)activate_strict_hints: Recursively (de)activate strict hints among data IO.
         disconnect: Remove all connections, including signals.
         draw: Use graphviz to visualize the node, its IO and, if composite in nature,
             its internal structure.
@@ -180,6 +179,8 @@ class Node(HasToDict, ABC):
             downstream). "Upstream" may optionally break out of the local scope to run
             parent nodes' dependencies as well (all the way until the parent-most
             object is encountered).
+        replace_with: If the node belongs to a parent, attempts to replace itself in
+            that parent with a new provided node.
         run: Run the node function from `on_run`. Handles status automatically. Various
             execution options are available as boolean flags.
         set_input_values: Allows input channels' values to be updated without any
@@ -204,7 +205,7 @@ def __init__(
         """
         super().__init__(*args, **kwargs)
         self.label: str = label
-        self.parent = parent
+        self._parent = None
         if parent is not None:
             parent.add(self)
         self.running = False
@@ -256,6 +257,17 @@ def process_run_result(self, run_output):
             run_output: The results of a `self.on_run(self.run_args)` call.
         """
 
+    @property
+    def parent(self) -> Composite | None:
+        return self._parent
+
+    @parent.setter
+    def parent(self, new_parent: Composite | None) -> None:
+        raise ValueError(
+            "Please change parentage by adding/removing the node to/from the relevant"
+            "parent"
+        )
+
     def run(
         self,
         run_data_tree: bool = False,
@@ -618,33 +630,16 @@ def __str__(self):
             f"{str(self.signals)}"
         )
 
-    def connect_output_signal(self, signal: OutputSignal):
+    def _connect_output_signal(self, signal: OutputSignal):
         self.signals.input.run.connect(signal)
 
     def __gt__(self, other: InputSignal | Node):
         """
         Allows users to connect run and ran signals like: `first_node > second_node`.
         """
-        other.connect_output_signal(self.signals.output.ran)
+        other._connect_output_signal(self.signals.output.ran)
         return True
 
-    def get_parent_proximate_to(self, composite: Composite) -> Composite | None:
-        parent = self.parent
-        while parent is not None and parent.parent is not composite:
-            parent = parent.parent
-        return parent
-
-    def get_first_shared_parent(self, other: Node) -> Composite | None:
-        our, their = self, other
-        while our.parent is not None:
-            while their.parent is not None:
-                if our.parent is their.parent:
-                    return our.parent
-                their = their.parent
-            our = our.parent
-            their = other
-        return None
-
     def copy_io(
         self,
         other: Node,
@@ -802,7 +797,7 @@ def replace_with(self, other: Node | type[Node]):
 
     def __getstate__(self):
         state = self.__dict__
-        state["parent"] = None
+        state["_parent"] = None
         # I am not at all confident that removing the parent here is the _right_
         # solution.
         # In order to run composites on a parallel process, we ship off just the nodes
diff --git a/pyiron_workflow/workflow.py b/pyiron_workflow/workflow.py
index 2b7611a6..789f57c9 100644
--- a/pyiron_workflow/workflow.py
+++ b/pyiron_workflow/workflow.py
@@ -46,6 +46,10 @@ class Workflow(Composite):
     you should consider reformulating it as a `Macro`, which operates somewhat more
     efficiently.
 
+    Promises (in addition parent class promises):
+    - Workflows are living, their IO always reflects their current state of child nodes
+    - Workflows are parent-most objects, they cannot be a sub-graph of a larger graph
+
     Examples:
         We allow adding nodes to workflows in five equivalent ways:
         >>> from pyiron_workflow.workflow import Workflow
@@ -253,11 +257,11 @@ def deserialize(self, source):
         raise NotImplementedError
 
     @property
-    def parent(self) -> None:
+    def _parent(self) -> None:
         return None
 
-    @parent.setter
-    def parent(self, new_parent: None):
+    @_parent.setter
+    def _parent(self, new_parent: None):
         # Currently workflows are not allowed to have a parent -- maybe we want to
         # change our minds on this in the future? If we do, we can just expose `parent`
         # as a kwarg and roll back this private var/property/setter protection and let
diff --git a/tests/unit/test_channels.py b/tests/unit/test_channels.py
index c6861954..2c694b42 100644
--- a/tests/unit/test_channels.py
+++ b/tests/unit/test_channels.py
@@ -2,7 +2,8 @@
 from sys import version_info
 
 from pyiron_workflow.channels import (
-    InputData, OutputData, InputSignal, OutputSignal, NotData, ChannelConnectionError
+    Channel, InputData, OutputData, InputSignal, OutputSignal, NotData,
+    ChannelConnectionError
 )
 
 
@@ -16,117 +17,191 @@ def update(self):
         self.foo.append(self.foo[-1] + 1)
 
 
+@skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
+class TestChannel(TestCase):
+
+    class InputChannel(Channel):
+        """Just to de-abstract the base class"""
+        def __str__(self):
+            return "non-abstract input"
+
+        @property
+        def generic_type(self) -> type[Channel]:
+            return Channel
+
+    class OutputChannel(Channel):
+        """Just to de-abstract the base class"""
+        def __str__(self):
+            return "non-abstract output"
+
+        @property
+        def generic_type(self) -> type[Channel]:
+            return Channel
+
+    def setUp(self) -> None:
+        self.inp = self.InputChannel("inp", DummyNode())
+        self.out = self.OutputChannel("out", DummyNode())
+        self.out2 = self.OutputChannel("out2", DummyNode())
+
+    def test_connection_validity(self):
+        with self.assertRaises(
+            TypeError,
+            msg="Can't connect to non-channels"
+        ):
+            self.inp.connect("not a node")
+
+        with self.assertRaises(
+            ChannelConnectionError,
+            msg="Can't connect non-conjugate pairs"
+        ):
+            self.inp.connect(self.InputChannel("also_input", DummyNode()))
+
+        self.inp.connect(self.out)
+        # A conjugate pair should work fine
+
+    def test_length(self):
+        self.inp.connect(self.out)
+        self.out2.connect(self.inp)
+        self.assertEqual(
+            2,
+            len(self.inp),
+            msg="Promised that channel length was number of connections"
+        )
+        self.assertEqual(
+            1,
+            len(self.out),
+            msg="Promised that channel length was number of connections"
+        )
+
+    def test_connection_reflexivity(self):
+        self.inp.connect(self.out)
+
+        self.assertIs(
+            self.inp.connections[0],
+            self.out,
+            msg="Connecting a conjugate pair should work fine"
+        )
+        self.assertIs(
+            self.out.connections[0],
+            self.inp,
+            msg="Promised connection to be reflexive"
+        )
+        self.out.disconnect_all()
+        self.assertListEqual(
+            [],
+            self.inp.connections,
+            msg="Promised disconnection to be reflexive too"
+        )
+
+        self.out.connect(self.inp)
+        self.assertIs(
+            self.inp.connections[0],
+            self.out,
+            msg="Connecting should work in either direction"
+        )
+
+    def test_connect_and_disconnect(self):
+        self.inp.connect(self.out, self.out2)
+        # Should allow multiple (dis)connections at once
+        disconnected = self.inp.disconnect(self.out2, self.out)
+        self.assertListEqual(
+            [(self.inp, self.out2), (self.inp, self.out)],
+            disconnected,
+            msg="Broken connection pairs should be returned in the order they were "
+                "broken"
+        )
+
+    def test_iterability(self):
+        self.inp.connect(self.out)
+        self.out2.connect(self.inp)
+        for i, conn in enumerate(self.inp):
+            self.assertIs(
+                self.inp.connections[i],
+                conn,
+                msg="Promised channels to be iterable over connections"
+            )
+
+
 @skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
 class TestDataChannels(TestCase):
 
     def setUp(self) -> None:
-        self.ni1 = InputData(label="numeric", node=DummyNode(), default=1, type_hint=int | float)
-        self.ni2 = InputData(label="numeric", node=DummyNode(), default=1, type_hint=int | float)
-        self.no = OutputData(label="numeric", node=DummyNode(), default=0, type_hint=int | float)
-        self.no_empty = OutputData(label="not_data", node=DummyNode(), type_hint=int | float)
+        self.ni1 = InputData(
+            label="numeric", node=DummyNode(), default=1, type_hint=int|float
+        )
+        self.ni2 = InputData(
+            label="numeric", node=DummyNode(), default=1, type_hint=int|float
+        )
+        self.no = OutputData(
+            label="numeric", node=DummyNode(), default=0, type_hint=int|float
+        )
+        self.no_empty = OutputData(
+            label="not_data", node=DummyNode(), type_hint=int|float
+        )
 
         self.si = InputData(label="list", node=DummyNode(), type_hint=list)
-        self.so1 = OutputData(label="list", node=DummyNode(), default=["foo"], type_hint=list)
-        self.so2 = OutputData(label="list", node=DummyNode(), default=["foo"], type_hint=list)
-
-        self.unhinted = InputData(label="unhinted", node=DummyNode)
+        self.so1 = OutputData(
+            label="list", node=DummyNode(), default=["foo"], type_hint=list
+        )
 
     def test_mutable_defaults(self):
+        so2 = OutputData(
+            label="list", node=DummyNode(), default=["foo"], type_hint=list
+        )
         self.so1.default.append("bar")
         self.assertEqual(
-            len(self.so2.default),
+            len(so2.default),
             len(self.so1.default) - 1,
-            msg="Mutable defaults should avoid sharing between instances"
+            msg="Mutable defaults should avoid sharing between different instances"
         )
 
-    def test_connections(self):
-
-        with self.subTest("Test connection reflexivity and value updating"):
-            self.assertEqual(self.no.value, 0)
-            self.ni1.connect(self.no)
-            self.assertIn(self.no, self.ni1.connections)
-            self.assertIn(self.ni1, self.no.connections)
-            self.assertNotEqual(self.no.value, self.ni1.value)
-            self.ni1.fetch()
-            self.assertEqual(self.no.value, self.ni1.value)
-
-        with self.subTest("Test disconnection"):
-            disconnected = self.ni2.disconnect(self.no)
-            self.assertEqual(
-                len(disconnected),
-                0,
-                msg="There were no connections to begin with, nothing should be there"
-            )
-            disconnected = self.ni1.disconnect(self.no)
-            self.assertEqual(
-                [], self.ni1.connections, msg="No connections should be left"
-            )
-            self.assertEqual(
-                [],
-                self.no.connections,
-                msg="Disconnection should also have been reflexive"
-            )
-            self.assertListEqual(
-                disconnected,
-                [(self.ni1, self.no)],
-                msg="Expected a list of the disconnected pairs."
-            )
+    def test_fetch(self):
+        self.no.value = NotData
+        self.ni1.value = 1
 
-        with self.subTest("Test multiple connections"):
-            self.no.connect(self.ni1, self.ni2)
-            self.assertEqual(2, len(self.no.connections), msg="Should connect to all")
+        self.ni1.connect(self.no_empty)
+        self.ni1.connect(self.no)
 
-        with self.subTest("Test iteration"):
-            self.assertTrue(all([con in self.no.connections for con in self.no]))
+        self.assertEqual(
+            self.ni1.value,
+            1,
+            msg="Data should not be getting pushed on connection"
+        )
 
-        with self.subTest("Data should update on fetch"):
-            self.ni1.disconnect_all()
+        self.ni1.fetch()
+        self.assertEqual(
+            self.ni1.value,
+            1,
+            msg="NotData values should not be getting pulled, so no update expected"
+        )
 
-            self.no.value = NotData
-            self.ni1.value = 1
+        self.no.value = 3
+        self.ni1.fetch()
+        self.assertEqual(
+            self.ni1.value,
+            3,
+            msg="Data fetch should to first connected value that's actually data,"
+                "in this case skipping over no_empty"
+        )
 
-            self.ni1.connect(self.no_empty)
-            self.ni1.connect(self.no)
-            self.assertEqual(
-                self.ni1.value,
-                1,
-                msg="Data should not be getting pushed on connection"
-            )
-            self.ni1.fetch()
-            self.assertEqual(
-                self.ni1.value,
-                1,
-                msg="NotData values should not be getting pulled"
-            )
-            self.no.value = 3
-            self.ni1.fetch()
-            self.assertEqual(
-                self.ni1.value,
-                3,
-                msg="Data fetch should to first connected value that's actually data,"
-                    "in this case skipping over no_empty"
-            )
-            self.no_empty.value = 4
-            self.ni1.fetch()
-            self.assertEqual(
-                self.ni1.value,
-                4,
-                msg="As soon as no_empty actually has data, it's position as 0th "
-                    "element in the connections list should give it priority"
-            )
+        self.no_empty.value = 4
+        self.ni1.fetch()
+        self.assertEqual(
+            self.ni1.value,
+            4,
+            msg="As soon as no_empty actually has data, it's position as 0th "
+                "element in the connections list should give it priority"
+        )
 
-    def test_connection_validity_tests(self):
+    def test_connection_validity(self):
         self.ni1.type_hint = int | float | bool  # Override with a larger set
         self.ni2.type_hint = int  # Override with a smaller set
 
-        with self.assertRaises(TypeError):
-            self.ni1.connect("Not a channel at all")
-
         self.no.connect(self.ni1)
         self.assertIn(
             self.no,
             self.ni1.connections,
-            "Input types should be allowed to be a super-set of output types"
+            msg="Input types should be allowed to be a super-set of output types"
         )
 
         with self.assertRaises(
@@ -146,7 +221,7 @@ def test_connection_validity_tests(self):
         self.assertIn(
             self.so1,
             self.ni2.connections,
-            "With strict connections turned off, we should allow type-violations"
+            msg="With strict connections turned off, we should allow type-violations"
         )
 
     def test_copy_connections(self):
@@ -189,24 +264,53 @@ def test_value_receiver(self):
             msg="Value-linked nodes should automatically get new values"
         )
 
+        self.ni2.value = 3
+        self.assertEqual(
+            self.ni1.value,
+            new_value,
+            msg="Coupling is uni-directional, the partner should not push values back"
+        )
+
+        with self.assertRaises(
+            TypeError,
+            msg="Only data channels of the same class are valid partners"
+        ):
+            self.ni1.value_receiver = self.no
+
+        with self.assertRaises(
+            ValueError,
+            msg="Must not couple to self to avoid infinite recursion"
+        ):
+            self.ni1.value_receiver = self.ni1
+
         with self.assertRaises(
             ValueError,
             msg="Linking should obey type hint requirements",
         ):
             self.ni1.value_receiver = self.si
 
-        self.si.strict_hints = False
-        self.ni1.value_receiver = self.si  # Should work fine if the receiver is not
-        # strictly checking hints
+        with self.subTest("Value receivers avoiding type checking"):
+            self.si.strict_hints = False
+            self.ni1.value_receiver = self.si  # Should work fine if the receiver is not
+            # strictly checking hints
 
-        self.ni1.value_receiver = self.unhinted
-        self.unhinted.value_receiver = self.ni2
-        # Should work fine if either is unhinted
+            unhinted = InputData(label="unhinted", node=DummyNode())
+            self.ni1.value_receiver = unhinted
+            unhinted.value_receiver = self.ni2
+            # Should work fine if either lacks a hint
 
     def test_value_assignment(self):
         self.ni1.value = 2  # Should be fine when value matches hint
         self.ni1.value = NotData  # Should be able to clear the data
 
+        self.ni1.node.running = True
+        with self.assertRaises(
+            RuntimeError,
+            msg="Input data should be locked while its node runs"
+        ):
+            self.ni1.value = 3
+        self.ni1.node.running = False
+
         with self.assertRaises(
             TypeError,
             msg="Should not be able to take values of the wrong type"
@@ -241,44 +345,6 @@ def test_ready(self):
         self.ni1._value = "Not numeric at all"  # Bypass type checking
         self.assertFalse(self.ni1.ready)
 
-    def test_input_coupling(self):
-        self.assertNotEqual(
-            self.ni2.value,
-            2,
-            msg="Ensure we start from a setup that the next test is meaningful"
-        )
-        self.ni1.value = 2
-        self.ni1.value_receiver = self.ni2
-        self.assertEqual(
-            self.ni2.value,
-            2,
-            msg="Coupled value should get updated on coupling"
-        )
-        self.ni1.value = 3
-        self.assertEqual(
-            self.ni2.value,
-            3,
-            msg="Coupled value should get updated after partner update"
-        )
-        self.ni2.value = 4
-        self.assertEqual(
-            self.ni1.value,
-            3,
-            msg="Coupling is uni-directional, the partner should not push values back"
-        )
-
-        with self.assertRaises(
-            TypeError,
-            msg="Only input data channels are valid partners"
-        ):
-            self.ni1.value_receiver = self.no
-
-        with self.assertRaises(
-            ValueError,
-            msg="Must not couple to self to avoid infinite recursion"
-        ):
-            self.ni1.value_receiver = self.ni1
-
 
 class TestSignalChannels(TestCase):
     def setUp(self) -> None:
diff --git a/tests/unit/test_composite.py b/tests/unit/test_composite.py
new file mode 100644
index 00000000..a26cf025
--- /dev/null
+++ b/tests/unit/test_composite.py
@@ -0,0 +1,603 @@
+from sys import version_info
+import unittest
+
+from bidict import ValueDuplicationError
+
+from pyiron_workflow._tests import ensure_tests_in_python_path
+from pyiron_workflow.channels import NotData
+from pyiron_workflow.composite import Composite
+from pyiron_workflow.io import Outputs, Inputs
+from pyiron_workflow.topology import CircularDataFlowError
+
+
+def plus_one(x: int = 0) -> int:
+    y = x + 1
+    return y
+
+
+class AComposite(Composite):
+    def __init__(self, label):
+        super().__init__(label=label)
+
+    def _get_linking_channel(self, child_reference_channel, composite_io_key):
+        return child_reference_channel  # IO by reference
+
+    @property
+    def inputs(self) -> Inputs:
+        return self._build_inputs()  # Dynamic IO reflecting current children
+
+    @property
+    def outputs(self) -> Outputs:
+        return self._build_outputs()  # Dynamic IO reflecting current children
+
+
+@unittest.skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
+class TestComposite(unittest.TestCase):
+    @classmethod
+    def setUpClass(cls) -> None:
+        ensure_tests_in_python_path()
+        super().setUpClass()
+
+    def setUp(self) -> None:
+        self.comp = AComposite("my_composite")
+        super().setUp()
+
+    def test_node_decorator_access(self):
+        @Composite.wrap_as.function_node("y")
+        def foo(x: int = 0) -> int:
+            return x + 1
+
+        from_class = foo()
+        self.assertEqual(from_class.run(), 1, msg="Node should be fully functioning")
+        self.assertIsNone(
+            from_class.parent,
+            msg="Wrapping from the class should give no parent"
+        )
+
+        comp = self.comp
+        @comp.wrap_as.function_node("y")
+        def bar(x: int = 0) -> int:
+            return x + 2
+
+        from_instance = bar()
+        self.assertEqual(from_instance.run(), 2, msg="Node should be fully functioning")
+        self.assertIsNone(
+            from_instance.parent,
+            msg="Wrappers are not creators, wrapping from the instance makes no "
+                "difference"
+        )
+
+    def test_creator_access_and_registration(self):
+        self.comp.register("demo", "static.demo_nodes")
+
+        # Test invocation
+        self.comp.create.demo.OptionallyAdd(label="by_add")
+        # Test invocation with attribute assignment
+        self.comp.by_assignment = self.comp.create.demo.OptionallyAdd()
+        node = AComposite.create.demo.OptionallyAdd()
+
+        self.assertSetEqual(
+            set(self.comp.nodes.keys()),
+            set(["by_add", "by_assignment"]),
+            msg=f"Expected one node label generated automatically from the class and "
+                f"the other from the attribute assignment, but got {self.comp.nodes.keys()}"
+        )
+        self.assertIsNone(
+            node.parent,
+            msg="Creating from the class directly should not parent the created nodes"
+        )
+
+    def test_node_addition(self):
+        # Validate the four ways to add a node
+        self.comp.add(Composite.create.Function(plus_one, label="foo"))
+        self.comp.create.Function(plus_one, label="bar")
+        self.comp.baz = self.comp.create.Function(plus_one, label="whatever_baz_gets_used")
+        Composite.create.Function(plus_one, label="qux", parent=self.comp)
+        self.assertListEqual(
+            list(self.comp.nodes.keys()),
+            ["foo", "bar", "baz", "qux"],
+            msg="Expected every above syntax to add a node OK"
+        )
+        self.comp.boa = self.comp.qux
+        self.assertListEqual(
+            list(self.comp.nodes.keys()),
+            ["foo", "bar", "baz", "boa"],
+            msg="Reassignment should remove the original instance"
+        )
+                
+    def test_node_access(self):
+        node = Composite.create.Function(plus_one)
+        self.comp.child = node
+        self.assertIs(
+            self.comp.child,
+            node,
+            msg="Access should be possible by attribute"
+        )
+        self.assertIs(
+            self.comp.nodes.child,
+            node,
+            msg="Access should be possible by attribute on nodes collection"
+        )
+        self.assertIs(
+            self.comp.nodes["child"],
+            node,
+            msg="Access should be possible by item on nodes collection"
+        )
+        
+        for n in self.comp:
+            self.assertIs(
+                node,
+                n,
+                msg="Should be able to iterate through (the one and only) nodes"
+            )
+
+    def test_node_removal(self):
+        self.comp.owned = Composite.create.Function(plus_one)
+        node = Composite.create.Function(plus_one)
+        self.comp.foo = node
+        # Add it to starting nodes manually, otherwise it's only there at run time
+        self.comp.starting_nodes = [self.comp.foo]
+        # Connect it inside the composite
+        self.comp.foo.inputs.x = self.comp.owned.outputs.y
+
+        disconnected = self.comp.remove(node)
+        self.assertIsNone(node.parent, msg="Removal should de-parent")
+        self.assertFalse(node.connected, msg="Removal should disconnect")
+        self.assertListEqual(
+            [(node.inputs.x, self.comp.owned.outputs.y)],
+            disconnected,
+            msg="Removal should return destroyed connections"
+        )
+        self.assertListEqual(
+            self.comp.starting_nodes,
+            [],
+            msg="Removal should also remove from starting nodes"
+        )
+
+        node_owned = self.comp.owned
+        disconnections = self.comp.remove(node_owned.label)
+        self.assertEqual(
+            node_owned.parent,
+            None,
+            msg="Should be able to remove nodes by label as well as by object"
+        )
+        self.assertListEqual(
+            [],
+            disconnections,
+            msg="node1 should have no connections left"
+        )
+
+    def test_label_uniqueness(self):
+        self.comp.foo = Composite.create.Function(plus_one)
+
+        self.comp.strict_naming = True
+        # Validate name preservation for each node addition path
+        with self.assertRaises(AttributeError, msg="We have 'foo' at home"):
+            self.comp.add(self.comp.create.Function(plus_one, label="foo"))
+
+        with self.assertRaises(AttributeError, msg="We have 'foo' at home"):
+            self.comp.create.Function(plus_one, label="foo")
+
+        with self.assertRaises(
+            AttributeError,
+            msg="The provided label is ok, but then assigning to baz should give "
+                "trouble since that name is already occupied"
+        ):
+            self.comp.foo = Composite.create.Function(plus_one, label="whatever")
+
+        with self.assertRaises(AttributeError, msg="We have 'foo' at home"):
+            Composite.create.Function(plus_one, label="foo", parent=self.comp)
+
+        with self.assertRaises(ValueError, msg="Parentage can't be set directly"):
+            node = Composite.create.Function(plus_one, label="foo")
+            node.parent = self.comp
+
+        with self.subTest("Make sure trivial re-assignment has no impact"):
+            original_foo = self.comp.foo
+            n_nodes = len(self.comp.nodes)
+            self.comp.foo = original_foo
+            self.assertIs(
+                original_foo,
+                self.comp.foo,
+                msg="Reassigning a node to the same name should have no impact",
+            )
+            self.assertEqual(
+                n_nodes,
+                len(self.comp.nodes),
+                msg="Reassigning a node to the same name should have no impact",
+            )
+
+        print("\nKEYS", list(self.comp.nodes.keys()))
+        self.comp.strict_naming = False
+        self.comp.add(Composite.create.Function(plus_one, label="foo"))
+        print("\nKEYS", list(self.comp.nodes.keys()))
+        self.assertEqual(
+            2,
+            len(self.comp),
+            msg="Without strict naming, we should be able to add to an existing name"
+        )
+        self.assertListEqual(
+            ["foo", "foo0"],
+            list(self.comp.nodes.keys()),
+            msg="When adding a node with an existing name and relaxed naming, the new "
+                "node should get an index on its label so each label is still unique"
+        )
+
+    def test_singular_ownership(self):
+        comp1 = AComposite("one")
+        comp1.create.Function(plus_one, label="node1")
+        node2 = AComposite.create.Function(
+            plus_one, label="node2", parent=comp1, x=comp1.node1.outputs.y
+        )
+        self.assertTrue(node2.connected, msg="Sanity check that node connection works")
+
+        comp2 = AComposite("two")
+        with self.assertRaises(ValueError, msg="Can't belong to two parents"):
+            comp2.add(node2)
+        comp1.remove(node2)
+        comp2.add(node2)
+        self.assertEqual(
+            node2.parent,
+            comp2,
+            msg="Freed nodes should be able to join other parents"
+        )
+
+    def test_replace(self):
+        n1 = Composite.create.SingleValue(plus_one)
+        n2 = Composite.create.SingleValue(plus_one)
+        n3 = Composite.create.SingleValue(plus_one)
+
+        @Composite.wrap_as.function_node(("y", "minus"))
+        def x_plus_minus_z(x: int = 0, z=2) -> tuple[int, int]:
+            """
+            A commensurate but different node: has _more_ than the necessary channels,
+            but old channels are all there with the same hints
+            """
+            return x + z, x - z
+
+        replacement = x_plus_minus_z()
+
+        @Composite.wrap_as.single_value_node("y")
+        def different_input_channel(z: int = 0) -> int:
+            return z + 10
+
+        @Composite.wrap_as.single_value_node("z")
+        def different_output_channel(x: int = 0) -> int:
+            return x + 100
+
+        self.comp.n1 = n1
+        self.comp.n2 = n2
+        self.comp.n3 = n3
+        self.comp.n2.inputs.x = self.comp.n1
+        self.comp.n3.inputs.x = self.comp.n2
+        self.comp.inputs_map = {"n1__x": "x"}
+        self.comp.outputs_map = {"n3__y": "y"}
+        self.comp.set_run_signals_to_dag_execution()
+
+        with self.subTest("Verify success cases"):
+            self.assertEqual(3, self.comp.run().y, msg="Sanity check")
+
+            self.comp.replace(n1, replacement)
+            out = self.comp.run(x=0)
+            self.assertEqual(
+                (0+2) + 1 + 1, out.y, msg="Should be able to replace by instance"
+            )
+            self.assertEqual(
+                0 - 2, out.n1__minus, msg="Replacement output should also appear"
+            )
+            self.comp.replace(replacement, n1)
+            self.assertFalse(
+                replacement.connected, msg="Replaced nodes should be disconnected"
+            )
+            self.assertIsNone(
+                replacement.parent, msg="Replaced nodes should be orphaned"
+            )
+
+            self.comp.replace("n2", replacement)
+            out = self.comp.run(x=0)
+            self.assertEqual(
+                (0 + 1) + 2 + 1, out.y, msg="Should be able to replace by label"
+            )
+            self.assertEqual(1 - 2, out.n2__minus)
+            self.comp.replace(replacement, n2)
+
+            self.comp.replace(n3, x_plus_minus_z)
+            out = self.comp.run(x=0)
+            self.assertEqual(
+                (0 + 1) + 2 + 1, out.y, msg="Should be able to replace with a class"
+            )
+            self.assertEqual(2 - 2, out.n3__minus)
+            self.assertIsNot(
+                self.comp.n3,
+                replacement,
+                msg="Sanity check -- when replacing with class, a _new_ instance "
+                    "should be created"
+            )
+            self.comp.replace(self.comp.n3, n3)
+
+            self.comp.n1 = x_plus_minus_z
+            self.assertEqual(
+                (0+2) + 1 + 1,
+                self.comp.run(x=0).y,
+                msg="Assigning a new _class_ to an existing node should be a shortcut "
+                    "for replacement"
+            )
+            self.comp.replace(self.comp.n1, n1)  # Return to original state
+
+            self.comp.n1 = different_input_channel
+            self.assertEqual(
+                (0 + 10) + 1 + 1,
+                self.comp.run(n1__z=0).y,
+                msg="Different IO should be compatible as long as what's missing is "
+                    "not connected"
+            )
+            self.comp.replace(self.comp.n1, n1)
+
+            self.comp.n3 = different_output_channel
+            self.assertEqual(
+                (0 + 1) + 1 + 100,
+                self.comp.run(x=0).n3__z,
+                msg="Different IO should be compatible as long as what's missing is "
+                    "not connected"
+            )
+            self.comp.replace(self.comp.n3, n3)
+
+        with self.subTest("Verify failure cases"):
+            self.assertEqual(3, self.comp.run().y, msg="Sanity check")
+
+            another_comp = AComposite("another")
+            another_node = x_plus_minus_z(parent=another_comp)
+
+            with self.assertRaises(
+                ValueError,
+                msg="Should fail when replacement has a parent"
+            ):
+                self.comp.replace(self.comp.n1, another_node)
+
+            another_comp.remove(another_node)
+            another_node.inputs.x = replacement.outputs.y
+            with self.assertRaises(
+                ValueError,
+                msg="Should fail when replacement is connected"
+            ):
+                self.comp.replace(self.comp.n1, another_node)
+
+            another_node.disconnect()
+            with self.assertRaises(
+                ValueError,
+                msg="Should fail if the node being replaced isn't a child"
+            ):
+                self.comp.replace(replacement, another_node)
+
+            @Composite.wrap_as.single_value_node("y")
+            def wrong_hint(x: float = 0) -> float:
+                return x + 1.1
+
+            with self.assertRaises(
+                TypeError,
+                msg="Should not be able to replace with the wrong type hints"
+            ):
+                self.comp.n1 = wrong_hint
+
+            with self.assertRaises(
+                AttributeError,
+                msg="Should not be able to replace with any missing connected channels"
+            ):
+                self.comp.n2 = different_input_channel
+
+            with self.assertRaises(
+                AttributeError,
+                msg="Should not be able to replace with any missing connected channels"
+            ):
+                self.comp.n2 = different_output_channel
+
+            self.assertEqual(
+                3,
+                self.comp.run().y,
+                msg="Failed replacements should always restore the original state "
+                    "cleanly"
+            )
+
+    def test_working_directory(self):
+        self.comp.plus_one = Composite.create.Function(plus_one)
+        self.assertTrue(
+            str(self.comp.plus_one.working_directory.path).endswith(self.comp.plus_one.label),
+            msg="Child nodes should have their own working directories nested inside"
+        )
+        self.comp.working_directory.delete()  # Clean up
+
+    def test_length(self):
+        self.comp.child = Composite.create.Function(plus_one)
+        l1 = len(self.comp)
+        self.comp.child2 = Composite.create.Function(plus_one)
+        self.assertEqual(
+            l1 + 1,
+            len(self.comp),
+            msg="Expected length to count the number of children"
+        )
+
+    def test_run(self):
+        self.comp.create.SingleValue(plus_one, label="n1", x=0)
+        self.comp.create.SingleValue(plus_one, label="n2", x=self.comp.n1)
+        self.comp.create.SingleValue(plus_one, label="n3", x=42)
+        self.comp.n1 > self.comp.n2
+        self.comp.starting_nodes = [self.comp.n1]
+
+        self.comp.run()
+        self.assertEqual(
+            2,
+            self.comp.n2.outputs.y.value,
+            msg="Expected to start from starting node and propagate"
+        )
+        self.assertIs(
+            NotData,
+            self.comp.n3.outputs.y.value,
+            msg="n3 was omitted from the execution diagram, it should not have run"
+        )
+
+    def test_set_run_signals_to_dag(self):
+        # Like the run test, but manually invoking this first
+        self.comp.create.SingleValue(plus_one, label="n1", x=0)
+        self.comp.create.SingleValue(plus_one, label="n2", x=self.comp.n1)
+        self.comp.create.SingleValue(plus_one, label="n3", x=42)
+        self.comp.set_run_signals_to_dag_execution()
+        self.comp.run()
+        self.assertEqual(
+            1,
+            self.comp.n1.outputs.y.value,
+            msg="Expected all nodes to run"
+        )
+        self.assertEqual(
+            2,
+            self.comp.n2.outputs.y.value,
+            msg="Expected all nodes to run"
+        )
+        self.assertEqual(
+            43,
+            self.comp.n3.outputs.y.value,
+            msg="Expected all nodes to run"
+        )
+
+        self.comp.n1.inputs.x = self.comp.n2
+        with self.assertRaises(
+            CircularDataFlowError,
+            msg="Should not be able to automate graphs with circular data"
+        ):
+            self.comp.set_run_signals_to_dag_execution()
+
+    def test_return(self):
+        self.comp.n1 = Composite.create.SingleValue(plus_one, x=0)
+        not_dottable_string = "can't dot this"
+        not_dottable_name_node = self.comp.create.SingleValue(
+            plus_one, x=42, label=not_dottable_string
+        )
+        self.comp.starting_nodes = [self.comp.n1, not_dottable_name_node]
+        out = self.comp.run()
+        self.assertEqual(
+            1,
+            self.comp.outputs.n1__y.value,
+            msg="Sanity check that the output has been filled and is stored under the "
+                "name we think it is"
+        )
+        # Make sure the returned object is functionally a dot-dict
+        self.assertEqual(1, out["n1__y"], msg="Should work with item-access")
+        self.assertEqual(1, out.n1__y, msg="Should work with dot-access")
+        # We can give nodes crazy names, but then we're stuck with item access
+        self.assertIs(
+            not_dottable_name_node,
+            self.comp.nodes[not_dottable_string],
+            msg="Should be able to access the node by item"
+        )
+        self.assertEqual(
+            43,
+            out[not_dottable_string + "__y"],
+            msg="Should always be able to fall back to item access with crazy labels"
+        )
+
+    def test_io_maps(self):
+        # input and output, renaming, accessing connected, and deactivating disconnected
+        self.comp.n1 = Composite.create.SingleValue(plus_one, x=0)
+        self.comp.n2 = Composite.create.SingleValue(plus_one, x=self.comp.n1)
+        self.comp.n3 = Composite.create.SingleValue(plus_one, x=self.comp.n2)
+        self.comp.m = Composite.create.SingleValue(plus_one, x=42)
+        self.comp.inputs_map = {
+            "n1__x": "x",  # Rename
+            "n2__x": "intermediate_x",  # Expose
+            "m__x": None,  # Hide
+        }
+        self.comp.outputs_map = {
+            "n3__y": "y",  # Rename
+            "n2__y": "intermediate_y",  # Expose,
+            "m__y": None,  # Hide
+        }
+        self.assertIn("x", self.comp.inputs.labels, msg="Should be renamed")
+        self.assertIn("y", self.comp.outputs.labels, msg="Should be renamed")
+        self.assertIn("intermediate_x", self.comp.inputs.labels, msg="Should be exposed")
+        self.assertIn("intermediate_y", self.comp.outputs.labels, msg="Should be exposed")
+        self.assertNotIn("m__x", self.comp.inputs.labels, msg="Should be hidden")
+        self.assertNotIn("m__y", self.comp.outputs.labels, msg="Should be hidden")
+        self.assertNotIn("m__y", self.comp.outputs.labels, msg="Should be hidden")
+
+        self.comp.set_run_signals_to_dag_execution()
+        out = self.comp.run()
+        self.assertEqual(
+            3,
+            out.y,
+            msg="New names should be propagated to the returned value"
+        )
+        self.assertNotIn(
+            "m__y",
+            list(out.keys()),
+            msg="IO filtering should be evident in returned value"
+        )
+        self.assertEqual(
+            43,
+            self.comp.m.outputs.y.value,
+            msg="The child channel should still exist and have run"
+        )
+        self.assertEqual(
+            1,
+            self.comp.inputs.intermediate_x.value,
+            msg="IO should be up-to-date post-run"
+        )
+        self.assertEqual(
+            2,
+            self.comp.outputs.intermediate_y.value,
+            msg="IO should be up-to-date post-run"
+        )
+
+    def test_io_map_bijectivity(self):
+        with self.assertRaises(
+            ValueDuplicationError,
+            msg="Should not be allowed to map two children's channels to the same label"
+        ):
+            self.comp.inputs_map = {"n1__x": "x", "n2__x": "x"}
+
+        self.comp.inputs_map = {"n1__x": "x"}
+        with self.assertRaises(
+            ValueDuplicationError,
+            msg="Should not be allowed to update a second child's channel onto an "
+                "existing mapped channel"
+        ):
+            self.comp.inputs_map["n2__x"] = "x"
+
+        with self.subTest("Ensure we can use None to turn multiple off"):
+            self.comp.inputs_map = {"n1__x": None, "n2__x": None}  # At once
+            # Or in a row
+            self.comp.inputs_map = {}
+            self.comp.inputs_map["n1__x"] = None
+            self.comp.inputs_map["n2__x"] = None
+            self.comp.inputs_map["n3__x"] = None
+            print("\nMAP", self.comp.inputs_map)
+            self.assertEqual(
+                3,
+                len(self.comp.inputs_map),
+                msg="All entries should be stored"
+            )
+            self.assertEqual(
+                0,
+                len(self.comp.inputs),
+                msg="No IO should be left exposed"
+            )
+
+    def test_de_activate_strict_connections(self):
+        self.comp.sub_comp = AComposite("sub")
+        self.comp.sub_comp.n1 = Composite.create.SingleValue(plus_one, x=0)
+        self.assertTrue(
+            self.comp.sub_comp.n1.inputs.x.strict_hints,
+            msg="Sanity check that test starts in the expected condition"
+        )
+        self.comp.deactivate_strict_hints()
+        self.assertFalse(
+            self.comp.sub_comp.n1.inputs.x.strict_hints,
+            msg="Deactivating should propagate to children"
+        )
+        self.comp.activate_strict_hints()
+        self.assertTrue(
+            self.comp.sub_comp.n1.inputs.x.strict_hints,
+            msg="Activating should propagate to children"
+        )
+
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/tests/unit/test_function.py b/tests/unit/test_function.py
index 1a33fcc1..3dc813dc 100644
--- a/tests/unit/test_function.py
+++ b/tests/unit/test_function.py
@@ -5,10 +5,7 @@
 import warnings
 
 from pyiron_workflow.channels import NotData, ChannelConnectionError
-from pyiron_workflow.files import DirectoryObject
-from pyiron_workflow.function import (
-    Function, SingleValue, function_node, single_value_node
-)
+from pyiron_workflow.function import Function, SingleValue, function_node
 
 
 def throw_error(x: Optional[int] = None):
@@ -49,9 +46,10 @@ def test_instantiation(self):
         with self.subTest("Args and kwargs at initialization"):
             node = Function(plus_one)
             self.assertIs(
-                node.outputs.y.value,
                 NotData,
-                msg="Nodes should not run at instantiation",
+                node.outputs.y.value,
+                msg="Sanity check that output just has the standard not-data value at "
+                    "instantiation",
             )
             node.inputs.x = 10
             self.assertIs(
@@ -63,8 +61,8 @@ def test_instantiation(self):
             self.assertEqual(
                 node.outputs.y.value,
                 11,
-                msg=f"Slow nodes should still run when asked! Expected 11 but got "
-                    f"{node.outputs.y.value}"
+                msg=f"Expected the run to update the output -- did the test function"
+                    f"change or something?"
             )
 
             node = Function(no_default, 1, y=2, output_labels="output")
@@ -72,10 +70,9 @@ def test_instantiation(self):
             self.assertEqual(
                 no_default(1, 2),
                 node.outputs.output.value,
-                msg="Nodes should allow input initialization by arg and kwarg"
+                msg="Nodes should allow input initialization by arg _and_ kwarg"
             )
             node(2, y=3)
-            node.run()
             self.assertEqual(
                 no_default(2, 3),
                 node.outputs.output.value,
@@ -142,6 +139,10 @@ def test_label_choices(self):
             switch = Function(multiple_branches, output_labels="bool")
             self.assertListEqual(switch.outputs.labels, ["bool"])
 
+    def test_default_label(self):
+        n = Function(plus_one)
+        self.assertEqual(plus_one.__name__, n.label)
+
     def test_availability_of_node_function(self):
         @function_node()
         def linear(x):
@@ -159,108 +160,23 @@ def bilinear(x, y):
                 "use at the class level"
         )
 
-    def test_signals(self):
-        @function_node()
-        def linear(x):
-            return x
-
-        @function_node()
-        def times_two(y):
-            return 2 * y
-
-        l = linear(x=1)
-        t2 = times_two(
-            output_labels=["double"],
-            y=l.outputs.x
-        )
-        self.assertIs(
-            t2.outputs.double.value,
-            NotData,
-            msg=f"Without updates, expected the output to be {NotData} but got "
-                f"{t2.outputs.double.value}"
-        )
-
-        # Nodes should _all_ have the run and ran signals
-        t2.signals.input.run = l.signals.output.ran
-        l.run()
-        self.assertEqual(
-            t2.outputs.double.value, 2,
-            msg="Running the upstream node should trigger a run here"
-        )
-
-        with self.subTest("Test syntactic sugar"):
-            t2.signals.input.run.disconnect_all()
-            l > t2
-            self.assertIn(
-                l.signals.output.ran,
-                t2.signals.input.run.connections,
-                msg="> should be equivalent to run/ran connection"
-            )
-
-            t2.signals.input.run.disconnect_all()
-            l > t2.signals.input.run
-            self.assertIn(
-                l.signals.output.ran,
-                t2.signals.input.run.connections,
-                msg="> should allow us to mix and match nodes and signal channels"
-            )
-
-            t2.signals.input.run.disconnect_all()
-            l.signals.output.ran > t2
-            self.assertIn(
-                l.signals.output.ran,
-                t2.signals.input.run.connections,
-                msg="Mixing and matching should work both directions"
-            )
-
-            t2.signals.input.run.disconnect_all()
-            l > t2 > l
-            self.assertTrue(
-                l.signals.input.run.connections[0] is t2.signals.output.ran
-                and t2.signals.input.run.connections[0] is l.signals.output.ran,
-                msg="> should allow chaining signal connections"
-            )
-
     def test_statuses(self):
         n = Function(plus_one)
         self.assertTrue(n.ready)
         self.assertFalse(n.running)
         self.assertFalse(n.failed)
 
-        # Can't really test "running" until we have a background executor, so fake a bit
-        n.running = True
-        with self.assertRaises(RuntimeError):
-            # Running nodes can't be run
-            n.run()
-        n.running = False
-
         n.inputs.x = "Can't be added together with an int"
-        with self.assertRaises(TypeError):
-            # The function error should get passed up
+        with self.assertRaises(
+            TypeError,
+            msg="We expect the int+str type error because there were no type hints "
+                "guarding this function from running with bad data"
+        ):
             n.run()
         self.assertFalse(n.ready)
         self.assertFalse(n.running)
         self.assertTrue(n.failed)
 
-        n.inputs.x = 1
-        self.assertFalse(
-            n.ready,
-            msg="Should not be ready while it has failed status"
-        )
-
-        n.failed = False  # Manually reset the failed status
-        self.assertTrue(
-            n.ready,
-            msg="Input is ok, not running, not failed -- should be ready!"
-        )
-        n.run()
-        self.assertTrue(n.ready)
-        self.assertFalse(n.running)
-        self.assertFalse(
-            n.failed,
-            msg="Should be back to a good state and ready to run again"
-        )
-
     def test_with_self(self):
         def with_self(self, x: float) -> float:
             # Note: Adding internal state to the node like this goes against the best
@@ -296,7 +212,8 @@ def with_self(self, x: float) -> float:
         self.assertEqual(
             node.some_counter,
             1,
-            msg="Function functions should be able to modify attributes on the node object."
+            msg="Function functions should be able to modify attributes on the node "
+                "object."
         )
 
         node.executor = True
@@ -378,37 +295,23 @@ def test_return_value(self):
         node = Function(plus_one)
 
         with self.subTest("Run on main process"):
-            return_on_call = node(1)
-            self.assertEqual(
-                return_on_call,
-                plus_one(1),
-                msg="Run output should be returned on call"
-            )
-
             node.inputs.x = 2
             return_on_explicit_run = node.run()
             self.assertEqual(
                 return_on_explicit_run,
                 plus_one(2),
-                msg="On explicit run, the most recent input data should be used and the "
-                    "result should be returned"
+                msg="On explicit run, the most recent input data should be used and "
+                    "the result should be returned"
             )
 
-        with self.subTest("Run on executor"):
-            node.executor = True
-
-            return_on_explicit_run = node.run()
-            self.assertIsInstance(
-                return_on_explicit_run,
-                Future,
-                msg="Running with an executor should return the future"
+            return_on_call = node(1)
+            self.assertEqual(
+                return_on_call,
+                plus_one(1),
+                msg="Run output should be returned on call"
+                # This is a duplicate test, since __call__ just invokes run, but it is
+                # such a core promise that let's just double-check it
             )
-            with self.assertRaises(RuntimeError):
-                # The executor run should take a second
-                # So we can double check that attempting to run while already running
-                # raises an error
-                node.run()
-            node.future.result()  # Wait for the remote execution to finish
 
     def test_copy_connections(self):
         node = Function(plus_one)
@@ -662,41 +565,9 @@ def test_easy_output_connection(self):
                 "from assignment at instantiation"
         )
 
-    def test_working_directory(self):
-        n_f = Function(plus_one)
-        self.assertTrue(n_f._working_directory is None)
-        self.assertIsInstance(n_f.working_directory, DirectoryObject)
-        self.assertTrue(str(n_f.working_directory.path).endswith(n_f.label))
-        n_f.working_directory.delete()
-
-    def test_disconnection(self):
-        n1 = Function(no_default, output_labels="out")
-        n2 = Function(no_default, output_labels="out")
-        n3 = Function(no_default, output_labels="out")
-        n4 = Function(plus_one)
-
-        n3.inputs.x = n1.outputs.out
-        n3.inputs.y = n2.outputs.out
-        n4.inputs.x = n3.outputs.out
-        n2 > n3 > n4
-        disconnected = n3.disconnect()
-        self.assertListEqual(
-            disconnected,
-            [
-                # Inputs
-                (n3.inputs.x, n1.outputs.out),
-                (n3.inputs.y, n2.outputs.out),
-                # Outputs
-                (n3.outputs.out, n4.inputs.x),
-                # Signals (inputs, then output)
-                (n3.signals.input.run, n2.signals.output.ran),
-                (n3.signals.output.ran, n4.signals.input.run),
-            ],
-            msg="Expected to find pairs (starting with the node disconnect was called "
-                "on) of all broken connections among input, output, and signals."
-        )
-
-    def test_pulling_without_any_parents(self):
+    def test_nested_declaration(self):
+        # It's really just a silly case of running without a parent, where you don't
+        # store references to all the nodes declared
         node = SingleValue(
             plus_one,
             x=SingleValue(
diff --git a/tests/unit/test_macro.py b/tests/unit/test_macro.py
index 0ee9d78b..0a8006c5 100644
--- a/tests/unit/test_macro.py
+++ b/tests/unit/test_macro.py
@@ -26,7 +26,108 @@ def add_three_macro(macro):
 @unittest.skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
 class TestMacro(unittest.TestCase):
 
-    def test_labels(self):
+    def test_static_input(self):
+        m = Macro(add_three_macro)
+        inp = m.inputs
+        inp_again = m.inputs
+        self.assertIs(
+            inp, inp_again, msg="Should not be rebuilding just to look at it"
+        )
+        m._rebuild_data_io()
+        new_inp = m.inputs
+        self.assertIsNot(
+            inp, new_inp, msg="After rebuild we should get a new object"
+        )
+
+    def test_io_independence(self):
+        m = Macro(add_three_macro)
+        self.assertIsNot(
+            m.inputs.one__x,
+            m.one.inputs.x,
+            msg="Expect input to be by value, not by reference"
+        )
+        self.assertIsNot(
+            m.outputs.three__result,
+            m.three.outputs.result,
+            msg="Expect output to be by value, not by reference"
+        )
+        self.assertFalse(
+            m.connected,
+            msg="Macro should talk to its children by value links _not_ graph "
+                "connections"
+        )
+
+    def test_value_links(self):
+        m = Macro(add_three_macro)
+        self.assertIs(
+            m.one.inputs.x,
+            m.inputs.one__x.value_receiver,
+            msg="Sanity check that value link exists"
+        )
+        self.assertIs(
+            m.outputs.three__result,
+            m.three.outputs.result.value_receiver,
+            msg="Sanity check that value link exists"
+        )
+        self.assertNotEqual(
+            42, m.one.inputs.x.value, msg="Sanity check that we start from expected"
+        )
+        self.assertNotEqual(
+            42,
+            m.three.outputs.result.value,
+            msg="Sanity check that we start from expected"
+        )
+        m.inputs.one__x.value = 0
+        self.assertEqual(
+            0, m.one.inputs.x.value, msg="Expected values to stay synchronized"
+        )
+        m.three.outputs.result.value = 0
+        self.assertEqual(
+            0, m.outputs.three__result.value, msg="Expected values to stay synchronized"
+        )
+
+    def test_execution_automation(self):
+        fully_automatic = add_three_macro
+
+        def fully_defined(macro):
+            add_three_macro(macro)
+            macro.one > macro.two > macro.three
+            macro.starting_nodes = [macro.one]
+
+        def only_order(macro):
+            add_three_macro(macro)
+            macro.two > macro.three
+
+        def only_starting(macro):
+            add_three_macro(macro)
+            macro.starting_nodes = [macro.one]
+
+        m_auto = Macro(fully_automatic)
+        m_user = Macro(fully_defined)
+
+        x = 0
+        expected = add_one(add_one(add_one(x)))
+        self.assertEqual(
+            m_auto(one__x=x).three__result,
+            expected,
+            "DAG macros should run fine without user specification of execution."
+        )
+        self.assertEqual(
+            m_user(one__x=x).three__result,
+            expected,
+            "Macros should run fine if the user nicely specifies the exeuction graph."
+        )
+
+        with self.subTest("Partially specified execution should fail"):
+            # We don't yet check for _crappy_ user-defined execution,
+            # But we should make sure it's at least valid in principle
+            with self.assertRaises(ValueError):
+                Macro(only_order)
+
+            with self.assertRaises(ValueError):
+                Macro(only_starting)
+
+    def test_default_label(self):
         m = Macro(add_three_macro)
         self.assertEqual(
             m.label,
@@ -37,7 +138,7 @@ def test_labels(self):
         m2 = Macro(add_three_macro, label=label)
         self.assertEqual(m2.label, label, msg="Should be able to specify a label")
 
-    def test_wrapper_function(self):
+    def test_creation_from_decorator(self):
         m = Macro(add_three_macro)
 
         self.assertIs(
@@ -62,7 +163,7 @@ def test_wrapper_function(self):
             msg="Macros should get output updated, just like other nodes"
         )
 
-    def test_subclass(self):
+    def test_creation_from_subclass(self):
         class MyMacro(Macro):
             def build_graph(self):
                 add_three_macro(self)
@@ -81,58 +182,6 @@ def build_graph(self):
             msg="Subclasses should be able to simply override the graph_creator arg"
         )
 
-    def test_key_map(self):
-        m = Macro(
-            add_three_macro,
-            inputs_map={"one__x": "my_input"},
-            outputs_map={
-                "three__result": "my_output",
-                "two__result": "intermediate"
-            },
-        )
-        self.assertSetEqual(
-            set(m.inputs.labels),
-            set(("my_input",)),
-            msg="Input should be relabelled, but not added to or taken away from"
-        )
-        self.assertSetEqual(
-            set(m.outputs.labels),
-            set(("my_output", "intermediate")),
-            msg="Output should be relabelled and expanded"
-        )
-
-        with self.subTest("Make new names can be used as usual"):
-            x = 0
-            out = m(my_input=x)
-            self.assertEqual(
-                out.my_output,
-                add_one(add_one(add_one(x))),
-                msg="Expected output but relabeled should be accessible"
-            )
-            self.assertEqual(
-                out.intermediate,
-                add_one(add_one(x)),
-                msg="New, internally connected output that was specifically requested "
-                    "should be accessible"
-            )
-
-        with self.subTest("IO can be disabled"):
-            m = Macro(
-                add_three_macro,
-                inputs_map={"one__x": None},
-                outputs_map={"three__result": None},
-            )
-            self.assertEqual(
-                len(m.inputs.labels),
-                0,
-                msg="Only inputs should have been disabled"
-            )
-            self.assertEqual(
-                len(m.outputs.labels),
-                0,
-                msg="Only outputs should have been disabled"
-            )
-
     def test_nesting(self):
         def nested_macro(macro):
             macro.a = SingleValue(add_one)
@@ -160,329 +209,6 @@ def nested_macro(macro):
         m = Macro(nested_macro)
         self.assertEqual(m(a__x=0).d__result, 8)
 
-        m2 = Macro(nested_macro)
-
-        with self.subTest("Test Node.get_parent_proximate_to"):
-            self.assertIs(
-                m.b,
-                m.b.two.get_parent_proximate_to(m),
-                msg="Should return parent closest to the passed composite"
-            )
-
-            self.assertIsNone(
-                m.b.two.get_parent_proximate_to(m2),
-                msg="Should return None when composite is not in parentage"
-            )
-
-        with self.subTest("Test Node.get_first_shared_parent"):
-            self.assertIs(
-                m.b,
-                m.b.two.get_first_shared_parent(m.b.three),
-                msg="Should get the parent when parents are the same"
-            )
-            self.assertIs(
-                m,
-                m.b.two.get_first_shared_parent(m.c.two),
-                msg="Should find first matching object in parentage"
-            )
-            self.assertIs(
-                m,
-                m.b.two.get_first_shared_parent(m.d),
-                msg="Should work when depth is not equal"
-            )
-            self.assertIsNone(
-                m.b.two.get_first_shared_parent(m2.b.two),
-                msg="Should return None when no shared parent exists"
-            )
-            self.assertIsNone(
-                m.get_first_shared_parent(m.b),
-                msg="Should return None when parent is None"
-            )
-
-    def test_execution_automation(self):
-        fully_automatic = add_three_macro
-
-        def fully_defined(macro):
-            add_three_macro(macro)
-            macro.one > macro.two > macro.three
-            macro.starting_nodes = [macro.one]
-
-        def only_order(macro):
-            add_three_macro(macro)
-            macro.two > macro.three
-
-        def only_starting(macro):
-            add_three_macro(macro)
-            macro.starting_nodes = [macro.one]
-
-        m_auto = Macro(fully_automatic)
-        m_user = Macro(fully_defined)
-
-        x = 0
-        expected = add_one(add_one(add_one(x)))
-        self.assertEqual(
-            m_auto(one__x=x).three__result,
-            expected,
-            "DAG macros should run fine without user specification of execution."
-        )
-        self.assertEqual(
-            m_user(one__x=x).three__result,
-            expected,
-            "Macros should run fine if the user nicely specifies the exeuction graph."
-        )
-
-        with self.subTest("Partially specified execution should fail"):
-            # We don't yet check for _crappy_ user-defined execution,
-            # But we should make sure it's at least valid in principle
-            with self.assertRaises(ValueError):
-                Macro(only_order)
-
-            with self.assertRaises(ValueError):
-                Macro(only_starting)
-
-    def test_replace_node(self):
-        macro = Macro(add_three_macro)
-
-        adds_three_node = Macro(
-            add_three_macro,
-            inputs_map={"one__x": "x"},
-            outputs_map={"three__result": "result"}
-        )
-        adds_one_node = macro.two
-
-        self.assertEqual(
-            macro(one__x=0).three__result,
-            3,
-            msg="Sanity check"
-        )
-
-        with self.subTest("Verify successful cases"):
-
-            macro.replace(adds_one_node, adds_three_node)
-            self.assertEqual(
-                macro(one__x=0).three__result,
-                5,
-                msg="Result should be bigger after replacing an add_one node with an "
-                    "add_three macro"
-            )
-            self.assertFalse(
-                adds_one_node.connected,
-                msg="Replaced node should get disconnected"
-            )
-            self.assertIsNone(
-                adds_one_node.parent,
-                msg="Replaced node should get orphaned"
-            )
-
-            add_one_class = macro.wrap_as.single_value_node()(add_one)
-            self.assertTrue(issubclass(add_one_class, SingleValue), msg="Sanity check")
-            macro.replace(adds_three_node, add_one_class)
-            self.assertEqual(
-                macro(one__x=0).three__result,
-                3,
-                msg="Should be possible to replace with a class instead of an instance"
-            )
-
-            macro.replace("two", adds_three_node)
-            self.assertEqual(
-                macro(one__x=0).three__result,
-                5,
-                msg="Should be possible to replace by label"
-            )
-
-            macro.two.replace_with(adds_one_node)
-            self.assertEqual(
-                macro(one__x=0).three__result,
-                3,
-                msg="Nodes should have syntactic sugar for invoking replacement"
-            )
-
-            @Macro.wrap_as.function_node()
-            def add_two(x):
-                result = x + 2
-                return result
-            macro.two = add_two
-            self.assertEqual(
-                macro(one__x=0).three__result,
-                4,
-                msg="Composite should allow replacement when a class is assigned"
-            )
-
-            self.assertListEqual(
-                macro.starting_nodes,
-                [macro.one],
-                msg="Sanity check"
-            )
-            new_starter = add_two()
-            macro.one.replace_with(new_starter)
-            self.assertListEqual(
-                macro.starting_nodes,
-                [new_starter],
-                msg="Replacement should be reflected in the starting nodes"
-            )
-            self.assertIs(
-                macro.inputs.one__x.value_receiver,
-                new_starter.inputs.x,
-                msg="Replacement should be reflected in composite IO"
-            )
-
-        with self.subTest("Verify failure cases"):
-            another_macro = Macro(add_three_macro)
-            another_node = Macro(
-                add_three_macro,
-                inputs_map={"one__x": "x"},
-                outputs_map={"three__result": "result"},
-            )
-            another_macro.now_its_a_child = another_node
-
-            with self.assertRaises(
-                ValueError,
-                msg="Should fail when replacement has a parent"
-            ):
-                macro.replace(macro.two, another_node)
-
-            another_macro.remove(another_node)
-            another_node.inputs.x = another_macro.outputs.three__result
-            with self.assertRaises(
-                ValueError,
-                msg="Should fail when replacement is connected"
-            ):
-                macro.replace(macro.two, another_node)
-
-            another_node.disconnect()
-            an_ok_replacement = another_macro.two
-            another_macro.remove(an_ok_replacement)
-            with self.assertRaises(
-                ValueError,
-                msg="Should fail if the node being replaced isn't a child"
-            ):
-                macro.replace(another_node, an_ok_replacement)
-
-            @Macro.wrap_as.function_node()
-            def add_two_incompatible_io(not_x):
-                result_is_not_my_name = not_x + 2
-                return result_is_not_my_name
-
-            with self.assertRaises(
-                AttributeError,
-                msg="Replacing via class assignment should fail if the class has "
-                    "incompatible IO"
-            ):
-                macro.two = add_two_incompatible_io
-
-    def test_macro_connections_after_replace(self):
-        # If the macro-level IO is going to change after replacing a child,
-        # it had better still be able to recreate all the macro-level IO connections
-        # For macro IO channels that weren't connected, we don't really care
-        # If it fails to replace, it had better revert to its original state
-
-        macro = Macro(add_three_macro, one__x=0)
-        downstream = SingleValue(add_one, x=macro.outputs.three__result)
-        downstream.pull()
-        self.assertEqual(
-            0 + (1 + 1 + 1) + 1,
-            downstream.outputs.result.value,
-            msg="Sanity check that our test setup is what we want: macro->single"
-        )
-
-        def add_two(x):
-            result = x + 2
-            return result
-        compatible_replacement = SingleValue(add_two)
-
-        macro.replace(macro.three, compatible_replacement)
-        downstream.pull()
-        self.assertEqual(
-            len(downstream.inputs.x.connections),
-            1,
-            msg="After replacement, the downstream node should still have exactly one "
-                "connection to the macro"
-        )
-        self.assertIs(
-            downstream.inputs.x.connections[0],
-            macro.outputs.three__result,
-            msg="The one connection should be the living, updated macro IO channel"
-        )
-        self.assertEqual(
-            0 + (1 + 1 + 2) + 1,
-            downstream.outputs.result.value,
-            msg="The whole flow should still function after replacement, but with the "
-                "new behaviour (and extra 1 added)"
-        )
-
-        def different_signature(x):
-            # When replacing the final node of add_three_macro, the rebuilt IO will
-            # no longer have three__result, but rather three__changed_output_label,
-            # which will break existing macro-level IO if the macro output is connected
-            changed_output_label = x + 3
-            return changed_output_label
-
-        incompatible_replacement = SingleValue(
-            different_signature,
-            label="original_label"
-        )
-        with self.assertRaises(
-            AttributeError,
-            msg="macro.three__result is connected output, but can't be found in the "
-                "rebuilt IO, so an exception is expected"
-        ):
-            macro.replace(macro.three, incompatible_replacement)
-        self.assertIs(
-            macro.three,
-            compatible_replacement,
-            msg="Failed replacements should get reverted, putting the original node "
-                "back"
-        )
-        self.assertIs(
-            macro.three.outputs.result.value_receiver,
-            macro.outputs.three__result,
-            msg="Failed replacements should get reverted, restoring the link between "
-                "child IO and macro IO"
-        )
-        self.assertIs(
-            downstream.inputs.x.connections[0],
-            macro.outputs.three__result,
-            msg="Failed replacements should get reverted, and macro IO should be as "
-                "it was before"
-        )
-        self.assertFalse(
-            incompatible_replacement.connected,
-            msg="Failed replacements should get reverted, leaving the replacement in "
-                "its original state"
-        )
-        self.assertEqual(
-            "original_label",
-            incompatible_replacement.label,
-            msg="Failed replacements should get reverted, leaving the replacement in "
-                "its original state"
-        )
-        macro > downstream
-        # If we want to push, we need to define a connection formally
-        macro.run(one__x=1)
-        # Fresh input to make sure updates are actually going through
-        self.assertEqual(
-            1 + (1 + 1 + 2) + 1,
-            downstream.outputs.result.value,
-            msg="Final integration test that replacements get reverted, the macro "
-                "function and downstream results should be the same as before"
-        )
-
-        downstream.disconnect()
-        macro.replace(macro.three, incompatible_replacement)
-        self.assertIs(
-            macro.three,
-            incompatible_replacement,
-            msg="Since it is only incompatible with the external connections and we "
-                "broke those first, replacement is expected to work fine now"
-        )
-        macro(one__x=2)
-        self.assertEqual(
-            2 + (1 + 1 + 3),
-            macro.outputs.three__changed_output_label.value,
-            msg="For all to be working, we need the result with the new behaviour "
-                "at its new location"
-        )
-
     def test_with_executor(self):
         macro = Macro(add_three_macro)
         downstream = SingleValue(add_one, x=macro.outputs.three__result)
@@ -512,6 +238,8 @@ def test_with_executor(self):
         )
 
         returned_nodes = result.result()  # Wait for the process to finish
+        from time import sleep
+        sleep(1)
         self.assertIsNot(
             original_one,
             returned_nodes.one,
@@ -561,8 +289,6 @@ def test_pulling_from_inside_a_macro(self):
         macro.inputs.one__x = 0  # Set value
         # Now macro.one.inputs.x has both value and a connection
 
-        print("MACRO ONE INPUT X", macro.one.inputs.x.value, macro.one.inputs.x.connections)
-
         self.assertEqual(
             0 + 1 + 1,
             macro.two.pull(run_parent_trees_too=False),
@@ -609,10 +335,13 @@ def grab_connections(macro):
             self.assertListEqual(
                 initial_labels,
                 list(m.nodes.keys()),
-                msg="Labels should be restored after failing to pull because of acyclicity"
+                msg="Labels should be restored after failing to pull because of "
+                    "acyclicity"
             )
             self.assertTrue(
-                all(c is ic for (c, ic) in zip(grab_connections(m), initial_connections)),
+                all(
+                    c is ic for (c, ic) in zip(grab_connections(m), initial_connections)
+                ),
                 msg="Connections should be restored after failing to pull because of "
                     "cyclic data flow"
             )
diff --git a/tests/unit/test_node.py b/tests/unit/test_node.py
new file mode 100644
index 00000000..5c18e8d6
--- /dev/null
+++ b/tests/unit/test_node.py
@@ -0,0 +1,307 @@
+from concurrent.futures import Future
+import os
+from sys import version_info
+import unittest
+
+from pyiron_workflow.channels import InputData, OutputData
+from pyiron_workflow.files import DirectoryObject
+from pyiron_workflow.io import Inputs, Outputs
+from pyiron_workflow.node import Node
+
+
+def add_one(x):
+    return x + 1
+
+
+class ANode(Node):
+    """To de-abstract the class"""
+
+    def __init__(self, label):
+        super().__init__(label=label)
+        self._inputs = Inputs(InputData("x", self, type_hint=int))
+        self._outputs = Outputs(OutputData("y", self, type_hint=int))
+
+    @property
+    def inputs(self) -> Inputs:
+        return self._inputs
+
+    @property
+    def outputs(self) -> Inputs:
+        return self._outputs
+
+    @property
+    def on_run(self):
+        return add_one
+
+    @property
+    def run_args(self) -> dict:
+        return {"x": self.inputs.x.value}
+
+    def process_run_result(self, run_output):
+        self.outputs.y.value = run_output
+        return run_output
+
+    def to_dict(self):
+        pass
+
+
+@unittest.skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
+class TestNode(unittest.TestCase):
+    def setUp(self):
+        n1 = ANode("start")
+        n2 = ANode("middle")
+        n3 = ANode("end")
+        n1.inputs.x = 0
+        n2.inputs.x = n1.outputs.y
+        n3.inputs.x = n2.outputs.y
+        self.n1 = n1
+        self.n2 = n2
+        self.n3 = n3
+
+    def test_set_input_values(self):
+        n = ANode("some_node")
+        n.set_input_values(x=2)
+        self.assertEqual(
+            2,
+            n.inputs.x.value,
+            msg="Post-instantiation update of inputs should also work"
+        )
+
+        n.set_input_values(y=3)
+        # Missing keys may throw a warning, but are otherwise allowed to pass
+
+        with self.assertRaises(
+            TypeError,
+            msg="Type checking should be applied",
+        ):
+            n.set_input_values(x="not an int")
+
+        n.deactivate_strict_hints()
+        n.set_input_values(x="not an int")
+        self.assertEqual(
+            "not an int",
+            n.inputs.x.value,
+            msg="It should be possible to deactivate type checking from the node level"
+        )
+
+    def test_run_data_tree(self):
+        self.assertEqual(
+            add_one(add_one(add_one(self.n1.inputs.x.value))),
+            self.n3.run(run_data_tree=True),
+            msg="Should pull start down to end, even with no flow defined"
+        )
+
+    def test_fetch_input(self):
+        self.n1.outputs.y.value = 0
+        with self.assertRaises(
+            ValueError,
+            msg="Without input, we should not achieve readiness"
+        ):
+            self.n2.run(run_data_tree=False, fetch_input=False, check_readiness=True)
+
+        self.assertEqual(
+            add_one(self.n1.outputs.y.value),
+            self.n2.run(run_data_tree=False, fetch_input=True),
+            msg="After fetching the upstream data, should run fine"
+        )
+
+    def test_check_readiness(self):
+        with self.assertRaises(
+            ValueError,
+            msg="When input is not data, we should fail early"
+        ):
+            self.n3.run(run_data_tree=False, fetch_input=False, check_readiness=True)
+
+        self.assertFalse(
+            self.n3.failed,
+            msg="The benefit of the readiness check should be that we don't actually "
+                "qualify as failed"
+        )
+
+        with self.assertRaises(
+            TypeError,
+            msg="If we bypass the check, we should get the failing function error"
+        ):
+            self.n3.run(run_data_tree=False, fetch_input=False, check_readiness=False)
+
+        self.assertTrue(
+            self.n3.failed,
+            msg="If the node operation itself fails, the status should be failed"
+        )
+
+        self.n3.inputs.x = 0
+        with self.assertRaises(
+            ValueError,
+            msg="When status is failed, we should fail early, even if input data is ok"
+        ):
+            self.n3.run(run_data_tree=False, fetch_input=False, check_readiness=True)
+
+        with self.assertRaises(
+            RuntimeError,
+            msg="If we manage to run with bad input, being in a failed state still "
+                "stops us"
+        ):
+            self.n3.run(run_data_tree=False, fetch_input=False, check_readiness=False)
+
+        self.n3.failed = False
+        self.assertEqual(
+            1,
+            self.n3.run(run_data_tree=False, fetch_input=False, check_readiness=True),
+            msg="After manually resetting the failed state and providing good input, "
+                "running should proceed"
+        )
+
+    def test_force_local_execution(self):
+        self.n1.executor = True
+        out = self.n1.run(force_local_execution=False)
+        with self.subTest("Test running with an executor fulfills promises"):
+            self.assertIsInstance(
+                out,
+                Future,
+                msg="With an executor, we expect a futures object back"
+            )
+            self.assertTrue(
+                self.n1.running,
+                msg="The running flag should be true while it's running, and "
+                    "(de)serialization is time consuming enough that we still expect"
+                    "this to be the case"
+            )
+            self.assertFalse(
+                self.n1.ready,
+                msg="While running, the node should not be ready."
+            )
+            with self.assertRaises(
+                RuntimeError,
+                msg="Running nodes should not be allowed to get their input updated",
+            ):
+                self.n1.inputs.x = 42
+            self.assertEqual(
+                1,
+                out.result(),
+                msg="If we wait for the remote execution to finish, it should give us"
+                    "the right thing"
+            )
+            self.assertEqual(
+                1,
+                self.n1.outputs.y.value,
+                msg="The callback on the executor should ensure the output processing "
+                    "happens"
+            )
+
+        self.n2.executor = True
+        self.n2.inputs.x = 0
+        self.assertEqual(
+            1,
+            self.n2.run(fetch_input=False, force_local_execution=True),
+            msg="Forcing local execution should do just that."
+        )
+
+    def test_emit_ran_signal(self):
+        self.n1 > self.n2 > self.n3  # Chained connection declaration
+
+        self.n1.run(emit_ran_signal=False)
+        self.assertFalse(
+            self.n3.inputs.x.ready,
+            msg="Without emitting the ran signal, nothing should happen downstream"
+        )
+
+        self.n1.run(emit_ran_signal=True)
+        self.assertEqual(
+            add_one(add_one(add_one(self.n1.inputs.x.value))),
+            self.n3.outputs.y.value,
+            msg="With the connection and signal, we should have pushed downstream "
+                "execution"
+        )
+
+    def test_execute(self):
+        self.n1.outputs.y = 0  # Prime the upstream data source for fetching
+        self.n2 > self.n3
+        self.assertEqual(
+            self.n2.run(fetch_input=False, emit_ran_signal=False, x=10) + 1,
+            self.n2.execute(x=11),
+            msg="Execute should _not_ fetch in the upstream data"
+        )
+        self.assertFalse(
+            self.n3.ready,
+            msg="Executing should not be triggering downstream runs, even though we "
+                "made a ran/run connection"
+        )
+
+        self.n2.inputs.x._value = "manually override the desired int"
+        with self.assertRaises(
+            TypeError,
+            msg="Execute should be running without a readiness check and hitting the "
+                "string + int error"
+        ):
+            self.n2.execute()
+
+    def test_pull(self):
+        self.n2 > self.n3
+        self.n1.inputs.x = 0
+        by_run = self.n2.run(
+                run_data_tree=True,
+                fetch_input=True,
+                emit_ran_signal=False
+            )
+        self.n1.inputs.x = 1
+        self.assertEqual(
+            by_run + 1,
+            self.n2.pull(),
+            msg="Pull should be running the upstream node"
+        )
+        self.assertFalse(
+            self.n3.ready,
+            msg="Pulling should not be triggering downstream runs, even though we "
+                "made a ran/run connection"
+        )
+
+    def test___call__(self):
+        # __call__ is just a pull that punches through macro walls, so we'll need to
+        # test it again over in macro to really make sure it's working
+        self.n2 > self.n3
+        self.n1.inputs.x = 0
+        by_run = self.n2.run(
+            run_data_tree=True,
+            fetch_input=True,
+            emit_ran_signal=False
+        )
+        self.n1.inputs.x = 1
+        self.assertEqual(
+            by_run + 1,
+            self.n2(),
+            msg="A call should be running the upstream node"
+        )
+        self.assertFalse(
+            self.n3.ready,
+            msg="Calling should not be triggering downstream runs, even though we "
+                "made a ran/run connection"
+        )
+
+    def test_working_directory(self):
+        self.assertTrue(
+            self.n1._working_directory is None,
+            msg="Sanity check -- No working directory should be made unless asked for"
+        )
+        self.assertFalse(
+            os.path.isdir(self.n1.label),
+            msg="Sanity check -- No working directory should be made unless asked for"
+        )
+        self.assertIsInstance(
+            self.n1.working_directory,
+            DirectoryObject,
+            msg="Directory should be created on first access"
+        )
+        self.assertTrue(
+            str(self.n1.working_directory.path).endswith(self.n1.label),
+            msg="Directory name should be based off of label"
+        )
+        self.assertTrue(
+            os.path.isdir(self.n1.label),
+            msg="Now we asked for it, it should be there"
+        )
+        self.n1.working_directory.delete()
+        self.assertFalse(
+            os.path.isdir(self.n1.label),
+            msg="Just want to make sure we cleaned up after ourselves"
+        )
+
diff --git a/tests/unit/test_pyiron_workflow.py b/tests/unit/test_pyiron_workflow.py
new file mode 100644
index 00000000..7f1f77e4
--- /dev/null
+++ b/tests/unit/test_pyiron_workflow.py
@@ -0,0 +1,10 @@
+from sys import version_info
+import unittest
+
+
+@unittest.skipUnless(version_info[0] == 3 and version_info[1] >= 10, "Only supported for 3.10+")
+class TestModule(unittest.TestCase):
+    def test_single_point_of_entry(self):
+        from pyiron_workflow import Workflow
+        # That's it, let's just make sure the main class is available at the topmost
+        # level
diff --git a/tests/unit/test_workflow.py b/tests/unit/test_workflow.py
index c54aa2b7..c4c73dc3 100644
--- a/tests/unit/test_workflow.py
+++ b/tests/unit/test_workflow.py
@@ -3,11 +3,8 @@
 from time import sleep
 import unittest
 
-from bidict import ValueDuplicationError
-
 from pyiron_workflow._tests import ensure_tests_in_python_path
 from pyiron_workflow.channels import NotData
-from pyiron_workflow.files import DirectoryObject
 from pyiron_workflow.util import DotDict
 from pyiron_workflow.workflow import Workflow
 
@@ -24,196 +21,58 @@ def setUpClass(cls) -> None:
         ensure_tests_in_python_path()
         super().setUpClass()
 
-    def test_node_addition(self):
-        wf = Workflow("my_workflow")
-
-        # Validate the four ways to add a node
-        wf.add(Workflow.create.Function(plus_one, label="foo"))
-        wf.create.Function(plus_one, label="bar")
-        wf.baz = wf.create.Function(plus_one, label="whatever_baz_gets_used")
-        Workflow.create.Function(plus_one, label="qux", parent=wf)
-        self.assertListEqual(list(wf.nodes.keys()), ["foo", "bar", "baz", "qux"])
-        wf.boa = wf.qux
-        self.assertListEqual(
-            list(wf.nodes.keys()),
-            ["foo", "bar", "baz", "boa"],
-            msg="Reassignment should remove the original instance"
-        )
-
-        wf.strict_naming = False
-        # Validate name incrementation
-        wf.add(Workflow.create.Function(plus_one, label="foo"))
-        wf.create.Function(plus_one, label="bar")
-        wf.baz = wf.create.Function(
-            plus_one,
-            label="without_strict_you_can_override_by_assignment"
-        )
-        Workflow.create.Function(plus_one, label="boa", parent=wf)
-        self.assertListEqual(
-            list(wf.nodes.keys()),
-            [
-                "foo", "bar", "baz", "boa",
-                "foo0", "bar0", "baz0", "boa0",
-            ]
-        )
-
-        with self.subTest("Make sure trivial re-assignment has no impact"):
-            original_foo = wf.foo
-            n_nodes = len(wf.nodes)
-            wf.foo = original_foo
-            self.assertIs(
-                original_foo,
-                wf.foo,
-                msg="Reassigning a node to the same name should have no impact",
-            )
-            self.assertEqual(
-                n_nodes,
-                len(wf.nodes),
-                msg="Reassigning a node to the same name should have no impact",
-            )
-
-        with self.subTest("Make sure strict naming causes a bunch of attribute errors"):
-            wf.strict_naming = True
-            # Validate name preservation
-            with self.assertRaises(AttributeError):
-                wf.add(wf.create.Function(plus_one, label="foo"))
-
-            with self.assertRaises(AttributeError):
-                wf.create.Function(plus_one, label="bar")
-
-            with self.assertRaises(AttributeError):
-                wf.baz = wf.create.Function(plus_one, label="whatever_baz_gets_used")
-
-            with self.assertRaises(AttributeError):
-                Workflow.create.Function(plus_one, label="boa", parent=wf)
-
-    def test_node_removal(self):
-        wf = Workflow("my_workflow")
-        wf.owned = Workflow.create.Function(plus_one)
-        node = Workflow.create.Function(plus_one)
-        wf.foo = node
-        # Add it to starting nodes manually, otherwise it's only there at run time
-        wf.starting_nodes = [wf.foo]
-        # Connect it inside the workflow
-        wf.foo.inputs.x = wf.owned.outputs.y
-
-        wf.remove(node)
-        self.assertIsNone(node.parent, msg="Removal should de-parent")
-        self.assertFalse(node.connected, msg="Removal should disconnect")
-        self.assertListEqual(
-            wf.starting_nodes,
-            [],
-            msg="Removal should also remove from starting nodes"
-        )
-
-    def test_node_packages(self):
-        wf = Workflow("my_workflow")
-        wf.register("demo", "static.demo_nodes")
-
-        # Test invocation
-        wf.create.demo.OptionallyAdd(label="by_add")
-        # Test invocation with attribute assignment
-        wf.by_assignment = wf.create.demo.OptionallyAdd()
-
-        self.assertSetEqual(
-            set(wf.nodes.keys()),
-            set(["by_add", "by_assignment"]),
-            msg=f"Expected one node label generated automatically from the class and "
-                f"the other from the attribute assignment, but got {wf.nodes.keys()}"
-        )
+    def test_io(self):
+        wf = Workflow("wf")
+        wf.create.Function(plus_one, label="n1")
+        wf.create.Function(plus_one, label="n2")
+        wf.create.Function(plus_one, label="n3")
 
-    def test_double_workfloage_and_node_removal(self):
-        wf1 = Workflow("one")
-        wf1.create.Function(plus_one, label="node1")
-        node2 = Workflow.create.Function(
-            plus_one, label="node2", parent=wf1, x=wf1.node1.outputs.y
-        )
-        self.assertTrue(node2.connected)
-
-        wf2 = Workflow("two")
-        with self.assertRaises(ValueError):
-            # Can't belong to two workflows at once
-            wf2.add(node2)
-        disconnections = wf1.remove(node2)
-        self.assertFalse(node2.connected, msg="Removal should first disconnect")
-        self.assertListEqual(
-            disconnections,
-            [(node2.inputs.x, wf1.node1.outputs.y)],
-            msg="Disconnections should be returned by removal"
+        inp = wf.inputs
+        inp_again = wf.inputs
+        self.assertIsNot(
+            inp, inp_again, msg="Workflow input should always get rebuilt"
         )
-        wf2.add(node2)
-        self.assertEqual(node2.parent, wf2)
 
-        node1 = wf1.node1
-        disconnections = wf1.remove(node1.label)
+        n_in = len(wf.inputs)
+        n_out = len(wf.outputs)
+        wf.create.Function(plus_one, label="n4")
         self.assertEqual(
-            node1.parent,
-            None,
-            msg="Should be able to remove nodes by label as well as by object"
+            n_in + 1, len(wf.inputs), msg="Workflow IO should be drawn from its nodes"
         )
-        self.assertListEqual(
-            [],
-            disconnections,
-            msg="node1 should have no connections left"
+        self.assertEqual(
+            n_out + 1, len(wf.outputs), msg="Workflow IO should be drawn from its nodes"
         )
 
-    def test_workflow_io(self):
-        wf = Workflow("wf")
-        wf.create.Function(plus_one, label="n1")
-        wf.create.Function(plus_one, label="n2")
-        wf.create.Function(plus_one, label="n3")
-
-        with self.subTest("Workflow IO should be drawn from its nodes"):
-            self.assertEqual(len(wf.inputs), 3)
-            self.assertEqual(len(wf.outputs), 3)
-
+        n_in = len(wf.inputs)
+        n_out = len(wf.outputs)
         wf.n3.inputs.x = wf.n2.outputs.y
         wf.n2.inputs.x = wf.n1.outputs.y
+        self.assertEqual(
+            n_in -2, len(wf.inputs), msg="New connections should get reflected"
+        )
+        self.assertEqual(
+            n_out - 2, len(wf.outputs), msg="New connections should get reflected"
+        )
 
-        with self.subTest("Only unconnected channels should count"):
-            self.assertEqual(len(wf.inputs), 1)
-            self.assertEqual(len(wf.outputs), 1)
-
-        with self.subTest(
-                "IO should be re-mappable, including exposing internally connected "
-                "channels"
-        ):
-            wf.inputs_map = {"n1__x": "inp"}
-            wf.outputs_map = {"n3__y": "out", "n2__y": "intermediate"}
-            out = wf(inp=0)
-            self.assertEqual(out.out, 3)
-            self.assertEqual(out.intermediate, 2)
-
-    def test_node_decorator_access(self):
-        @Workflow.wrap_as.function_node("y")
-        def plus_one(x: int = 0) -> int:
-            return x + 1
-
-        self.assertEqual(plus_one().run(), 1)
+        wf.inputs_map = {"n1__x": "inp"}
+        self.assertIs(wf.n1.inputs.x, wf.inputs.inp, msg="IO should be renamable")
 
-    def test_working_directory(self):
-        wf = Workflow("wf")
-        self.assertTrue(wf._working_directory is None)
-        self.assertIsInstance(wf.working_directory, DirectoryObject)
-        self.assertTrue(str(wf.working_directory.path).endswith(wf.label))
-        wf.create.Function(plus_one)
-        self.assertTrue(
-            str(wf.plus_one.working_directory.path).endswith(wf.plus_one.label)
+        self.assertNotIn(wf.n2.outputs.y, wf.outputs, msg="Ensure starting condition")
+        self.assertIn(wf.n3.outputs.y, wf.outputs, msg="Ensure starting condition")
+        wf.outputs_map = {"n3__y": None, "n2__y": "intermediate"}
+        self.assertIn(wf.n2.outputs.y, wf.outputs, msg="IO should be exposable")
+        self.assertIs(
+            wf.n2.outputs.y, wf.outputs.intermediate, msg="IO should be by reference"
         )
-        wf.working_directory.delete()
+        self.assertNotIn(wf.n3.outputs.y, wf.outputs, msg="IO should be hidable")
 
-    def test_no_parents(self):
+    def test_is_parentmost(self):
         wf = Workflow("wf")
         wf2 = Workflow("wf2")
-        wf2.parent = None  # Is already the value and should ignore this
-        with self.assertRaises(TypeError):
-            # We currently specify workflows shouldn't get parents, this just verifies
-            # the spec. If that spec changes, test instead that you _can_ set parents!
-            wf2.parent = "not None"
 
         with self.assertRaises(TypeError):
             # Setting a non-None value to parent raises the type error from the setter
-            wf2.parent = wf
+            wf.sub_wf = wf2
 
     def test_with_executor(self):
 
@@ -357,13 +216,10 @@ def test_return_value(self):
             self.assertEqual(
                 return_on_explicit_run["b__y"],
                 2 + 2,
-                msg="On explicit run, the most recent input data should be used and the "
-                    "result should be returned"
+                msg="On explicit run, the most recent input data should be used and "
+                    "the result should be returned"
             )
 
-        # Note: We don't need to test running on an executor, because Workflows can't
-        #       do that yet
-
     def test_execution_automation(self):
         @Workflow.wrap_as.single_value_node("out")
         def foo(x, y):
@@ -436,52 +292,6 @@ def matches_expectations(results):
             with self.assertRaises(ValueError):
                 cyclic()
 
-    def test_io_label_maps_are_bijective(self):
-
-        with self.subTest("Null case"):
-            Workflow(
-                "my_workflow",
-                Workflow.create.Function(plus_one, label="foo1"),
-                Workflow.create.Function(plus_one, label="foo2"),
-                inputs_map={
-                    "foo1__x": "x1",
-                    "foo2__x": "x2"
-                },
-                outputs_map=None
-            )
-
-        with self.subTest("At instantiation"):
-            with self.assertRaises(ValueDuplicationError):
-                Workflow(
-                    "my_workflow",
-                    Workflow.create.Function(plus_one, label="foo1"),
-                    Workflow.create.Function(plus_one, label="foo2"),
-                    inputs_map={
-                        "foo1__x": "x",
-                        "foo2__x": "x"
-                    }
-                )
-
-        with self.subTest("Post-facto assignment"):
-            wf = Workflow(
-                "my_workflow",
-                Workflow.create.Function(plus_one, label="foo1"),
-                Workflow.create.Function(plus_one, label="foo2"),
-            )
-            wf.outputs_map = None
-            with self.assertRaises(ValueDuplicationError):
-                wf.inputs_map = {"foo1__x": "x", "foo2__x": "x"}
-
-        with self.subTest("Post-facto update"):
-            wf = Workflow(
-                "my_workflow",
-                Workflow.create.Function(plus_one, label="foo1"),
-                Workflow.create.Function(plus_one, label="foo2"),
-            )
-            wf.inputs_map = {"foo1__x": "x1", "foo2__x": "x2"}
-            with self.assertRaises(ValueDuplicationError):
-                wf.inputs_map["foo2__x"] = "x1"
-
     def test_pull_and_executors(self):
         def add_three_macro(macro):
             macro.one = Workflow.create.SingleValue(plus_one)