diff --git a/user-guide/docs/tools/jupyterhub/imgs/terminal.png b/user-guide/docs/tools/jupyterhub/imgs/terminal.png new file mode 100644 index 00000000..388491bf Binary files /dev/null and b/user-guide/docs/tools/jupyterhub/imgs/terminal.png differ diff --git a/user-guide/docs/tools/jupyterhub/jupyterhub.md b/user-guide/docs/tools/jupyterhub/jupyterhub.md index 47a167f0..9a40906f 100644 --- a/user-guide/docs/tools/jupyterhub/jupyterhub.md +++ b/user-guide/docs/tools/jupyterhub/jupyterhub.md @@ -87,9 +87,57 @@ Ephemoral or temporary user installations of Python packages is the preferrred a #### Custom User-Defined Kernels { #installing-kernels } -The objective of custom user-defined kernels is to allow users to build, customize, and share entire Python kernels to enable highly-customized Jupyter workflows and greater scientific reproducability. Each kernel includes their own Python interpreter and any number of user-selected Python packages installed via pip or conda. By being able to create and share their Python kernels, resarchers are able to easily create, share, and publish their development enviroments alongside their software avoiding any potential issue related to the environment and the dreaded "It works on my machine" issue. +The objective of custom user-defined kernels is to allow users to build, customize, and share entire Python kernels to enable highly-customized Jupyter workflows and greater scientific reproducability. Each kernel includes their own Python interpreter and any number of user-selected Python packages installed via pip or conda. By being able to create and share their Python kernels, researchers can easily create, share, and publish their development environments alongside their software, avoiding any potential issues related to the environment and the dreaded "It works on my machine" issue. + +User-defined kernels are supported in JupyterLab by creating conda environments. These steps are performed using a terminal within the Jupyter Notebook as follows: + +Open a new Terminal in your JupyterLab session (go to New Launcher/Other/Terminal) (see Figure below). + +![Launching a terminal on JupyterLab](./imgs/terminal.png) +Figure 6. Launching a terminal on JupyterLab + +Create a directory to store your custom Python environments. For example, by running the code below in the Terminal, you will create the Python_Envs folder in MyData. The second line configures conda in your current session to search this directory and its default directories when listing and creating environments by default. + +```bash +mkdir ~/MyData/Python_Envs +conda config --add envs_dirs ~/MyData/Python_Envs +``` + +Create a new environment using the following command. Replace the word your_environment with the name of your choice: + +```bash +conda create --name your_environment -y -c conda-forge pip python +conda activate your_environment +``` + +Note: you can create environments with specific versions of Python or specific packages. For more information check this link. + +Install your environment as a jupyter kernel: + +```bash +pip install ipykernel +ipython kernel install --user --name=your_environment +``` + +Install the packages of your choice using standard pip or conda syntax (see the example below). + +```bash +conda install networkx +pip install tensorflow +``` + +Once you have created or added a new kernel, those will become selectable alongside the base Python 3, Julia, and R kernels in the Launcher tab. You may need to wait a few seconds or refresh the browser to observe the new available kernel. Note that you can create as many kernels as you like to manage your various projects and their dependencies on DesignSafe. + +The Jupyter Session will end after a few days without any activity or when you shut down your server ("File" > "Hub Control Panel" > "Stop My Server" > "Log Out".). In such cases, your user-defined kernels will not be immediately visible on the restart. To enable the custom Python environments, you must re-add this Python environment to your Jupyter kernel spec. For this, open a Terminal (go to New Launcher/Other/Terminal) and run the following commands: + +```bash +conda config --add envs_dirs ~/MyData/Python_Envs +conda activate your_environment +ipython kernel install --user --name=your_environment +``` + +If you do not see your kernels reappear, wait a few seconds, refresh your browser, and return to the Launcher tab. -User-defined kernels are supported in the Updated Jupyter Image using the kernelutility Python package. To get started you will need to install the kernelutility which you can do using pip (e.g., in Jupyter "!pip install kernelutility", do not forget to restart your notebook after the installation is complete for Python to be able to see the new installation). To start using the kernelutility, run "from kernelutility import kernelset". Importing the kernelset will restore any kernels you created previously and allow you to manage your kernels via Python. The kernelset instance of the KernelSet class has four basic methods: create - that allows you to create new kernels, destroy - that allows you destroy previously created kernels, add - that allows you to add an existing kernel from a specified path by making a local copy, and remove - that allows you to remove a previously added or created kernel. Note that add is similar to destroy except that it does not clean up the kernel's files on disk such that it can be added again later if desired. Once you have created or added a new kernel those will become selectable alongside the base Python 3, Julia, and R kernels in the Launcher tab. Note that you can create as many kernels as you like to manage your various projects and their dependencies on DeisgnSafe. When you shutdown your server your user-defined kernels will not be immediately visible on restart, to activate them all you need to do is open a Jupyter notebook and run "from kernelutility import kernelset". If you do not see your kernels reappear wait a few seconds, refresh your browser, and return to the Launcher tab. If you have any issues using DesignSafe's JupyterHub, please create a ticket (https://designsafe-ci.org/help). diff --git a/user-guide/docs/tools/simulation/imgs/new_conda_environment.png b/user-guide/docs/tools/simulation/imgs/new_conda_environment.png new file mode 100644 index 00000000..de7b9ef4 Binary files /dev/null and b/user-guide/docs/tools/simulation/imgs/new_conda_environment.png differ diff --git a/user-guide/docs/tools/simulation/imgs/terminal.png b/user-guide/docs/tools/simulation/imgs/terminal.png new file mode 100644 index 00000000..7a54c837 Binary files /dev/null and b/user-guide/docs/tools/simulation/imgs/terminal.png differ diff --git a/user-guide/docs/tools/simulation/in-core.md b/user-guide/docs/tools/simulation/in-core.md index f56ebd7a..f8328caf 100644 --- a/user-guide/docs/tools/simulation/in-core.md +++ b/user-guide/docs/tools/simulation/in-core.md @@ -2,13 +2,19 @@ The Interdependent Networked Community Resilience Modeling Environment (IN-CORE) platform, in continuous development by the Center of Excellence for Risk-Based Community Resilience Planning (CoE), is the result of a multi-university research center funded by The National Institute of Standards and Technology (NIST). Moreover, the platform is intended to offer the potential for community contributed code as resilience modeling research evolves. The platform focuses on measurement science to support community resilience assessment through a risk-based approach to support decision-making for definition, prioritization, and comparison of resilience strategies at the community level. -The IN-CORE platform's main analysis tools correspond to the Python libraries [pyincore](https://incore.ncsa.illinois.edu/doc/incore/pyincore.html) and [pyincore-viz](https://incore.ncsa.illinois.edu/doc/incore/pyincore_viz.html). Users can access these using [IN-CORE lab](https://incore.ncsa.illinois.edu/doc/incore/incore_lab.html) (hosted on the NCSA cloud system) or by installing the Python libraries on local computers; the latter allows the user to run the job locally or submit the job through the [NCSA](https://www.ncsa.illinois.edu/) cloud system. -This user guide presents how to launch IN-CORE with DesignSafe resources, leveraging the computational capabilities within the DesignSafe Cyberinfrastructure. Moreover, advantages of launching IN-CORE within DesignSafe include the potential to integrate shared data, streamline data curation and publication of results that emerge from simulation with IN-CORE, or even couple IN-CORE simulations and codes with those from other DesignSafe tools and resources. +### **Pyincore**, the Python package to access and run IN-CORE models -### IN-CORE on DesignSafe Cyberinfrastructure (DesignSafe-CI) +The IN-CORE platform's main analysis and visualization tools correspond to the Python libraries [pyincore](https://incore.ncsa.illinois.edu/doc/incore/pyincore.html) and [pyincore-viz](https://incore.ncsa.illinois.edu/doc/incore/pyincore_viz.html). +Users can access these using [IN-CORE lab](https://incore.ncsa.illinois.edu/doc/incore/incore_lab.html) (hosted on the NCSA cloud system) or by installing the Python libraries on local computers; the latter allows the user to run the job locally or submit the job through the [NCSA](https://www.ncsa.illinois.edu/) cloud system. -The JupyterLab shell on DesignSafe-CI can be used to access the pyincore and pyincore-viz functions on DesignSafe-CI. Computational capabilities within the DesignSafe-CI are leveraged to enhance the regional-scale assessment tools within IN-CORE. DesignSafe users can also use the seamless communication of intermediate and final results from IN-CORE python packages with other DesignSafe tools through the DesignSafe-CI Jupyter Notebooks and Data Depot repositories. For example, high-fidelity hazard estimates can be obtained from different resources at DesignSafe and used as input data for risk and resilience analysis using IN-CORE Python packages. Monte Carlo simulations or optimization can be run leveraging the HPC resources of DesignSafe. The interaction between the data archived in Data Depot, tools and applications’ workflow in DesignSafe-CI, and the use of IN-CORE tools through JupyterLab allows the users to create different roadmaps for analysis, visualization, and results publication to advance the field of regional-scale community resilience estimation. +This user guide presents how to use **pyincore** with DesignSafe resources, leveraging the computational capabilities within the DesignSafe Cyberinfrastructure. Moreover, advantages of launching **pyincore** within DesignSafe include the potential to integrate shared data, streamline data curation and publication of results that emerge from simulation with IN-CORE, or even couple IN-CORE outputs and codes with those from other DesignSafe tools and resources. + +### Pyincore on DesignSafe Cyberinfrastructure (DesignSafe-CI) + +The JupyterLab shell on DesignSafe-CI can be used to access the pyincore and pyincore-viz functions on DesignSafe-CI. +Computational capabilities within the DesignSafe-CI are leveraged to enhance the regional-scale assessment tools within IN-CORE. +DesignSafe users can also use the seamless communication of intermediate and final results from IN-CORE python packages with other DesignSafe tools through the DesignSafe-CI Jupyter Notebooks and Data Depot repositories. For example, high-fidelity hazard estimates can be obtained from different resources at DesignSafe and used as input data for risk and resilience analysis using IN-CORE Python packages. Monte Carlo simulations or optimization can be run leveraging the HPC resources of DesignSafe. The interaction between the data archived in Data Depot, tools and applications’ workflow in DesignSafe-CI, and the use of IN-CORE tools through JupyterLab allows the users to create different roadmaps for analysis, visualization, and results publication to advance the field of regional-scale community resilience estimation. Using a client-based development, IN-CORE Python libraries can connect directly to the NCSA cloud system to retrieve published models and run analyses. However, to leverage the resources at DesignSafe-CI, the client mode must be disabled (more information is presented below), and the models must be created “locally” (on DesignSafe-CI JupyterHub). @@ -19,7 +25,7 @@ The user can install pyincore using any of these two options: 1) the [temporary user installation](#Title1.1) 2) creating a [specific kernel for pyincore](#Title1.2) -While option 1 may be faster, option 2 corresponds to the formal (recommended) approach for installing the IN-CORE packages. Additionally, some related packages to pyincore, e.g. pyincore-viz, may present installation conflicts when using the temporary option (option 1). For more information about installing Python libraries on DesignSafe-CI, refer to [Installing Packages](https://www.designsafe-ci.org/user-guide/tools/jupyterhub/#installing). +While option 1 may be faster, option 2 corresponds to the formal (recommended) approach for installing the IN-CORE Python packages. Additionally, some related packages to pyincore, e.g. pyincore-viz, may present installation conflicts when using the temporary option (option 1). For more information about installing Python libraries on DesignSafe-CI, refer to [Installing Packages](https://www.designsafe-ci.org/user-guide/tools/jupyterhub/#installing). To start, access DesignSafe JupyterHub via the DesignSafe-CI. Select "Tools & Applications" > "Analysis" > "Jupyter". When asked to select a notebook image, select the “Updated Jupyter Image” and click “Start My Server”. ![Figure 1. Access to the JupyterHub on DesignSafe-CI](./imgs/in-core-1.png) @@ -36,50 +42,49 @@ After this, you may need to restart your kernel (click on Kernel/Restart Kernel #### Installing pyincore creating a new environment (recommended) -To install the maintained version of the pyincore and the pyincore-viz packages, a particular environment using `conda` must be created. This step requires installing the `kernelutility` Python package as follows: - -```python -!pip3 -q install kernelutility -``` -After this, you may need to restart your kernel (click on Kernel/Restart Kernel and Clear All Outputs). For more information on the use of `kernelutility` refer to [Custom User-Defined Kernels](https://www.designsafe-ci.org/user-guide/tools/jupyterhub/#installing-kernels). +To install the maintained version of the pyincore and the pyincore-viz packages, a particular environment using conda must be created. If you haven’t created a custom python environment, we recommend following the steps presented in the guide for installing Custom User-Defined Kernels. These steps are performed using a terminal within the Jupyter Notebook. -Next, use the `kernelutility` package to create a sharable kernel supported by the Updated Jupyter Image on DesignSafe. Using the following command, create a new environment called 'pyincore_on_DS': +Using this guide, create a new environment called _pyincore_on_DS_ and install your environment as a jupyter kernel. Then, install the pyincore and pyincore-viz libraries. The steps are presented below, but the detailed explanation of User-Defined Kernels can be found in the link above. -```python -from kernelutility import kernelset -kernelset.create('pyincore_on_DS') +```bash +conda create --name pyincore_on_DS -y -c conda-forge python=3.10 pip +conda activate pyincore_on_DS +pip install ipykernel +ipython kernel install --user --name=pyincore_on_DS +conda install -c in-core pyincore +conda install -c in-core pyincore-viz ``` -After this step, that is, the previous cell has finished running, select the newly created environment in the "switch kernel" panel (right upper corner of the notebook, as shown in Figure 2). Select specifically the one with the name **Python: [conda env:pyincore_on_DS]**. Then, restart the kernel (click on Kernel/Restart Kernel and Clear All Outputs). -![Figure 2. Selecting the newly created conda environment](./imgs/in-core-2.png) +After these steps, the new environment should appear in your launcher. +It also must appear in the "switch kernel" panel (right upper corner of the notebook, as shown in Figure 2). Select the newly created environment; this must appear as `[conda env:pyincore_on_DS]`. + +![Figure 2. Selecting the newly created conda environment](./imgs/new_conda_environment.png) *Figure 2. Selecting the newly created conda environment* -Use the `%conda install` command to install pyincore and pyincore-viz and the recently created environment. -```python -%conda install -c in-core pyincore -%conda install -c in-core pyincore-viz -``` +Then, restart the kernel (click on Kernel/Restart Kernel and Clear All Outputs). -At this point, you have created a new environment, installed pyincore and pyincore-viz with their respective dependencies, and one last restart of the kernel is required. This created environment can be accessed throughout the current and future sessions. +#### Reproducibility after shutting down your server (if you created a new Python environment) -#### Reproducibility after shutting down your server (if you installed pyincore using kernelutility) +The Jupyter Session will be ended after a few days without any activity or when the user has decided to shut down the server ("File" > "Hub Control Panel" > "Stop My Server" > "Log Out".). +In such cases, the next time the user accesses JupyterLab, the previously created environments will not be immediately found. +To enable the custom Python environments, you must re-add this python environment to your Jupyter kernelspec. Open a Terminal (go to New Launcher/Other/Terminal) and run the following commands: -The Jupyter Session will be ended after a few days without any activity or when the user has decided to shut down the server ("File" > "Hub Control Panel" > "Stop My Server" > "Log Out"). In such case, the next time the user accesses the Updated Jupyter Image, the user-defined kernels (pre-existing conda environments, such as the newly created environment 'pyincore_on_DS') will not be immediately visible. If this happens, you will have to run the following commands: - -```python -!pip -q install kernelutility -from kernelutility import kernelset +```bash +conda config --add envs_dirs ~/MyData/Python_Envs +conda activate pyincore_on_DS +ipython kernel install --user --name=pyincore_on_DS ``` -After waiting a few seconds, the pre-existing user-defined kernels may appear after clicking on the "switch kernel" panel (right upper corner, as shown in Figure 2). If not, refresh your browser and check the "switch kernel" panel again. -For more information on accessing created environments, refer to [Custom User-Defined Kernels](https://www.designsafe-ci.org/user-guide/tools/jupyterhub/#installing-kernels). +After waiting a few seconds, the pre-existing user-defined kernels may appear after clicking on the "switch kernel" panel (right upper corner, as shown in Figure 3). If not, refresh your browser and check the "switch kernel" panel again. ### Example: IN-CORE tools within DesignSafe-CI -The following example leverages the use case published in the Data Depot as [PRJ-4675 “IN-CORE on DesignSafe”](https://doi.org/10.17603/ds2-cx62-ve21). The notebook presents a use case focused on the risk analysis of a regional scale portfolio of bridges exposed to seismic events. The goal of this use case is to show the interaction of DesignSafe with IN-CORE Python tools. You can copy this folder to your “My Data” folder to enable editing permission, thus enabling working directly on the Jupyter Notebook. To access to the main Jupyter notebook of the published use case (called **main.ipynb**), click on the button below. +The following example leverages the use case published in the Data Depot as [PRJ-4675 “IN-CORE on DesignSafe”](https://doi.org/10.17603/ds2-cx62-ve21). +The notebook presents a use case focused on the risk analysis of a regional scale portfolio of bridges exposed to seismic events. The goal of this use case is to show the interaction of DesignSafe with IN-CORE Python tools. +You can copy this folder to your “My Data” folder to enable editing permission, thus enabling working directly on the Jupyter Notebook. To access to the main Jupyter notebook of the published use case (called **main.ipynb**), click on the button below. -[![Open in DesignSafe](https://raw.githubusercontent.com/geoelements/LearnMPM/main/DesignSafe-Badge.svg)](https://jupyter.designsafe-ci.org/hub/user-redirect/lab/tree/NHERI-Published/PRJ-4675/main.ipynb) +[![Open in DesignSafe](https://raw.githubusercontent.com/geoelements/LearnMPM/main/DesignSafe-Badge.svg)](https://jupyter.designsafe-ci.org/hub/user-redirect/lab/tree/NHERI-Published/PRJ-4675v2/main.ipynb) For more information about advanced analyses in IN-CORE, including housing unit allocation, population dislocation evaluation, recovery analyses, and computable general equilibrium modeling for quantifying community-level recovery, the reader is referred to the IN-CORE user documentation at the [IN-CORE website](https://incore.ncsa.illinois.edu/doc/incore/introduction.html). @@ -170,7 +175,7 @@ fragility_class1 = FragilityCurveSet.from_json_str(definition_frag1) plt = plot.get_fragility_plot(fragility_class1, start=0, end=1.5) ``` -![Figure 4. Univariate visualization of the created fragility functions](./imgs/in-core-4.jpg) +![Figure 4. Visualization of the created fragility functions](./imgs/in-core-4.jpg) *Figure 4. Univariate visualization of the created fragility functions* diff --git a/user-guide/docs/usecases/padgett/usecase_JN_viz.md b/user-guide/docs/usecases/padgett/usecase_JN_viz.md index a7f0b09c..0e31f60a 100644 --- a/user-guide/docs/usecases/padgett/usecase_JN_viz.md +++ b/user-guide/docs/usecases/padgett/usecase_JN_viz.md @@ -12,7 +12,20 @@ Jupyter notebook for visualization of spatially distributed data in risk and res _Keywords: visualization; risk and resilience; infrastructure systems; static, interactive, and animated maps and figures, effective communication_ ### Resources - + + + + #### Jupyter Notebooks The following Jupyter notebook is the basis for the use case described in this section. You can access and run it directly on DesignSafe by clicking on the "Open in DesignSafe" button. @@ -52,7 +65,7 @@ This use case adopts a representative hazard and distributed infrastructure syst To start working with this use case, open the Jupyter Notebook on the published project using the button below (*same notebook as above*). -[![Open In DesignSafe](https://raw.githubusercontent.com/geoelements/LearnMPM/main/DesignSafe-Badge.svg)](https://jupyter.designsafe-ci.org/user/silvia/lab/workspaces/auto-k/tree/NHERI-Published/PRJ-3939v3/visualization_risk_resilience.ipynb) +[![Open In DesignSafe](https://raw.githubusercontent.com/geoelements/LearnMPM/main/DesignSafe-Badge.svg)](https://jupyter.designsafe-ci.org/hub/user-redirect/lab/tree/NHERI-Published/PRJ-3939v3/visualization_risk_resilience.ipynb) It may be necessary to click on "Run">"Run All Cells" to allow the visualization of some of the interactive figures. Note: Some cells are used to save figures, which will present an error because the published notebook is in a "Read Only" folder. To run these specific cells or save customized figures, copy the notebook and the input files to your "My Data" folder, as explained below. @@ -101,7 +114,7 @@ These interactive maps are useful in situations when data visualization is not f In this use case, damage state exceedance probabilities are obtained for each bridge (i.e., considered as an outcome of applying fragility models to the hazard scenario). An example of this output is presented in the ‘bridge_result.csv’ file for one hazard scenario. In this file format, damage state exceedance probabilities $\mathbb{P}(DS \geq ds_i)$ are named as "$LS_i$", for $i=1,…,4$; similarly, probabilities of being in a damage state $\mathbb{P}(DS = ds_i)$ are named as "$DS_i$", for $i=1,…,4$. For such cases, there may be interest in visualizing the spatial distribution of damage to infrastructure components. -Here, interactive Python libraries are used to visualize and inspect fine information on the different components that comprise the map, such as bridge location, basic information, and damage condition (see [Figure 3](#Fig3)). These interactive functionalities are integrated using Python libraries such as Plotly and Folium; these allow the user to pan over the different geospatially distributed systems and inspect the region or assets of interest. Also, these enable the user to construct icon objects that display data of interest (e.g. the ‘construction year’ and the ‘exceeding probability of damage state 3’ in [Figure 3a](#Fig3)) when hovering over the bridge locations. If additional data is also important to display (e.g. hazard intensity, link, or bridge IDs, among others), ‘pop-up’ functionalities can be used to present this information when the user clicks on a particular object (shown in [Figure 3b](#Fig3)). +Here, interactive Python libraries are used to visualize and inspect fine information on the different components that comprise the map, such as bridge location, basic information, and damage condition (see [Figure 3](#Fig3)). These interactive functionalities are integrated using Python libraries such as Plotly and Folium; these allow the user to pan over the different geospatially distributed systems and inspect the region or assets of interest. Also, these enable the user to construct icon objects that display data of interest (e.g., the ‘construction year’ and the ‘exceeding probability of damage state 3’ in [Figure 3a](#Fig3)) when hovering over the bridge locations. If additional data is also important to display (e.g., hazard intensity, link, or bridge IDs, among others), ‘pop-up’ functionalities can be used to present this information when the user clicks on a particular object (shown in [Figure 3b](#Fig3)). As shown in this use case, interactive maps can be enhanced by handling the icons, points, and link characteristics such as type, icon figure, color, etc. [Figure 3](#Fig3) presents the bridge condition using a common color coding related to post-hazard tagging. Red tag is used here when $\mathbb{P}(DS≥ds_3 )≥0.15$, yellow tag is used if $0.05≤\mathbb{P}(DS≥ds_3 )<0.15$, and green tag is used if $\mathbb{P}(DS≥ds_3)<0.05$; note that these limits have been arbitrarily selected for display purposes. Moreover, objects such as legends and color bars can be easily included in such interactive maps to add additional layers of information. Given the possibility of presenting the data "online", these are very useful tools for communication with stakeholders, inspection teams, or simply for data analysis during damage simulation or recovery processes.