From 5a062db3d0d0fa7214a1d00081f03628051ae985 Mon Sep 17 00:00:00 2001 From: Gillian Petro <96886803+gspetro-NOAA@users.noreply.github.com> Date: Tue, 21 Jun 2022 10:17:55 -0400 Subject: [PATCH] [release/public-v2] Penultimate documentation edits (#282) * update hpc-stack module docs & MacOS config.sh * update machine file instructions * updates to BuildRun chapter * fix typo * merge changes from PR #240 to release-v2 * resolve conflicts w/PR #281 * I/O updates to wget commands * Update documentation for CSV file containing WE2E test info (#278) * Edits to documentation to match latest in code. * Edits to documentation to match latest in code. * Minor changes to documentation. * update HPSS file names * add links, remove comments * add links, remove comments * FAQ updates * ContribGuide updates * Rocoto ch updates * update wflow gen image * update LAM grid chapter * update docs links * update NC QS * glossary updates * add Stochastic Physics info * format valid values * edits for Config Params * final Config Param updates * Graphics edits * remove WCOSS info * Intro updates pt1 * Docs for Linux SRW build and run * Remove WCOSS-specifics * miscellaneous changes * Intro & QS updates * minor fixes * I/O updates * I/O updates * update RW hash * update file paths to NaturalEarth * change ESFM docs version * change ESFM docs version * finish I/O updates * update paths from develop to v2p0 * Update BuildRunSRW.rst "wcoss_dell_p3" is a valid value * Update ConfigWorkflow.rst placed "WCOSS_DELL_P3" as a valid value * updates to WE2E * WE2E updates * final WE2E updates * update METplus support info * update NC-QS Guide * QS updates * fix WE2E tables * add WE2E test files * WE2E test edits * WE2E table updates * QS & updates to config_defaults table in build/run * QS modulefiles load * fix typo * update mac/linux section of build/run * fix image * fix image * typo * update CCPP link * adjust Mac/Linux organization * mark's edits pt1 * mac/linux reorg * minor edits Co-authored-by: gspetro Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com> Co-authored-by: Natalie Perlin <68030316+natalie-perlin@users.noreply.github.com> Co-authored-by: gsketefian <31046882+gsketefian@users.noreply.github.com> Co-authored-by: Natalie Perlin --- README.md | 6 +- docs/UsersGuide/source/BuildRunSRW.rst | 933 ++++++++++-------- docs/UsersGuide/source/CompleteTests.csv | 28 - docs/UsersGuide/source/Components.rst | 33 +- docs/UsersGuide/source/ConfigWorkflow.rst | 399 ++++---- docs/UsersGuide/source/ContributorsGuide.rst | 36 +- docs/UsersGuide/source/FAQ.rst | 50 +- docs/UsersGuide/source/Glossary.rst | 75 +- docs/UsersGuide/source/Graphics.rst | 136 ++- docs/UsersGuide/source/InputOutputFiles.rst | 187 ++-- docs/UsersGuide/source/Introduction.rst | 246 +++-- docs/UsersGuide/source/LAMGrids.rst | 58 +- docs/UsersGuide/source/Non-ContainerQS.rst | 50 +- docs/UsersGuide/source/Quickstart.rst | 161 +-- docs/UsersGuide/source/RocotoInfo.rst | 18 +- docs/UsersGuide/source/Tests.csv | 30 + .../source/{CompleteTests.rst => Tests.rst} | 6 +- docs/UsersGuide/source/WE2Etests.rst | 209 ++-- .../_static/FV3regional_workflow_gen_v2.png | Bin 0 -> 550227 bytes docs/UsersGuide/source/fix_file_list.rst | 823 +++++++++++++++ docs/UsersGuide/source/index.rst | 4 +- 21 files changed, 2149 insertions(+), 1339 deletions(-) delete mode 100644 docs/UsersGuide/source/CompleteTests.csv create mode 100644 docs/UsersGuide/source/Tests.csv rename docs/UsersGuide/source/{CompleteTests.rst => Tests.rst} (65%) create mode 100644 docs/UsersGuide/source/_static/FV3regional_workflow_gen_v2.png create mode 100644 docs/UsersGuide/source/fix_file_list.rst diff --git a/README.md b/README.md index 7306cf9250..1ec8497218 100644 --- a/README.md +++ b/README.md @@ -2,9 +2,11 @@ The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. -The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App release branches represent a snapshot of this continuously evolving system. The SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within the User's Guide and supported through a community forum (https://forums.ufscommunity.org/). +The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application release v2.0.0, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App v2.0.0 represents a snapshot of this continuously evolving system. -The UFS SRW App User's Guide associated with the development branch can be found at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.0.0 release can be found at: https://srw-users-guide.readthedocs.io/en/release-public-v2/. The GitHub repository link is: https://github.com/ufs-community/ufs-srweather-app. +The SRW Application v2.0.0 includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a community forum (https://forums.ufscommunity.org/). New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four Stochastically Perturbed Perturbation (SPP) schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability. + +The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.0.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/release-public-v2/. The repository is at: https://github.com/ufs-community/ufs-srweather-app. For instructions on how to clone the repository, build the code, and run the workflow, see: https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 94e7deda10..7f04b51bfe 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -4,36 +4,39 @@ Building and Running the SRW App ===================================== -The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. +The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is built, users can configure an experiment and generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. -This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The "out-of-the-box" SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) domain (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The out-of-the-box SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) domain (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: - All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. + The SRW Application has `four levels of support `__. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. This chapter can also serve as a starting point for running the SRW App on other systems (including generic Linux/Mac systems), but the user may need to perform additional troubleshooting. .. note:: - The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth system-based knowledge, especially on Level 3 and 4 systems; it is less appropriate for beginners. + The :ref:`container approach ` is recommended for a smoother first-time build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth system-based knowledge, especially on Level 3 and 4 systems, so it is less appropriate for beginners. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: - * :ref:`Install prerequisites ` - * :ref:`Clone the SRW App from GitHub ` - * :ref:`Check out the external repositories ` - * :ref:`Set up the build environment and build the executables ` - * :ref:`Download and stage data ` - * :ref:`Optional: Configure a new grid ` - * :ref:`Generate a regional workflow experiment ` + #. :ref:`Install prerequisites ` + #. :ref:`Clone the SRW App from GitHub ` + #. :ref:`Check out the external repositories ` + #. :ref:`Set up the build environment and build the executables ` + #. :ref:`Download and stage data ` + #. :ref:`Optional: Configure a new grid ` + #. :ref:`Generate a regional workflow experiment ` + * :ref:`Configure the experiment parameters ` * :ref:`Load the python environment for the regional workflow ` - * :ref:`Run the regional workflow ` - * :ref:`Optional: Plot the output ` + + #. :ref:`Run the regional workflow ` + #. :ref:`Optional: Plot the output ` .. _AppOverallProc: .. figure:: _static/FV3LAM_wflow_overall.png + :alt: Flowchart describing the SRW App workflow steps. - *Overall layout of the SRW App Workflow* + *Overall layout of the SRW App Workflow* .. _HPCstackInfo: @@ -44,7 +47,7 @@ Install the HPC-Stack .. Attention:: Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud). -**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system and builds the software stack required for `UFS `_ applications such as the SRW App. +**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system to build the software stack required for `UFS `_ applications such as the SRW App. Background ---------------- @@ -53,7 +56,7 @@ The UFS Weather Model draws on over 50 code libraries to run its applications. T Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) or models that depend on it. Users can either build the HPC-Stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing the HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) that depend on it. Users can either build the HPC-Stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing the HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: .. code-block:: console @@ -65,27 +68,25 @@ Users working on systems that fall under `Support Levels 2-4 `. -After completing installation, continue to the next section. +After completing installation, continue to the next section (:numref:`Section %s: Download the UFS SRW Application Code `). .. _DownloadSRWApp: Download the UFS SRW Application Code ====================================== -The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the ``develop`` branch of the repository: +The SRW Application source code is publicly available on GitHub. To download the SRW App code, clone the ``release/public-v2`` branch of the repository: .. code-block:: console - git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git - -.. - COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. + git clone -b release/public-v2 https://github.com/ufs-community/ufs-srweather-app.git The cloned repository contains the configuration files and sub-directories shown in -:numref:`Table %s `. The user may set an ``$SRW`` environmental variable to point to the location of the new ``ufs-srweather-app`` repository. For example, if ``ufs-srweather-app`` was cloned into the $HOME directory: +:numref:`Table %s `. The user may set an ``$SRW`` environment variable to point to the location of the new ``ufs-srweather-app`` repository. For example, if ``ufs-srweather-app`` was cloned into the ``$HOME`` directory, the following commands will set an ``$SRW`` environment variable in a bash or csh shell, respectively: .. code-block:: console export SRW=$HOME/ufs-srweather-app + setenv SRW $HOME/ufs-srweather-app .. _FilesAndSubDirs: @@ -94,7 +95,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | **File/Directory Name** | **Description** | +================================+========================================================+ - | CMakeLists.txt | Main cmake file for SRW App | + | CMakeLists.txt | Main CMake file for SRW App | +--------------------------------+--------------------------------------------------------+ | Externals.cfg | Includes tags pointing to the correct version of the | | | external GitHub repositories/branches used in the SRW | @@ -109,7 +110,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | ufs_srweather_app.settings.in | SRW App configuration summary | +--------------------------------+--------------------------------------------------------+ - | modulefiles | Contains build and workflow module files | + | modulefiles | Contains build and workflow modulefiles | +--------------------------------+--------------------------------------------------------+ | etc | Contains Lmod startup scripts | +--------------------------------+--------------------------------------------------------+ @@ -118,7 +119,7 @@ The cloned repository contains the configuration files and sub-directories shown | manage_externals | Utility for checking out external repositories | +--------------------------------+--------------------------------------------------------+ | src | Contains CMakeLists.txt; external repositories | - | | will be cloned in this directory. | + | | will be cloned into this directory. | +--------------------------------+--------------------------------------------------------+ @@ -127,7 +128,7 @@ The cloned repository contains the configuration files and sub-directories shown Check Out External Components ================================ -The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own :term:`repository`. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective git repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. +The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own repository. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective git repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. Run the executable that pulls in SRW App components from external repositories: @@ -150,13 +151,13 @@ Set Up the Environment and Build the Executables ``devbuild.sh`` Approach ----------------------------- -On Level 1 systems for which a modulefile is provided under the ``modulefiles`` directory, we can build the SRW App binaries with: +On Level 1 systems for which a modulefile is provided under the ``modulefiles`` directory, users can build the SRW App binaries with: .. code-block:: console ./devbuild.sh --platform= -where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``macos`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` +where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. For example: @@ -168,7 +169,7 @@ where valid values are ``intel`` or ``gnu``. The last line of the console output should be ``[100%] Built target ufs-weather-model``, indicating that the UFS Weather Model executable has been built successfully. -The executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/bin`` directory. If this build method doesn't work, or it users are not on a supported machine, they will have to manually setup the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. +The executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/bin`` directory. If this build method doesn't work, or if users are not on a supported machine, they will have to manually setup the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. .. _ExecDescription: @@ -181,19 +182,39 @@ The executables listed in :numref:`Table %s ` should appear in | chgres_cube | Reads in raw external model (global or regional) and surface climatology data | | | to create initial and lateral boundary conditions | +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | + | | create a global sea ice analysis used to update the GFS once per day | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | + | | global depth analysis used to update the GFS snow field once per day | + +------------------------+---------------------------------------------------------------------------------+ | filter_topo | Filters topography based on resolution | +------------------------+---------------------------------------------------------------------------------+ + | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | + +------------------------+---------------------------------------------------------------------------------+ + | fvcom_to_FV3 | Determines lake surface conditions for the Great Lakes | + +------------------------+---------------------------------------------------------------------------------+ + | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | + +------------------------+---------------------------------------------------------------------------------+ | global_equiv_resol | Calculates a global, uniform, cubed-sphere equivalent resolution for the | | | regional Extended Schmidt Gnomonic (ESG) grid | +------------------------+---------------------------------------------------------------------------------+ - | make_solo_mosaic | Creates mosaic files with halos | + | inland | Creates an inland land mask by determining in-land (i.e. non-coastal) points | + | | and assigning a value of 1. Default value is 0. | +------------------------+---------------------------------------------------------------------------------+ - | upp.x | Post-processor for the model output | + | lakefrac | Calculates the ratio of the lake area to the grid cell area at each atmospheric | + | | grid point. | +------------------------+---------------------------------------------------------------------------------+ - | ufs_model | UFS Weather Model executable | + | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | + | | for global uniform grids | + +------------------------+---------------------------------------------------------------------------------+ + | make_solo_mosaic | Creates mosaic files with halos | +------------------------+---------------------------------------------------------------------------------+ | orog | Generates orography, land mask, and gravity wave drag files from fixed files | +------------------------+---------------------------------------------------------------------------------+ + | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | + | | developed by NOAA's Global Systems Laboratory (GSL) | + +------------------------+---------------------------------------------------------------------------------+ | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | +------------------------+---------------------------------------------------------------------------------+ | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | @@ -201,79 +222,58 @@ The executables listed in :numref:`Table %s ` should appear in | shave | Shaves the excess halo rows down to what is required for the lateral boundary | | | conditions (LBC's) in the orography and grid files | +------------------------+---------------------------------------------------------------------------------+ - | vcoord_gen | Generates hybrid coordinate interface profiles | - +------------------------+---------------------------------------------------------------------------------+ - | fvcom_to_FV3 | Determines lake surface conditions for the Great Lakes | - +------------------------+---------------------------------------------------------------------------------+ - | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | - | | for global uniform grids | - +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | - | | create a global sea ice analysis used to update the GFS once per day | - +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | - | | global depth analysis used to update the GFS snow field once per day | - +------------------------+---------------------------------------------------------------------------------+ - | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | - +------------------------+---------------------------------------------------------------------------------+ - | inland | Creates an inland land mask by determining in-land (i.e. non-coastal) points | - | | and assigning a value of 1. Default value is 0. | - +------------------------+---------------------------------------------------------------------------------+ - | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | - | | developed by NOAA's Global Systems Laboratory (GSL) | + | upp.x | Post-processor for the model output | +------------------------+---------------------------------------------------------------------------------+ - | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | + | ufs_model | UFS Weather Model executable | +------------------------+---------------------------------------------------------------------------------+ - | lakefrac | Calculates the ratio of the lake area to the grid cell area at each atmospheric | - | | grid point. | + | vcoord_gen | Generates hybrid coordinate interface profiles | +------------------------+---------------------------------------------------------------------------------+ - + .. _CMakeApproach: CMake Approach ----------------- -Set Up the Workflow Environment +Set Up the Build Environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. attention:: - If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. - For the CMake steps on MacOS systems, follow the approach in :numref:`Step %s `. + * If users successfully built the executables in :numref:`Table %s `, they should skip to step :numref:`Step %s `. + * Users who want to build the SRW App on a generic MacOS should skip to :numref:`Step %s ` and follow the approach there. -If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, assuming a ``bash`` login shell, run: +If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, users can run one of the following two commands depending on whether they have a bash or csh shell, respectively: .. code-block:: console source etc/lmod-setup.sh gaea + source etc/lmod-setup.csh gaea -or if the login shell is ``csh`` or ``tcsh``, run ``source etc/lmod-setup.csh`` instead. If users execute the above command on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using ````, run: +If users execute one of the above commands on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). + +From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in the ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using a given ````, run: .. code-block:: console - module use + module use module load build__ -where ```` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. +where ```` is the full path to the ``modulefiles`` directory. -On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: +This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands, depending on whether they are using a bash or csh/tcsh shell, respectively: .. code-block:: export = setenv -.. - COMMENT: Might be good to list an example here... +Note that building the SRW App without Lmod is not supported for this release. It should be possible to do so, but it has not been tested. Users are encouraged to install Lmod on their system. .. _BuildCMake: Build the Executables Using CMake ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -.. attention:: - If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. - -In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: +After setting up the build environment in the preceding section, users need to build the executables required to run the SRW App. In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: .. code-block:: console @@ -289,11 +289,11 @@ From the build directory, run the following commands to build the pre-processing ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j8``, ``-j2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times. -The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. +The build will take several minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. Once all executables have been built, users may continue to :numref:`Step %s `. .. hint:: - If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + If you see the ``build.out`` file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. .. _MacDetails: @@ -301,9 +301,9 @@ Additional Details for Building on MacOS ------------------------------------------ .. note:: - Users not building the SRW App to run on MacOS may skip to the :ref:`next section `. + Users who are **not** building the SRW App on a MacOS machine may skip to the :ref:`next section `. -The SRW App can be built on MacOS systems, presuming HPC-Stack has already been successfully installed. The following two options have been tested: +The SRW App can be built on MacOS machines, presuming HPC-Stack has already been installed successfully. The following two options have been tested: * **Option 1:** MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed @@ -333,7 +333,7 @@ An excerpt of the ``build_macos_gnu`` contents appears below for Option 1. To us #setenv FC "/usr/local/bin/gfortran" #setenv CXX "/usr/local/bin/g++" -Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running SRW App: +Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running the SRW App: .. code-block:: console @@ -342,32 +342,30 @@ Then, users must source the Lmod setup file, just as they would on other systems module load build_macos_gnu export LDFLAGS="-L${MPI_ROOT}/lib" -In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh macos`` in place of the first line in the code above. - -.. note:: - If you execute ``source etc/lmod-setup.sh`` on systems that don't need it, it will simply do a ``module purge``. +In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh macos`` in place of the first line in the code block above. -Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. You could use a streamline editor `sed` to change it: +Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in the ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. To make this change using a streamline editor (`sed`), run: .. code-block:: console sed -i .bak 's/QUAD_PRECISION\" ON)/QUAD_PRECISION\" OFF)/' $SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt -Proceed to building executables using CMake in :numref:`Step %s ` +Proceed to building the executables using the process outlined in :numref:`Step %s `. + .. _Data: Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available. For Level 2-4 systems, the data must be added. Detailed instructions on how to add the data can be found in :numref:`Section %s `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: Grid Configuration ======================= -The SRW App officially supports four different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the four predefined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. +The SRW App officially supports four different predefined grids as shown in :numref:`Table %s `. The out-of-the-box SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the four predefined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Section %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals.sh`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params.sh`` script. .. _PredefinedGrids: @@ -403,224 +401,310 @@ The first two steps depend on the platform being used and are described here for Set Experiment Parameters ---------------------------- -Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specific ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. For background info on ``config_defaults.sh``, read :numref:`Section %s `, or jump to :numref:`Section %s ` to continue configuring the experiment. +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specified ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. + +Section Overview: + + * For background information on ``config_defaults.sh``, read :numref:`Section %s `. + * Jump to :numref:`Section %s ` to continue configuring the experiment. + * Go to :numref:`Section %s ` for additional details on configuring an experiment on a generic Linux or MacOS system. + .. _DefaultConfigSection: Default configuration: ``config_defaults.sh`` ------------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. note:: This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is informative, but users do not need to modify ``config_defaults.sh`` to run the out-of-the-box case for the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. Important configuration variables in the ``config_defaults.sh`` file appear in -:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` +:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the ``config_defaults.sh`` settings. There is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and in :numref:`Chapter %s `. .. _ConfigVarsDefault: .. table:: Configuration variables specified in the config_defaults.sh script. - +----------------------+------------------------------------------------------------+ - | **Group Name** | **Configuration variables** | - +======================+============================================================+ - | Experiment mode | RUN_ENVIR | - +----------------------+------------------------------------------------------------+ - | Machine and queue | MACHINE, ACCOUNT, SCHED, PARTITION_DEFAULT, QUEUE_DEFAULT, | - | | PARTITION_HPSS, QUEUE_HPSS, PARTITION_FCST, QUEUE_FCST | - +----------------------+------------------------------------------------------------+ - | Cron | USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS | - +----------------------+------------------------------------------------------------+ - | Experiment Dir. | EXPT_BASEDIR, EXPT_SUBDIR | - +----------------------+------------------------------------------------------------+ - | NCO mode | COMINgfs, STMP, NET, envir, RUN, PTMP | - +----------------------+------------------------------------------------------------+ - | Separator | DOT_OR_USCORE | - +----------------------+------------------------------------------------------------+ - | File name | EXPT_CONFIG_FN, RGNL_GRID_NML_FN, DATA_TABLE_FN, | - | | DIAG_TABLE_FN, FIELD_TABLE_FN, FV3_NML_BASE_SUITE_FN, | - | | FV3_NML_YALM_CONFIG_FN, FV3_NML_BASE_ENS_FN, | - | | MODEL_CONFIG_FN, NEMS_CONFIG_FN, FV3_EXEC_FN, | - | | WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, | - | | EXTRN_MDL_ICS_VAR_DEFNS_FN, EXTRN_MDL_LBCS_VAR_DEFNS_FN, | - | | WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN | - +----------------------+------------------------------------------------------------+ - | Forecast | DATE_FIRST_CYCL, DATE_LAST_CYCL, CYCL_HRS, FCST_LEN_HRS | - +----------------------+------------------------------------------------------------+ - | IC/LBC | EXTRN_MDL_NAME_ICS, EXTRN_MDL_NAME_LBCS, | - | | LBC_SPEC_INTVL_HRS, FV3GFS_FILE_FMT_ICS, | - | | FV3GFS_FILE_FMT_LBCS | - +----------------------+------------------------------------------------------------+ - | NOMADS | NOMADS, NOMADS_file_type | - +----------------------+------------------------------------------------------------+ - | External model | USE_USER_STAGED_EXTRN_FILES, EXTRN_MDL_SOURCE_BASEDIR_ICS, | - | | EXTRN_MDL_FILES_ICS, EXTRN_MDL_SOURCE_BASEDIR_LBCS, | - | | EXTRN_MDL_FILES_LBCS | - +----------------------+------------------------------------------------------------+ - | CCPP | CCPP_PHYS_SUITE | - +----------------------+------------------------------------------------------------+ - | GRID | GRID_GEN_METHOD | - +----------------------+------------------------------------------------------------+ - | ESG grid | ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, | - | | ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, | - | | ESGgrid_WIDE_HALO_WIDTH | - +----------------------+------------------------------------------------------------+ - | Input configuration | DT_ATMOS, LAYOUT_X, LAYOUT_Y, BLOCKSIZE, QUILTING, | - | | PRINT_ESMF, WRTCMP_write_groups, | - | | WRTCMP_write_tasks_per_group, WRTCMP_output_grid, | - | | WRTCMP_cen_lon, WRTCMP_cen_lat, WRTCMP_lon_lwr_left, | - | | WRTCMP_lat_lwr_left, WRTCMP_lon_upr_rght, | - | | WRTCMP_lat_upr_rght, WRTCMP_dlon, WRTCMP_dlat, | - | | WRTCMP_stdlat1, WRTCMP_stdlat2, WRTCMP_nx, WRTCMP_ny, | - | | WRTCMP_dx, WRTCMP_dy | - +----------------------+------------------------------------------------------------+ - | Pre-existing grid | PREDEF_GRID_NAME, PREEXISTING_DIR_METHOD, VERBOSE | - +----------------------+------------------------------------------------------------+ - | Cycle-independent | RUN_TASK_MAKE_GRID, GRID_DIR, RUN_TASK_MAKE_OROG, | - | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | - +----------------------+------------------------------------------------------------+ - | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | - | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | - | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | - | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | - | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | - | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | - +----------------------+------------------------------------------------------------+ - | Workflow task | MAKE_GRID_TN, MAKE_OROG_TN, MAKE_SFC_CLIMO_TN, | - | | GET_EXTRN_ICS_TN, GET_EXTRN_LBCS_TN, MAKE_ICS_TN, | - | | MAKE_LBCS_TN, RUN_FCST_TN, RUN_POST_TN | - +----------------------+------------------------------------------------------------+ - | NODE | NNODES_MAKE_GRID, NNODES_MAKE_OROG, NNODES_MAKE_SFC_CLIMO, | - | | NNODES_GET_EXTRN_ICS, NNODES_GET_EXTRN_LBCS, | - | | NNODES_MAKE_ICS, NNODES_MAKE_LBCS, NNODES_RUN_FCST, | - | | NNODES_RUN_POST | - +----------------------+------------------------------------------------------------+ - | MPI processes | PPN_MAKE_GRID, PPN_MAKE_OROG, PPN_MAKE_SFC_CLIMO, | - | | PPN_GET_EXTRN_ICS, PPN_GET_EXTRN_LBCS, PPN_MAKE_ICS, | - | | PPN_MAKE_LBCS, PPN_RUN_FCST, PPN_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Walltime | WTIME_MAKE_GRID, WTIME_MAKE_OROG, WTIME_MAKE_SFC_CLIMO, | - | | WTIME_GET_EXTRN_ICS, WTIME_GET_EXTRN_LBCS, WTIME_MAKE_ICS, | - | | WTIME_MAKE_LBCS, WTIME_RUN_FCST, WTIME_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | - | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | - | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | - | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Post configuration | USE_CUSTOM_POST_CONFIG_FILE, CUSTOM_POST_CONFIG_FP | - +----------------------+------------------------------------------------------------+ - | Running ensembles | DO_ENSEMBLE, NUM_ENS_MEMBERS | - +----------------------+------------------------------------------------------------+ - | Stochastic physics | DO_SHUM, DO_SPPT, DO_SKEB, SHUM_MAG, SHUM_LSCALE, | - | | SHUM_TSCALE, SHUM_INT, SPPT_MAG, SPPT_LSCALE, SPPT_TSCALE, | - | | SPPT_INT, SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, | - | | SKEB_VDOF, USE_ZMTNBLCK | - +----------------------+------------------------------------------------------------+ - | Boundary blending | HALO_BLEND | - +----------------------+------------------------------------------------------------+ - | FVCOM | USE_FVCOM, FVCOM_DIR, FVCOM_FILE | - +----------------------+------------------------------------------------------------+ - | Compiler | COMPILER | - +----------------------+------------------------------------------------------------+ - | METplus | MODEL, MET_INSTALL_DIR, MET_BIN_EXEC, METPLUS_PATH, | - | | CCPA_OBS_DIR, MRMS_OBS_DIR, NDAS_OBS_DIR | - +----------------------+------------------------------------------------------------+ - - + +----------------------+--------------------------------------------------------------+ + | **Group Name** | **Configuration variables** | + +======================+==============================================================+ + | Experiment mode | RUN_ENVIR | + +----------------------+--------------------------------------------------------------+ + | Machine and queue | MACHINE, MACHINE_FILE, ACCOUNT, COMPILER | + | | NCORES_PER_NODE, LMOD_PATH, BUILD_MOD_FN, WFLOW_MOD_FN, | + | | SCHED, PARTITION_DEFAULT, CLUSTERS_DEFAULT, QUEUE_DEFAULT, | + | | PARTITION_HPSS, CLUSTERS_HPSS, QUEUE_HPSS, PARTITION_FCST, | + | | CLUSTERS_FCST, QUEUE_FCST | + +----------------------+--------------------------------------------------------------+ + | Workflow management | WORKFLOW_MANAGER, RUN_CMD_UTILS, RUN_CMD_FCST, RUN_CMD_POST | + +----------------------+--------------------------------------------------------------+ + | Cron | USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS | + +----------------------+--------------------------------------------------------------+ + | Directory parameters | EXPT_BASEDIR, EXPT_SUBDIR, EXEC_SUBDIR | + +----------------------+--------------------------------------------------------------+ + | NCO mode | COMINgfs, FIXLAM_NCO_BASEDIR, STMP, NET, envir, RUN, PTMP | + +----------------------+--------------------------------------------------------------+ + | Separator | DOT_OR_USCORE | + +----------------------+--------------------------------------------------------------+ + | File name | EXPT_CONFIG_FN, RGNL_GRID_NML_FN, DATA_TABLE_FN, | + | | DIAG_TABLE_FN, FIELD_TABLE_FN, FV3_NML_BASE_SUITE_FN, | + | | FV3_NML_YAML_CONFIG_FN, FV3_NML_BASE_ENS_FN, | + | | MODEL_CONFIG_FN, NEMS_CONFIG_FN, FV3_EXEC_FN, | + | | FCST_MODEL, WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, | + | | EXTRN_MDL_ICS_VAR_DEFNS_FN, EXTRN_MDL_LBCS_VAR_DEFNS_FN, | + | | WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN | + +----------------------+--------------------------------------------------------------+ + | Forecast | DATE_FIRST_CYCL, DATE_LAST_CYCL, CYCL_HRS, INCR_CYCL_FREQ, | + | | FCST_LEN_HRS | + +----------------------+--------------------------------------------------------------+ + | IC/LBC | EXTRN_MDL_NAME_ICS, EXTRN_MDL_NAME_LBCS, | + | | LBC_SPEC_INTVL_HRS, EXTRN_MDL_ICS_OFFSET_HRS, | + | | EXTRN_MDL_LBCS_OFFSET_HRS, FV3GFS_FILE_FMT_ICS, | + | | FV3GFS_FILE_FMT_LBCS | + +----------------------+--------------------------------------------------------------+ + | NOMADS | NOMADS, NOMADS_file_type | + +----------------------+--------------------------------------------------------------+ + | External model | EXTRN_MDL_SYSBASEDIR_ICS, EXTRN_MDL_SYSBASEDIR_LBCS, | + | | USE_USER_STAGED_EXTRN_FILES, EXTRN_MDL_SOURCE_BASEDIR_ICS, | + | | EXTRN_MDL_FILES_ICS, EXTRN_MDL_SOURCE_BASEDIR_LBCS, | + | | EXTRN_MDL_FILES_LBCS | + +----------------------+--------------------------------------------------------------+ + | CCPP | CCPP_PHYS_SUITE | + +----------------------+--------------------------------------------------------------+ + | Stochastic physics | NEW_LSCALE, DO_SHUM, DO_SPPT, DO_SKEB, DO_SPP, DO_LSM_SPP, | + | | ISEED_SHUM, SHUM_MAG, SHUM_LSCALE, SHUM_TSCALE, SHUM_INT, | + | | ISEED_SPPT, SPPT_MAG, SPPT_LOGIT, SPPT_LSCALE, SPPT_TSCALE, | + | | SPPT_INT, SPPT_SFCLIMIT, USE_ZMTNBLCK, ISEED_SKEB, | + | | SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, SKEBNORM, | + | | SKEB_VDOF, ISEED_SPP, SPP_MAG_LIST, SPP_LSCALE, SPP_TSCALE, | + | | SPP_SIGTOP1, SPP_SIGTOP2, SPP_STDDEV_CUTOFF, SPP_VAR_LIST, | + | | LSM_SPP_TSCALE, LSM_SPP_LSCALE, ISEED_LSM_SPP, | + | | LSM_SPP_VAR_LIST, LSM_SPP_MAG_LIST, LSM_SPP_EACH_STEP | + +----------------------+--------------------------------------------------------------+ + | GRID | GRID_GEN_METHOD, PREDEF_GRID_NAME | + +----------------------+--------------------------------------------------------------+ + | ESG grid | ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, | + | | ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, ESGgrid_PAZI | + | | ESGgrid_WIDE_HALO_WIDTH | + +----------------------+--------------------------------------------------------------+ + | GFDL grid | GFDLgrid_LON_T6_CTR, GFDLgrid_LAT_T6_CTR, GFDLgrid_RES, | + | | GFDLgrid_STRETCH_FAC, GFDLgrid_REFINE_RATIO, | + | | GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G, | + | | GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G, | + | | GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G, | + | | GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G, | + | | GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES | + +----------------------+--------------------------------------------------------------+ + | Input configuration | DT_ATMOS, RESTART_INTERVAL, WRITE_DOPOST, LAYOUT_X, | + | | LAYOUT_Y, BLOCKSIZE, QUILTING, | + | | PRINT_ESMF, WRTCMP_write_groups, | + | | WRTCMP_write_tasks_per_group, WRTCMP_output_grid, | + | | WRTCMP_cen_lon, WRTCMP_cen_lat, WRTCMP_lon_lwr_left, | + | | WRTCMP_lat_lwr_left, WRTCMP_lon_upr_rght, | + | | WRTCMP_lat_upr_rght, WRTCMP_dlon, WRTCMP_dlat, | + | | WRTCMP_stdlat1, WRTCMP_stdlat2, WRTCMP_nx, WRTCMP_ny, | + | | WRTCMP_dx, WRTCMP_dy | + +----------------------+--------------------------------------------------------------+ + | Experiment generation| PREEXISTING_DIR_METHOD, VERBOSE, DEBUG | + +----------------------+--------------------------------------------------------------+ + | Cycle-independent | RUN_TASK_MAKE_GRID, GRID_DIR, RUN_TASK_MAKE_OROG, | + | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | + +----------------------+--------------------------------------------------------------+ + | Cycle dependent | RUN_TASK_GET_EXTRN_ICS, RUN_TASK_GET_EXTRN_LBCS, | + | | RUN_TASK_MAKE_ICS, RUN_TASK_MAKE_LBCS, RUN_TASK_RUN_FCST | + | | RUN_TASK_RUN_POST | + +----------------------+--------------------------------------------------------------+ + | VX run tasks | RUN_TASK_GET_OBS_CCPA, RUN_TASK_GET_OBS_MRMS, | + | | RUN_TASK_GET_OBS_NDAS, RUN_TASK_VX_GRIDSTAT, | + | | RUN_TASK_VX_POINTSTAT, RUN_TASK_VX_ENSGRID, | + | | RUN_TASK_VX_ENSPOINT | + +----------------------+--------------------------------------------------------------+ + | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | + | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | + | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | + | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | + | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | + | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | + +----------------------+--------------------------------------------------------------+ + | Workflow tasks | MAKE_GRID_TN, MAKE_OROG_TN, MAKE_SFC_CLIMO_TN, | + | | GET_EXTRN_ICS_TN, GET_EXTRN_LBCS_TN, MAKE_ICS_TN, | + | | MAKE_LBCS_TN, RUN_FCST_TN, RUN_POST_TN | + +----------------------+--------------------------------------------------------------+ + | Verification tasks | GET_OBS, GET_OBS_CCPA_TN, GET_OBS_MRMS_TN, GET_OBS_NDAS_TN, | + | | VX_TN, VX_GRIDSTAT_TN, VX_GRIDSTAT_REFC_TN, | + | | VX_GRIDSTAT_RETOP_TN, VX_GRIDSTAT_##h_TN, VX_POINTSTAT_TN, | + | | VX_ENSGRID_TN, VX_ENSGRID_##h_TN, VX_ENSGRID_REFC_TN, | + | | VX_ENSGRID_RETOP_TN, VX_ENSGRID_MEAN_TN, VX_ENSGRID_PROB_TN, | + | | VX_ENSGRID_MEAN_##h_TN, VX_ENSGRID_PROB_03h_TN, | + | | VX_ENSGRID_PROB_REFC_TN, VX_ENSGRID_PROB_RETOP_TN, | + | | VX_ENSPOINT_TN, VX_ENSPOINT_MEAN_TN, VX_ENSPOINT_PROB_TN | + +----------------------+--------------------------------------------------------------+ + | NODE | NNODES_MAKE_GRID, NNODES_MAKE_OROG, NNODES_MAKE_SFC_CLIMO, | + | | NNODES_GET_EXTRN_ICS, NNODES_GET_EXTRN_LBCS, | + | | NNODES_MAKE_ICS, NNODES_MAKE_LBCS, NNODES_RUN_FCST, | + | | NNODES_RUN_POST, NNODES_GET_OBS_CCPA, NNODES_GET_OBS_MRMS, | + | | NNODES_GET_OBS_NDAS, NNODES_VX_GRIDSTAT, | + | | NNODES_VX_POINTSTAT, NNODES_VX_ENSGRID, | + | | NNODES_VX_ENSGRID_MEAN, NNODES_VX_ENSGRID_PROB, | + | | NNODES_VX_ENSPOINT, NNODES_VX_ENSPOINT_MEAN, | + | | NNODES_VX_ENSPOINT_PROB | + +----------------------+--------------------------------------------------------------+ + | MPI processes | PPN_MAKE_GRID, PPN_MAKE_OROG, PPN_MAKE_SFC_CLIMO, | + | | PPN_GET_EXTRN_ICS, PPN_GET_EXTRN_LBCS, PPN_MAKE_ICS, | + | | PPN_MAKE_LBCS, PPN_RUN_FCST, PPN_RUN_POST, | + | | PPN_GET_OBS_CCPA, PPN_GET_OBS_MRMS, PPN_GET_OBS_NDAS, | + | | PPN_VX_GRIDSTAT, PPN_VX_POINTSTAT, PPN_VX_ENSGRID, | + | | PPN_VX_ENSGRID_MEAN, PPN_VX_ENSGRID_PROB, PPN_VX_ENSPOINT, | + | | PPN_VX_ENSPOINT_MEAN, PPN_VX_ENSPOINT_PROB | + +----------------------+--------------------------------------------------------------+ + | Walltime | WTIME_MAKE_GRID, WTIME_MAKE_OROG, WTIME_MAKE_SFC_CLIMO, | + | | WTIME_GET_EXTRN_ICS, WTIME_GET_EXTRN_LBCS, WTIME_MAKE_ICS, | + | | WTIME_MAKE_LBCS, WTIME_RUN_FCST, WTIME_RUN_POST, | + | | WTIME_GET_OBS_CCPA, WTIME_GET_OBS_MRMS, WTIME_GET_OBS_NDAS, | + | | WTIME_VX_GRIDSTAT, WTIME_VX_POINTSTAT, WTIME_VX_ENSGRID, | + | | WTIME_VX_ENSGRID_MEAN, WTIME_VX_ENSGRID_PROB, | + | | WTIME_VX_ENSPOINT, WTIME_VX_ENSPOINT_MEAN, | + | | WTIME_VX_ENSPOINT_PROB | + +----------------------+--------------------------------------------------------------+ + | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | + | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | + | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | + | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | + | | MAXTRIES_GET_OBS_CCPA, MAXTRIES_GET_OBS_MRMS, | + | | MAXTRIES_GET_OBS_NDAS, MAXTRIES_VX_GRIDSTAT, | + | | MAXTRIES_VX_GRIDSTAT_REFC, MAXTRIES_VX_GRIDSTAT_RETOP, | + | | MAXTRIES_VX_GRIDSTAT_##h, MAXTRIES_VX_POINTSTAT, | + | | MAXTRIES_VX_ENSGRID, MAXTRIES_VX_ENSGRID_REFC, | + | | MAXTRIES_VX_ENSGRID_RETOP, MAXTRIES_VX_ENSGRID_##h, | + | | MAXTRIES_VX_ENSGRID_MEAN, MAXTRIES_VX_ENSGRID_PROB, | + | | MAXTRIES_VX_ENSGRID_MEAN_##h, MAXTRIES_VX_ENSGRID_PROB_##h, | + | | MAXTRIES_VX_ENSGRID_PROB_REFC, | + | | MAXTRIES_VX_ENSGRID_PROB_RETOP, MAXTRIES_VX_ENSPOINT, | + | | MAXTRIES_VX_ENSPOINT_MEAN, MAXTRIES_VX_ENSPOINT_PROB | + +----------------------+--------------------------------------------------------------+ + | Aerosol climatology | USE_MERRA_CLIMO, FIXaer | + +----------------------+--------------------------------------------------------------+ + | Fixed file params | FIXlut | + +----------------------+--------------------------------------------------------------+ + | CRTM | USE_CRTM, CRTM_DIR | + +----------------------+--------------------------------------------------------------+ + | Post configuration | USE_CUSTOM_POST_CONFIG_FILE, CUSTOM_POST_CONFIG_FP, | + | | SUB_HOURLY_POST, DT_SUB_HOURLY_POST_MNTS | + +----------------------+--------------------------------------------------------------+ + | METplus | MODEL, MET_INSTALL_DIR, MET_BIN_EXEC, METPLUS_PATH, | + | | CCPA_OBS_DIR, MRMS_OBS_DIR, NDAS_OBS_DIR | + +----------------------+--------------------------------------------------------------+ + | Running ensembles | DO_ENSEMBLE, NUM_ENS_MEMBERS | + +----------------------+--------------------------------------------------------------+ + | Boundary blending | HALO_BLEND | + +----------------------+--------------------------------------------------------------+ + | FVCOM | USE_FVCOM, FVCOM_WCSTART, FVCOM_DIR, FVCOM_FILE | + +----------------------+--------------------------------------------------------------+ + | Thread Affinity | KMP_AFFINITY_*, OMP_NUM_THREADS_*, OMP_STACKSIZE_* | + +----------------------+--------------------------------------------------------------+ .. _UserSpecificConfig: User-specific configuration: ``config.sh`` --------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and is fully supported for this release. The operational/NCO mode is typically used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables that appear in the ``config.community.sh``, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. .. _ConfigCommunity: .. table:: Configuration variables specified in the config.community.sh script - +--------------------------------+-------------------+--------------------------------------------------------+ - | **Parameter** | **Default Value** | **config.community.sh Value** | - +================================+===================+========================================================+ - | MACHINE | "BIG_COMPUTER" | "hera" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | ACCOUNT | "project_name" | "an_account" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv16" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | VERBOSE | "TRUE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_ENVIR | "nco" | "community" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | PREEXISTING_DIR_METHOD | "delete" | "rename" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | GRID_GEN_METHOD | "ESGgrid" | "ESGgrid" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | QUILTING | "TRUE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v16" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FCST_LEN_HRS | "24" | "48" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | LBC_SPEC_INTVL_HRS | "6" | "6" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | DATE_FIRST_CYCL | "YYYYMMDD" | "20190615" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | DATE_LAST_CYCL | "YYYYMMDD" | "20190615" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | CYCL_HRS | ("HH1" "HH2") | "00" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | WTIME_RUN_FCST | "04:30:00" | "01:00:00" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | USE_USER_STAGED_EXTRN_FILES | "FALSE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_SOURCE_BASE_DIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | MODEL | "" | FV3_GFS_v16_CONUS_25km" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | METPLUS_PATH | "" | "/path/to/METPlus" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | MET_INSTALL_DIR | "" | "/path/to/MET" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | CCPA_OBS_DIR | "" | "/path/to/processed/CCPA/data" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | MRMS_OBS_DIR | "" | "/path/to/processed/MRMS/data" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | NDAS_OBS_DIR | "" | "/path/to/processed/NDAS/data" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_GET_OBS_CCPA | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_GET_OBS_MRMS | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_GET_OBS_NDAS | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_VX_GRIDSTAT | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_VX_POINTSTAT | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_VX_ENSGRID | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_TASK_VX_ENSPOINT | "FALSE" | "FALSE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | **Parameter** | **Default Value** | **config.community.sh Value** | + +================================+===================+==================================================================================+ + | MACHINE | "BIG_COMPUTER" | "hera" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | ACCOUNT | "project_name" | "an_account" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv16" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | COMPILER | "intel" | "intel" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | VERBOSE | "TRUE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_ENVIR | "nco" | "community" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | PREEXISTING_DIR_METHOD | "delete" | "rename" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | DO_ENSEMBLE | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | NUM_ENS_MEMBERS | "1" | "2" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | QUILTING | "TRUE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | CCPP_PHYS_SUITE | "FV3_GFS_v16" | "FV3_GFS_v16" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | FCST_LEN_HRS | "24" | "12" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | LBC_SPEC_INTVL_HRS | "6" | "6" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | DATE_FIRST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | DATE_LAST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | CYCL_HRS | ("HH1" "HH2") | "18" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | WTIME_RUN_FCST | "04:30:00" | "02:00:00" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | USE_USER_STAGED_EXTRN_FILES | "FALSE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASEDIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/2019061518" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/2019061518" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | MODEL | "" | FV3_GFS_v16_CONUS_25km" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | METPLUS_PATH | "" | "/path/to/METPlus" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | MET_INSTALL_DIR | "" | "/path/to/MET" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | CCPA_OBS_DIR | "" | "/path/to/processed/CCPA/data" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | MRMS_OBS_DIR | "" | "/path/to/processed/MRMS/data" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | NDAS_OBS_DIR | "" | "/path/to/processed/NDAS/data" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_MAKE_GRID | "TRUE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_MAKE_OROG | "TRUE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_MAKE_SFC_CLIMO | "TRUE" | "TRUE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_GET_OBS_CCPA | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_GET_OBS_MRMS | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_GET_OBS_NDAS | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_VX_GRIDSTAT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_VX_POINTSTAT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_VX_ENSGRID | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ + | RUN_TASK_VX_ENSPOINT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+----------------------------------------------------------------------------------+ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather-app`` directory, run: @@ -636,7 +720,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min .. note:: - MacOS users should refer to :numref:`Section %s ` for details on configuring an experiment on MacOS. + Generic Linux and MacOS users should refer to :numref:`Section %s ` for details on configuring an experiment and python environment. Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. @@ -650,6 +734,8 @@ Sample settings are indicated below for Level 1 platforms. Detailed guidance app Minimum parameter settings for running the out-of-the-box SRW App case on Level 1 machines: +.. _SystemData: + **Cheyenne:** .. code-block:: console @@ -658,13 +744,13 @@ Minimum parameter settings for running the out-of-the-box SRW App case on Level ACCOUNT="" EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_App/v2p0/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_App/v2p0/input_model_data///" where: -* refers to a subdirectory such as "FV3GFS" or "HRRR" containing the experiment data. -* refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. -* YYYYMMDDHH refers to a subdirectory containing data for the :term:`cycle` date. + * ```` refers to a subdirectory such as "FV3GFS" or "HRRR" containing the experiment data. + * ```` refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. + * ``YYYYMMDDHH`` refers to a subdirectory containing data for the :term:`cycle` date. **Hera, Jet, Orion, Gaea:** @@ -675,55 +761,37 @@ On Hera: .. code-block:: console - "/scratch2/BMC/det/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + "/scratch2/BMC/det/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" On Jet: .. code-block:: console - "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" On Orion: .. code-block:: console - "/work/noaa/fv3-cam/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + "/work/noaa/fv3-cam/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" On Gaea: .. code-block:: console - "/lustre/f2/pdata/ncep/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + "/lustre/f2/pdata/ncep/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" -For **WCOSS** systems, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: - -.. code-block:: console - - MACHINE="wcoss_cray" or MACHINE="wcoss_dell_p3" - ACCOUNT="valid_wcoss_project_code" - EXPT_SUBDIR="my_expt_name" - USE_USER_STAGED_EXTRN_FILES="TRUE" - -For WCOSS_DELL_P3: - -.. code-block:: console - - EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/model_data///YYYYMMDDHH/ICS" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/LBCS" - -**NOAA Cloud Systems:** +On NOAA Cloud Systems: .. code-block:: console MACHINE="NOAACLOUD" ACCOUNT="none" EXPT_SUBDIR="" - EXPT_BASEDIR="lustre/$USER/expt_dirs" - COMPILER="gnu" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data/FV3GFS/grib2/YYYYMMDDHH/" EXTRN_MDL_FILES_ICS=( "gfs.t18z.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data/FV3GFS/grib2/YYYYMMDDHH/" EXTRN_MDL_FILES_LBCS=( "gfs.t18z.pgrb2.0p25.f006" "gfs.t18z.pgrb2.0p25.f012" ) .. note:: @@ -731,132 +799,73 @@ For WCOSS_DELL_P3: The values of the configuration variables should be consistent with those in the ``valid_param_vals script``. In addition, various example configuration files can be found in the ``regional_workflow/tests/baseline_configs`` directory. -.. _VXConfig: - -Configure METplus Verification Suite (Optional) --------------------------------------------------- - -Users who want to use the METplus verification suite to evaluate their forecasts need to add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. - -.. attention:: - METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on `Level 1 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. - -.. note:: - If METplus users update their METplus installation, they must update the module load statements in ``ufs-srweather-app/regional_workflow/modulefiles/tasks//run_vx.local`` file to correspond to their system's updated installation: - - .. code-block:: console - - module use -a - module load met/ - -To use METplus verification, the path to the MET and METplus directories must be added to ``config.sh``: - -.. code-block:: console - - METPLUS_PATH="" - MET_INSTALL_DIR="" - -Users who have already staged the observation data needed for METplus (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. - -.. code-block:: console - - CCPA_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/ccpa/proc" - MRMS_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/mrms/proc" - NDAS_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/ndas/proc" - RUN_TASK_GET_OBS_CCPA="FALSE" - RUN_TASK_GET_OBS_MRMS="FALSE" - RUN_TASK_GET_OBS_NDAS="FALSE" - -If users have access to NOAA HPSS but have not pre-staged the data, they can simply set the ``RUN_TASK_GET_OBS_*`` tasks to "TRUE", and the machine will attempt to download the appropriate data from NOAA HPSS. The ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. - -Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. - -Next, the verification tasks must be turned on according to the user's needs. Users should add some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: - -.. code-block:: console - RUN_TASK_VX_GRIDSTAT="TRUE" - RUN_TASK_VX_POINTSTAT="TRUE" - RUN_TASK_VX_ENSGRID="TRUE" - RUN_TASK_VX_ENSPOINT="TRUE" +To configure an experiment and python environment for a general Linux or Mac system, see the :ref:`next section `. Otherwise, skip to :numref:`Section %s `. -These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s ` More details on all of the parameters in this section are available in :numref:`Chapter %s `. +.. _LinuxMacEnvConfig: -.. _SetUpPythonEnv: +User-specific Configuration on a General Linux/MacOS System +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Set Up the Python and Other Environment Parameters ----------------------------------------------------- -The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): - -.. code-block:: console - - module use - module load wflow_ - conda activate regional_workflow - -This command will activate the ``regional_workflow`` conda environment. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: - -.. code-block:: console - - conda init - source ~/.bashrc - conda activate regional_workflow - -.. _MacConfig: - -Configuring an Experiment on MacOS ------------------------------------------------------------- - -In principle, the configuration process for MacOS systems is the same as for other systems. However, the details of the configuration process on MacOS require a few extra steps. +The configuration process for Linux and MacOS systems is similar to the process for other systems, but it requires a few extra steps. .. note:: Examples in this subsection presume that the user is running Terminal.app with a bash shell environment. If this is not the case, users will need to adjust the commands to fit their command line application and shell environment. .. _MacMorePackages: -Install Additional Packages -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Check the version of bash, and upgrade it if it is lower than 4. Additionally, install the ``coreutils`` package: +Install/Upgrade Mac-Specific Packages +```````````````````````````````````````` +MacOS requires the installation of a few additional packages and, possibly, an upgrade to bash. Users running on MacOS should execute the following commands: .. code-block:: console bash --version brew upgrade bash brew install coreutils - -.. _MacVEnv: -Create a Python Virtual Environment -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +.. _LinuxMacVEnv: + +Creating a Virtual Environment on Linux and Mac +`````````````````````````````````````````````````` + +Users should ensure that the following packages are installed and up-to-date: -Users must create a python virtual environment for running the SRW on MacOS. This involves setting python3 as default, adding required python modules, and sourcing the ``regional_workflow``. - .. code-block:: console python3 -m pip --version python3 -m pip install --upgrade pip python3 -m ensurepip --default-pip + python3 -m pip install ruby OR(on MacOS only): brew install ruby + +Users must create a virtual environment (``regional_workflow``), store it in their ``$HOME/venv/`` directory, and install additional python packages: + +.. code-block:: console + + [[ -d $HOME/venv ]] | mkdir -p $HOME/venv python3 -m venv $HOME/venv/regional_workflow source $HOME/venv/regional_workflow/bin/activate python3 -m pip install jinja2 python3 -m pip install pyyaml python3 -m pip install f90nml - python3 -m pip install ruby OR: brew install ruby -The virtual environment can be deactivated by running the ``deactivate`` command. The virtual environment built here will be reactivated in :numref:`Step %s ` and needs to be used to generate the workflow and run the experiment. +The virtual environment can be deactivated by running the ``deactivate`` command. The virtual environment built here will be reactivated in :numref:`Step %s ` and needs to be used to generate the workflow and run the experiment. + -Install Rocoto -^^^^^^^^^^^^^^^^^^ +.. _LinuxMacExptConfig: + +Configuring an Experiment on General Linux and MacOS Systems +`````````````````````````````````````````````````````````````` + +**Optional: Install Rocoto** .. note:: - Users may `install Rocoto `__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and is not supported for this release. + Users may `install Rocoto `__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and had limited testing on general Linux plaforms. -Configure the SRW App -^^^^^^^^^^^^^^^^^^^^^^^^ +**Configure the SRW App:** -Users will need to configure their experiment just like on any other system. From the ``$SRW/regional_workflow/ush`` directory, users can copy the settings from ``config.community.sh`` into a ``config.sh`` file (see :numref:`Section %s `) above. In the ``config.sh`` file, users should set ``MACHINE="macos"`` and modify additional variables as needed. For example: +Configure an experiment using a template. Copy a ``config.community.sh`` file in the ``$SRW/regional_workflow/ush`` directory into ``config.sh`` file (see :numref:`Section %s `) above. In the ``config.sh`` file, set ``MACHINE="macos"`` or ``MACHINE="linux"``, and modify account and experiment info. For example: .. code-block:: console @@ -891,11 +900,12 @@ For :ref:`Option 2 `, add the following information to ``config.sh`` WRTCMP_write_groups="1" WRTCMP_write_tasks_per_group="1" -Configure the Machine File -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Configure the machine file based on the number of CPUs in the system (8 or 4). Specify the following variables in ``$SRW/regional_workflow/ush/machine/macos.sh``: +.. note:: + The number of MPI processes required by the forecast will be equal to LAYOUT_X * LAYOUT_Y + WRTCMP_write_tasks_per_group. + +**Configure the Machine File** -For Option 1 (8 CPUs): +Configure a ``macos.sh`` or ``linux.sh`` machine file in ``$SRW/regional_workflow/ush/machine/`` based on the number of CPUs in the system (8 or 4 in MacOS), or a given number for Linux systems, ````. Job scheduler, ```` options are ``none``, ``slurm``, or another scheduler used by the system. .. code-block:: console @@ -904,8 +914,8 @@ For Option 1 (8 CPUs): # Architecture information WORKFLOW_MANAGER="none" - NCORES_PER_NODE=${NCORES_PER_NODE:-8} - SCHED=${SCHED:-"none"} + NCORES_PER_NODE=${NCORES_PER_NODE:-} + SCHED=${SCHED:-""} # UFS SRW App specific paths FIXgsm="path/to/FIXgsm/files" @@ -922,21 +932,99 @@ For Option 1 (8 CPUs): RUN_CMD_FCST='mpirun -np ${PE_MEMBER01}' RUN_CMD_POST="mpirun -np 4" -Using Option 2 with 4 CPUs requires ``NCORES_PER_NODE=${NCORES_PER_NODE:-4}`` in the above example. -.. _MacActivateWFenv: +.. _VXConfig: -Activate the Workflow Environment -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Configure METplus Verification Suite (Optional) +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Users who want to use the METplus verification suite to evaluate their forecasts need to add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. + +.. attention:: + METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. -The ``regional_workflow`` environment can be activated on MacOS as it is for any other system: +.. note:: + If METplus users update their METplus installation, they must update the module load statements in ``ufs-srweather-app/regional_workflow/modulefiles/tasks//run_vx.local`` file to correspond to their system's updated installation: + + .. code-block:: console + + module use -a + module load met/ + +To use METplus verification, the path to the MET and METplus directories must be added to ``config.sh``: + +.. code-block:: console + + METPLUS_PATH="" + MET_INSTALL_DIR="" + +Users who have already staged the observation data needed for METplus (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. + +.. code-block:: console + + CCPA_OBS_DIR="/path/to/UFS_SRW_App/v2p0/obs_data/ccpa/proc" + MRMS_OBS_DIR="/path/to/UFS_SRW_App/v2p0/obs_data/mrms/proc" + NDAS_OBS_DIR="/path/to/UFS_SRW_App/v2p0/obs_data/ndas/proc" + RUN_TASK_GET_OBS_CCPA="FALSE" + RUN_TASK_GET_OBS_MRMS="FALSE" + RUN_TASK_GET_OBS_NDAS="FALSE" + +If users have access to NOAA HPSS but have not pre-staged the data, they can simply set the ``RUN_TASK_GET_OBS_*`` tasks to "TRUE", and the machine will attempt to download the appropriate data from NOAA HPSS. The ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. + +Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. + +Next, the verification tasks must be turned on according to the user's needs. Users should add some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: + +.. code-block:: console + + RUN_TASK_VX_GRIDSTAT="TRUE" + RUN_TASK_VX_POINTSTAT="TRUE" + RUN_TASK_VX_ENSGRID="TRUE" + RUN_TASK_VX_ENSPOINT="TRUE" + +These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s ` More details on all of the parameters in this section are available in :numref:`Chapter %s `. + +.. _SetUpPythonEnv: + +Set Up the Python and Other Environment Parameters +---------------------------------------------------- + +The workflow requires Python 3 with the packages ``PyYAML``, ``Jinja2``, and ``f90nml`` available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): + +.. code-block:: console + + module use + module load wflow_ + +The ``wflow_`` modulefile will then output instructions to activate the regional workflow. The user should run the commands specified in the modulefile output. For example, if the output says: + +.. code-block:: console + + Please do the following to activate conda: + > conda activate regional_workflow + +then the user should run ``conda activate regional_workflow``. This will activate the ``regional_workflow`` conda environment. However, the command(s) will vary from system to system. Regardless, the user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: + +.. code-block:: console + + conda init + source ~/.bashrc + conda activate regional_workflow + + +.. _LinuxMacActivateWFenv: + +Activate the Workflow Environment on General MacOS/Linux Systems +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The ``regional_workflow`` environment can be activated as follows for ``="macos"``, or ``"=linux"``: .. code-block:: console cd $SRW/regional_workflow/ush - module load wflow_macos + module load wflow_ -This should activate the ``regional_workflow`` environment created in :numref:`Step %s `. From here, the user may continue to the :ref:`next step ` and generate the regional workflow. +This should activate the ``regional_workflow`` environment created in :numref:`Step %s `. From here, the user may continue to :numref:`Section %s ` to generate the regional workflow. .. _GenerateWorkflow: @@ -952,6 +1040,8 @@ Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` d The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. +When not using Rocoto on Linux or MacOS systems, the experiment could be launched using stand-alone scripts as :numref:`Section %s `. + This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). If a parameter is specified differently in these scripts, the file containing the last defined value will be used. @@ -960,9 +1050,11 @@ The generated workflow will appear in ``EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDI .. _WorkflowGeneration: -.. figure:: _static/FV3regional_workflow_gen.png +.. figure:: _static/FV3regional_workflow_gen_v2.png + :alt: Flowchart of the workflow generation process. Scripts are called in the following order: source_util_funcs.sh (which calls bash_utils), then set_FV3nml_sfc_climo_filenames.sh, set_FV3nml_stock_params.sh, create_diag_table_files.sh, and setup.sh. setup.sh calls several scripts: set_cycle_dates.sh, set_grid_params_GFDLgrid.sh, set_grid_params_ESGgrid.sh, link_fix.sh, set_ozone_param.sh, set_Thompson_mp_fix_files.sh, config_defaults.sh, config.sh, and valid_param_vals.sh. Then, it sets a number of variables, including FIXgsm, TOPO_DIR, and SFC_CLIMO_INPUT_DIR variables. Next, set_predef_grid_params.sh is called, and the FIXam and FIXLAM directories are set, along with the forecast input files. The setup script also calls set_extrn_mdl_params.sh, sets the GRID_GEN_METHOD with HALO, checks various parameters, and generates shell scripts. Then, the workflow generation script sets up YAML-compliant strings and generates the actual Rocoto workflow XML file from the template file (fill_jinja_template.py). The workflow generation script checks the crontab file and, if applicable, copies certain fix files to the experiment directory. Then, it copies templates of various input files to the experiment directory and sets parameters for the input.nml file. Finally, it generates the workflow. Additional information on each step appears in comments within each script. + + *Experiment generation description* - *Experiment generation description* .. _WorkflowTaskDescription: @@ -984,8 +1076,9 @@ Description of Workflow Tasks .. _WorkflowTasksFig: .. figure:: _static/FV3LAM_wflow_flowchart_v2.png + :alt: Flowchart of the workflow tasks. If the make_grid, make_orog, and make_sfc_climo tasks are toggled off, they will not be run. If toggled on, make_grid, make_orog, and make_sfc_climo will run consecutively by calling the corresponding exregional script in the regional_workflow/scripts directory. The get_ics, get_lbcs, make_ics, make_lbcs, and run_fcst tasks call their respective exregional scripts. The run_post task will run, and if METplus verification tasks have been configured, those will run during post-processing by calling their exregional scripts. - *Flowchart of the workflow tasks* + *Flowchart of the workflow tasks* The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workflow/jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script (or "ex-script") named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. @@ -1239,6 +1332,7 @@ If users choose to run METplus verification tasks as part of their experiment, t 201906150000 run_gridstatvx_24h 30468493 SUCCEEDED 0 1 20.0 201906150000 run_pointstatvx 30468423 SUCCEEDED 0 1 670.0 +.. _RocotoManualRun: Launch the Rocoto Workflow Manually --------------------------------------- @@ -1283,7 +1377,16 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai Automated Option ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *`` or ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: + +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add the following commands to their ``config.sh`` file: + +.. code-block:: console + + USE_CRON_TO_RELAUNCH="TRUE" + CRON_RELAUNCH_INTVL_MNTS="02" + + +Alternatively, the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *`` or ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console diff --git a/docs/UsersGuide/source/CompleteTests.csv b/docs/UsersGuide/source/CompleteTests.csv deleted file mode 100644 index 28cc46162f..0000000000 --- a/docs/UsersGuide/source/CompleteTests.csv +++ /dev/null @@ -1,28 +0,0 @@ -Grid,ICS,LBCS,Suite,Date,Time (UTC),Script Name,Test Type -RRFS_CONUS_3km,FV3GFS,FV3GFS,GFS_v16,2019-07-01,00,config.grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh,Complete -RRFS_CONUS_25km,HRRR,RAP,RRFS_v1beta,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete -RRFS_CONUS_13km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete -RRFS_CONUS_3km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete -RRFS_CONUS_25km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete -RRFS_CONUS_13km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete -RRFS_CONUS_3km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_008mems.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_2mems.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_008mems.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_2mems.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.deactivate_tasks.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.inline_post.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_ensemble_verification.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_verification.sh,Complete/wflow -ESGgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp_regional,2019-07-01,00,config.new_ESGgrid.sh,Complete/wflow -GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid.sh,Complete/wflow -GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh,Complete/wflow -GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.pregen_grid_orog_sfc_climo.sh,Complete/wflow -RRFS_CONUS_25km,GSMGFS,GSMGFS,FV3_GFS_2017_gfdlmp,2019-05-20,00,config.specify_DOT_OR_USCORE.sh,Complete/wflow -RRFS_CONUScompact_25km,HRRR,RAP,FV3_HRRR,2020-08-01,00,config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2021-06-03,06,config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_RESTART_INTERVAL.sh,Complete/wflow -RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post_ensemble_2mems.sh,Complete/wflow -RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post.sh,Complete/wflow -RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_template_filenames.sh,Complete/wflow \ No newline at end of file diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 028c87c851..cbf9a5d28a 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -19,30 +19,33 @@ These components are documented within this User's Guide and supported through a Pre-processor Utilities and Initial Conditions ============================================== -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide `_. +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3`-:term:`LAM`. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide `__. + +.. + COMMENT: Update link! The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. .. WARNING:: - For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. + For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `NOAA Operational Model Archive and Distribution System `__ (NOMADS). Raw external model data may be pre-staged on disk by the user. Forecast Model ============== The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. -Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. +Supported model resolutions in this release include 3-, 13-, and 25-km predefined contiguous U.S. (:term:`CONUS`) domains, each with 127 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found in the `scientific documentation `__, the `technical documentation `__, and on the `NOAA Geophysical Fluid Dynamics Laboratory website `__. -Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics suites supported for the SRW App v2.0.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. Post-processor ============== -The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__. +The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__. Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). @@ -54,18 +57,18 @@ METplus Verification Suite The enhanced Model Evaluation Tools (`METplus `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. -METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on all `Level 1 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack`. Users on non-Level 1 systems can follow the `MET Installation `__ and `METplus Installation `__ Guides for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. +METplus *installation* is not included as part of the build process for the v2.0.0 release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. -The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. +The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files in `prepBUFR format `__ (which include conventional point-based surface and upper-air data) for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. -METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (EMC), and it is open to community contributions. +METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. Visualization Example ===================== -A Python script is provided to create basic visualization of the model output. The script +A Python script is provided to create basic visualizations of the model output. The script is designed to output graphics in PNG format for 14 standard meteorological variables when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results. @@ -79,16 +82,14 @@ The SRW Application has a portable build system and a user-friendly, modular, an An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library. Once built, the provided experiment generator script can be used to create a Rocoto-based -workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_) for more information. If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. +workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_ for more information on Rocoto). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps. -This SRW Application release has been tested on a variety of platforms widely used by +This SRW Application v2.0.0 release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems -(RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational -Supercomputing System (WCOSS); the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne -system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. +(RDHPCS), including Hera, Orion, and Jet; the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms. A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. -Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page `_. +Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application Wiki `_. diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index 9fcfbd8b11..127435b82b 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -3,35 +3,38 @@ ============================================================================================ Workflow Parameters: Configuring the Workflow in ``config.sh`` and ``config_defaults.sh`` ============================================================================================ -To create the experiment directory and workflow when running the SRW App, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, observation data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repository’s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to "community"; see below), and the second is for running experiments in "nco" mode (``RUN_ENVIR`` set to "nco"). Note that for this release, only "community" mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations in which to run the SRW App. +To create the experiment directory and workflow when running the SRW Application, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, observation data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repository’s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to "community"), and the second is for running experiments in "nco" mode (``RUN_ENVIR`` set to "nco"). Note that for this release, only "community" mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations for the SRW App. -There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. If a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, such as the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e., the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). +There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. If a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, such as the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e., the user cannot set parameters in ``config.sh`` that are not initialized in ``config_defaults.sh``). -The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, any relevant information on features and settings supported or unsupported in this release is specified. +The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, relevant information on support for features and settings in the v2.0.0 release is provided. Platform Environment ==================== ``RUN_ENVIR``: (Default: "nco") - This variable determines the mode that the workflow will run in. The user can choose between two modes: "nco" and "community". The "nco" mode uses a directory structure that mimics what is used in operations at NOAA/NCEP Central Operations (NCO) and at the NOAA/NCEP/Environmental Modeling Center (EMC), which is working with NCO on pre-implementation testing. Specifics of the conventions used in "nco" mode can be found in the following `WCOSS Implementation Standards `__ document: + This variable determines the workflow mode. The user can choose between two options: "nco" and "community". The "nco" mode uses a directory structure that mimics what is used in operations at NOAA/NCEP Central Operations (NCO) and at the NOAA/NCEP/Environmental Modeling Center (EMC), which works with NCO on pre-implementation testing. Specifics of the conventions used in "nco" mode can be found in the following `WCOSS Implementation Standards `__ document: | NCEP Central Operations | WCOSS Implementation Standards | January 19, 2022 | Version 11.0.0 - Setting ``RUN_ENVIR`` to "community" will use the standard directory structure and variable naming convention and is recommended in most cases for users who are not planning to implement their code into operations at NCO. + Setting ``RUN_ENVIR`` to "community" is recommended in most cases for users who are not planning to implement their code into operations at NCO. ``MACHINE``: (Default: "BIG_COMPUTER") - The machine (a.k.a. platform) on which the workflow will run. Currently supported platforms include "WCOSS_DELL_P3", "HERA", "ORION", "JET", "ODIN", "CHEYENNE", "STAMPEDE", "GAEA", "SINGULARITY", "NOAACLOUD", "MACOS", and "LINUX". When running the SRW App in a container, set ``MACHINE`` to "SINGULARITY" regardless of the underlying platform. + The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed on the `SRW App Wiki page `__. When running the SRW App on any ParellelWorks/NOAA Cloud system, use "NOAACLOUD" regardless of the underlying system (AWS, GCP, or Azure). When running the SRW App in a container, set ``MACHINE`` to "SINGULARITY" regardless of the underlying platform (including on NOAA Cloud systems). Valid values: ``"HERA"`` | ``"ORION"`` | ``"JET"`` | ``"CHEYENNE"`` | ``"GAEA"`` | ``"NOAACLOUD"`` | ``"STAMPEDE"`` | ``"ODIN"`` | ``"MACOS"`` | ``"LINUX"`` | ``"SINGULARITY"`` | ``"WCOSS_DELL_P3"`` ``MACHINE_FILE``: (Default: "") Path to a configuration file with machine-specific settings. If none is provided, ``setup.sh`` will attempt to set the path to a configuration file for a supported platform. ``ACCOUNT``: (Default: "project_name") - The account under which to submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field for `Level 1 `__ systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. On some systems, the ``saccount_params`` command will display additional account details. + The account under which users submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field for `Level 1 `__ systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. On some systems, the ``saccount_params`` command will display additional account details. + +``COMPILER``: (Default: "intel") + Type of compiler invoked during the build step. Currently, this must be set manually (i.e., it is not inherited from the build system in the ``ufs-srweather-app`` directory). Valid values: ``"intel"`` | ``"gnu"`` ``WORKFLOW_MANAGER``: (Default: "none") - The workflow manager to use (e.g. "ROCOTO"). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "ROCOTO." Valid values: "rocoto" "none" + The workflow manager to use (e.g., "ROCOTO"). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "ROCOTO." Valid values: ``"rocoto"`` | ``"none"`` ``NCORES_PER_NODE``: (Default: "") The number of cores available per node on the compute platform. Set for supported platforms in ``setup.sh``, but it is now also configurable for all platforms. @@ -46,41 +49,38 @@ Platform Environment Name of alternative workflow module file to use if running on an unsupported platform. Is set automatically for supported machines. ``SCHED``: (Default: "") - The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Set this to an empty string in order for the experiment generation script to set it automatically depending on the machine the workflow is running on. Valid values: "slurm" "pbspro" "lsf" "lsfcray" "none" + The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Leaving this an empty string allows the experiment generation script to set it automatically depending on the machine the workflow is running on. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` Machine-Dependent Parameters: ------------------------------- These parameters vary depending on machine. On `Level 1 and 2 `__ systems, the appropriate values for each machine can be viewed in the ``regional_workflow/ush/machine/.sh`` scripts. To specify a value other than the default, add these variables and the desired value in the ``config.sh`` file so that they override the ``config_defaults.sh`` and machine default values. ``PARTITION_DEFAULT``: (Default: "") - This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). This is the default partition to which Slurm submits workflow tasks. When a variable that designates the partition (e.g., ``PARTITION_HPSS``, ``PARTITION_FCST``; see below) is **not** specified, the task will be submitted to the default partition indicated in the ``PARTITION_DEFAULT`` variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "hera" "normal" "orion" "sjet,vjet,kjet,xjet" "workq" + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). This is the default partition to which Slurm submits workflow tasks. When a variable that designates the partition (e.g., ``PARTITION_HPSS``, ``PARTITION_FCST``; see below) is **not** specified, the task will be submitted to the default partition indicated in the ``PARTITION_DEFAULT`` variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: ``""`` | ``"hera"`` | ``"normal"`` | ``"orion"`` | ``"sjet,vjet,kjet,xjet"`` | ``"workq"`` ``CLUSTERS_DEFAULT``: (Default: "") This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). These are the default clusters to which Slurm submits workflow tasks. If ``CLUSTERS_HPSS`` or ``CLUSTERS_FCST`` (see below) are not specified, the task will be submitted to the default clusters indicated in this variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. ``QUEUE_DEFAULT``: (Default: "") - The default queue or QOS to which workflow tasks are submitted (QOS is Slurm's term for queue; it stands for "Quality of Service"). If the task's ``QUEUE_HPSS`` or ``QUEUE_FCST`` parameters (see below) are not specified, the task will be submitted to the queue indicated by this variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev" "normal" "regular" "workq" + The default queue or QOS to which workflow tasks are submitted (QOS is Slurm's term for queue; it stands for "Quality of Service"). If the task's ``QUEUE_HPSS`` or ``QUEUE_FCST`` parameters (see below) are not specified, the task will be submitted to the queue indicated by this variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: ``""`` | ``"batch"`` | ``"dev"`` | ``"normal"`` | ``"regular"`` | ``"workq"`` ``PARTITION_HPSS``: (Default: "") - This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). Tasks that get or create links to external model files are submitted to the partition specified in this variable. These links are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs) for the experiment. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "normal" "service" "workq" - -.. - COMMENT: Wouldn't it be reset to the PARTITION_DEFAULT value? + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). Tasks that get or create links to external model files are submitted to the partition specified in this variable. These links are needed to generate initial conditions (:term:`ICs`) and lateral boundary conditions (:term:`LBCs`) for the experiment. If this variable is not set or is set to an empty string, it will be (re)set to the ``PARTITION_DEFAULT`` value (if set) or to a machine-dependent value. Valid values: ``""`` | ``"normal"`` | ``"service"`` | ``"workq"`` ``CLUSTERS_HPSS``: (Default: "") This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). Tasks that get or create links to external model files are submitted to the clusters specified in this variable. These links are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs) for the experiment. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. ``QUEUE_HPSS``: (Default: "") - Tasks that get or create links to external model files are submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev_transfer" "normal" "regular" "workq" + Tasks that get or create links to external model files are submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: ``""`` | ``"batch"`` | ``"dev_transfer"`` | ``"normal"`` | ``"regular"`` | ``"workq"`` ``PARTITION_FCST``: (Default: "") - This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). The task that runs forecasts is submitted to this partition. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "hera" "normal" "orion" "sjet,vjet,kjet,xjet" "workq" + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). The task that runs forecasts is submitted to this partition. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: ``""`` | ``"hera"`` | ``"normal"`` | ``"orion"`` | ``"sjet,vjet,kjet,xjet"`` | ``"workq"`` ``CLUSTERS_FCST``: (Default: "") This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). The task that runs forecasts is submitted to this cluster. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. ``QUEUE_FCST``: (Default: "") - The task that runs a forecast is submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this variable is not set or set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev" "normal" "regular" "workq" + The task that runs a forecast is submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this variable is not set or set to an empty string, it will be (re)set to a machine-dependent value. Valid values: ``""`` | ``"batch"`` | ``"dev"`` | ``"normal"`` | ``"regular"`` | ``"workq"`` Parameters for Running Without a Workflow Manager ================================================= @@ -93,26 +93,30 @@ These settings control run commands for platforms without a workflow manager. Va The run command for the model forecast step. This will be appended to the end of the variable definitions file (``var_defns.sh``). Changing the ``${PE_MEMBER01}`` variable is **not** recommended; it refers to the number of MPI tasks that the Weather Model will expect to run with. Running the Weather Model with a different number of MPI tasks than the workflow has been set up for can lead to segmentation faults and other errors. It is also important to escape the ``$`` character or use single quotes here so that ``PE_MEMBER01`` is not referenced until runtime, since it is not defined at the beginning of the workflow generation script. ``RUN_CMD_POST``: (Default: "mpirun -np 1") - The run command for post-processing (:term:`UPP`). Can be left blank for smaller domains, in which case UPP will run without :term:`MPI`. + The run command for post-processing (via the :term:`UPP`). Can be left blank for smaller domains, in which case UPP will run without :term:`MPI`. + +.. _Cron: Cron-Associated Parameters ========================== + +Cron is a job scheduler accessed through the command-line on UNIX-like operating systems. It is useful for automating tasks such as the ``rocotorun`` command, which launches each workflow task in the SRW App. Cron periodically checks a cron table (aka crontab) to see if any tasks are are ready to execute. If so, it runs them. + ``USE_CRON_TO_RELAUNCH``: (Default: "FALSE") Flag that determines whether or not a line is added to the user's cron table, which calls the experiment launch script every ``CRON_RELAUNCH_INTVL_MNTS`` minutes. ``CRON_RELAUNCH_INTVL_MNTS``: (Default: "03") The interval (in minutes) between successive calls of the experiment launch script by a cron job to (re)launch the experiment (so that the workflow for the experiment kicks off where it left off). This is used only if ``USE_CRON_TO_RELAUNCH`` is set to "TRUE". -.. - COMMENT: Are these variables set in a machine script somewhere for Level 1 systems? I've used cron but never had to set these. It seems like the default for NOAA Cloud is "01". +.. _DirParams: Directory Parameters ==================== ``EXPT_BASEDIR``: (Default: "") - The base directory in which the experiment directory will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. + The full path to the base directory inside of which the experiment directory (``EXPT_SUBDIR``) will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. ``EXPT_SUBDIR``: (Default: "") - The name that the experiment directory (without the full path) will have. The full path to the experiment directory, which will be contained in the variable ``EXPTDIR``, will be: + A descriptive name of the user's choice for the experiment directory (*not* its full path). The full path to the experiment directory, which will be contained in the variable ``EXPTDIR``, will be: .. code-block:: console @@ -127,7 +131,7 @@ Directory Parameters NCO Mode Parameters =================== -These variables apply only when using NCO mode (i.e. when ``RUN_ENVIR`` is set to "nco"). +These variables apply only when using NCO mode (i.e., when ``RUN_ENVIR`` is set to "nco"). ``COMINgfs``: (Default: "/base/path/of/directory/containing/gfs/input/files") The beginning portion of the path to the directory that contains files generated by the external model (FV3GFS). The initial and lateral boundary condition generation tasks need this path in order to create initial and boundary condition files for a given cycle on the native FV3-LAM grid. For a cycle that starts on the date specified by the variable YYYYMMDD (consisting of the 4-digit year, 2-digit month, and 2-digit day of the month) and the hour specified by the variable HH (consisting of the 2-digit hour of the day), the directory in which the workflow will look for the external model files is: @@ -136,17 +140,14 @@ These variables apply only when using NCO mode (i.e. when ``RUN_ENVIR`` is set t $COMINgfs/gfs.$yyyymmdd/$hh/atmos -.. - COMMENT: Should "atmos" be at the end of this file path? If so, is it standing in for something (like FV3GFS), or is "atmos" actually part of the file path? Are the files created directly in the "atmos" folder? Or is there an "ICS" and "LBCS" directory generated? - ``FIXLAM_NCO_BASEDIR``: (Default: "") - The base directory containing pregenerated grid, orography, and surface climatology files. For the pregenerated grid specified by ``PREDEF_GRID_NAME``, these "fixed" files are located in: + The base directory containing pregenerated grid, orography, and surface climatology files. For the pregenerated grid type specified in the variable ``PREDEF_GRID_NAME``, these "fixed" files are located in: .. code-block:: console ${FIXLAM_NCO_BASEDIR}/${PREDEF_GRID_NAME} - The workflow scripts will create a symlink in the experiment directory that will point to a subdirectory (having the name of the grid being used) under this directory. This variable should be set to a null string in this file, but it can be specified in the user-specified workflow configuration file (e.g., ``config.sh``). + The workflow scripts will create a symlink in the experiment directory that will point to a subdirectory (having the name of the grid being used) under this directory. This variable should be set to a null string in ``config_defaults.sh`` and specified by the user in the workflow configuration file (``config.sh``). ``STMP``: (Default: "/base/path/of/directory/containing/model/input/and/raw/output/files") The beginning portion of the path to the directory that will contain :term:`cycle-dependent` model input files, symlinks to :term:`cycle-independent` input files, and raw (i.e., before post-processing) forecast output files for a given :term:`cycle`. The format for cycle dates (cdate) is ``cdate="${YYYYMMDD}${HH}"``, where the date is specified using YYYYMMDD format, and the hour is specified using HH format. The files for a cycle date will be located in the following directory: @@ -177,7 +178,7 @@ These variables apply only when using NCO mode (i.e. when ``RUN_ENVIR`` is set t Pre-Processing File Separator Parameters ======================================== ``DOT_OR_USCORE``: (Default: "_") - This variable sets the separator character(s) to use in the names of the grid, mosaic, and orography fixed files. Ideally, the same separator should be used in the names of these fixed files as in the surface climatology fixed files. Valid values: "_" "." + This variable sets the separator character(s) to use in the names of the grid, mosaic, and orography fixed files. Ideally, the same separator should be used in the names of these fixed files as in the surface climatology fixed files. Valid values: ``"_"`` | ``"."`` File Name Parameters ==================== @@ -188,19 +189,19 @@ File Name Parameters Name of the file containing namelist settings for the code that generates an "ESGgrid" regional grid. ``FV3_NML_BASE_SUITE_FN``: (Default: "input.nml.FV3") - Name of the Fortran namelist file containing the forecast model's base suite namelist (i.e., the portion of the namelist that is common to all physics suites). + Name of the Fortran file containing the forecast model's base suite namelist (i.e., the portion of the namelist that is common to all physics suites). ``FV3_NML_YAML_CONFIG_FN``: (Default: "FV3.input.yml") Name of YAML configuration file containing the forecast model's namelist settings for various physics suites. ``FV3_NML_BASE_ENS_FN``: (Default: "input.nml.base_ens") - Name of the Fortran namelist file containing the forecast model's base ensemble namelist, i.e., the the namelist file that is the starting point from which the namelist files for each of the enesemble members are generated. + Name of the Fortran file containing the forecast model's base ensemble namelist (i.e., the original namelist file from which each of the ensemble members' namelist files are generated). ``DIAG_TABLE_FN``: (Default: "diag_table") - Name of the file that specifies the fields that the forecast model will output. + Name of the file specifying the fields that the forecast model will output. ``FIELD_TABLE_FN``: (Default: "field_table") - Name of the file that specifies the tracers that the forecast model will read in from the :term:`IC/LBC` files. + Name of the file specifying the :term:`tracers` that the forecast model will read in from the :term:`IC/LBC` files. ``DATA_TABLE_FN``: (Default: "data_table") Name of the file containing the data table read in by the forecast model. @@ -215,7 +216,7 @@ File Name Parameters Name of the forecast model executable stored in the executables directory (``EXECDIR``; set during experiment generation). ``FCST_MODEL``: (Default: "ufs-weather-model") - Name of forecast model. Valid values: "ufs-weather-model" "fv3gfs_aqm" + Name of forecast model. Valid values: ``"ufs-weather-model"`` | ``"fv3gfs_aqm"`` ``WFLOW_XML_FN``: (Default: "FV3LAM_wflow.xml") Name of the Rocoto workflow XML file that the experiment generation script creates. This file defines the workflow for the experiment. @@ -249,9 +250,6 @@ Forecast Parameters ``INCR_CYCL_FREQ``: (Default: "24") Increment in hours for cycle frequency (cycl_freq). The default is "24", which means cycl_freq=24:00:00. -.. - COMMENT: What is cycl_freq from? It's not mentioned anywhere else here... In general, this definition could be better... need more info. - ``FCST_LEN_HRS``: (Default: "24") The length of each forecast, in integer hours. @@ -259,10 +257,7 @@ Model Configuration Parameters ================================= ``DT_ATMOS``: (Default: "") - Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, tracer transport, and vertical dynamics routines; see the `FV3 dycore documentation `__ for details.) Must be set. Takes an integer value. - -.. - COMMENT: FV3 documentation says DT_ATMOS must be set, but in our code, the default value is "". What is the actual default value? And is the default set by the FV3 dycore (or somewhere else) rather than in the SRW App itself? + Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, tracer transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.sh`` script, but a different value can be set in ``config.sh``. ``RESTART_INTERVAL``: (Default: "0") Frequency of the output restart files in hours. Using the default interval ("0"), restart files are produced at the end of a forecast run. When ``RESTART_INTERVAL="1"``, restart files are produced every hour with the prefix "YYYYMMDD.HHmmSS." in the ``RESTART`` directory. @@ -270,11 +265,7 @@ Model Configuration Parameters .. _InlinePost: ``WRITE_DOPOST``: (Default: "FALSE") - Flag that determines whether to use the INLINE POST option. If TRUE, the ``WRITE_DOPOST`` flag in the ``model_configure`` file will be set to "TRUE", and the post-processing tasks get called from within the weather model so that the post files (:term:`grib2`) are output by the weather model at the same time that it outputs the ``dynf###.nc`` and ``phyf###.nc`` files. Setting ``WRITE_DOPOST="TRUE"`` - turns off the separate ``run_post`` task (i.e., ``RUN_TASK_RUN_POST`` is set to "FALSE") in ``setup.sh``. - - .. - Should there be an underscore in inline post? + Flag that determines whether to use the INLINE POST option. If TRUE, the ``WRITE_DOPOST`` flag in the ``model_configure`` file will be set to "TRUE", and the post-processing tasks get called from within the weather model so that the post-processed files (in :term:`grib2` format) are output by the weather model at the same time that it outputs the ``dynf###.nc`` and ``phyf###.nc`` files. Setting ``WRITE_DOPOST="TRUE"`` turns off the separate ``run_post`` task (i.e., ``RUN_TASK_RUN_POST`` is set to "FALSE") in ``setup.sh``. METplus Parameters ===================== @@ -297,52 +288,57 @@ METplus Parameters .. note:: Where a date field is required: - * YYYY refers to the 4-digit valid year - * MM refers to the 2-digit valid month - * DD refers to the 2-digit valid day of the month - * HH refers to the 2-digit valid hour of the day - * mm refers to the 2-digit valid minutes of the hour - * SS refers to the two-digit valid seconds of the hour + * ``YYYY`` refers to the 4-digit valid year + * ``MM`` refers to the 2-digit valid month + * ``DD`` refers to the 2-digit valid day of the month + * ``HH`` refers to the 2-digit valid hour of the day + * ``mm`` refers to the 2-digit valid minutes of the hour + * ``SS`` refers to the two-digit valid seconds of the hour ``CCPA_OBS_DIR``: (Default: "") - User-specified location of top-level directory where CCPA hourly precipitation files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_ccpa_tn`` task. (This task is activated in the workflow by setting ``RUN_TASK_GET_OBS_CCPA="TRUE"``). - METplus configuration files require the use of predetermined directory structure and file names. If the CCPA files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/ccpa.t{HH}z.01h.hrap.conus.gb2``, where YYYYMMDD and HH are as described in the note :ref:`above `. When pulling observations from NOAA HPSS, the data retrieved will be placed in the ``CCPA_OBS_DIR`` directory. This path must be defind as ``//ccpa/proc``. METplus is configured to verify 01-, 03-, 06-, and 24-h accumulated precipitation using hourly CCPA files. + User-specified location of top-level directory where CCPA hourly precipitation files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA :term:`HPSS` (if the user has access) via the ``get_obs_ccpa_tn`` task. (This task is activated in the workflow by setting ``RUN_TASK_GET_OBS_CCPA="TRUE"``). + + METplus configuration files require the use of a predetermined directory structure and file names. If the CCPA files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/ccpa.t{HH}z.01h.hrap.conus.gb2``, where YYYYMMDD and HH are as described in the note :ref:`above `. When pulling observations from NOAA HPSS, the data retrieved will be placed in the ``CCPA_OBS_DIR`` directory. This path must be defind as ``//ccpa/proc``. METplus is configured to verify 01-, 03-, 06-, and 24-h accumulated precipitation using hourly CCPA files. .. note:: There is a problem with the valid time in the metadata for files valid from 19 - 00 UTC (i.e., files under the "00" directory). The script to pull the CCPA data from the NOAA HPSS (``regional_workflow/scripts/exregional_get_ccpa_files.sh``) has an example of how to account for this and organize the data into a more intuitive format. When a fix is provided, it will be accounted for in the ``exregional_get_ccpa_files.sh`` script. ``MRMS_OBS_DIR``: (Default: "") - User-specified location of top-level directory where MRMS composite reflectivity files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_mrms_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_MRMS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defind as ``//mrms/proc``. METplus configuration files require the use of a predetermined directory structure and file names. Therefore, if the MRMS files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/MergedReflectivityQCComposite_00.50_{YYYYMMDD}-{HH}{mm}{SS}.grib2``, where YYYYMMDD and {HH}{mm}{SS} are as described in the note :ref:`above `. + User-specified location of top-level directory where MRMS composite reflectivity files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA :term:`HPSS` (if the user has access) via the ``get_obs_mrms_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_MRMS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defind as ``//mrms/proc``. + + METplus configuration files require the use of a predetermined directory structure and file names. Therefore, if the MRMS files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/MergedReflectivityQCComposite_00.50_{YYYYMMDD}-{HH}{mm}{SS}.grib2``, where YYYYMMDD and {HH}{mm}{SS} are as described in the note :ref:`above `. .. note:: - METplus is configured to look for a MRMS composite reflectivity file for the valid time of the forecast being verified; since MRMS composite reflectivity files do not always exactly match the valid time, a script, within the main script to retrieve MRMS data from the NOAA HPSS, is used to identify and rename the MRMS composite reflectivity file to match the valid time of the forecast. The script to pull the MRMS data from the NOAA HPSS has an example of the expected file naming structure: ``regional_workflow/scripts/exregional_get_mrms_files.sh``. This script calls the script used to identify the MRMS file closest to the valid time: ``regional_workflow/ush/mrms_pull_topofhour.py``. + METplus is configured to look for a MRMS composite reflectivity file for the valid time of the forecast being verified; since MRMS composite reflectivity files do not always exactly match the valid time, a script (within the main script that retrieves MRMS data from the NOAA HPSS) is used to identify and rename the MRMS composite reflectivity file to match the valid time of the forecast. The script to pull the MRMS data from the NOAA HPSS has an example of the expected file-naming structure: ``regional_workflow/scripts/exregional_get_mrms_files.sh``. This script calls the script used to identify the MRMS file closest to the valid time: ``regional_workflow/ush/mrms_pull_topofhour.py``. ``NDAS_OBS_DIR``: (Default: "") - User-specified location of top-level directory where NDAS prepbufr files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_ndas_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_NDAS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defined as ``//ndas/proc``. METplus is configured to verify near-surface variables hourly and upper-air variables at 00 and 12 UTC with NDAS prepbufr files. METplus configuration files require the use of predetermined file names. Therefore, if the NDAS files are user provided, they need to follow the anticipated naming structure: ``prepbufr.ndas.{YYYYMMDDHH}``, where YYYYMMDD and HH are as described in the note :ref:`above `. The script to pull the NDAS data from the NOAA HPSS (``regional_workflow/scripts/exregional_get_ndas_files.sh``) has an example of how to rename the NDAS data into a more intuitive format with the valid time listed in the file name. + User-specified location of top-level directory where NDAS prepbufr files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA :term:`HPSS` (if the user has access) via the ``get_obs_ndas_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_NDAS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defined as ``//ndas/proc``. METplus is configured to verify near-surface variables hourly and upper-air variables at 00 and 12 UTC with NDAS prepbufr files. + + METplus configuration files require the use of predetermined file names. Therefore, if the NDAS files are user-provided, they need to follow the anticipated naming structure: ``prepbufr.ndas.{YYYYMMDDHH}``, where YYYYMMDD and HH are as described in the note :ref:`above `. The script to pull the NDAS data from the NOAA HPSS (``regional_workflow/scripts/exregional_get_ndas_files.sh``) has an example of how to rename the NDAS data into a more intuitive format with the valid time listed in the file name. Initial and Lateral Boundary Condition Generation Parameters ============================================================ ``EXTRN_MDL_NAME_ICS``: (Default: "FV3GFS") - The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: "GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM" + The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` ``EXTRN_MDL_NAME_LBCS``: (Default: "FV3GFS") - The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: "GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM" + The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` ``LBC_SPEC_INTVL_HRS``: (Default: "6") - The interval (in integer hours) at which LBC files will be generated. This is also referred to as the *boundary specification interval*. Note that the model specified in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to "6", then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. + The interval (in integer hours) at which LBC files will be generated. This is also referred to as the *boundary specification interval*. Note that the model selected in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to "6", then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. ``EXTRN_MDL_ICS_OFFSET_HRS``: (Default: "0") - Users may wish to start a forecast using forecast data from a previous cycle of an external model. This variable sets the number of hours earlier the external model started than when the FV3 forecast configured here should start. For example, if the forecast should start from a 6 hour forecast of the GFS, then ``EXTRN_MDL_ICS_OFFSET_HRS="6"``. + Users may wish to start a forecast using forecast data from a previous cycle of an external model. This variable indicates how many hours earlier the external model started than the FV3 forecast configured here. For example, if the forecast should start from a 6-hour forecast of the GFS, then ``EXTRN_MDL_ICS_OFFSET_HRS="6"``. ``EXTRN_MDL_LBCS_OFFSET_HRS``: (Default: "") - Users may wish to use lateral boundary conditions from a forecast that was started earlier than the initial time for the FV3 forecast configured here. This variable sets the number of hours earlier the external model started than when the FV3 forecast configured here should start. For example, if the forecast should use lateral boundary conditions from the GFS started 6 hours earlier, then ``EXTRN_MDL_LBCS_OFFSET_HRS="6"``. Note: the default value is model-dependent and set in ``set_extrn_mdl_params.sh``. + Users may wish to use lateral boundary conditions from a forecast that was started earlier than the start of the forecast configured here. This variable indicates how many hours earlier the external model started than the FV3 forecast configured here. For example, if the forecast should use lateral boundary conditions from the GFS started 6 hours earlier, then ``EXTRN_MDL_LBCS_OFFSET_HRS="6"``. Note: the default value is model-dependent and is set in ``set_extrn_mdl_params.sh``. ``FV3GFS_FILE_FMT_ICS``: (Default: "nemsio") - If using the FV3GFS model as the source of the :term:`ICs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the ICs. Valid values: "nemsio" "grib2" "netcdf" + If using the FV3GFS model as the source of the :term:`ICs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the ICs. Valid values: ``"nemsio"`` | ``"grib2"`` | ``"netcdf"`` ``FV3GFS_FILE_FMT_LBCS``: (Default: "nemsio") - If using the FV3GFS model as the source of the :term:`LBCs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the LBCs. Valid values: "nemsio" "grib2" "netcdf" + If using the FV3GFS model as the source of the :term:`LBCs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the LBCs. Valid values: ``"nemsio"`` | ``"grib2"`` | ``"netcdf"`` @@ -350,7 +346,7 @@ Base Directories for External Model Files =========================================== .. note:: - Note that these must be defined as null strings in ``config_defaults.sh`` so that if they are specified by the user in the experiment configuration file (i.e., ``config.sh``), they remain set to those values, and if not, they get set to machine-dependent values. + These variables must be defined as null strings in ``config_defaults.sh`` so that if they are specified by the user in the experiment configuration file (``config.sh``), they remain set to those values, and if not, they get set to machine-dependent values. ``EXTRN_MDL_SYSBASEDIR_ICS``: (Default: "") Base directory on the local machine containing external model files for generating :term:`ICs` on the native grid. The way the full path containing these files is constructed depends on the user-specified external model for ICs (defined in ``EXTRN_MDL_NAME_ICS`` above). @@ -362,7 +358,7 @@ Base Directories for External Model Files User-Staged External Model Directory and File Parameters ======================================================== ``USE_USER_STAGED_EXTRN_FILES``: (Default: "FALSE") - Flag that determines whether or not the workflow will look for the external model files needed for generating :term:`ICs` and :term:`LBCs` in user-specified directories (as opposed to fetching them from mass storage like NOAA HPSS). + Flag that determines whether the workflow will look for the external model files needed for generating :term:`ICs` and :term:`LBCs` in user-specified directories (rather than fetching them from mass storage like NOAA :term:`HPSS`). ``EXTRN_MDL_SOURCE_BASEDIR_ICS``: (Default: "/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") Directory containing external model files for generating ICs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks within this directory for a subdirectory named "YYYYMMDDHH", which contains the external model files specified by the array ``EXTRN_MDL_FILES_ICS``. This "YYYYMMDDHH" subdirectory corresponds to the start date and cycle hour of the forecast (see :ref:`above `). These files will be used to generate the :term:`ICs` on the native FV3-LAM grid. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". @@ -372,9 +368,10 @@ User-Staged External Model Directory and File Parameters ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``: (Default: "/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") Analogous to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` but for :term:`LBCs` instead of :term:`ICs`. + Directory containing external model files for generating LBCs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks within this directory for a subdirectory named "YYYYMMDDHH", which contains the external model files specified by the array ``EXTRN_MDL_FILES_LBCS``. This "YYYYMMDDHH" subdirectory corresponds to the start date and cycle hour of the forecast (see :ref:`above `). These files will be used to generate the :term:`LBCs` on the native FV3-LAM grid. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". ``EXTRN_MDL_FILES_LBCS``: (Default: " "LBCS_file1" "LBCS_file2" "...") - Analogous to ``EXTRN_MDL_FILES_ICS`` but for :term:`LBCs` instead of :term:`ICs`. + Analogous to ``EXTRN_MDL_FILES_ICS`` but for :term:`LBCs` instead of :term:`ICs`. Array containing the file names to search for in the ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directory. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". NOMADS Parameters @@ -386,35 +383,34 @@ Set parameters associated with NOMADS online data. Flag controlling whether to use NOMADS online data. ``NOMADS_file_type``: (Default: "nemsio") - Flag controlling the format of the data. Valid values: "GRIB2" "grib2" "NEMSIO" "nemsio" + Flag controlling the format of the data. Valid values: ``"GRIB2"`` | ``"grib2"`` | ``"NEMSIO"`` | ``"nemsio"`` +.. _CCPP_Params: CCPP Parameter -============== +=============== ``CCPP_PHYS_SUITE``: (Default: "FV3_GFS_v16") This parameter indicates which :term:`CCPP` (Common Community Physics Package) physics suite to use for the forecast(s). The choice of physics suite determines the forecast model's namelist file, the diagnostics table file, the field table file, and the XML physics suite definition file, which are staged in the experiment directory or the :term:`cycle` directories under it. **Current supported settings for this parameter are:** - | "FV3_GFS_v16" - | "FV3_RRFS_v1beta" - | "FV3_HRRR" - | "FV3_WoFS" + | ``"FV3_GFS_v16"`` + | ``"FV3_RRFS_v1beta"`` + | ``"FV3_HRRR"`` + | ``"FV3_WoFS"`` **Other valid values include:** - | "FV3_GFS_2017_gfdlmp" - | "FV3_GFS_2017_gfdlmp_regional" - | "FV3_GFS_v15p2" - | "FV3_GFS_v15_thompson_mynn_lam3km" - | "FV3_RRFS_v1alpha" - + | ``"FV3_GFS_2017_gfdlmp"`` + | ``"FV3_GFS_2017_gfdlmp_regional"`` + | ``"FV3_GFS_v15p2"`` + | ``"FV3_GFS_v15_thompson_mynn_lam3km"`` + | ``"FV3_RRFS_v1alpha"`` Stochastic Physics Parameters ================================ -For the most updated and detailed documentation of these parameters, see the `UFS Stochastic Physics Documentation `__. - +For the most updated and detailed documentation of these parameters, see the `UFS Stochastic Physics Documentation `__. ``NEW_LSCALE``: (Default: "TRUE") Use correct formula for converting a spatial legnth scale into spectral space. @@ -469,7 +465,7 @@ SPPT perturbs full physics tendencies *after* the call to the physics suite, unl Interval in seconds to update random pattern (optional parameter). Perturbations still get applied at every time-step. Corresponds to the variable ``spptint`` in ``input.nml``. ``SPPT_SFCLIMIT``: (Default: "TRUE") - When "TRUE", tapers the SPPT perturbations to zero at the model’s lowest level, which reduces model crashes. + When "TRUE", tapers the SPPT perturbations to zero at the model's lowest level, which reduces model crashes. ``USE_ZMTNBLCK``: (Default: "FALSE") When "TRUE", do not apply perturbations below the dividing streamline that is diagnosed by the gravity wave drag, mountain blocking scheme @@ -509,77 +505,107 @@ Stochastic Kinetic Energy Backscatter (SKEB) Parameters Parameters for Stochastically Perturbed Parameterizations (SPP) ------------------------------------------------------------------ -Set default Stochastically Perturbed Parameterizations (SPP) stochastic physics options. Unlike :ref:`SPPT physics `, SPP is applied within the physics, not afterward. SPP perturbs specific tuning parameters within a physics :term:`parameterization` (unlike `SPPT `, which multiplies overall physics tendencies by a random perturbation field *after* the call to the physics suite). Each SPP option is an array, applicable (in order) to the :term:`RAP`/:term:`HRRR`-based parameterization listed in ``SPP_VAR_LIST``. Enter each value of the array in ``config.sh`` as shown below without commas or single quotes (e.g., ``SPP_VAR_LIST=( "pbl" "sfc" "mp" "rad" "gwd"`` ). Both commas and single quotes will be added by Jinja when creating the namelist. +SPP perturbs specific tuning parameters within a physics :term:`parameterization` (unlike :ref:`SPPT `, which multiplies overall physics tendencies by a random perturbation field *after* the call to the physics suite). Each SPP option is an array, applicable (in order) to the :term:`RAP`/:term:`HRRR`-based parameterization listed in ``SPP_VAR_LIST``. Enter each value of the array in ``config.sh`` as shown below without commas or single quotes (e.g., ``SPP_VAR_LIST=( "pbl" "sfc" "mp" "rad" "gwd"`` ). Both commas and single quotes will be added by Jinja when creating the namelist. .. note:: - SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which :term:`SDF` is chosen when turning this option on. + SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which :term:`SDF` is chosen when turning this option on. Among the supported physics suites, the full set of parameterizations can only be used with the ``FV3_HRRR`` option for ``CCPP_PHYS_SUITE``. ``DO_SPP``: (Default: "false") Flag to turn SPP on or off. SPP perturbs parameters or variables with unknown or uncertain magnitudes within the physics code based on ranges provided by physics experts. ``ISEED_SPP``: (Default: ( "4" "4" "4" "4" "4" ) ) - The initial seed value for the perturbation pattern. + Seed for setting the random number sequence for the perturbation pattern. ``SPP_MAG_LIST``: (Default: ( "0.2" "0.2" "0.75" "0.2" "0.2" ) ) - Corresponds to the variable ``spp_prt_list`` in ``input.nml`` + SPP perturbation magnitudes used in each parameterization. Corresponds to the variable ``spp_prt_list`` in ``input.nml`` ``SPP_LSCALE``: (Default: ( "150000.0" "150000.0" "150000.0" "150000.0" "150000.0" ) ) - Length scale in meters. + Decorrelation spatial scales in meters. ``SPP_TSCALE``: (Default: ( "21600.0" "21600.0" "21600.0" "21600.0" "21600.0" ) ) - Time decorrelation length in seconds. Corresponds to the variable ``spp_tau`` in ``input.nml``. + Decorrelation timescales in seconds. Corresponds to the variable ``spp_tau`` in ``input.nml``. ``SPP_SIGTOP1``: (Default: ( "0.1" "0.1" "0.1" "0.1" "0.1") ) Controls vertical tapering of perturbations at the tropopause and corresponds to the lower sigma level at which to taper perturbations to zero. -.. - COMMENT: Needs review. - ``SPP_SIGTOP2``: (Default: ( "0.025" "0.025" "0.025" "0.025" "0.025" ) ) Controls vertical tapering of perturbations at the tropopause and corresponds to the upper sigma level at which to taper perturbations to zero. -.. - COMMENT: Needs review. - ``SPP_STDDEV_CUTOFF``: (Default: ( "1.5" "1.5" "2.5" "1.5" "1.5" ) ) - Perturbation magnitude cutoff in number of standard deviations from the mean. - -.. - COMMENT: Needs review. + Limit for possible perturbation values in standard deviations from the mean. ``SPP_VAR_LIST``: (Default: ( "pbl" "sfc" "mp" "rad" "gwd" ) ) - The list of parameterizations to perturb: planetary boundary layer (PBL), surface physics (SFC), microphysics (MP), radiation (RAD), gravity wave drag (GWD). Valid values: "pbl", "sfc", "rad", "gwd", and "mp". + The list of parameterizations to perturb: planetary boundary layer (PBL), surface physics (SFC), microphysics (MP), radiation (RAD), gravity wave drag (GWD). Valid values: ``"pbl"`` | ``"sfc"`` | ``"rad"`` | ``"gwd"`` | ``"mp"`` Land Surface Model (LSM) SPP ------------------------------- -Land surface perturbations can be applied to land model parameters and land model prognostic variables. The LSM scheme is intended to address errors in the land model and land-atmosphere interactions. LSM perturbations include soil moisture content [SMC] (volume fraction), vegetation fraction (VGF), albedo (ALB), salinity (SAL), emissivity (EMI), surface roughness (ZOL) (in cm), and soil temperature (STC). Perturbations to soil moisture content (SMC) are only applied at the first time step. Only five perturbations at a time can be applied currently, but all seven are shown below. In addition, only one unique *iseed* value is allowed at the moment, and it is used for each pattern. +Land surface perturbations can be applied to land model parameters and land model prognostic variables. The LSM scheme is intended to address errors in the land model and land-atmosphere interactions. LSM perturbations include soil moisture content (SMC) (volume fraction), vegetation fraction (VGF), albedo (ALB), salinity (SAL), emissivity (EMI), surface roughness (ZOL) (in cm), and soil temperature (STC). Perturbations to soil moisture content (SMC) are only applied at the first time step. Only five perturbations at a time can be applied currently, but all seven are shown below. In addition, only one unique *iseed* value is allowed at the moment, and it is used for each pattern. The parameters below turn on SPP in Noah or RUC LSM (support for Noah MP is in progress). Please be aware of the :term:`SDF` that you choose if you wish to turn on Land Surface Model (LSM) SPP. SPP in LSM schemes is handled in the ``&nam_sfcperts`` namelist block instead of in ``&nam_sppperts``, where all other SPP is implemented. The default perturbation frequency is determined by the ``fhcyc`` namelist entry. Since that parameter is set to zero in the SRW App, use ``LSM_SPP_EACH_STEP`` to perturb every time step. ``DO_LSM_SPP``: (Default: "false") Turns on Land Surface Model (LSM) Stochastic Physics Parameterizations (SPP). When "TRUE", sets ``lndp_type=2``, which applies land perturbations to the selected paramaters using a newer scheme designed for data assimilation (DA) ensemble spread. LSM SPP perturbs uncertain land surface fields ("smc" "vgf" "alb" "sal" "emi" "zol" "stc") based on recommendations from physics experts. -``LSM_SPP_TSCALE``: (Default: ( ( "21600" "21600" "21600" "21600" "21600" "21600" "21600" ) ) - Decorrelation timescale in seconds. +``LSM_SPP_TSCALE``: (Default: ( ( "21600" "21600" "21600" "21600" "21600" "21600" "21600" ) ) ) + Decorrelation timescales in seconds. -``LSM_SPP_LSCALE``: (Default: ( ( "150000" "150000" "150000" "150000" "150000" "150000" "150000" ) ) - Decorrelation spatial scale in meters. +``LSM_SPP_LSCALE``: (Default: ( ( "150000" "150000" "150000" "150000" "150000" "150000" "150000" ) ) ) + Decorrelation spatial scales in meters. ``ISEED_LSM_SPP``: (Default: ("9") ) Seed to initialize the random perturbation pattern. -``LSM_SPP_VAR_LIST``: (Default: ( ( "smc" "vgf" "alb" "sal" "emi" "zol" "stc" ) ) +``LSM_SPP_VAR_LIST``: (Default: ( ( "smc" "vgf" "alb" "sal" "emi" "zol" "stc" ) ) ) Indicates which LSM variables to perturb. -``LSM_SPP_MAG_LIST``: (Default: ( ( "0.2" "0.001" "0.001" "0.001" "0.001" "0.001" "0.2" ) ) +``LSM_SPP_MAG_LIST``: (Default: ( ( "0.2" "0.001" "0.001" "0.001" "0.001" "0.001" "0.2" ) ) ) Sets the maximum random pattern amplitude for each of the LSM perturbations. ``LSM_SPP_EACH_STEP``: (Default: "true") When set to "TRUE", it sets ``lndp_each_step=.true.`` and perturbs each time step. -.. This is a continuation of the ConfigWorkflow.rst chapter +Predefined Grid Parameters +========================== +``PREDEF_GRID_NAME``: (Default: "") + This parameter indicates which (if any) predefined regional grid to use for the experiment. Setting ``PREDEF_GRID_NAME`` provides a convenient method of specifying a commonly used set of grid-dependent parameters. The predefined grid settings can be viewed in the script ``ush/set_predef_grid_params.sh``. + + **Currently supported options:** + + | ``"RRFS_CONUS_25km"`` + | ``"RRFS_CONUS_13km"`` + | ``"RRFS_CONUS_3km"`` + | ``"SUBCONUS_Ind_3km"`` + + **Other valid values include:** + + | ``"CONUS_25km_GFDLgrid"`` + | ``"CONUS_3km_GFDLgrid"`` + | ``"EMC_AK"`` + | ``"EMC_HI"`` + | ``"EMC_PR"`` + | ``"EMC_GU"`` + | ``"GSL_HAFSV0.A_25km"`` + | ``"GSL_HAFSV0.A_13km"`` + | ``"GSL_HAFSV0.A_3km"`` + | ``"GSD_HRRR_AK_50km"`` + | ``"RRFS_AK_13km"`` + | ``"RRFS_AK_3km"`` + | ``"RRFS_CONUScompact_25km"`` + | ``"RRFS_CONUScompact_13km"`` + | ``"RRFS_CONUScompact_3km"`` + | ``"RRFS_NA_13km"`` + | ``"RRFS_NA_3km"`` + | ``"RRFS_SUBCONUS_3km"`` + | ``"WoFS_3km"`` + +.. note:: + + * If ``PREDEF_GRID_NAME`` is set to a valid predefined grid name, the grid generation method, the (native) grid parameters, and the write component grid parameters are set to predefined values for the specified grid, overwriting any settings of these parameters in the user-specified experiment configuration file (``config.sh``). In addition, if the time step ``DT_ATMOS`` and the computational parameters (``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE``) are not specified in that configuration file, they are also set to predefined values for the specified grid. + + * If ``PREDEF_GRID_NAME`` is set to an empty string, it implies that the user will provide the native grid parameters in the user-specified experiment configuration file (``config.sh``). In this case, the grid generation method, the native grid parameters, the write component grid parameters, the main time step (``DT_ATMOS``), and the computational parameters (``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE``) must be set in the configuration file. Otherwise, the values of the parameters in the default experiment configuration file (``config_defaults.sh``) will be used. + .. _ConfigParameters: @@ -590,7 +616,7 @@ Grid Generation Parameters * **"ESGgrid":** The "ESGgrid" method will generate a regional version of the Extended Schmidt Gnomonic (ESG) grid using the map projection developed by Jim Purser of EMC (:cite:t:`Purser_2020`). "ESGgrid" is the preferred grid option. - * **"GFDLgrid":** The "GFDLgrid" method first generates a "parent" global cubed-sphere grid. Then a portion from tile 6 of the global grid is used as the regional grid. This regional grid is referred to in the grid generation scripts as "tile 7," even though it does not correspond to a complete tile. The forecast is run only on the regional grid (i.e., on tile 7, not on tiles 1 through 6). Note that the "GFDLgrid" grid generation method is the legacy grid generation method. It is not supported in *all* predefined domains. + * **"GFDLgrid":** The "GFDLgrid" method first generates a "parent" global cubed-sphere grid. Then a portion from tile 6 of the global grid is used as the regional grid. This regional grid is referred to in the grid generation scripts as "tile 7," even though it does not correspond to a complete tile. The forecast is run only on the regional grid (i.e., on tile 7, not on tiles 1 through 6). Note that the "GFDLgrid" method is the legacy grid generation method. It is not supported in *all* predefined domains. .. attention:: @@ -598,14 +624,14 @@ Grid Generation Parameters .. note:: - If the experiment uses a **user-defined grid** (i.e. if ``PREDEF_GRID_NAME`` is set to a null string), then ``GRID_GEN_METHOD`` must be set in the experiment configuration file. Otherwise, the experiment generation will fail because the generation scripts check to ensure that the grid name is set to a non-empty string before creating the experiment directory. + If the experiment uses a **user-defined grid** (i.e., if ``PREDEF_GRID_NAME`` is set to a null string), then ``GRID_GEN_METHOD`` must be set in the experiment configuration file. Otherwise, the experiment generation will fail because the generation scripts check to ensure that the grid name is set to a non-empty string before creating the experiment directory. .. _ESGgrid: ESGgrid Settings ------------------- -The following parameters must be set if using the "ESGgrid" method of generating a regional grid (i.e., when ``GRID_GEN_METHOD="ESGgrid"``). +The following parameters must be set if using the "ESGgrid" method to generate a regional grid (i.e., when ``GRID_GEN_METHOD="ESGgrid"``). ``ESGgrid_LON_CTR``: (Default: "") The longitude of the center of the grid (in degrees). @@ -625,25 +651,21 @@ The following parameters must be set if using the "ESGgrid" method of generating ``ESGgrid_NY``: (Default: "") The number of cells in the meridional direction on the regional grid. -``ESGgrid_WIDE_HALO_WIDTH``: (Default: "") - The width (in number of grid cells) of the :term:`halo` to add around the regional grid before shaving the halo down to the width(s) expected by the forecast model. - ``ESGgrid_PAZI``: (Default: "") The rotational parameter for the "ESGgrid" (in degrees). +``ESGgrid_WIDE_HALO_WIDTH``: (Default: "") + The width (in number of grid cells) of the :term:`halo` to add around the regional grid before shaving the halo down to the width(s) expected by the forecast model. + .. _WideHalo: .. note:: - A :term:`halo` is the strip of cells surrounding the regional grid; the halo is used to feed in the lateral boundary conditions to the grid. The forecast model requires **grid** files containing 3-cell- and 4-cell-wide halos and **orography** files with 0-cell- and 3-cell- wide halos. In order to generate grid and orography files with appropriately-sized halos, the grid and orography tasks create preliminary files with halos around the regional domain of width ``ESGgrid_WIDE_HALO_WIDTH`` cells. The files are then read in and "shaved" down to obtain grid files with 3-cell-wide and 4-cell-wide halos and orography files with 0-cell-wide and 3-cell-wide halos. The original halo that gets shaved down is referred to as the "wide" halo because it is wider than the 0-cell-wide, 3-cell-wide, and 4-cell-wide halos that we eventually end up with. Note that the grid and orography files with the wide halo are only needed as intermediates in generating the files with 0-cell-, 3-cell-, and 4-cell-wide halos; they are not needed by the forecast model. - -.. - COMMENT: There's a note that we "probably don't need to make ESGgrid_WIDE_HALO_WIDTH a user-specified variable. Just set it in the function set_gridparams_ESGgrid.sh". Has this been done? I thought there was a default value of 6. Does this come from set_gridparams_ESGgrid.sh? Will it overwirte what's added here? - + A :term:`halo` is the strip of cells surrounding the regional grid; the halo is used to feed in the lateral boundary conditions to the grid. The forecast model requires **grid** files containing 3-cell- and 4-cell-wide halos and **orography** files with 0-cell- and 3-cell-wide halos. In order to generate grid and orography files with appropriately-sized halos, the grid and orography tasks create preliminary files with halos around the regional domain of width ``ESGgrid_WIDE_HALO_WIDTH`` cells. The files are then read in and "shaved" down to obtain grid files with 3-cell-wide and 4-cell-wide halos and orography files with 0-cell-wide and 3-cell-wide halos. The original halo that gets shaved down is referred to as the "wide" halo because it is wider than the 0-cell-wide, 3-cell-wide, and 4-cell-wide halos that users eventually end up with. Note that the grid and orography files with the wide halo are only needed as intermediates in generating the files with 0-cell-, 3-cell-, and 4-cell-wide halos; they are not needed by the forecast model. GFDLgrid Settings --------------------- -The following parameters must be set if using the "GFDLgrid" method of generating a regional grid (i.e., when ``GRID_GEN_METHOD="GFDLgrid"``). Note that the regional grid is defined with respect to a "parent" global cubed-sphere grid. Thus, all the parameters for a global cubed-sphere grid must be specified even though the model equations are integrated only on the regional grid. Tile 6 has arbitrarily been chosen as the tile to use to orient the global parent grid on the sphere (Earth). For convenience, the regional grid is denoted as "tile 7" even though it is embedded within tile 6 (i.e., it doesn't extend beyond the boundary of tile 6). Its exact location within tile 6 is determined by specifying the starting and ending i- and j-indices of the regional grid on tile 6, where i is the grid index in the x direction and j is the grid index in the y direction. All of this information is set in the variables below. +The following parameters must be set if using the "GFDLgrid" method to generate a regional grid (i.e., when ``GRID_GEN_METHOD="GFDLgrid"``). Note that the regional grid is defined with respect to a "parent" global cubed-sphere grid. Thus, all the parameters for a global cubed-sphere grid must be specified even though the model equations are integrated only on the regional grid. Tile 6 has arbitrarily been chosen as the tile to use to orient the global parent grid on the sphere (Earth). For convenience, the regional grid is denoted as "tile 7" even though it is embedded within tile 6 (i.e., it doesn't extend beyond the boundary of tile 6). Its exact location within tile 6 is determined by specifying the starting and ending i- and j-indices of the regional grid on tile 6, where ``i`` is the grid index in the x direction and ``j`` is the grid index in the y direction. All of this information is set in the variables below. ``GFDLgrid_LON_T6_CTR``: (Default: "") Longitude of the center of tile 6 (in degrees). @@ -652,13 +674,10 @@ The following parameters must be set if using the "GFDLgrid" method of generatin Latitude of the center of tile 6 (in degrees). ``GFDLgrid_RES``: (Default: "") - Number of points in either of the two horizontal directions (x and y) on each tile of the parent global cubed-sphere grid. Valid values: "48" "96" "192" "384" "768" "1152" "3072" + Number of points in either of the two horizontal directions (x and y) on each tile of the parent global cubed-sphere grid. Valid values: ``"48"`` | ``"96"`` | ``"192"`` | ``"384"`` | ``"768"`` | ``"1152"`` | ``"3072"`` - .. - COMMENT: Are these still the valid values? Are there others? - .. note:: - ``GFDLgrid_RES`` is a misnomer because it specifies *number* of grid cells, not grid size (in meters or kilometers). However, we keep this name in order to remain consistent with the usage of the word "resolution" in the global forecast model and auxiliary codes. The mapping from ``GFDLgrid_RES`` to a nominal resolution (grid cell size) for several values of ``GFDLgrid_RES`` is as follows (assuming a uniform global grid, i.e., with Schmidt stretch factor ``GFDLgrid_STRETCH_FAC="1"``): + ``GFDLgrid_RES`` is a misnomer because it specifies *number* of grid cells, not grid size (in meters or kilometers). However, we keep this name in order to remain consistent with the usage of the word "resolution" in the global forecast model and auxiliary codes. The mapping from ``GFDLgrid_RES`` to a nominal resolution (grid cell size) for several values of ``GFDLgrid_RES`` is as follows (assuming a uniform global grid, i.e., with Schmidt stretch factor of 1: ``GFDLgrid_STRETCH_FAC="1"``): +----------------+--------------------+ | GFDLgrid_RES | typical cell size | @@ -678,7 +697,7 @@ The following parameters must be set if using the "GFDLgrid" method of generatin ``GFDLgrid_STRETCH_FAC``: (Default: "") - Stretching factor used in the Schmidt transformation applied to the parent cubed-sphere grid. Setting the Schmidt stretching factor (``GFDLgrid_STRETCH_FAC``) to a value greater than 1 shrinks tile 6, while setting it to a value less than 1 (but still greater than 0) expands it. The remaining 5 tiles change shape as necessary to maintain global coverage of the grid. + Stretching factor used in the Schmidt transformation applied to the parent cubed-sphere grid. Setting the Schmidt stretching factor to a value greater than 1 shrinks tile 6, while setting it to a value less than 1 (but still greater than 0) expands it. The remaining 5 tiles change shape as necessary to maintain global coverage of the grid. ``GFDLgrid_REFINE_RATIO``: (Default: "") Cell refinement ratio for the regional grid. It refers to the number of cells in either the x or y direction on the regional grid (tile 7) that abut one cell on its parent tile (tile 6). @@ -696,7 +715,7 @@ The following parameters must be set if using the "GFDLgrid" method of generatin j-index on tile 6 at which the regional grid (tile 7) ends. ``GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES``: (Default: "") - Flag that determines the file naming convention to use for grid, orography, and surface climatology files (or, if using pregenerated files, the naming convention that was used to name these files). These files usually start with the string ``"C${RES}_"``, where ``RES`` is an integer. In the global forecast model, ``RES`` is the number of points in each of the two horizontal directions (x and y) on each tile of the global grid (defined here as ``GFDLgrid_RES``). If this flag is set to "TRUE", ``RES`` will be set to ``GFDLgrid_RES`` just as in the global forecast model. If it is set to "FALSE", we calculate (in the grid generation task) an "equivalent global uniform cubed-sphere resolution" -- call it ``RES_EQUIV`` -- and then set ``RES`` equal to it. ``RES_EQUIV`` is the number of grid points in each of the x and y directions on each tile that a global UNIFORM (i.e., stretch factor of 1) cubed-sphere grid would need to have in order to have the same average grid size as the regional grid. This is a more useful indicator of the grid size because it takes into account the effects of ``GFDLgrid_RES``, ``GFDLgrid_STRETCH_FAC``, and ``GFDLgrid_REFINE_RATIO`` in determining the regional grid's typical grid size, whereas simply setting RES to ``GFDLgrid_RES`` doesn't take into account the effects of ``GFDLgrid_STRETCH_FAC`` and ``GFDLgrid_REFINE_RATIO`` on the regional grid's resolution. Nevertheless, some users still prefer to use ``GFDLgrid_RES`` in the file names, so we allow for that here by setting this flag to "TRUE". + Flag that determines the file naming convention to use for grid, orography, and surface climatology files (or, if using pregenerated files, the naming convention that was used to name these files). These files usually start with the string ``"C${RES}_"``, where ``RES`` is an integer. In the global forecast model, ``RES`` is the number of points in each of the two horizontal directions (x and y) on each tile of the global grid (defined here as ``GFDLgrid_RES``). If this flag is set to "TRUE", ``RES`` will be set to ``GFDLgrid_RES`` just as in the global forecast model. If it is set to "FALSE", we calculate (in the grid generation task) an "equivalent global uniform cubed-sphere resolution" --- call it ``RES_EQUIV`` --- and then set ``RES`` equal to it. ``RES_EQUIV`` is the number of grid points in each of the x and y directions on each tile that a global uniform cubed-sphere grid (i.e., stretch factor of 1) would need to have in order to have the same average grid size as the regional grid. This is a more useful indicator of the grid size because it takes into account the effects of ``GFDLgrid_RES``, ``GFDLgrid_STRETCH_FAC``, and ``GFDLgrid_REFINE_RATIO`` in determining the regional grid's typical grid size, whereas simply setting RES to ``GFDLgrid_RES`` doesn't take into account the effects of ``GFDLgrid_STRETCH_FAC`` and ``GFDLgrid_REFINE_RATIO`` on the regional grid's resolution. Nevertheless, some users still prefer to use ``GFDLgrid_RES`` in the file names, so we allow for that here by setting this flag to "TRUE". Computational Forecast Parameters ================================= @@ -720,17 +739,17 @@ Write-Component (Quilting) Parameters ====================================== .. note:: - The :term:`UPP` (called by the ``RUN_POST_TN`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write-component grid** before writing them to an output file. The output files written by the UFS Weather Model model use an Earth System Modeling Framework (ESMF) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. + The :term:`UPP` (called by the ``RUN_POST_TN`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write-component grid** before writing them to an output file. The output files written by the UFS Weather Model use an Earth System Modeling Framework (:term:`ESMF`) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. ``QUILTING``: (Default: "TRUE") -.. attention:: - The regional grid requires the use of the write component, so users generally should not need to change the default value for ``QUILTING``. + .. attention:: + The regional grid requires the use of the write component, so users generally should not need to change the default value for ``QUILTING``. - Flag that determines whether to use the write component for writing forecast output files to disk. If set to "TRUE", the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where HHH is the 3-digit forecast hour) containing dynamics and physics fields, respectively, on the write-component grid. For example, the output files for the 3rd hour of the forecast would be ``dynf$003.nc`` and ``phyf$003.nc``. (The regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model.) If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc``, and they contain fields on the native grid. Although the UFS Weather Model can run without quilting, the regional grid requires the use of the write component. Therefore, QUILTING should be set to "TRUE" when running the SRW App. If ``QUILTING`` is set to "FALSE", the ``RUN_POST_TN`` (meta)task cannot run because the :term:`UPP` code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. The :ref:`INLINE POST ` option also requires ``QUILTING`` to be set to "TRUE" in the SRW App. + Flag that determines whether to use the write component for writing forecast output files to disk. If set to "TRUE", the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where ``HHH`` is the 3-digit forecast hour) containing dynamics and physics fields, respectively, on the write-component grid. For example, the output files for the 3rd hour of the forecast would be ``dynf$003.nc`` and ``phyf$003.nc``. (The regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model.) If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc``, and they contain fields on the native grid. Although the UFS Weather Model can run without quilting, the regional grid requires the use of the write component. Therefore, QUILTING should be set to "TRUE" when running the SRW App. If ``QUILTING`` is set to "FALSE", the ``RUN_POST_TN`` (meta)task cannot run because the :term:`UPP` code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. The :ref:`INLINE POST ` option also requires ``QUILTING`` to be set to "TRUE" in the SRW App. ``PRINT_ESMF``: (Default: "FALSE") - Flag that determines whether to output extra (debugging) information from ESMF routines. Must be "TRUE" or "FALSE". Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). + Flag that determines whether to output extra (debugging) information from :term:`ESMF` routines. Must be "TRUE" or "FALSE". Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). ``WRTCMP_write_groups``: (Default: "1") The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. @@ -739,7 +758,7 @@ Write-Component (Quilting) Parameters The number of MPI tasks to allocate for each write group. ``WRTCMP_output_grid``: (Default: "''") - Sets the type (coordinate system) of the write component grid. The default empty string forces the user to set a valid value for ``WRTCMP_output_grid`` in ``config.sh`` if specifying a *custom* grid. When creating an experiment with a user-defined grid, this parameter must be specified or the experiment will fail. Valid values: "lambert_conformal" "regional_latlon" "rotated_latlon" + Sets the type (coordinate system) of the write component grid. The default empty string forces the user to set a valid value for ``WRTCMP_output_grid`` in ``config.sh`` if specifying a *custom* grid. When creating an experiment with a user-defined grid, this parameter must be specified or the experiment will fail. Valid values: ``"lambert_conformal"`` | ``"regional_latlon"`` | ``"rotated_latlon"`` ``WRTCMP_cen_lon``: (Default: "") Longitude (in degrees) of the center of the write component grid. Can usually be set to the corresponding value from the native grid. @@ -787,56 +806,14 @@ Write-Component (Quilting) Parameters ``WRTCMP_dy``: (Default: "") Grid cell size (in meters) along the y-axis of the Lambert conformal projection. - -Predefined Grid Parameters -========================== -``PREDEF_GRID_NAME``: (Default: "") - This parameter specifies the name of a predefined regional grid. Setting ``PREDEF_GRID_NAME`` provides a convenient method of specifying a commonly used set of grid-dependent parameters. The predefined grid parameters are specified in the script ``ush/set_predef_grid_params.sh``. - - **Currently supported options:** - - | "RRFS_CONUS_25km" - | "RRFS_CONUS_13km" - | "RRFS_CONUS_3km" - | "SUBCONUS_Ind_3km" - - **Other valid values include:** - - | "CONUS_25km_GFDLgrid" - | "CONUS_3km_GFDLgrid" - | "EMC_AK" - | "EMC_HI" - | "EMC_PR" - | "EMC_GU" - | "GSL_HAFSV0.A_25km" - | "GSL_HAFSV0.A_13km" - | "GSL_HAFSV0.A_3km" - | "GSD_HRRR_AK_50km" - | "RRFS_AK_13km" - | "RRFS_AK_3km" - | "RRFS_CONUScompact_25km" - | "RRFS_CONUScompact_13km" - | "RRFS_CONUScompact_3km" - | "RRFS_NA_13km" - | "RRFS_NA_3km" - | "RRFS_SUBCONUS_3km" - | "WoFS_3km" - -.. note:: - - * If ``PREDEF_GRID_NAME`` is set to a valid predefined grid name, the grid generation method ``GRID_GEN_METHOD``, the (native) grid parameters, and the write-component grid parameters are set to predefined values for the specified grid, overwriting any settings of these parameters in the user-specified experiment configuration file (``config.sh``). In addition, if the time step ``DT_ATMOS`` and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` are not specified in that configuration file, they are also set to predefined values for the specified grid. - - * If ``PREDEF_GRID_NAME`` is set to an empty string, it implies the user is providing the native grid parameters in the user-specified experiment configuration file (``EXPT_CONFIG_FN``). In this case, the grid generation method ``GRID_GEN_METHOD``, the native grid parameters, and the write-component grid parameters as well as the main time step (``DT_ATMOS``) and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` must be set in that configuration file. Otherwise, the values of all of these parameters in this default experiment configuration file will be used. - - Pre-existing Directory Parameter ================================ ``PREEXISTING_DIR_METHOD``: (Default: "delete") - This variable determines the method to use to deal with pre-existing directories (generated by previous calls to the experiment generation script using the same experiment name (``EXPT_SUBDIR``) as the current experiment). This variable must be set to one of three valid values: "delete", "rename", and "quit". The resulting behavior for each of these values is as follows: + This variable determines how to deal with pre-existing directories (resulting from previous calls to the experiment generation script using the same experiment name [``EXPT_SUBDIR``] as the current experiment). This variable must be set to one of three valid values: ``"delete"``, ``"rename"``, or ``"quit"``. The behavior for each of these values is as follows: * **"delete":** The preexisting directory is deleted and a new directory (having the same name as the original preexisting directory) is created. - * **"rename":** The preexisting directory is renamed and a new directory (having the same name as the original pre-existing directory) is created. The new name of the preexisting directory consists of its original name and the suffix "_oldNNN", where NNN is a 3-digit integer chosen to make the new name unique. + * **"rename":** The preexisting directory is renamed and a new directory (having the same name as the original pre-existing directory) is created. The new name of the preexisting directory consists of its original name and the suffix "_old###", where ``###`` is a 3-digit integer chosen to make the new name unique. * **"quit":** The preexisting directory is left unchanged, but execution of the currently running script is terminated. In this case, the preexisting directory must be dealt with manually before rerunning the script. @@ -844,12 +821,12 @@ Pre-existing Directory Parameter Verbose Parameter ================= ``VERBOSE``: (Default: "TRUE") - Flag that determines whether the experiment generation and workflow task scripts print out extra informational messages. Valid values: "TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no" + Flag that determines whether the experiment generation and workflow task scripts print out extra informational messages. Valid values: ``"TRUE"`` | ``"true"`` | ``"YES"`` | ``"yes"`` | ``"FALSE"`` | ``"false"`` | ``"NO"`` | ``"no"`` Debug Parameter ================= ``DEBUG``: (Default: "FALSE") - Flag that determines whether to print out very detailed debugging messages. Note that if DEBUG is set to TRUE, then VERBOSE will also get reset to TRUE if it isn't already. Valid values: "TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no" + Flag that determines whether to print out very detailed debugging messages. Note that if DEBUG is set to TRUE, then VERBOSE will also be reset to TRUE if it isn't already. Valid values: ``"TRUE"`` | ``"true"`` | ``"YES"`` | ``"yes"`` | ``"FALSE"`` | ``"false"`` | ``"NO"`` | ``"no"`` .. _WFTasks: @@ -907,7 +884,7 @@ Set the names of the various Rocoto workflow tasks. These names usually do not n Workflow Task Parameters ======================== -For each workflow task, additional parameters set the values to pass to the job scheduler (e.g., Slurm) that will submit a job for each task to be run. Parameters include the number of nodes to use to run the job, the number of MPI processes per node, the maximum walltime to allow for the job to complete, and the maximum number of times to attempt to run each task. +For each workflow task, additional parameters determine the values to pass to the job scheduler (e.g., Slurm), which submits a job for each task. Parameters include the number of nodes to use for the job, the number of :term:`MPI` processes per node, the maximum walltime to allow for the job to complete, and the maximum number of times to attempt each task. **Number of nodes:** @@ -1043,16 +1020,16 @@ Baseline Workflow Tasks The directory containing pre-generated grid files when ``RUN_TASK_MAKE_GRID`` is set to "FALSE". ``RUN_TASK_MAKE_OROG``: (Default: "TRUE") - Same as ``RUN_TASK_MAKE_GRID`` but for the orography generation task (``MAKE_OROG_TN``). + Same as ``RUN_TASK_MAKE_GRID`` but for the orography generation task (``MAKE_OROG_TN``). Flag that determines whether to run the orography file generation task (``MAKE_OROG_TN``). If this is set to "TRUE", the orography generation task is run and new orography files are generated. If it is set to "FALSE", then the scripts look for pre-generated orography files in the directory specified by ``OROG_DIR`` (see below). ``OROG_DIR``: (Default: "/path/to/pregenerated/orog/files") - Same as ``GRID_DIR`` but for the orography generation task (``MAKE_OROG_TN``). + The directory containing pre-generated orography files when ``MAKE_OROG_TN`` is set to "FALSE". ``RUN_TASK_MAKE_SFC_CLIMO``: (Default: "TRUE") - Same as ``RUN_TASK_MAKE_GRID`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). + Same as ``RUN_TASK_MAKE_GRID`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). Flag that determines whether to run the surface climatology file generation task (``MAKE_SFC_CLIMO_TN``). If this is set to "TRUE", the surface climatology generation task is run and new surface climatology files are generated. If it is set to "FALSE", then the scripts look for pre-generated surface climatology files in the directory specified by ``SFC_CLIMO_DIR`` (see below). ``SFC_CLIMO_DIR``: (Default: "/path/to/pregenerated/surface/climo/files") - Same as ``GRID_DIR`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). + The directory containing pre-generated surface climatology files when ``MAKE_SFC_CLIMO_TN`` is set to "FALSE". ``RUN_TASK_GET_EXTRN_ICS``: (Default: "TRUE") Flag that determines whether to run the ``GET_EXTRN_ICS_TN`` task. @@ -1078,7 +1055,7 @@ Verification Tasks -------------------- ``RUN_TASK_GET_OBS_CCPA``: (Default: "FALSE") - Flag that determines whether to run the ``GET_OBS_CCPA_TN`` task, which retrieves the :term:`CCPA` hourly precipitation files used by METplus from NOAA HPSS. + Flag that determines whether to run the ``GET_OBS_CCPA_TN`` task, which retrieves the :term:`CCPA` hourly precipitation files used by METplus from NOAA :term:`HPSS`. ``RUN_TASK_GET_OBS_MRMS``: (Default: "FALSE") Flag that determines whether to run the ``GET_OBS_MRMS_TN`` task, which retrieves the :term:`MRMS` composite reflectivity files used by METplus from NOAA HPSS. @@ -1098,17 +1075,11 @@ Verification Tasks ``RUN_TASK_VX_ENSPOINT``: (Default: "FALSE") Flag that determines whether to run the ensemble point verification task. If this flag is set, both ensemble-stat point verification and point verification of ensemble-stat output is computed. -.. - COMMENT: Might be worth defining "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output" - Aerosol Climatology Parameter ================================ ``USE_MERRA_CLIMO``: (Default: "FALSE") - Flag that determines whether MERRA2 aerosol climatology data and lookup tables for optics properties are obtained. - -.. - COMMENT: When would it be appropriate to obtain these files? + Flag that determines whether :term:`MERRA2` aerosol climatology data and lookup tables for optics properties are obtained. Surface Climatology Parameter ============================= @@ -1123,7 +1094,7 @@ These parameters are associated with the fixed (i.e., static) files. On `Level 1 System directory in which the majority of fixed (i.e., time-independent) files that are needed to run the FV3-LAM model are located. ``FIXaer``: (Default: "") - System directory where MERRA2 aerosol climatology files are located. + System directory where :term:`MERRA2` aerosol climatology files are located. ``FIXlut``: (Default: "") System directory where the lookup tables for optics properties are located. @@ -1202,11 +1173,6 @@ These parameters are associated with the fixed (i.e., static) files. On `Level 1 This array is used to set some of the :term:`namelist` variables in the forecast model's namelist file. It maps file symlinks to the actual fixed file locations in the ``FIXam`` directory. The symlink names appear in the first column (to the left of the "|" symbol), and the paths to these files (in the ``FIXam`` directory) are held in workflow variables, which appear to the right of the "|" symbol. It is possible to remove ``FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING`` as a workflow variable and make it only a local one since it is used in only one script. -.. - COMMENT: Why is #"FNZORC | $FNZORC" \ commented out in config_defaults.sh? - COMMENT: Is this an accurate rewording of the original? - - ``FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING``: (Default: see below) .. code-block:: console @@ -1222,9 +1188,6 @@ These parameters are associated with the fixed (i.e., static) files. On `Level 1 "FNABSC | maximum_snow_albedo" ) This array is used to set some of the :term:`namelist` variables in the forecast model's namelist file. The variable names appear in the first column (to the left of the "|" symbol), and the paths to these surface climatology files on the native FV3-LAM grid (in the ``FIXLAM`` directory) are derived from the corresponding surface climatology fields (the second column of the array). - -.. - COMMENT: Is this an accurate rewording of the original? ``CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING``: (Default: see below) .. code-block:: console @@ -1252,16 +1215,16 @@ These parameters are associated with the fixed (i.e., static) files. On `Level 1 "solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt" \ "global_o3prdlos.f77 | " ) - This array specifies the mapping to use between the symlinks that need to be created in each cycle directory (these are the "files" that FV3 looks for) and their targets in the ``FIXam`` directory. The first column of the array specifies the symlink to be created, and the second column specifies its target file in ``FIXam`` (where columns are delineated by the pipe symbol "|"). + This array specifies the mapping to use between the symlinks that need to be created in each cycle directory (these are the "files" that :term:`FV3` looks for) and their targets in the ``FIXam`` directory. The first column of the array specifies the symlink to be created, and the second column specifies its target file in ``FIXam`` (where columns are delineated by the pipe symbol "|"). Subhourly Forecast Parameters ================================= ``SUB_HOURLY_POST``: (Default: "FALSE") - Flag that indicates whether the forecast model will generate output files on a sub-hourly time interval (e.g., 10 minutes, 15 minutes). This will also cause the post-processor to process these sub-hourly files. If this variable is set to "TRUE", then ``DT_SUBHOURLY_POST_MNTS`` should be set to a value between "01" and "59". + Flag that indicates whether the forecast model will generate output files on a sub-hourly time interval (e.g., 10 minutes, 15 minutes). This will also cause the post-processor to process these sub-hourly files. If this variable is set to "TRUE", then ``DT_SUBHOURLY_POST_MNTS`` should be set to a valid value between "01" and "59". ``DT_SUB_HOURLY_POST_MNTS``: (Default: "00") - Time interval in minutes between the forecast model output files. If ``SUB_HOURLY_POST`` is set to "TRUE", this needs to be set to a two-digit integer between "01" and "59". Note that if ``SUB_HOURLY_POST`` is set to "TRUE" but ``DT_SUB_HOURLY_POST_MNTS`` is set to "00", ``SUB_HOURLY_POST`` will get reset to "FALSE" in the experiment generation scripts (there will be an informational message in the log file to emphasize this). Valid values: "1" "01" "2" "02" "3" "03" "4" "04" "5" "05" "6" "06" "10" "12" "15" "20" "30". + Time interval in minutes between the forecast model output files. If ``SUB_HOURLY_POST`` is set to "TRUE", this needs to be set to a valid two-digit integer between "01" and "59". Note that if ``SUB_HOURLY_POST`` is set to "TRUE" but ``DT_SUB_HOURLY_POST_MNTS`` is set to "00", ``SUB_HOURLY_POST`` will get reset to "FALSE" in the experiment generation scripts (there will be an informational message in the log file to emphasize this). Valid values: ``"1"`` | ``"01"`` | ``"2"`` | ``"02"`` | ``"3"`` | ``"03"`` | ``"4"`` | ``"04"`` | ``"5"`` | ``"05"`` | ``"6"`` | ``"06"`` | ``"10"`` | ``"12"`` | ``"15"`` | ``"20"`` | ``"30"`` Customized Post Configuration Parameters ======================================== @@ -1270,7 +1233,7 @@ Customized Post Configuration Parameters Flag that determines whether a user-provided custom configuration file should be used for post-processing the model data. If this is set to "TRUE", then the workflow will use the custom post-processing (:term:`UPP`) configuration file specified in ``CUSTOM_POST_CONFIG_FP``. Otherwise, a default configuration file provided in the UPP repository will be used. ``CUSTOM_POST_CONFIG_FP``: (Default: "") - The full path to the custom post flat file, including filename, to be used for post-processing. This is only used if ``CUSTOM_POST_CONFIG_FILE`` is set to "TRUE". + The full path to the custom flat file, including filename, to be used for post-processing. This is only used if ``CUSTOM_POST_CONFIG_FILE`` is set to "TRUE". Community Radiative Transfer Model (CRTM) Parameters @@ -1278,9 +1241,6 @@ Community Radiative Transfer Model (CRTM) Parameters These variables set parameters associated with outputting satellite fields in the :term:`UPP` :term:`grib2` files using the Community Radiative Transfer Model (:term:`CRTM`). :numref:`Section %s ` includes further instructions on how to do this. -.. - COMMENT: What actually happens here? Where are the satellite fields outputted to? When/why would this be used? What kind of satellites? - ``USE_CRTM``: (Default: "FALSE") Flag that defines whether external :term:`CRTM` coefficient files have been staged by the user in order to output synthetic satellite products available within the :term:`UPP`. If this is set to "TRUE", then the workflow will check for these files in the directory ``CRTM_DIR``. Otherwise, it is assumed that no satellite fields are being requested in the UPP configuration. @@ -1301,7 +1261,7 @@ Ensemble Model Parameters Halo Blend Parameter ==================== ``HALO_BLEND``: (Default: "10") - Number of cells to use for “blending” the external solution (obtained from the :term:`LBCs`) with the internal solution from the FV3LAM dycore. Specifically, it refers to the number of rows into the computational domain that should be blended with the LBCs. Cells at which blending occurs are all within the boundary of the native grid; they don’t involve the 4 cells outside the boundary where the LBCs are specified (which is a different :term:`halo`). Blending is necessary to smooth out waves generated due to mismatch between the external and internal solutions. To shut :term:`halo` blending off, set this to zero. + Number of cells to use for “blending” the external solution (obtained from the :term:`LBCs`) with the internal solution from the FV3LAM :term:`dycore`. Specifically, it refers to the number of rows into the computational domain that should be blended with the LBCs. Cells at which blending occurs are all within the boundary of the native grid; they don’t involve the 4 cells outside the boundary where the LBCs are specified (which is a different :term:`halo`). Blending is necessary to smooth out waves generated due to mismatch between the external and internal solutions. To shut :term:`halo` blending off, set this to zero. FVCOM Parameter @@ -1310,7 +1270,7 @@ FVCOM Parameter Flag that specifies whether or not to update surface conditions in FV3-LAM with fields generated from the Finite Volume Community Ocean Model (:term:`FVCOM`). If set to "TRUE", lake/sea surface temperatures, ice surface temperatures, and ice placement will be overwritten using data provided by FVCOM. Setting ``USE_FVCOM`` to "TRUE" causes the executable ``process_FVCOM.exe`` in the ``MAKE_ICS_TN`` task to run. This, in turn, modifies the file ``sfc_data.nc`` generated by ``chgres_cube``. Note that the FVCOM data must already be interpolated to the desired FV3-LAM grid. ``FVCOM_WCSTART``: (Default: "cold") - Define if this is a "warm" start or a "cold" start. Setting this to "warm" will read in ``sfc_data.nc`` generated in a RESTART directory. Setting this to "cold" will read in the ``sfc_data.nc`` generated from ``chgres_cube`` in the ``make_ics`` portion of the workflow. Valid values: "cold" "warm" + Define if this is a "warm" start or a "cold" start. Setting this to "warm" will read in ``sfc_data.nc`` generated in a RESTART directory. Setting this to "cold" will read in the ``sfc_data.nc`` generated from ``chgres_cube`` in the ``make_ics`` portion of the workflow. Valid values: ``"cold"`` | ``"warm"`` ``FVCOM_DIR``: (Default: "/user/defined/dir/to/fvcom/data") User-defined directory where the ``fvcom.nc`` file containing :term:`FVCOM` data on the FV3-LAM native grid is located. The file name in this directory must be ``fvcom.nc``. @@ -1318,20 +1278,11 @@ FVCOM Parameter ``FVCOM_FILE``: (Default: "fvcom.nc") Name of file located in ``FVCOM_DIR`` that has :term:`FVCOM` data interpolated to the FV3-LAM grid. This file will be copied later to a new location and the name changed to ``fvcom.nc`` if a name other than ``fvcom.nc`` is selected. -Compiler Parameter -================== -``COMPILER``: (Default: "intel") - Type of compiler invoked during the build step. Currently, this must be set manually (i.e., it is not inherited from the build system in the ``ufs-srweather-app`` directory). Valid values: "intel" "gnu" - - Thread Affinity Interface =========================== .. note:: - Note that settings for the ``make_grid`` and ``make_orog`` tasks are not included below because they do not use parallelized code. - -.. - COMMENT: The note above is in config_defaults.sh comments, but make_orog does seem to be included below... should I remove it? + Note that settings for the ``make_grid`` and ``make_orog`` tasks are disabled or not included below because they do not use parallelized code. ``KMP_AFFINITY_*``: (Default: see below) @@ -1344,7 +1295,7 @@ Thread Affinity Interface KMP_AFFINITY_RUN_FCST="scatter" KMP_AFFINITY_RUN_POST="scatter" - Intel's runtime library can bind OpenMP threads to physical processing units. The interface is controlled using the KMP_AFFINITY environment variable. Thread affinity restricts execution of certain threads to a subset of the physical processing units in a multiprocessor computer. Depending on the system (machine) topology, application, and operating system, thread affinity can have a dramatic effect on the application speed and on the execution speed of a program." Valid values: "scatter" "disabled" "balanced" "compact" "explicit" "none" + "Intel's runtime library can bind OpenMP threads to physical processing units. The interface is controlled using the KMP_AFFINITY environment variable. Thread affinity restricts execution of certain threads to a subset of the physical processing units in a multiprocessor computer. Depending on the system (machine) topology, application, and operating system, thread affinity can have a dramatic effect on the application speed and on the execution speed of a program." Valid values: ``"scatter"`` | ``"disabled"`` | ``"balanced"`` | ``"compact"`` | ``"explicit"`` | ``"none"`` For more information, see the `Intel Development Reference Guide `__. @@ -1356,15 +1307,11 @@ Thread Affinity Interface OMP_NUM_THREADS_MAKE_SFC_CLIMO="1" OMP_NUM_THREADS_MAKE_ICS="1" OMP_NUM_THREADS_MAKE_LBCS="1" - OMP_NUM_THREADS_RUN_FCST="2" # atmos_nthreads in model_configure + OMP_NUM_THREADS_RUN_FCST="2" OMP_NUM_THREADS_RUN_POST="1" The number of OpenMP threads to use for parallel regions. -.. - COMMENT: What does the #atmos_nthreads comment mean? Can it be removed? - - ``OMP_STACKSIZE_*``: (Default: see below) .. code-block:: console diff --git a/docs/UsersGuide/source/ContributorsGuide.rst b/docs/UsersGuide/source/ContributorsGuide.rst index 7d5813582c..ffb0575e16 100644 --- a/docs/UsersGuide/source/ContributorsGuide.rst +++ b/docs/UsersGuide/source/ContributorsGuide.rst @@ -39,6 +39,12 @@ Scientists from across multiple labs and organizations have volunteered to revie | EPIC | Mark Potts (@mark-a-potts) | | | | | | Jong Kim (@jkbk2004) | + | | | + | | Natalie Perlin (@natalie-perlin) | + | | | + | | Gillian Petro (@gspetro-NOAA) | + | | | + | | Edward Snyder (@EdwardSnyder-NOAA) | +------------------+------------------------------------------------+ | GLERL/UM | David Wright (@dmwright526) | +------------------+------------------------------------------------+ @@ -57,8 +63,6 @@ Scientists from across multiple labs and organizations have volunteered to revie | NCAR | Mike Kavulich (@mkavulich) | | | | | | Will Mayfield (@willmayfield) | - | | | - | | Jamie Wolff (@jwolff-ncar) | +------------------+------------------------------------------------+ | NSSL | Yunheng Wang (@ywangwof) | +------------------+------------------------------------------------+ @@ -83,7 +87,7 @@ The steps below should be followed in order to make changes to the ``develop`` b #. **Development** - Perform and test changes in the branch. Document work in the issue and mention the issue number in commit messages to link your work to the issue (e.g., ``commit -m "Issue #23 - "``). Test code modifications on as many platforms as possible, and request help with further testing from the code management team when unable to test on all platforms. Document changes to the workflow and capabilities (either in the ``.rst`` files or separately) so that the SRW App documentation stays up-to-date. #. **Pull request** - When ready to merge changes back to the ``develop`` branch, the code developer should initiate a pull request (PR) of the feature branch into the ``develop`` branch. Read `here `__ about pull requests in GitHub. When a PR is initiated, the :ref:`PR Template