From 370bb6c5011bb42f68740ac7649694e59f77f8c3 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 14:07:50 -0500 Subject: [PATCH 001/118] updated docs --- .gitignore | 1 + docs/UsersGuide/source/Components.rst | 217 ++++++++++++++++++ docs/UsersGuide/source/Glossary.rst | 24 ++ docs/UsersGuide/source/Include-HPCInstall.rst | 6 + docs/UsersGuide/source/Introduction.rst | 147 +++++------- docs/UsersGuide/source/Quickstart.rst | 213 ++++++++++------- docs/UsersGuide/source/conf.py | 4 +- docs/UsersGuide/source/index.rst | 7 +- 8 files changed, 442 insertions(+), 177 deletions(-) create mode 100644 .gitignore create mode 100644 docs/UsersGuide/source/Components.rst create mode 100644 docs/UsersGuide/source/Include-HPCInstall.rst diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000000..e43b0f9889 --- /dev/null +++ b/.gitignore @@ -0,0 +1 @@ +.DS_Store diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst new file mode 100644 index 0000000000..26822fe341 --- /dev/null +++ b/docs/UsersGuide/source/Components.rst @@ -0,0 +1,217 @@ +.. _Components: + +=============== +SRW Components +=============== + +The SRW Application v2.0 release assembles a variety of components, including: +* Pre-processor Utilities & Initial Conditions +* Forecast Model +* Post-Processor +* Visualization Example +* Build System and Workflow +* User Support, Documentation, and Contributing Development + +These components are documented within this User's Guide and supported through a `community forum `_. + + +Pre-processor Utilities and Initial Conditions +============================================== + +The SRW Application includes a number of pre-processing utilities that initialize and prepare the +model. Tasks include generating a regional grid, along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. + +The SRW Application includes a number of pre-processing utilities to initialize and prepare the +model. For the limited area model (LAM), it is necessary to first generate a +regional grid ``regional_esg_grid/make_hgrid`` along with orography ``orog`` and surface climatology ``sfc_climo_gen`` files on that grid. There are additional utilities included to handle the correct number of halo ``shave`` points and topography filtering ``filter_topo``. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format, needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. + +.. + COMMENT: "Integration" with what?!?! (1st sentence) --> Try "prepare the model data" instead of "prepare the model for integration." + COMMENT: Why are we using code/commands in an overview doc?! A newbie is going to glaze over and give up. + +The SRW Application can be initialized from a range of operational initial condition files. It is +possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (GRIB2) format and GFS in NEMSIO format for past dates. + +.. WARNING:: + Please note, for GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. + + +Forecast Model +============== + +The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:`BlackEtAl2020`. +The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `_. + +Supported model resolutions in this release include a 3-, 13-, and 25-km predefined Contiguous +U.S. (CONUS) domain, all with 64 vertical levels. Preliminary tools for users to define their +own domain are also available in the release with full, formal support of these tools to be +provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, +which features relatively uniform grid cells across the entirety of the domain. Additional +information about the FV3 dynamical core can be found `here +`_ and on the `NOAA Geophysical +Fluid Dynamics Laboratory website `_. + +Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) +Land Surface Model options, are supported through the Common Community Physics Package +(:term:`CCPP`; described `here `_). +Atmospheric physics are a set of numerical methods describing small-scale processes such +as clouds, turbulence, radiation, and their interactions. There are two physics options +supported for the release. The first is an experimental physics suite being tested for use +in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned +for 2023-2024, and the second is an updated version of the physics suite used in the operational +Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and +suites can be found in the `CCPP Scientific Documentation `_, +and CCPP technical aspects are described in the `CCPP Technical Documentation +`_. The model namelist has many settings +beyond the physics options that can optimize various aspects of the model for use with each +of the supported suites. + +The SRW App supports the use of both GRIB2 and :term:`NEMSIO` input data. The UFS Weather Model +ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in +netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal and model +levels in the vertical. + +Post-processor +============== + +The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the +workflow as a way to convert the netCDF output on the native model grid to GRIB2 format on +standard isobaric vertical coordinates. UPP can also be used to compute a variety of useful +diagnostic fields, as described in the `UPP user’s guide `_. + +Output from UPP can be used with visualization, plotting, and verification packages, or for +further downstream post-processing, e.g. statistical post-processing techniques. + +Visualization Example +===================== +A Python script is provided to create basic visualization of the model output. The script +is designed to output graphics in PNG format for 14 standard meteorological variables +when using the pre-defined CONUS domain. In addition, a difference plotting script is included +to visually compare two runs for the same domain and resolution. These scripts are provided only +as an example for users familiar with Python, and may be used to do a visual check to verify +that the application is producing reasonable results. + +The scripts are available in the `regional_workflow repository +`_ +under ush/Python. Usage information and instructions are described in +:numref:`Chapter %s ` and are also included at the top of the script. + +Build System and Workflow +========================= + +The SRW Application has a portable build system and a user-friendly, modular, and +expandable workflow framework. + +An umbrella CMake-based build system is used for building the components necessary +for running the end-to-end SRW Application: the UFS Weather Model and the pre- and +post-processing software. Additional libraries (:term:`NCEPLIBS-external` and :term:`NCEPLIBS`) necessary +for the application are not included in the SRW Application build system, but are available +pre-built on pre-configured platforms. There is a small set of system libraries and utilities +that are assumed to be present on the target computer: the CMake build software, a Fortran, +C, and C++ compiler, and MPI library. + +Once built, the provided experiment generator script can be used to create a Rocoto-based +workflow file that will run each task in the system (see `Rocoto documentation +`_) in the proper sequence. +If Rocoto and/or a batch system is not present on the available platform, the individual +components can be run in a stand-alone, command line fashion with provided run scripts. The +generated namelist for the atmospheric model can be modified in order to vary settings such +as forecast starting and ending dates, forecast length hours, the CCPP physics suite, +integration time step, history file output frequency, and more. It also allows for configuration +of other elements of the workflow; for example, whether to run some or all of the pre-processing, +forecast model, and post-processing steps. + +This SRW Application release has been tested on a variety of platforms widely used by +researchers, such as the NOAA Research and Development High-Performance Computing Systems +(RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational +Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne +system; NSSL’s HPC machine, Odin; the National Science Foundation Stampede2 system; and +generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support +`_ +have been defined for the SRW Application, including pre-configured (level 1), configurable +(level 2), limited test platforms (level 3), and build only platforms (level 4). Each +level is further described below. + +For the selected computational platforms that have been pre-configured (level 1), all the +required libraries for building the SRW Application are available in a central place. That +means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both +been built. The SRW Application is expected to build and run out of the box on these +pre-configured platforms and users can proceed directly to the using the workflow, as +described in the Quick Start (:numref:`Chapter %s `). + +A few additional computational platforms are considered configurable for the SRW +Application release. Configurable platforms (level 2) are platforms where all of +the required libraries for building the SRW Application are expected to install successfully, +but are not available in a central place. Applications and models are expected to build +and run once the required bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) +are built. + +Limited-Test (level 3) and Build-Only (level 4) computational platforms are those in which +the developers have built the code but little or no pre-release testing has been conducted, +respectively. A complete description of the levels of support, along with a list of preconfigured +and configurable platforms can be found in the `SRW Application wiki page +`_. + +User Support, Documentation, and Contributing Development +========================================================= + +A forum-based, online `support system `_ organized by topic +provides a centralized location for UFS users and developers to post questions and exchange +information. + +A list of available documentation is shown in :numref:`Table %s `. + +.. _list_of_documentation: + +.. table:: Centralized list of documentation + + +----------------------------+---------------------------------------------------------------------------------+ + | **Documentation** | **Location** | + +============================+=================================================================================+ + | UFS SRW Application v1.0 | https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.0 | + | User's Guide | | + +----------------------------+---------------------------------------------------------------------------------+ + | UFS_UTILS v2.0 User's | https://noaa-emcufs-utils.readthedocs.io/en/ufs-v2.0.0/ | + | Guide | | + +----------------------------+---------------------------------------------------------------------------------+ + | UFS Weather Model v2.0 | https://ufs-weather-model.readthedocs.io/en/ufs-v2.0.0 | + | User's Guide | | + +----------------------------+---------------------------------------------------------------------------------+ + | NCEPLIBS Documentation | https://github.com/NOAA-EMC/NCEPLIBS/wiki | + +----------------------------+---------------------------------------------------------------------------------+ + | NCEPLIBS-external | https://github.com/NOAA-EMC/NCEPLIBS-external/wiki | + | Documentation | | + +----------------------------+---------------------------------------------------------------------------------+ + | FV3 Documentation | https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html | + +----------------------------+---------------------------------------------------------------------------------+ + | CCPP Scientific | https://dtcenter.ucar.edu/GMTB/v5.0.0/sci_doc/index.html | + | Documentation | | + +----------------------------+---------------------------------------------------------------------------------+ + | CCPP Technical | https://ccpp-techdoc.readthedocs.io/en/v5.0.0/ | + | Documentation | | + +----------------------------+---------------------------------------------------------------------------------+ + | ESMF manual | http://earthsystemmodeling.org/docs/release/ESMF_8_0_0/ESMF_usrdoc/ | + +----------------------------+---------------------------------------------------------------------------------+ + | Unified Post Processor | https://upp.readthedocs.io/en/upp-v9.0.0/ | + +----------------------------+---------------------------------------------------------------------------------+ + +The UFS community is encouraged to contribute to the development effort of all related +utilities, model code, and infrastructure. Issues can be posted in SRW-related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `. + +Future Direction +================ + +Users can expect to see incremental improvements and additional capabilities in upcoming +releases of the SRW Application to enhance research opportunities and support operational +forecast implementations. Planned advancements include: + +* A more extensive set of supported developmental physics suites. +* A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. +* Inclusion of data assimilation, cycling, and ensemble capabilities. +* A verification package (i.e., METplus) integrated into the workflow. +* Inclusion of stochastic perturbation techniques. + +In addition to the above list, other improvements will be addressed in future releases. + +.. bibliography:: references.bib diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 622814368a..d3deb40672 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -11,19 +11,34 @@ Glossary parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + CONUS + Continental United States + chgres_cube The preprocessing software used to create initial and boundary condition files to “coldstart” the forecast model. + HRRR + `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. + + .. + COMMENT: Clarify HRRR definition! + FV3 The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL), it is a scalable and flexible dycore capable of both hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model. + GFS + `Global Forecast System `_. The GFS is a National Centers for Environmental Prediction (NCEP) weather forecast model that generates data for dozens of atmospheric and land-soil variables, including temperatures, winds, precipitation, soil moisture, and atmospheric ozone concentration. The system couples four separate models (atmosphere, ocean model, land/soil model, and sea ice) that work together to accurately depict weather conditions. + GRIB2 The second version of the World Meterological Organization's (WMO) standard for distributing gridded data. + NAM + `North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes. + NCEP National Centers for Environmental Prediction, an arm of the National Weather Service, consisting of nine centers. More information can be found at https://www.ncep.noaa.gov. @@ -47,6 +62,12 @@ Glossary NEMSIO A binary format for atmospheric model output from :term:`NCEP`'s Global Forecast System (GFS). + Orography + The branch of physical geography dealing with mountains + + RAP + `Rapid Refresh `. The continental-scale NOAA hourly-updated assimilation/modeling system operational at NCEP. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (HRRR) model. + UFS The Unified Forecast System is a community-based, coupled comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global @@ -64,6 +85,9 @@ Glossary The Unified Post Processor is software developed at :term:`NCEP` and used operationally to post-process raw output from a variety of :term:`NCEP`'s NWP models, including the FV3. + Weather Enterprise + Individuals and organizations from public, private, and academic sectors that contribute to the research, development, and production of weather forecast products; primary consumers of these weather forecast products. + Weather Model A prognostic model that can be used for short- and medium-range research and operational forecasts. It can be an atmosphere-only model or an atmospheric diff --git a/docs/UsersGuide/source/Include-HPCInstall.rst b/docs/UsersGuide/source/Include-HPCInstall.rst new file mode 100644 index 0000000000..4fa2a9989b --- /dev/null +++ b/docs/UsersGuide/source/Include-HPCInstall.rst @@ -0,0 +1,6 @@ +.. _InstallHPCstack: + +====================== +Install the HPC-Stack +====================== +.. include:: ../../../docs/source/hpc-install.rst \ No newline at end of file diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index ac1bedb1b0..363bb745c5 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -4,109 +4,69 @@ Introduction ============ -The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. -It is designed to be the source system for NOAA’s operational numerical weather prediction applications -while enabling research, development, and contribution opportunities for the broader weather enterprise. -For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. - -The UFS can be configured for multiple applications (see a complete list at -https://ufscommunity.org/science/aboutapps/). The configuration described here is the UFS Short-Range -Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain -and on time scales from less than an hour out to several days. The SRW Application v1.0 release includes a -prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system -end-to-end, which are documented within the User's Guide and supported through a community forum. -Future work will include expanding the capabilities of the application to include data assimilation -(DA) and a verification package (e.g. METplus) as part of the workflow. This documentation provides an -overview of the release components, a description of the supported capabilities, a quick start guide -for running the application, and information on where to find more information and obtain support. +The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. + +The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g. METplus). This documentation provides a quick start guide for running the application, in addition to an overview of the release components, a description of the supported capabilities, and information on where to find more information and obtain support. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research -conducted with the App. +conducted with the App: + +UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 -UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application -(Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 +.. + COMMENT: What will be the numbering for this release? It will need to be changed above and throughout the docs. + COMMENT: Can this app be used beyond the CONUS? Change "limited spatial domain" above to CONUS or something more specific. + COMMENT: Where are we on the DA and verification package? Can we update that line? + COMMENT: Update citation date & version number. What is Zenodo? + COMMENT: Disagree: "This documentation provides... information on where to find more information and obtain support." We need to add this (i.e. link in the docs) or remove the line. + COMMENT: "Components" doc created but not added to TOC. Contains more detailed info on components discussed here below. Pre-processor Utilities and Initial Conditions ============================================== -The SRW Application includes a number of pre-processing utilities to initialize and prepare the -model for integration. For the limited area model (LAM), it is necessary to first generate a -regional grid ``regional_esg_grid/make_hgrid`` along with orography ``orog`` and surface climatology ``sfc_climo_gen`` -files on that grid. There are additional utilities included to handle the correct number of halo ``shave`` -points and topography filtering ``filter_topo``. The pre-processing software ``chgres_cube`` -is used to convert the raw external model data into initial and lateral boundary condition files in netCDF -format, needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can -be found in the `UFS_UTILS User’s Guide `_. +The SRW Application includes a number of pre-processing utilities that initialize and prepare the +model. Tasks include generating a regional grid, along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. -The SRW Application can be initialized from a range of operational initial condition files. It is -possible to initialize the model from GFS, NAM, RAP, and HRRR files in Gridded Binary v2 (GRIB2) -format and GFS in NEMSIO format for past dates. Please note, for GFS data, dates prior to 1 January 2018 may work but are -not guaranteed. Public archives of model data can be accessed through the `National Centers for -Environmental Information `_ -(NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ -(NOMADS). Raw external model data may be pre-staged on disk by the user. + +.. + COMMENT: "Integration" with what?!?! (1st sentence) --> Try "prepare the model data" instead of "prepare the model for integration." + COMMENT: Deleted code/commands bc it's an introduction. + COMMENT: What is a "halo shave point" or wide-halo grid?! Forecast Model ============== +Atmospheric Model +-------------------- + The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:`BlackEtAl2020`. -The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s -Guide for the UFS :term:`Weather Model` is `here `_. - -Supported model resolutions in this release include a 3-, 13-, and 25-km predefined Contiguous -U.S. (CONUS) domain, all with 64 vertical levels. Preliminary tools for users to define their -own domain are also available in the release with full, formal support of these tools to be -provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, -which features relatively uniform grid cells across the entirety of the domain. Additional -information about the FV3 dynamical core can be found `here -`_ and on the `NOAA Geophysical -Fluid Dynamics Laboratory website `_. - -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) -Land Surface Model options, are supported through the Common Community Physics Package -(:term:`CCPP`; described `here `_). -Atmospheric physics are a set of numerical methods describing small-scale processes such -as clouds, turbulence, radiation, and their interactions. There are two physics options -supported for the release. The first is an experimental physics suite being tested for use -in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned -for 2023-2024, and the second is an updated version of the physics suite used in the operational -Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and -suites can be found in the `CCPP Scientific Documentation `_, -and CCPP technical aspects are described in the `CCPP Technical Documentation -`_. The model namelist has many settings -beyond the physics options that can optimize various aspects of the model for use with each -of the supported suites. - -The SRW App supports the use of both GRIB2 and :term:`NEMSIO` input data. The UFS Weather Model -ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in -netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal and model -levels in the vertical. +The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `_. + +Common Community Physics Package +--------------------------------- + +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions.The SRW release includes an experimental physics version and an updated operational version. + +Data Format +-------------- + +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model +ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. + Post-processor ============== -The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the -workflow as a way to convert the netCDF output on the native model grid to GRIB2 format on -standard isobaric vertical coordinates. UPP can also be used to compute a variety of useful -diagnostic fields, as described in the `UPP user’s guide `_. +The Unified Post Processor (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF output on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP user’s guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing, e.g. statistical post-processing techniques. -Output from UPP can be used with visualization, plotting, and verification packages, or for -further downstream post-processing, e.g. statistical post-processing techniques. Visualization Example ===================== -A Python script is provided to create basic visualization of the model output. The script -is designed to output graphics in PNG format for 14 standard meteorological variables -when using the pre-defined CONUS domain. In addition, a difference plotting script is included -to visually compare two runs for the same domain and resolution. These scripts are provided only -as an example for users familiar with Python, and may be used to do a visual check to verify -that the application is producing reasonable results. - -The scripts are available in the `regional_workflow repository -`_ -under ush/Python. Usage information and instructions are described in +This SRW Application distribution provides Python scripts to create basic visualization of the model output. The scripts are available in the ```regional_workflow`` repository +`_ +under ``ush/Python``. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow @@ -115,6 +75,9 @@ Build System and Workflow The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. +.. + COMMENT: Define build system and workflow... + An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application: the UFS Weather Model and the pre- and post-processing software. Additional libraries (:term:`NCEPLIBS-external` and :term:`NCEPLIBS`) necessary @@ -138,12 +101,7 @@ This SRW Application release has been tested on a variety of platforms widely us researchers, such as the NOAA Research and Development High-Performance Computing Systems (RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne -system; NSSL’s HPC machine, Odin; the National Science Foundation Stampede2 system; and -generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support -`_ -have been defined for the SRW Application, including pre-configured (level 1), configurable -(level 2), limited test platforms (level 3), and build only platforms (level 4). Each -level is further described below. +system; the National Severe Storms Laboratory (NSSL) HPC machine, called Odin; the National Science Foundation (NSF) Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_have been defined for the SRW Application, including pre-configured (level 1), configurable (level 2), limited test platforms (level 3), and build only platforms (level 4). Each level is further described below. For the selected computational platforms that have been pre-configured (level 1), all the required libraries for building the SRW Application are available in a central place. That @@ -168,10 +126,9 @@ and configurable platforms can be found in the `SRW Application wiki page User Support, Documentation, and Contributing Development ========================================================= -A forum-based, online `support system `_ with topical sections +A forum-based, online `support system `_ organized by topic provides a centralized location for UFS users and developers to post questions and exchange -information. The forum complements the formal, written documentation, summarized here for ease of -use. +information. A list of available documentation is shown in :numref:`Table %s `. @@ -210,11 +167,7 @@ A list of available documentation is shown in :numref:`Table %s ` need to be followed. +utilities, model code, and infrastructure. Issues can be posted in SRW-related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `. Future Direction ================ @@ -231,6 +184,8 @@ forecast implementations. Planned advancements include: In addition to the above list, other improvements will be addressed in future releases. +.. + COMMENT: How is a domain different from a grid? Can we say "user-defined grid," for example? Might be clearer. How to Use This Document ======================== @@ -251,3 +206,7 @@ UFS forum at https://forums.ufscommunity.org/. in scripts, names of files and directories. .. bibliography:: references.bib + + + + diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 662d4c92b8..4bcc217076 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -3,73 +3,137 @@ ==================== Workflow Quick Start ==================== -To build and run the out-of-the-box case of the UFS Short-Range Weather (SRW) Application the user -must get the source code for multiple components, including: the regional workflow, the UFS_UTILS -pre-processor utilities, the UFS Weather Model, and the Unified Post Processor (UPP). Once the UFS -SRW Application umbrella repository is cloned, obtaining the necessary external repositories is -simplified by the use of ``manage_externals``. The out-of-the-box case uses a predefined 25-km -CONUS grid (RRFS_CONUS_25km), the GFS version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and -FV3-based GFS raw external model data for initialization. + + + +This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (UFS) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (CONUS) grid (RRFS_CONUS_25km), the Global Forecast System (GFS) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and FV3-based GFS raw external model data for initialization. .. note:: - The steps described in this chapter are applicable to preconfigured (Level 1) machines where - all of the required libraries for building community releases of UFS models and applications - are available in a central place (i.e. the bundled libraries (NCEPLIBS) and third-party - libraries (NCEPLIBS-external) have both been built). The Level 1 platforms are listed `here - `_. - For more information on compiling NCEPLIBS-external and NCEPLIBS, please refer to the - NCEPLIBS-external `wiki `_. + The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. The various platform levels are listed `here `_. + + +.. _HPCstackInfo: + +Install the HPC-Stack +======================== + +.. note:: + Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). +.. include:: ../../../hpc-stack/docs/source/hpc-intro.rst + + +After completing installation, continue to the :ref:`next section <_DownloadCode>`. + + +.. _DownloadCode: + Download the UFS SRW Application Code ===================================== -The necessary source code is publicly available on GitHub. To clone the release branch of the repository: +The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under your ``regional_workflow`` and ``src`` directories. + +Run the UFS SRW in a Singularity Container +------------------------------------------- + +Pull the Singularity container: .. code-block:: console - git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git - cd ufs-srweather-app + singularity pull ubuntu20.04-hpc-stack-0.1.sif docker://noaaepic/ubuntu20.04-hpc-stack:0.1 + +Build the container and make a ``home`` directory inside it: + +.. code-block:: console + + singularity build --sandbox ubuntu20.04-hpc-stack-0.1 ubuntu20.04-hpc-stack-0.1.sif + cd ubuntu20.04-hpc-stack-0.1 + mkdir home + +Start the container and run an interactive shell within it. This command also binds the local home directory to the container so that data can be shared between them. + +.. code-block:: console + + singularity shell -e --writable --bind /home:/home ubuntu20.04-hpc-stack + +Clone the develop branch of the UFS-SRW weather application repository: + +.. code-block:: console + + git clone https://github.com/jkbk2004/ufs-srweather-app -Then, check out the submodules for the SRW application: +Check out submodules for the SRW Application: .. code-block:: console + cd ufs-srweather-app ./manage_externals/checkout_externals -The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory -and will clone the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code -into the appropriate directories under your ``regional_workflow`` and ``src`` directories. + +Run the UFS SRW Without a Container +------------------------------------ + +Clone the release branch of the repository: + +.. code-block:: console + + git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git + +Then, check out the submodules for the SRW Application: + +.. code-block:: console + + cd ufs-srweather-app + ./manage_externals/checkout_externals .. _SetUpBuild: Set up the Build Environment ============================ -Instructions for loading the proper modules and/or setting the correct environment variables can be -found in the ``env/`` directory in files named ``build__.env``. -The commands in these files can be directly copy-pasted to the command line or the file can be sourced. -You may need to modify certain variables such as the path to NCEP libraries for your individual platform, -or use ``setenv`` rather than ``export`` depending on your environment: + +Container Approach +-------------------- +If the SRW Application has been built in an EPIC-provided Singularity container, set build environments and modules within the `ufs-srweather-app` directory as follows: .. code-block:: console - $ ls -l env/ - -rw-rw-r-- 1 user ral 1062 Apr 27 10:09 build_cheyenne_gnu.env - -rw-rw-r-- 1 user ral 1061 Apr 27 10:09 build_cheyenne_intel.env - -rw-rw-r-- 1 user ral 1023 Apr 27 10:09 build_hera_intel.env - -rw-rw-r-- 1 user ral 1017 Apr 27 10:09 build_jet_intel.env + ln -s /usr/bin/python3 /usr/bin/python + source /usr/share/lmod/6.6/init/profile + module use /opt/hpc-modules/modulefiles/stack + module load hpc hpc-gnu hpc-openmpi hpc-python + module load netcdf hdf5 bacio sfcio sigio nemsio w3emc esmf fms crtm g2 png zlib g2tmpl ip sp w3nco cmake gfsio wgrib2 upp + + +On Other Systems (Non-Container Approach) +------------------------------------------ + +Otherwise, for Level 1 and 2 systems, scripts for loading the proper modules and/or setting the +correct environment variables can be found in the ``env/`` directory of the SRW App in files named +``build__.env``. The commands in these files can be directly copy-pasted +to the command line, or the file can be sourced from the ufs-srweather-app ``env/`` directory. +For example, on Hera, run ``source env/build_hera_intel.env`` from the main ufs-srweather-app +directory to source the appropriate file. + +On Level 3-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. On systems without Lmod, this process will typically involve commands in the form `export =`. You may need to use ``setenv`` rather than ``export`` depending on your environment. + +Troubleshooting +------------------ +* If the system cannot find a module (i.e., a "module unknown" message appears), check whether the module version numbers match in ``ufs-srweather-app/env/build__.env`` and the ``hpc-stack/stack/stack_custom.yaml``. + Build the Executables ===================== -Build the executables as follows: + +Create a directory to hold the build's executables: .. code-block:: console mkdir build cd build -Run ``cmake`` to set up the ``Makefile``, then run ``make``: +From the build directory, run the ``cmake`` command below to set up the ``Makefile``, then run the ``make`` command to build the executables: .. code-block:: console @@ -78,53 +142,52 @@ Run ``cmake`` to set up the ``Makefile``, then run ``make``: Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``NEMS.exe`` and eleven -pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory which are +pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. -Generate the Workflow Experiment -================================ -Generating the workflow experiment requires three steps: +Download and Stage the Data +============================ + +The SRW requires input files to run. These include static datasets, initial and boundary conditions +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are +already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :doc:`Input and Output Files `, Section 3. Section 1 contains useful background information on the input files required by the SRW. + -* Set experiment parameters in config.sh +Generate the Forecast Experiment +================================= +Generating the forecast experiment requires three steps: + +* Set experiment parameters * Set Python and other environment parameters -* Run the ``generate_FV3LAM_wflow.sh`` script +* Run the ``generate_FV3LAM_wflow.sh`` script to generate the experiment workflow The first two steps depend on the platform being used and are described here for each Level 1 platform. +Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. .. _SetUpConfigFile: -Set up ``config.sh`` file +Set Experiment Parameters ------------------------- -The workflow requires a file called ``config.sh`` to specify the values of your experiment parameters. -Two example templates are provided: ``config.community.sh`` and ``config.nco.sh`` and can be found in -the ``ufs-srweather-app/regional_workflow/ush directory``. The first file is a minimal example for -creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``), -while the second is an example of creating and running an experiment in the *NCO* (operational) mode -(with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be -fully supported for this release while the operational mode will be more exclusively used by NOAA/NCEP -Central Operations (NCO) and those in the NOAA/NCEP/Environmental Modeling Center (EMC) working with -NCO on pre-implementation testing. Sample config.sh files are discussed in this section for Level 1 platforms. - -Make a copy of ``config.community.sh`` to get started (under /path-to-ufs-srweather-app/regional_workflow/ush): +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. + +Make a copy of ``config.community.sh`` to get started (under /regional_workflow/ush): .. code-block:: console cd ../regional_workflow/ush cp config.community.sh config.sh -Edit the ``config.sh`` file to set the machine you are running on to ``MACHINE``, use an account you can charge for -``ACCOUNT``, and set the name of the experiment with ``EXPT_SUBDIR``. If you have access to the NOAA HPSS from the -machine you are running on, those changes should be sufficient; however, if that is not the case (for example, -on Cheyenne), or if you have pre-staged the initialization data you would like to use, you will also want to set -``USE_USER_STAGED_EXTRN_FILES="TRUE"`` and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and -``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. +The default settings in this file include a predefined 25-km CONUS grid (RRFS_CONUS_25km), the GFS v15.2 physics suite (FV3_GFS_v15p2 CCPP), and FV3-based GFS raw external model data for initialization. + +Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. + +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :doc:`Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in the section on :doc:`Limited Area Model (LAM) Grids `. .. note:: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will - have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. + If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. -At a minimum, the following parameters should be set for the machine you are using: +Minimum parameter settings for Level 1 machines: For Cheyenne: @@ -169,8 +232,7 @@ For Gaea: ACCOUNT="my_account" EXPT_SUBDIR="my_expt_name" -For WCOSS, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS -project code for the account parameter: +For WCOSS, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: .. code-block:: console @@ -182,37 +244,28 @@ project code for the account parameter: Set up the Python and other Environment Parameters -------------------------------------------------- -Next, it is necessary to load the appropriate Python environment for the workflow. -The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. -This Python environment has already been set up on Level 1 platforms, and can be activated in -the following way (when in /path-to-ufs-srweather-app/regional_workflow/ush): +Next, it is necessary to load the appropriate Python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): .. code-block:: console source ../../env/wflow_.env + Run the ``generate_FV3LAM_wflow.sh`` script ------------------------------------------- -For all platforms, the workflow can then be generated with the command: +For all platforms, the workflow can then be generated from the ``ush`` directory with the command: .. code-block:: console ./generate_FV3LAM_wflow.sh -The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A -log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in -``$EXPTDIR``. The settings for these paths can be found in the output from the -``./generate_FV3LAM_wflow.sh`` script. +The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in ``$EXPTDIR``. The settings for these paths can be found in the output from the ``./generate_FV3LAM_wflow.sh`` script. Run the Workflow Using Rocoto ============================= -The information in this section assumes that Rocoto is available on the desired platform. -If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts -described in :numref:`Section %s `. There are two ways you can run -the workflow with Rocoto using either the ``./launch_FV3LAM_wflow.sh`` or by hand. +The information in this section assumes that Rocoto is available on the desired platform. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two ways you can run the workflow with Rocoto using either the ``./launch_FV3LAM_wflow.sh`` or by hand. -An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login -shell is bash, it can be set as follows: +An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: .. code-block:: console @@ -231,8 +284,7 @@ To run Rocoto using the script: cd $EXPTDIR ./launch_FV3LAM_wflow.sh -Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named -``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. +Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named ``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. Or to manually call Rocoto: @@ -325,5 +377,4 @@ The workflow run is completed when all tasks have “SUCCEEDED”, and the rocot Plot the Output =============== -Two python scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 output. Information -on how to generate the graphics can be found in :numref:`Chapter %s `. +Two python scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index 5b5b02ea4c..8e43ea2c75 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -17,6 +17,7 @@ sys.path.insert(0, os.path.abspath('.')) + # -- Project information ----------------------------------------------------- project = 'UFS Short-Range Weather App Users Guide' @@ -50,7 +51,8 @@ 'sphinx.ext.viewcode', 'sphinx.ext.githubpages', 'sphinx.ext.napoleon', - 'sphinxcontrib.bibtex' + 'sphinxcontrib.bibtex', + 'myst_parser' ] bibtex_bibfiles = ['references.bib'] diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index f26e03482c..d9ffc68370 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -6,17 +6,22 @@ UFS Short-Range Weather App Users Guide ======================================= +.. index:: + .. toctree:: :numbered: :maxdepth: 3 + Introduction Quickstart CodeReposAndDirs SRWAppOverview + Components + Include-HPCInstall + InputOutputFiles ConfigWorkflow LAMGrids - InputOutputFiles ConfigNewPlatform WE2Etests Graphics From a907487431c0a984c9fd0df3fe691492d2f70c1d Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 14:26:23 -0500 Subject: [PATCH 002/118] added git submodule --- .gitmodules | 3 +++ hpc-stack-mod | 1 + 2 files changed, 4 insertions(+) create mode 100644 .gitmodules create mode 160000 hpc-stack-mod diff --git a/.gitmodules b/.gitmodules new file mode 100644 index 0000000000..d115eb5c82 --- /dev/null +++ b/.gitmodules @@ -0,0 +1,3 @@ +[submodule "hpc-stack-mod"] + path = hpc-stack-mod + url = https://github.com/gspetro-NOAA/hpc-stack.git diff --git a/hpc-stack-mod b/hpc-stack-mod new file mode 160000 index 0000000000..7888f1aecf --- /dev/null +++ b/hpc-stack-mod @@ -0,0 +1 @@ +Subproject commit 7888f1aecfb655e957dd6ce0532dfd95f330fc96 From b628dd6748d31d577e0250bd9fd66f61f862caeb Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 14:55:58 -0500 Subject: [PATCH 003/118] fix formatting --- docs/UsersGuide/source/ConfigNewPlatform.rst | 2 +- docs/UsersGuide/source/Quickstart.rst | 2 +- docs/UsersGuide/source/SRWAppOverview.rst | 8 ++++---- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index 381ffb98cb..ce2475026a 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -213,7 +213,7 @@ Once the data has been staged, setting up your experiment on a platform without ``LAYOUT_X=2`` ``LAYOUT_Y=2`` - These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. + These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. ``RUN_CMD_UTILS="mpirun -np 4"`` This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 4bcc217076..523a3f1399 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -25,7 +25,7 @@ Install the HPC-Stack .. include:: ../../../hpc-stack/docs/source/hpc-intro.rst -After completing installation, continue to the :ref:`next section <_DownloadCode>`. +After completing installation, continue to the :ref:`next section `. .. _DownloadCode: diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 5f8cf98780..61d9c057f5 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -186,12 +186,12 @@ can be found in :numref:`Chapter %s `. +----------------------+-------------------+--------------------------------+ Case-specific Configuration -=========================== +============================= .. _DefaultConfigSection: Default configuration: ``config_defaults.sh`` --------------------------------------------- +------------------------------------------------ When generating a new experiment (described in detail in :numref:`Section %s `), the ``config_defaults.sh`` file is read first and assigns default values to the experiment parameters. Important configuration variables in the ``config_defaults.sh`` file are shown in @@ -408,7 +408,7 @@ Generating a Regional Workflow Experiment ========================================= Steps to a Generate a New Experiment ----------------------------------- +---------------------------------------- Generating an experiment requires running .. code-block:: console @@ -454,7 +454,7 @@ when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own so ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, -delete these two *.db files and then call the launch script repeatedly for each task. +delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksFig: From 467071f359093ef2c0042a13741b783d5e9a9a82 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 15:25:20 -0500 Subject: [PATCH 004/118] added new submodule commits --- docs/UsersGuide/source/ConfigNewPlatform.rst | 3 ++- docs/UsersGuide/source/Include-HPCInstall.rst | 9 +++++---- docs/UsersGuide/source/Quickstart.rst | 2 +- hpc-stack-mod | 2 +- 4 files changed, 9 insertions(+), 7 deletions(-) diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index ce2475026a..798a287b29 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -212,8 +212,9 @@ Once the data has been staged, setting up your experiment on a platform without These are the two ``MACHINE`` settings for generic, non-Rocoto-based platforms; you should choose the one most appropriate for your machine. ``MACOS`` has its own setting due to some differences in how command-line utilities function on Darwin-based operating systems. ``LAYOUT_X=2`` + ``LAYOUT_Y=2`` - These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. + These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. ``RUN_CMD_UTILS="mpirun -np 4"`` This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable. diff --git a/docs/UsersGuide/source/Include-HPCInstall.rst b/docs/UsersGuide/source/Include-HPCInstall.rst index 4fa2a9989b..097519db2d 100644 --- a/docs/UsersGuide/source/Include-HPCInstall.rst +++ b/docs/UsersGuide/source/Include-HPCInstall.rst @@ -1,6 +1,7 @@ .. _InstallHPCstack: -====================== -Install the HPC-Stack -====================== -.. include:: ../../../docs/source/hpc-install.rst \ No newline at end of file +.. include:: ../../../hpc-stack-mod/docs/source/hpc-install.rst + +.. include:: ../../../hpc-stack-mod/docs/source/hpc-prereqs.rst +.. include:: ../../../hpc-stack-mod/docs/source/hpc-parameters.rst +.. include:: ../../../hpc-stack-mod/docs/source/hpc-components.rst \ No newline at end of file diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 523a3f1399..e2c935f2f0 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -22,7 +22,7 @@ Install the HPC-Stack Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). -.. include:: ../../../hpc-stack/docs/source/hpc-intro.rst +.. include:: ../../../hpc-stack-mod/docs/source/hpc-intro.rst After completing installation, continue to the :ref:`next section `. diff --git a/hpc-stack-mod b/hpc-stack-mod index 7888f1aecf..055cab8ec5 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 7888f1aecfb655e957dd6ce0532dfd95f330fc96 +Subproject commit 055cab8ec5f7f87413e73185663215b0d151fe3e From de00e4c80bd4443998ce0beb2e863b2943e3bb61 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 15:34:50 -0500 Subject: [PATCH 005/118] fixed ref links --- docs/UsersGuide/source/Introduction.rst | 2 +- docs/UsersGuide/source/Quickstart.rst | 2 +- hpc-stack-mod | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 363bb745c5..cd5807d4c7 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -99,7 +99,7 @@ forecast model, and post-processing steps. This SRW Application release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems -(RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational +(RDHPCS), including Hera, Orion, and Jet; NOAA's Weather and Climate Operational Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne system; the National Severe Storms Laboratory (NSSL) HPC machine, called Odin; the National Science Foundation (NSF) Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_have been defined for the SRW Application, including pre-configured (level 1), configurable (level 2), limited test platforms (level 3), and build only platforms (level 4). Each level is further described below. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index e2c935f2f0..59826f5609 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -22,7 +22,7 @@ Install the HPC-Stack Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). -.. include:: ../../../hpc-stack-mod/docs/source/hpc-intro.rst +.. include:: ../../../hpc-stack-mod/docs/source/hpc-intro-text.rst After completing installation, continue to the :ref:`next section `. diff --git a/hpc-stack-mod b/hpc-stack-mod index 055cab8ec5..b038f39569 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 055cab8ec5f7f87413e73185663215b0d151fe3e +Subproject commit b038f395699af83c533c61b24563e281ee082b28 From fb341006f6ca3d9e5365831839349cc32bc9af70 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Feb 2022 17:33:43 -0500 Subject: [PATCH 006/118] finished Intro --- docs/UsersGuide/source/Components.rst | 67 +------------- docs/UsersGuide/source/Introduction.rst | 117 ++++++------------------ docs/UsersGuide/source/Quickstart.rst | 1 + 3 files changed, 30 insertions(+), 155 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 26822fe341..a0bb57c447 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -10,7 +10,6 @@ The SRW Application v2.0 release assembles a variety of components, including: * Post-Processor * Visualization Example * Build System and Workflow -* User Support, Documentation, and Contributing Development These components are documented within this User's Guide and supported through a `community forum `_. @@ -97,6 +96,9 @@ The scripts are available in the `regional_workflow repository under ush/Python. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. +.. + COMMENT: only after running manage_externals/checkout_externals + Build System and Workflow ========================= @@ -152,66 +154,3 @@ the developers have built the code but little or no pre-release testing has been respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page `_. - -User Support, Documentation, and Contributing Development -========================================================= - -A forum-based, online `support system `_ organized by topic -provides a centralized location for UFS users and developers to post questions and exchange -information. - -A list of available documentation is shown in :numref:`Table %s `. - -.. _list_of_documentation: - -.. table:: Centralized list of documentation - - +----------------------------+---------------------------------------------------------------------------------+ - | **Documentation** | **Location** | - +============================+=================================================================================+ - | UFS SRW Application v1.0 | https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.0 | - | User's Guide | | - +----------------------------+---------------------------------------------------------------------------------+ - | UFS_UTILS v2.0 User's | https://noaa-emcufs-utils.readthedocs.io/en/ufs-v2.0.0/ | - | Guide | | - +----------------------------+---------------------------------------------------------------------------------+ - | UFS Weather Model v2.0 | https://ufs-weather-model.readthedocs.io/en/ufs-v2.0.0 | - | User's Guide | | - +----------------------------+---------------------------------------------------------------------------------+ - | NCEPLIBS Documentation | https://github.com/NOAA-EMC/NCEPLIBS/wiki | - +----------------------------+---------------------------------------------------------------------------------+ - | NCEPLIBS-external | https://github.com/NOAA-EMC/NCEPLIBS-external/wiki | - | Documentation | | - +----------------------------+---------------------------------------------------------------------------------+ - | FV3 Documentation | https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html | - +----------------------------+---------------------------------------------------------------------------------+ - | CCPP Scientific | https://dtcenter.ucar.edu/GMTB/v5.0.0/sci_doc/index.html | - | Documentation | | - +----------------------------+---------------------------------------------------------------------------------+ - | CCPP Technical | https://ccpp-techdoc.readthedocs.io/en/v5.0.0/ | - | Documentation | | - +----------------------------+---------------------------------------------------------------------------------+ - | ESMF manual | http://earthsystemmodeling.org/docs/release/ESMF_8_0_0/ESMF_usrdoc/ | - +----------------------------+---------------------------------------------------------------------------------+ - | Unified Post Processor | https://upp.readthedocs.io/en/upp-v9.0.0/ | - +----------------------------+---------------------------------------------------------------------------------+ - -The UFS community is encouraged to contribute to the development effort of all related -utilities, model code, and infrastructure. Issues can be posted in SRW-related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `. - -Future Direction -================ - -Users can expect to see incremental improvements and additional capabilities in upcoming -releases of the SRW Application to enhance research opportunities and support operational -forecast implementations. Planned advancements include: - -* A more extensive set of supported developmental physics suites. -* A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. -* Inclusion of data assimilation, cycling, and ensemble capabilities. -* A verification package (i.e., METplus) integrated into the workflow. -* Inclusion of stochastic perturbation techniques. - -In addition to the above list, other improvements will be addressed in future releases. - -.. bibliography:: references.bib diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index cd5807d4c7..ae50c60ef1 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -13,13 +13,24 @@ conducted with the App: UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 -.. - COMMENT: What will be the numbering for this release? It will need to be changed above and throughout the docs. - COMMENT: Can this app be used beyond the CONUS? Change "limited spatial domain" above to CONUS or something more specific. - COMMENT: Where are we on the DA and verification package? Can we update that line? - COMMENT: Update citation date & version number. What is Zenodo? - COMMENT: Disagree: "This documentation provides... information on where to find more information and obtain support." We need to add this (i.e. link in the docs) or remove the line. - COMMENT: "Components" doc created but not added to TOC. Contains more detailed info on components discussed here below. +How to Use This Document +======================== + +This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the UFS forum at https://forums.ufscommunity.org/. + +.. code-block:: console + + Throughout the guide, this presentation style indicates shell commands and options, code examples, etc. + +.. note:: + + Variables presented as ``AaBbCc123`` in this document typically refer to variables + in scripts, names of files, and directories. + +.. note:: + + File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). + Pre-processor Utilities and Initial Conditions ============================================== @@ -28,12 +39,6 @@ The SRW Application includes a number of pre-processing utilities that initializ model. Tasks include generating a regional grid, along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. -.. - COMMENT: "Integration" with what?!?! (1st sentence) --> Try "prepare the model data" instead of "prepare the model for integration." - COMMENT: Deleted code/commands bc it's an introduction. - COMMENT: What is a "halo shave point" or wide-halo grid?! - - Forecast Model ============== @@ -64,71 +69,21 @@ The Unified Post Processor (:term:`UPP`) is included in the SRW Application work Visualization Example ===================== -This SRW Application distribution provides Python scripts to create basic visualization of the model output. The scripts are available in the ```regional_workflow`` repository -`_ -under ``ush/Python``. Usage information and instructions are described in -:numref:`Chapter %s ` and are also included at the top of the script. + +This SRW Application distribution provides Python scripts to create basic visualizations of the model output. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow ========================= -The SRW Application has a portable build system and a user-friendly, modular, and -expandable workflow framework. - -.. - COMMENT: Define build system and workflow... - -An umbrella CMake-based build system is used for building the components necessary -for running the end-to-end SRW Application: the UFS Weather Model and the pre- and -post-processing software. Additional libraries (:term:`NCEPLIBS-external` and :term:`NCEPLIBS`) necessary -for the application are not included in the SRW Application build system, but are available -pre-built on pre-configured platforms. There is a small set of system libraries and utilities -that are assumed to be present on the target computer: the CMake build software, a Fortran, -C, and C++ compiler, and MPI library. - -Once built, the provided experiment generator script can be used to create a Rocoto-based -workflow file that will run each task in the system (see `Rocoto documentation -`_) in the proper sequence. -If Rocoto and/or a batch system is not present on the available platform, the individual -components can be run in a stand-alone, command line fashion with provided run scripts. The -generated namelist for the atmospheric model can be modified in order to vary settings such -as forecast starting and ending dates, forecast length hours, the CCPP physics suite, -integration time step, history file output frequency, and more. It also allows for configuration -of other elements of the workflow; for example, whether to run some or all of the pre-processing, -forecast model, and post-processing steps. - -This SRW Application release has been tested on a variety of platforms widely used by -researchers, such as the NOAA Research and Development High-Performance Computing Systems -(RDHPCS), including Hera, Orion, and Jet; NOAA's Weather and Climate Operational -Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne -system; the National Severe Storms Laboratory (NSSL) HPC machine, called Odin; the National Science Foundation (NSF) Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_have been defined for the SRW Application, including pre-configured (level 1), configurable (level 2), limited test platforms (level 3), and build only platforms (level 4). Each level is further described below. - -For the selected computational platforms that have been pre-configured (level 1), all the -required libraries for building the SRW Application are available in a central place. That -means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both -been built. The SRW Application is expected to build and run out of the box on these -pre-configured platforms and users can proceed directly to the using the workflow, as -described in the Quick Start (:numref:`Chapter %s `). - -A few additional computational platforms are considered configurable for the SRW -Application release. Configurable platforms (level 2) are platforms where all of -the required libraries for building the SRW Application are expected to install successfully, -but are not available in a central place. Applications and models are expected to build -and run once the required bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) -are built. - -Limited-Test (level 3) and Build-Only (level 4) computational platforms are those in which -the developers have built the code but little or no pre-release testing has been conducted, -respectively. A complete description of the levels of support, along with a list of preconfigured -and configurable platforms can be found in the `SRW Application wiki page -`_. +The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `_). Individual components can also be run in a stand-alone, command line fashion. The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. + +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Comuting (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow, as +described in the Quick Start Guide (:numref:`Section %s <_GenerateForecast>`). On other platforms, the required libraries will need to be installed via the HPC-Stack. Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. User Support, Documentation, and Contributing Development ========================================================= -A forum-based, online `support system `_ organized by topic -provides a centralized location for UFS users and developers to post questions and exchange -information. +A forum-based, online `support system `_ organized by topic provides a centralized location for UFS users and developers to post questions and exchange information. A list of available documentation is shown in :numref:`Table %s `. @@ -167,7 +122,7 @@ A list of available documentation is shown in :numref:`Table %s `. +utilities, model code, and infrastructure. Issues can be posted in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `. Future Direction ================ @@ -184,26 +139,6 @@ forecast implementations. Planned advancements include: In addition to the above list, other improvements will be addressed in future releases. -.. - COMMENT: How is a domain different from a grid? Can we say "user-defined grid," for example? Might be clearer. - -How to Use This Document -======================== - -This guide instructs both novice and experienced users on downloading, -building and running the SRW Application. Please post questions in the -UFS forum at https://forums.ufscommunity.org/. - -.. code-block:: console - - Throughout the guide, this presentation style indicates shell - commands and options, code examples, etc. - - -.. note:: - - Variables presented as ``AaBbCc123`` in this document typically refer to variables - in scripts, names of files and directories. .. bibliography:: references.bib diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 59826f5609..e6e37c8c1a 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -152,6 +152,7 @@ The SRW requires input files to run. These include static datasets, initial and files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :doc:`Input and Output Files `, Section 3. Section 1 contains useful background information on the input files required by the SRW. +.. _GenerateForecast: Generate the Forecast Experiment ================================= From 701f9e9f2765d465d78d2647320d20ac9ac64864 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 14 Feb 2022 13:47:35 -0500 Subject: [PATCH 007/118] finish Components & Intro edits --- docs/UsersGuide/source/Components.rst | 121 ++++++------------------ docs/UsersGuide/source/Glossary.rst | 21 ++-- docs/UsersGuide/source/Introduction.rst | 7 +- 3 files changed, 42 insertions(+), 107 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index a0bb57c447..cf5180ba40 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -18,66 +18,37 @@ Pre-processor Utilities and Initial Conditions ============================================== The SRW Application includes a number of pre-processing utilities that initialize and prepare the -model. Tasks include generating a regional grid, along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. - -The SRW Application includes a number of pre-processing utilities to initialize and prepare the -model. For the limited area model (LAM), it is necessary to first generate a -regional grid ``regional_esg_grid/make_hgrid`` along with orography ``orog`` and surface climatology ``sfc_climo_gen`` files on that grid. There are additional utilities included to handle the correct number of halo ``shave`` points and topography filtering ``filter_topo``. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format, needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. - -.. - COMMENT: "Integration" with what?!?! (1st sentence) --> Try "prepare the model data" instead of "prepare the model for integration." - COMMENT: Why are we using code/commands in an overview doc?! A newbie is going to glaze over and give up. +model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid ``regional_esg_grid/make_hgrid`` along with orography ``orog`` and surface climatology ``sfc_climo_gen`` files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format, needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. The SRW Application can be initialized from a range of operational initial condition files. It is -possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (GRIB2) format and GFS in NEMSIO format for past dates. +possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. .. WARNING:: - Please note, for GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. + For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. Forecast Model ============== The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:`BlackEtAl2020`. -The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `_. - -Supported model resolutions in this release include a 3-, 13-, and 25-km predefined Contiguous -U.S. (CONUS) domain, all with 64 vertical levels. Preliminary tools for users to define their -own domain are also available in the release with full, formal support of these tools to be -provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, -which features relatively uniform grid cells across the entirety of the domain. Additional -information about the FV3 dynamical core can be found `here -`_ and on the `NOAA Geophysical -Fluid Dynamics Laboratory website `_. - -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) -Land Surface Model options, are supported through the Common Community Physics Package -(:term:`CCPP`; described `here `_). -Atmospheric physics are a set of numerical methods describing small-scale processes such -as clouds, turbulence, radiation, and their interactions. There are two physics options -supported for the release. The first is an experimental physics suite being tested for use -in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned -for 2023-2024, and the second is an updated version of the physics suite used in the operational -Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and -suites can be found in the `CCPP Scientific Documentation `_, -and CCPP technical aspects are described in the `CCPP Technical Documentation -`_. The model namelist has many settings -beyond the physics options that can optimize various aspects of the model for use with each +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `_. + +Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `_ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. + +Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`; described `here `_).Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. -The SRW App supports the use of both GRIB2 and :term:`NEMSIO` input data. The UFS Weather Model +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in -netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal and model -levels in the vertical. +netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. Post-processor ============== The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the -workflow as a way to convert the netCDF output on the native model grid to GRIB2 format on +workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. UPP can also be used to compute a variety of useful -diagnostic fields, as described in the `UPP user’s guide `_. +diagnostic fields, as described in the `UPP User’s Guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing, e.g. statistical post-processing techniques. @@ -86,18 +57,12 @@ Visualization Example ===================== A Python script is provided to create basic visualization of the model output. The script is designed to output graphics in PNG format for 14 standard meteorological variables -when using the pre-defined CONUS domain. In addition, a difference plotting script is included +when using the pre-defined :term:`CONUS` domain. In addition, a difference plotting script is included to visually compare two runs for the same domain and resolution. These scripts are provided only -as an example for users familiar with Python, and may be used to do a visual check to verify +as an example for users familiar with Python and may be used to do a visual check to verify that the application is producing reasonable results. -The scripts are available in the `regional_workflow repository -`_ -under ush/Python. Usage information and instructions are described in -:numref:`Chapter %s ` and are also included at the top of the script. - -.. - COMMENT: only after running manage_externals/checkout_externals +After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the `regional_workflow repository `_ under ush/Python. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow ========================= @@ -105,52 +70,28 @@ Build System and Workflow The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. -An umbrella CMake-based build system is used for building the components necessary -for running the end-to-end SRW Application: the UFS Weather Model and the pre- and -post-processing software. Additional libraries (:term:`NCEPLIBS-external` and :term:`NCEPLIBS`) necessary -for the application are not included in the SRW Application build system, but are available -pre-built on pre-configured platforms. There is a small set of system libraries and utilities -that are assumed to be present on the target computer: the CMake build software, a Fortran, -C, and C++ compiler, and MPI library. +An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application: the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack. There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, +C, and C++ compiler, and an MPI library. Once built, the provided experiment generator script can be used to create a Rocoto-based -workflow file that will run each task in the system (see `Rocoto documentation -`_) in the proper sequence. -If Rocoto and/or a batch system is not present on the available platform, the individual -components can be run in a stand-alone, command line fashion with provided run scripts. The -generated namelist for the atmospheric model can be modified in order to vary settings such -as forecast starting and ending dates, forecast length hours, the CCPP physics suite, -integration time step, history file output frequency, and more. It also allows for configuration -of other elements of the workflow; for example, whether to run some or all of the pre-processing, -forecast model, and post-processing steps. +workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation +`_). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. This SRW Application release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems (RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne -system; NSSL’s HPC machine, Odin; the National Science Foundation Stampede2 system; and -generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support -`_ -have been defined for the SRW Application, including pre-configured (level 1), configurable -(level 2), limited test platforms (level 3), and build only platforms (level 4). Each -level is further described below. - -For the selected computational platforms that have been pre-configured (level 1), all the +system; the National Severe Storms Laboratory (NSSL) HPC machine called Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. + +For the selected computational platforms that have been pre-configured (Level 1), all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both -been built. The SRW Application is expected to build and run out of the box on these -pre-configured platforms and users can proceed directly to the using the workflow, as -described in the Quick Start (:numref:`Chapter %s `). - -A few additional computational platforms are considered configurable for the SRW -Application release. Configurable platforms (level 2) are platforms where all of -the required libraries for building the SRW Application are expected to install successfully, -but are not available in a central place. Applications and models are expected to build -and run once the required bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) -are built. - -Limited-Test (level 3) and Build-Only (level 4) computational platforms are those in which -the developers have built the code but little or no pre-release testing has been conducted, -respectively. A complete description of the levels of support, along with a list of preconfigured -and configurable platforms can be found in the `SRW Application wiki page -`_. +been built. The SRW Application is expected to build and run out-of-the-box on these +pre-configured platforms, and users can proceed directly to the using the workflow, as +described in the Quick Start (:numref:`Section %s `). + +A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. + +Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page `_. + +.. bibliography:: references.bib \ No newline at end of file diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index d3deb40672..856692c801 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -7,9 +7,7 @@ Glossary .. glossary:: CCPP - A forecast-model agnostic, vetted collection of codes containing atmospheric physical - parameterizations and suites of parameterizations for use in Numerical Weather Prediction - (NWP) along with a framework that connects the physics to the host forecast model. + `Common Community Physics Package `_. A forecast-model agnostic, vetted collection of codes containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. CONUS Continental United States @@ -18,12 +16,6 @@ Glossary The preprocessing software used to create initial and boundary condition files to “coldstart” the forecast model. - HRRR - `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. - - .. - COMMENT: Clarify HRRR definition! - FV3 The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL), it is a scalable and flexible dycore capable of both @@ -36,24 +28,27 @@ Glossary GRIB2 The second version of the World Meterological Organization's (WMO) standard for distributing gridded data. + HRRR + `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. + NAM `North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes. NCEP National Centers for Environmental Prediction, an arm of the National Weather Service, - consisting of nine centers. More information can be found at https://www.ncep.noaa.gov. + consisting of nine centers. More information can be found at https://www.ncep.noaa.gov. NCEPLIBS The software libraries created and maintained by :term:`NCEP` that are required for running - :term:`chgres_cube`, the UFS Weather Model, and :term:`UPP`. + :term:`chgres_cube`, the UFS Weather Model, and :term:`UPP`. They are part of the HPC-Stack. NCEPLIBS-external A collection of third-party libraries required to build :term:`NCEPLIBS`, :term:`chgres_cube`, - the UFS Weather Model, and :term:`UPP`. + the UFS Weather Model, and :term:`UPP`. They are part of the HPC-Stack. NCL An interpreted programming language designed specifically for scientific data analysis and - visualization. More information can be found at https://www.ncl.ucar.edu. + visualization. Stands for NCAR Command Language. More information can be found at https://www.ncl.ucar.edu. NEMS The NOAA Environmental Modeling System is a common modeling framework whose purpose is diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index ae50c60ef1..1c834e47f4 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. -The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g. METplus). This documentation provides a quick start guide for running the application, in addition to an overview of the release components, a description of the supported capabilities, and information on where to find more information and obtain support. +The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g. METplus). This documentation provides a quick start guide for running the application, in addition to an overview of the release components, a description of the supported capabilities, and information on where to find more information and obtain support. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -64,7 +64,7 @@ ingests initial and lateral boundary condition files produced by :term:`chgres_c Post-processor ============== -The Unified Post Processor (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF output on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP user’s guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing, e.g. statistical post-processing techniques. +The Unified Post Processor (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF output on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing, e.g. statistical post-processing techniques. Visualization Example @@ -122,7 +122,7 @@ A list of available documentation is shown in :numref:`Table %s `. +utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `. Future Direction ================ @@ -139,7 +139,6 @@ forecast implementations. Planned advancements include: In addition to the above list, other improvements will be addressed in future releases. - .. bibliography:: references.bib From cab1c6f12ca855442af3135c24bf7d05416200e1 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 14 Feb 2022 17:16:13 -0500 Subject: [PATCH 008/118] edited Rocoto workflow section of Quickstart --- docs/UsersGuide/source/Quickstart.rst | 50 +++++++++++++++++++-------- 1 file changed, 35 insertions(+), 15 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index e6e37c8c1a..a214e52cb8 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -95,7 +95,7 @@ Set up the Build Environment Container Approach -------------------- -If the SRW Application has been built in an EPIC-provided Singularity container, set build environments and modules within the `ufs-srweather-app` directory as follows: +If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the `ufs-srweather-app` directory as follows: .. code-block:: console @@ -178,7 +178,7 @@ Make a copy of ``config.community.sh`` to get started (under `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. - + Minimum parameter settings for Level 1 machines: For Cheyenne: @@ -260,25 +260,31 @@ For all platforms, the workflow can then be generated from the ``ush`` directory ./generate_FV3LAM_wflow.sh -The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in ``$EXPTDIR``. The settings for these paths can be found in the output from the ``./generate_FV3LAM_wflow.sh`` script. +This script creates an experiment directory and populates it will all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in ``$EXPTDIR``. The settings for these paths can be found in the output from the ``./generate_FV3LAM_wflow.sh`` script. + +The last line of output, which starts with ``*/1 * * * * ``, can be saved and used later to automatically run portions of the workflow. The ``1`` could be any number and simply refers to the frequency of the reruns. + Run the Workflow Using Rocoto ============================= The information in this section assumes that Rocoto is available on the desired platform. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two ways you can run the workflow with Rocoto using either the ``./launch_FV3LAM_wflow.sh`` or by hand. -An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: +An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: .. code-block:: console - export EXPTDIR=/path-to-experiment/directory + export EXPTDIR=// Or if the login shell is csh/tcsh, it can be set using: .. code-block:: console - setenv EXPTDIR /path-to-experiment/directory + setenv EXPTDIR // -To run Rocoto using the script: +Launch the Rocoto Workflow Using a Script +----------------------------------------------- + +To run Rocoto using the script provided: .. code-block:: console @@ -287,9 +293,10 @@ To run Rocoto using the script: Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named ``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. -Or to manually call Rocoto: +Launch the Rocoto Workflow Manually +--------------------------------------- -First load the Rocoto module, depending on the platform used. +Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can manually load Rocoto and any other required modules. The commands for specific Level 1 platforms are described here: For Cheyenne: @@ -337,8 +344,18 @@ For WCOSS_CRAY: module use -a /usrx/local/emc_rocoto/modulefiles module load rocoto/1.2.4 -Then manually call ``rocotorun`` to launch the tasks that have all dependencies satisfied -and ``rocotostat`` to monitor the progress: +For other systems, a variant on the following commands will be necessary: + +.. code-block:: console + + module use + module load rocoto + + +Run the Rocoto Workflow +--------------------------- + +After loading Rocoto, call ``rocotorun`` from the experiment directory to launch the workflow tasks. As the workflow progresses through its stages, ``rocotostat`` will show the state of the process and allow users to monitor progress: .. code-block:: console @@ -346,12 +363,15 @@ and ``rocotostat`` to monitor the progress: rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -For automatic resubmission of the workflow (e.g., every 3 minutes), the following line can be added -to the user's crontab (use ``crontab -e`` to edit the cron table). +Additional Options +---------------------- +For automatic resubmission of the workflow at regular intervals (e.g., every 3 minutes), the user can add a chrontab entry by entering the ``crontab -e`` command. The last line of output from the ``./generate_FV3LAM_wflow.sh``, which starts with ``*/1 * * * * ``, can be pasted into the crontab at this point. Alternatively, if users preferred to use the ``./launch_FV3LAM_wflow.sh`` to run the workflow, they can past the following command into the crontab: .. code-block:: console - */3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && ./launch_FV3LAM_wflow.sh + */3 * * * * cd && ./launch_FV3LAM_wflow.sh + +The number 3 can be changed to resubmit the workflow more or less frequently. .. note:: From 290364ec37ade1f004ca01badbc498e7d3b9b9b6 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 15 Feb 2022 11:42:11 -0500 Subject: [PATCH 009/118] added minor hpc submodule commits --- docs/UsersGuide/source/Quickstart.rst | 6 +++--- hpc-stack-mod | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index a214e52cb8..0f60ba8638 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -348,7 +348,7 @@ For other systems, a variant on the following commands will be necessary: .. code-block:: console - module use + module use module load rocoto @@ -365,13 +365,13 @@ After loading Rocoto, call ``rocotorun`` from the experiment directory to launch Additional Options ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every 3 minutes), the user can add a chrontab entry by entering the ``crontab -e`` command. The last line of output from the ``./generate_FV3LAM_wflow.sh``, which starts with ``*/1 * * * * ``, can be pasted into the crontab at this point. Alternatively, if users preferred to use the ``./launch_FV3LAM_wflow.sh`` to run the workflow, they can past the following command into the crontab: +For automatic resubmission of the workflow at regular intervals (e.g., every 3 minutes), the user can add a chrontab entry by entering the ``crontab -e`` command. This opens a crontab file. The last line of output from the ``./generate_FV3LAM_wflow.sh``, which starts with ``*/1 * * * * ``, can be pasted into the crontab file at this point. Alternatively, if users preferred to use the ``./launch_FV3LAM_wflow.sh`` to run the workflow, they can paste the following command into the crontab: .. code-block:: console */3 * * * * cd && ./launch_FV3LAM_wflow.sh -The number 3 can be changed to resubmit the workflow more or less frequently. +The number ``3`` can be changed to resubmit the workflow more or less frequently. .. note:: diff --git a/hpc-stack-mod b/hpc-stack-mod index b038f39569..9002252d5a 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit b038f395699af83c533c61b24563e281ee082b28 +Subproject commit 9002252d5a39f69ff084ad2e969862557281bef3 From 80291a1b232f9827a726538f6f2566b55a72cb01 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 15 Feb 2022 19:49:42 -0500 Subject: [PATCH 010/118] Updates to Rocoto Workflow in Quick Start --- docs/UsersGuide/source/Glossary.rst | 2 +- docs/UsersGuide/source/Quickstart.rst | 158 +++++++++++++++----------- docs/UsersGuide/source/conf.py | 1 + 3 files changed, 91 insertions(+), 70 deletions(-) diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 856692c801..4432edc884 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -7,7 +7,7 @@ Glossary .. glossary:: CCPP - `Common Community Physics Package `_. A forecast-model agnostic, vetted collection of codes containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of codes containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. CONUS Continental United States diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 0f60ba8638..00dfeb1278 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -4,13 +4,11 @@ Workflow Quick Start ==================== - - -This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (UFS) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (CONUS) grid (RRFS_CONUS_25km), the Global Forecast System (GFS) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and FV3-based GFS raw external model data for initialization. +This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. .. note:: - The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. The various platform levels are listed `here `_. + The UFS defines `four platform levels `_. The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. .. _HPCstackInfo: @@ -32,7 +30,7 @@ After completing installation, continue to the :ref:`next section Download the UFS SRW Application Code ===================================== -The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under your ``regional_workflow`` and ``src`` directories. +The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. Run the UFS SRW in a Singularity Container ------------------------------------------- @@ -63,6 +61,9 @@ Clone the develop branch of the UFS-SRW weather application repository: git clone https://github.com/jkbk2004/ufs-srweather-app +.. + COMMENT: This will need to be changed to release branch of the SRW repo once it exists. + Check out submodules for the SRW Application: .. code-block:: console @@ -80,7 +81,10 @@ Clone the release branch of the repository: git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git -Then, check out the submodules for the SRW Application: +.. + COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. + +Then, run the executable that pulls in the submodules for the SRW Application: .. code-block:: console @@ -180,7 +184,14 @@ Make a copy of ``config.community.sh`` to get started (under `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in the section on :doc:`Limited Area Model (LAM) Grids `. @@ -190,50 +201,30 @@ Sample settings are indicated below for Level 1 platforms. Detailed guidance app Minimum parameter settings for Level 1 machines: -For Cheyenne: +**Cheyenne:** .. code-block:: console MACHINE="cheyenne" - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" + ACCOUNT="" + EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" -For Hera: +**Hera:** .. code-block:: console MACHINE="hera" - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" - -For Jet: - -.. code-block:: console - - MACHINE="jet" - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" - -For Orion: - -.. code-block:: console - - MACHINE="orion" - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" - -For Gaea: + ACCOUNT="" + EXPT_SUBDIR="" -.. code-block:: console +**Jet, Orion, Gaea:** - MACHINE="gaea" - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" +The settings are the same as for Hera, except that ``"hera"`` should be switched to ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. -For WCOSS, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: +For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: .. code-block:: console @@ -241,33 +232,37 @@ For WCOSS, edit ``config.sh`` with these WCOSS-specific parameters, and use a va ACCOUNT="my_account" EXPT_SUBDIR="my_expt_name" + .. _SetUpPythonEnv: Set up the Python and other Environment Parameters -------------------------------------------------- -Next, it is necessary to load the appropriate Python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): +Next, load the appropriate Python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): .. code-block:: console source ../../env/wflow_.env -Run the ``generate_FV3LAM_wflow.sh`` script +.. _GenerateWorkflow:: + +Generate the Regional Workflow ------------------------------------------- -For all platforms, the workflow can then be generated from the ``ush`` directory with the command: +First, activate the regional workflow from the ``ush`` directory: .. code-block:: console - ./generate_FV3LAM_wflow.sh + conda activate regional_workflow -This script creates an experiment directory and populates it will all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in ``$EXPTDIR``. The settings for these paths can be found in the output from the ``./generate_FV3LAM_wflow.sh`` script. +Then, run the following command to generate the workflow: + +.. code-block:: console -The last line of output, which starts with ``*/1 * * * * ``, can be saved and used later to automatically run portions of the workflow. The ``1`` could be any number and simply refers to the frequency of the reruns. + ./generate_FV3LAM_wflow.sh +The last line of output from this script, starting with ``*/1 * * * * ``, can be saved and `used later ` to automatically run portions of the workflow. -Run the Workflow Using Rocoto -============================= -The information in this section assumes that Rocoto is available on the desired platform. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two ways you can run the workflow with Rocoto using either the ``./launch_FV3LAM_wflow.sh`` or by hand. +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in `Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: @@ -275,11 +270,12 @@ An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. export EXPTDIR=// -Or if the login shell is csh/tcsh, it can be set using: +If the login shell is csh/tcsh, replace ``export`` with ``setenv`` in the command above. -.. code-block:: console - setenv EXPTDIR // +Run the Workflow Using Rocoto +============================= +The information in this section assumes that Rocoto is available on the desired platform. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. Launch the Rocoto Workflow Using a Script ----------------------------------------------- @@ -291,42 +287,60 @@ To run Rocoto using the script provided: cd $EXPTDIR ./launch_FV3LAM_wflow.sh -Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named ``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. +Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named ``log.launch_FV3LAM_wflow`` will be created (or appended) in the ``EXPTDIR``. Check the end of the log file periodically to see how the experiment is progressing: + +.. code-block:: console + + cd $EXPTDIR + vi ``log.launch_FV3LAM_wflow`` + Launch the Rocoto Workflow Manually --------------------------------------- -Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can manually load Rocoto and any other required modules. The commands for specific Level 1 platforms are described here: +Load Rocoto +^^^^^^^^^^^^^^^^ + +Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can manually load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. + +For most systems, a variant on the following commands will be necessary to load the Rocoto module: + +.. code-block:: console + + module use + module load rocoto -For Cheyenne: +The commands for specific Level 1 platforms are described here: + +Cheyenne: .. code-block:: console module use -a /glade/p/ral/jntp/UFS_SRW_app/modules/ module load rocoto -For Hera or Jet: +Hera and Jet: .. code-block:: console module purge module load rocoto -For Orion: +Orion: .. code-block:: console module purge module load contrib rocoto -For Gaea: +Gaea: .. code-block:: console module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles module load rocoto/1.3.3 -For WCOSS_DELL_P3: +WCOSS_DELL_P3: .. code-block:: console @@ -335,7 +349,7 @@ For WCOSS_DELL_P3: module use /gpfs/dell3/usrx/local/dev/emc_rocoto/modulefiles/ module load ruby/2.5.1 rocoto/1.2.4 -For WCOSS_CRAY: +WCOSS_CRAY: .. code-block:: console @@ -344,18 +358,11 @@ For WCOSS_CRAY: module use -a /usrx/local/emc_rocoto/modulefiles module load rocoto/1.2.4 -For other systems, a variant on the following commands will be necessary: - -.. code-block:: console - - module use - module load rocoto - Run the Rocoto Workflow ---------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^ -After loading Rocoto, call ``rocotorun`` from the experiment directory to launch the workflow tasks. As the workflow progresses through its stages, ``rocotostat`` will show the state of the process and allow users to monitor progress: +After loading Rocoto, call ``rocotorun`` from the experiment directory to launch the workflow tasks. This will start any tasks that do not have a dependency. As the workflow progresses through its stages, ``rocotostat`` will show the state of each task and allow users to monitor progress: .. code-block:: console @@ -363,19 +370,32 @@ After loading Rocoto, call ``rocotorun`` from the experiment directory to launch rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 +The ``rocotorun`` and ``rocotostat`` commands will need to be resubmitted regularly and repeatedlyuntil the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. + +If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed: + +.. code-block:: console + + cd $EXPTDIR/log + vi make_grid.log + +If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. + +.. _AdditionalOptions:: + Additional Options ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every 3 minutes), the user can add a chrontab entry by entering the ``crontab -e`` command. This opens a crontab file. The last line of output from the ``./generate_FV3LAM_wflow.sh``, which starts with ``*/1 * * * * ``, can be pasted into the crontab file at this point. Alternatively, if users preferred to use the ``./launch_FV3LAM_wflow.sh`` to run the workflow, they can paste the following command into the crontab: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in `Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * * ``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console - */3 * * * * cd && ./launch_FV3LAM_wflow.sh + */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -The number ``3`` can be changed to resubmit the workflow more or less frequently. +where is changed to correspond to the user's machine, and "/apps/rocoto/1.3.3/bin/rocotorun" corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. .. note:: - Currently cron is only available on the orion-login-1 node, so please use that node. + On Orion, cron is only available on the orion-login-1 node, so please use that node when running cron jobs on Orion. The workflow run is completed when all tasks have “SUCCEEDED”, and the rocotostat command will output the following: diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index 8e43ea2c75..3d7e48c137 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -56,6 +56,7 @@ ] bibtex_bibfiles = ['references.bib'] +#bibtex_bibfiles = ['refs.bib'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] From ab97b749d009ee7eb8571a4b4f24ddf6d340ea2f Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 16 Feb 2022 16:59:15 -0500 Subject: [PATCH 011/118] add to HPC-stack intro --- docs/UsersGuide/source/Quickstart.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 00dfeb1278..4e08b02cf7 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -19,6 +19,7 @@ Install the HPC-Stack .. note:: Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF etc) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `_ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. .. include:: ../../../hpc-stack-mod/docs/source/hpc-intro-text.rst @@ -370,7 +371,7 @@ After loading Rocoto, call ``rocotorun`` from the experiment directory to launch rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -The ``rocotorun`` and ``rocotostat`` commands will need to be resubmitted regularly and repeatedlyuntil the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. +The ``rocotorun`` and ``rocotostat`` commands will need to be resubmitted regularly and repeatedly until the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed: From 805620031b13a10d4210771a128381187010919d Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 16 Feb 2022 17:08:56 -0500 Subject: [PATCH 012/118] submodule updates --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index 9002252d5a..b2e87e1d5e 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 9002252d5a39f69ff084ad2e969862557281bef3 +Subproject commit b2e87e1d5ecc6208903cb00e715b6ed25000d41c From 17504fcfe21ca4b581f4b82ffcc433b748884da6 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Feb 2022 14:54:51 -0500 Subject: [PATCH 013/118] added submodule docs edits --- docs/UsersGuide/build/.gitignore | 4 ---- docs/UsersGuide/source/Quickstart.rst | 22 ++++++++++++++++++++-- docs/UsersGuide/source/conf.py | 6 +++--- hpc-stack-mod | 2 +- 4 files changed, 24 insertions(+), 10 deletions(-) delete mode 100644 docs/UsersGuide/build/.gitignore diff --git a/docs/UsersGuide/build/.gitignore b/docs/UsersGuide/build/.gitignore deleted file mode 100644 index 5e7d2734cf..0000000000 --- a/docs/UsersGuide/build/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -# Ignore everything in this directory -* -# Except this file -!.gitignore diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 4e08b02cf7..09fd88fc29 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -295,6 +295,14 @@ Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log cd $EXPTDIR vi ``log.launch_FV3LAM_wflow`` +Alternatively, to (re)launch the workflow and check its progress on a single line: + +.. code-block:: console + + ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow + +This will output the last 40 lines of the log file. The number 40 can be changed according to the user's preferences. + Launch the Rocoto Workflow Manually --------------------------------------- @@ -380,7 +388,9 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai cd $EXPTDIR/log vi make_grid.log -If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. +.. note:: + + If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. .. _AdditionalOptions:: @@ -390,10 +400,18 @@ For automatic resubmission of the workflow at regular intervals (e.g., every min .. code-block:: console - */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 where is changed to correspond to the user's machine, and "/apps/rocoto/1.3.3/bin/rocotorun" corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. +Then, check the experiment progress with: + +.. code-block:: console + + cd $EXPTDIR + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + + .. note:: On Orion, cron is only available on the orion-login-1 node, so please use that node when running cron jobs on Orion. diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index 3d7e48c137..93f6bcb9a5 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -107,9 +107,9 @@ html_static_path = ['_static'] html_context = { - 'css_files': [ - '_static/theme_overrides.css', # override wide tables in RTD theme - ], + # 'css_files': [ + # '_static/theme_overrides.css', # override wide tables in RTD theme +# ], } def setup(app): diff --git a/hpc-stack-mod b/hpc-stack-mod index b2e87e1d5e..485e7c6280 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit b2e87e1d5ecc6208903cb00e715b6ed25000d41c +Subproject commit 485e7c62808566e35cfaf37ba850c08cab4c1683 From 357e151213231aa0691cfc511c348d91eda94d2f Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Feb 2022 18:22:46 -0500 Subject: [PATCH 014/118] hpc-stack updates & formatting fixes --- docs/UsersGuide/source/CodeReposAndDirs.rst | 4 ++-- docs/UsersGuide/source/Components.rst | 8 +++---- docs/UsersGuide/source/ConfigNewPlatform.rst | 2 +- docs/UsersGuide/source/Glossary.rst | 9 ++++++++ docs/UsersGuide/source/InputOutputFiles.rst | 22 ++++++++++---------- docs/UsersGuide/source/Introduction.rst | 4 ++-- docs/UsersGuide/source/Quickstart.rst | 21 ++++++++++--------- docs/UsersGuide/source/conf.py | 8 ++----- docs/UsersGuide/source/index.rst | 2 -- hpc-stack-mod | 2 +- 10 files changed, 42 insertions(+), 40 deletions(-) diff --git a/docs/UsersGuide/source/CodeReposAndDirs.rst b/docs/UsersGuide/source/CodeReposAndDirs.rst index e52f5512c0..3031f84573 100644 --- a/docs/UsersGuide/source/CodeReposAndDirs.rst +++ b/docs/UsersGuide/source/CodeReposAndDirs.rst @@ -42,7 +42,7 @@ repositories associated with this umbrella repo (see :numref:`Table %s `_. +documented `here `__. Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The source code for these components resides in @@ -50,7 +50,7 @@ the repositories `NCEPLIBS `_ and `NCEPLIB `_. These external components are already built on the preconfigured platforms listed `here -`_. +`__. However, they must be cloned and built on other platforms according to the instructions provided in the wiki pages of those repositories: https://github.com/NOAA-EMC/NCEPLIBS/wiki and https://github.com/NOAA-EMC/NCEPLIBS-external/wiki. diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index cf5180ba40..9562242a59 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -31,11 +31,11 @@ Forecast Model ============== The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `_. +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. -Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `_ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. +Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`; described `here `_).Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each +Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`; described `here `__).Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model @@ -93,5 +93,3 @@ described in the Quick Start (:numref:`Section %s `). A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page `_. - -.. bibliography:: references.bib \ No newline at end of file diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index 798a287b29..4d972d7e31 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -4,7 +4,7 @@ Configuring a New Platform ========================== -The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here `_. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s `. +The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here `__. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s `. The first step to installing on a new machine is to install :term:`NCEPLIBS` (https://github.com/NOAA-EMC/NCEPLIBS), the NCEP libraries package, which is a set of libraries created and maintained by NCEP and EMC that are used in many parts of the UFS. NCEPLIBS comes with a large number of prerequisites (see :numref:`Section %s ` for more info), but the only required software prior to starting the installation process are as follows: diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 4432edc884..1cd7cb8e5d 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -16,6 +16,9 @@ Glossary The preprocessing software used to create initial and boundary condition files to “coldstart” the forecast model. + dynamical core + Global atmospheric model based on fluid dynamics principles, including Euler's equations of motion. + FV3 The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL), it is a scalable and flexible dycore capable of both @@ -28,9 +31,15 @@ Glossary GRIB2 The second version of the World Meterological Organization's (WMO) standard for distributing gridded data. + HPC-Stack + The `HPC-stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. + HRRR `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. + LAM + Limited Area Model. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. + NAM `North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes. diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 2cce0786d2..a103de9131 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -31,7 +31,7 @@ from a location on disk to your experiment directory by the workflow generation pre-processing utilities use many different datasets to create grids, and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here -`_. +`__. UFS Weather Model ----------------- @@ -41,14 +41,14 @@ must be staged by the user unless you are running on a pre-configured platform, you can link to the existing copy on that machine. See :numref:`Section %s ` for more information. The static, grid, and date specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather -model can be found in the `UFS Weather Model User's Guide `_. +model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. Unified Post Processor (UPP) ---------------------------- Documentation for the UPP input files can be found in the `UPP User's Guide -`_. +`__. .. _WorkflowTemplates: @@ -110,7 +110,7 @@ and are shown in :numref:`Table %s `. Additional information related to the ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide -`_, +`__, while information on the ``regional_grid.nml`` can be found in the `UFS_UTILS User’s Guide `_. @@ -162,7 +162,7 @@ experiment run directory ``EXPTDIR/YYYYMMDDHH/INPUT`` and consist of the followi * ``sfc_data.nc -> sfc_data.tile7.halo0.nc`` These output files are used as inputs for the UFS weather model, and are described in the `Users Guide -`_. +`__. UFS Weather Model ----------------- @@ -182,11 +182,11 @@ the file names are specified in the input file ``model_configure`` and are set t * ``phyfHHH.nc`` Additional details may be found in the UFS Weather Model `Users Guide -`_. +`__. Unified Post Processor (UPP) ---------------------------- -Documentation for the UPP output files can be found `here `_. +Documentation for the UPP output files can be found `here `__. For the SRW Application, the weather model netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH/postprd`` directory and have the naming convention (file->linked to): @@ -205,7 +205,7 @@ located in ``ufs-srweather-app/src/UPP/parm``. .. note:: This process requires advanced knowledge of which fields can be output for the UFS Weather Model. -Use the directions in the `UPP User's Guide `_ +Use the directions in the `UPP User's Guide `__ for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). @@ -240,8 +240,8 @@ where the static files are located. If you are on a pre-configured or configurab need to stage the fixed files manually because they have been prestaged and the paths are set in ``regional_workflow/ush/setup.sh``. If the user's platform is not defined in that file, the static files can be pulled individually or as a full tar file from the `FTP data repository -`_ or from `Amazon Web Services (AWS) cloud storage -`_ +`__ or from `Amazon Web Services (AWS) cloud storage +`__ and staged on your machine. The paths to the staged files must then be set in ``config.sh`` as follows: @@ -268,7 +268,7 @@ not have access to the NOAA HPSS and you need to pull and stage the data manuall set ``USE_USER_STAGED_EXTRN_FILES`` to ``TRUE`` and then set the paths to the where the IC/LBC files are located. A small sample of IC/LBCs is available at the `FTP data repository -`_ or from `AWS cloud storage +`__ or from `AWS cloud storage `_. Initial and Lateral Boundary Condition Organization diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 1c834e47f4..f7c84f02a1 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -47,7 +47,7 @@ Atmospheric Model The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:`BlackEtAl2020`. -The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `_. +The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package --------------------------------- @@ -78,7 +78,7 @@ Build System and Workflow The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `_). Individual components can also be run in a stand-alone, command line fashion. The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Comuting (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow, as -described in the Quick Start Guide (:numref:`Section %s <_GenerateForecast>`). On other platforms, the required libraries will need to be installed via the HPC-Stack. Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +described in the Quick Start Guide (:numref:`Section %s `). On other platforms, the required libraries will need to be installed via the HPC-Stack. Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. User Support, Documentation, and Contributing Development ========================================================= diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 09fd88fc29..865c863db1 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -6,7 +6,7 @@ Workflow Quick Start This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. -.. note:: +.. attention:: The UFS defines `four platform levels `_. The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. @@ -16,10 +16,10 @@ This Workflow Quick Start Guide will help users to build and run the "out-of-the Install the HPC-Stack ======================== -.. note:: +.. Hint:: Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). -The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF etc) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `_ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF etc) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. .. include:: ../../../hpc-stack-mod/docs/source/hpc-intro-text.rst @@ -245,7 +245,7 @@ Next, load the appropriate Python environment for the workflow. The workflow req source ../../env/wflow_.env -.. _GenerateWorkflow:: +.. _GenerateWorkflow: Generate the Regional Workflow ------------------------------------------- @@ -261,9 +261,9 @@ Then, run the following command to generate the workflow: ./generate_FV3LAM_wflow.sh -The last line of output from this script, starting with ``*/1 * * * * ``, can be saved and `used later ` to automatically run portions of the workflow. +The last line of output from this script, starting with ``*/1 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. -This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in `Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: @@ -392,17 +392,17 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. -.. _AdditionalOptions:: +.. _AdditionalOptions: Additional Options ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in `Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * * ``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in `Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console - */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -where is changed to correspond to the user's machine, and "/apps/rocoto/1.3.3/bin/rocotorun" corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. +where ```` is changed to correspond to the user's machine, and ``"/apps/rocoto/1.3.3/bin/rocotorun"`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. Then, check the experiment progress with: @@ -411,6 +411,7 @@ Then, check the experiment progress with: cd $EXPTDIR rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 +After finishing the experiment, open the crontab using `` crontab -e`` and delete the crontab entry. .. note:: diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index 93f6bcb9a5..42148bea63 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -104,13 +104,9 @@ # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ['_static'] +html_static_path = [] -html_context = { - # 'css_files': [ - # '_static/theme_overrides.css', # override wide tables in RTD theme -# ], - } +html_context = {} def setup(app): app.add_css_file('custom.css') # may also be an URL diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index d9ffc68370..96ac2e51c4 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -5,8 +5,6 @@ UFS Short-Range Weather App Users Guide ======================================= - -.. index:: .. toctree:: :numbered: diff --git a/hpc-stack-mod b/hpc-stack-mod index 485e7c6280..52d227f571 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 485e7c62808566e35cfaf37ba850c08cab4c1683 +Subproject commit 52d227f5718f46eee017f55b17a51f3fc1354cf3 From acf555b4ef1d1125b160d6eebefcfbbacd3ae939 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Feb 2022 18:37:55 -0500 Subject: [PATCH 015/118] hpc-stack intro edits --- docs/UsersGuide/source/Quickstart.rst | 10 +++++++++- hpc-stack-mod | 2 +- 2 files changed, 10 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 865c863db1..4df02d1993 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -19,10 +19,18 @@ Install the HPC-Stack .. Hint:: Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). +**Definition:** HPC-stack is a repository that provides a unified, shell script-based build system that builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. + +Background +---------------- + The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF etc) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. -.. include:: ../../../hpc-stack-mod/docs/source/hpc-intro-text.rst +Instructions +------------------------- +`Level 1 `_ platforms (e.g. Cheyenne, Hera) already have the HPC-Stack installed. Users on those platforms do *not* need to install the HPC-Stack before building applications or models that require the HPC-Stack. Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications or models that depend on it. +Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the :ref:`next section `. diff --git a/hpc-stack-mod b/hpc-stack-mod index 52d227f571..174286c1fe 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 52d227f5718f46eee017f55b17a51f3fc1354cf3 +Subproject commit 174286c1fe8ee3c59f54539389ce9fbe2d1ef0d1 From 36349a610e1a6c163986af84a5c0d6d56c5af5dd Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Feb 2022 13:12:26 -0500 Subject: [PATCH 016/118] bibtex attempted fix --- docs/UsersGuide/requirements.txt | 2 +- docs/UsersGuide/source/Quickstart.rst | 24 ++++++++++++------------ 2 files changed, 13 insertions(+), 13 deletions(-) diff --git a/docs/UsersGuide/requirements.txt b/docs/UsersGuide/requirements.txt index 9c7258463b..8e21f437b4 100644 --- a/docs/UsersGuide/requirements.txt +++ b/docs/UsersGuide/requirements.txt @@ -1,2 +1,2 @@ -sphinxcontrib-bibtex +sphinxcontrib-bibtex==1.0.0 sphinx_rtd_theme diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 4df02d1993..a312748d08 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -16,10 +16,10 @@ This Workflow Quick Start Guide will help users to build and run the "out-of-the Install the HPC-Stack ======================== -.. Hint:: +.. Attention:: Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). -**Definition:** HPC-stack is a repository that provides a unified, shell script-based build system that builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. +**Definition:** :term:`HPC-stack` is a repository that provides a unified, shell script-based build system that builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. Background ---------------- @@ -32,8 +32,7 @@ Instructions Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. -After completing installation, continue to the :ref:`next section `. - +After completing installation, continue to the next section. .. _DownloadCode: @@ -129,11 +128,12 @@ to the command line, or the file can be sourced from the ufs-srweather-app ``env For example, on Hera, run ``source env/build_hera_intel.env`` from the main ufs-srweather-app directory to source the appropriate file. -On Level 3-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. On systems without Lmod, this process will typically involve commands in the form `export =`. You may need to use ``setenv`` rather than ``export`` depending on your environment. +On Level 3-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. On systems without Lmod, this process will typically involve commands in the form ``export =``. You may need to use ``setenv`` rather than ``export`` depending on your environment. + -Troubleshooting ------------------- -* If the system cannot find a module (i.e., a "module unknown" message appears), check whether the module version numbers match in ``ufs-srweather-app/env/build__.env`` and the ``hpc-stack/stack/stack_custom.yaml``. +.. hint:: + + If the system cannot find a module (i.e., a "module unknown" message appears), check whether the module version numbers match in ``ufs-srweather-app/env/build__.env`` and the ``hpc-stack/stack/stack_custom.yaml``. Build the Executables @@ -184,7 +184,7 @@ Set Experiment Parameters ------------------------- Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. -Make a copy of ``config.community.sh`` to get started (under /regional_workflow/ush): +Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``): .. code-block:: console @@ -204,7 +204,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :doc:`Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in the section on :doc:`Limited Area Model (LAM) Grids `. -.. note:: +.. Important:: If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. @@ -410,7 +410,7 @@ For automatic resubmission of the workflow at regular intervals (e.g., every min */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -where ```` is changed to correspond to the user's machine, and ``"/apps/rocoto/1.3.3/bin/rocotorun"`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. +where ```` is changed to correspond to the user's machine, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. Then, check the experiment progress with: @@ -423,7 +423,7 @@ After finishing the experiment, open the crontab using `` crontab -e`` and delet .. note:: - On Orion, cron is only available on the orion-login-1 node, so please use that node when running cron jobs on Orion. + On Orion, *cron* is only available on the orion-login-1 node, so please use that node when running cron jobs on Orion. The workflow run is completed when all tasks have “SUCCEEDED”, and the rocotostat command will output the following: From 838271f80aa2be403ecf172fa045052bb1c9bfa9 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Feb 2022 17:06:08 -0500 Subject: [PATCH 017/118] add hpc-stack module edits --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index 174286c1fe..d1b88151e3 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 174286c1fe8ee3c59f54539389ce9fbe2d1ef0d1 +Subproject commit d1b88151e367255748b150db9de0d77b90aec5ba From 863b7de5cfe0af769b6f4b4351db0f307c816cc9 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 13:19:32 -0500 Subject: [PATCH 018/118] update sphinxcontrib version --- docs/UsersGuide/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/requirements.txt b/docs/UsersGuide/requirements.txt index 8e21f437b4..d3a02244d2 100644 --- a/docs/UsersGuide/requirements.txt +++ b/docs/UsersGuide/requirements.txt @@ -1,2 +1,2 @@ -sphinxcontrib-bibtex==1.0.0 +sphinxcontrib-bibtex<2.0.0 sphinx_rtd_theme From 2b100d9dd14d0757ca6fdbc302e0d8a04f00eb9d Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 13:25:48 -0500 Subject: [PATCH 019/118] add .readthedocs.yaml file --- .readthedocs.yaml | 29 +++++++++++++++++++++++++++++ 1 file changed, 29 insertions(+) create mode 100644 .readthedocs.yaml diff --git a/.readthedocs.yaml b/.readthedocs.yaml new file mode 100644 index 0000000000..e6f30f6274 --- /dev/null +++ b/.readthedocs.yaml @@ -0,0 +1,29 @@ +# .readthedocs.yaml +# Read the Docs configuration file +# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details + +# Required +version: 2 + +# Set the version of Python and other tools you might need +build: + os: ubuntu-20.04 + tools: + python: "3.9" + # You can also specify other tool versions: + # nodejs: "16" + # rust: "1.55" + # golang: "1.17" + +# Build documentation in the docs/ directory with Sphinx +sphinx: + configuration: docs/conf.py + +# If using Sphinx, optionally build your docs in additional formats such as PDF +# formats: +# - pdf + +# Optionally declare the Python requirements required to build your docs +python: + install: + - requirements: docs/requirements.txt From 9e58e67ebe8a61b23d057c76fa4938e09a267abf Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 13:28:35 -0500 Subject: [PATCH 020/118] update .readthedocs.yaml file --- .readthedocs.yaml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.readthedocs.yaml b/.readthedocs.yaml index e6f30f6274..dc8246b926 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -26,4 +26,4 @@ sphinx: # Optionally declare the Python requirements required to build your docs python: install: - - requirements: docs/requirements.txt + - requirements: docs/UsersGuide/requirements.txt From 1830b4935e943abe0eb546813b47603ffa7f84d7 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 13:51:10 -0500 Subject: [PATCH 021/118] update .readthedocs.yaml file --- .readthedocs.yaml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.readthedocs.yaml b/.readthedocs.yaml index dc8246b926..6fd3e6514e 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -17,7 +17,7 @@ build: # Build documentation in the docs/ directory with Sphinx sphinx: - configuration: docs/conf.py + configuration: docs/UsersGuide/source/conf.py # If using Sphinx, optionally build your docs in additional formats such as PDF # formats: From 54a647e251a5ed66383bbbf34d90ae78b32f3aca Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 13:53:43 -0500 Subject: [PATCH 022/118] update conf.py --- docs/UsersGuide/source/conf.py | 1 - 1 file changed, 1 deletion(-) diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index 42148bea63..d4404bb8b5 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -52,7 +52,6 @@ 'sphinx.ext.githubpages', 'sphinx.ext.napoleon', 'sphinxcontrib.bibtex', - 'myst_parser' ] bibtex_bibfiles = ['references.bib'] From 46d381fa10dffcde964c50041e9100ffcefbc26d Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 15:20:01 -0500 Subject: [PATCH 023/118] updates .readthedocs.yaml with submodules --- .readthedocs.yaml | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/.readthedocs.yaml b/.readthedocs.yaml index 6fd3e6514e..6879430f88 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -27,3 +27,9 @@ sphinx: python: install: - requirements: docs/UsersGuide/requirements.txt + +submodules: + include: + - hpc-stack-mod + #recursive: true + From 91af03d335a786ccd6e4e671b31edc577d3855c7 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 15:35:24 -0500 Subject: [PATCH 024/118] updates .readthedocs.yaml with submodules --- .readthedocs.yaml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.readthedocs.yaml b/.readthedocs.yaml index 6879430f88..e0987f8926 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -31,5 +31,5 @@ python: submodules: include: - hpc-stack-mod - #recursive: true + recursive: true From 97616fdc204fccd2cee9e35dac57fca368b19394 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 15:51:19 -0500 Subject: [PATCH 025/118] submodule updates --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index d1b88151e3..bf2320bc5c 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit d1b88151e367255748b150db9de0d77b90aec5ba +Subproject commit bf2320bc5ceb6926bca9cebc0dcb94a8e8a60c01 From 21d3e271939a8160fde1ca2636ec2a1fff7326a5 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Feb 2022 16:10:30 -0500 Subject: [PATCH 026/118] submodule updates --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index bf2320bc5c..5784e7b9a9 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit bf2320bc5ceb6926bca9cebc0dcb94a8e8a60c01 +Subproject commit 5784e7b9a9db83bbc76f04da13ff7b7f4bbaa067 From 5af69e522e5dd0812d3953e740938d1be61e662c Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Feb 2022 12:17:29 -0500 Subject: [PATCH 027/118] minor Intro edits --- docs/UsersGuide/requirements.txt | 2 +- docs/UsersGuide/source/Introduction.rst | 44 +++++++++++-------------- 2 files changed, 21 insertions(+), 25 deletions(-) diff --git a/docs/UsersGuide/requirements.txt b/docs/UsersGuide/requirements.txt index d3a02244d2..9c7258463b 100644 --- a/docs/UsersGuide/requirements.txt +++ b/docs/UsersGuide/requirements.txt @@ -1,2 +1,2 @@ -sphinxcontrib-bibtex<2.0.0 +sphinxcontrib-bibtex sphinx_rtd_theme diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index f7c84f02a1..f052e741ea 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -4,39 +4,34 @@ Introduction ============ -The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. +The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g. METplus). This documentation provides a quick start guide for running the application, in addition to an overview of the release components, a description of the supported capabilities, and information on where to find more information and obtain support. +The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a `Quick Start Guide ` for running the application, in addition to an overview of the `release components `, a description of the supported capabilities, and details on where to find more information and obtain support. -The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research -conducted with the App: +The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 How to Use This Document ======================== -This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the UFS forum at https://forums.ufscommunity.org/. +This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the `UFS forum `__. .. code-block:: console - Throughout the guide, this presentation style indicates shell commands and options, code examples, etc. + Throughout the guide, this presentation style indicates shell commands and options, + code examples, etc. -.. note:: +Variables presented as ``AaBbCc123`` in this document typically refer to variables in scripts, names of files, and directories. - Variables presented as ``AaBbCc123`` in this document typically refer to variables - in scripts, names of files, and directories. - -.. note:: - - File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). +File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). Pre-processor Utilities and Initial Conditions ============================================== The SRW Application includes a number of pre-processing utilities that initialize and prepare the -model. Tasks include generating a regional grid, along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. +model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. Forecast Model @@ -46,7 +41,7 @@ Atmospheric Model -------------------- The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:`BlackEtAl2020`. +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:t:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package @@ -64,21 +59,22 @@ ingests initial and lateral boundary condition files produced by :term:`chgres_c Post-processor ============== -The Unified Post Processor (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF output on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing, e.g. statistical post-processing techniques. +The `Unified Post Processor `__ (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). Visualization Example ===================== -This SRW Application distribution provides Python scripts to create basic visualizations of the model output. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. +This SRW Application provides Python scripts to create basic visualizations of the model output. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow ========================= -The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `_). Individual components can also be run in a stand-alone, command line fashion. The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. +The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `__ for more on workflow management). Individual components can also be run in a stand-alone, command line fashion. + +The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Comuting (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow, as -described in the Quick Start Guide (:numref:`Section %s `). On other platforms, the required libraries will need to be installed via the HPC-Stack. Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Comuting (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow to generate an experiment, as described in the Quick Start Guide (:numref:`Section %s Generate the Forecast Experiment `). On other platforms, the required libraries will need to be installed via the HPC_Stack (see :numref:`Section %s Installing the HPC-Stack `). Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. User Support, Documentation, and Contributing Development ========================================================= @@ -89,7 +85,7 @@ A list of available documentation is shown in :numref:`Table %s `. +utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component (described in the User’s Guides listed in :numref:`Table %s `). Future Direction ================ Users can expect to see incremental improvements and additional capabilities in upcoming releases of the SRW Application to enhance research opportunities and support operational -forecast implementations. Planned advancements include: +forecast implementations. Planned enhancements include: * A more extensive set of supported developmental physics suites. * A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. * Inclusion of data assimilation, cycling, and ensemble capabilities. -* A verification package (i.e., METplus) integrated into the workflow. +* A verification package (e.g., METplus) integrated into the workflow. * Inclusion of stochastic perturbation techniques. In addition to the above list, other improvements will be addressed in future releases. From ee901e6c9c45b3aa4636bf1f95758c5d02e56e10 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Feb 2022 13:53:29 -0500 Subject: [PATCH 028/118] minor Intro edits --- docs/UsersGuide/source/Introduction.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index f052e741ea..21517a36c0 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a `Quick Start Guide ` for running the application, in addition to an overview of the `release components `, a description of the supported capabilities, and details on where to find more information and obtain support. +The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -41,7 +41,7 @@ Atmospheric Model -------------------- The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability :cite:t:`BlackEtAl2020`. +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package From f77cba93a6db2ff6ac75d4c516633d85c89cb188 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Feb 2022 15:13:22 -0500 Subject: [PATCH 029/118] minor Intro edits --- docs/UsersGuide/source/Glossary.rst | 5 ++++- docs/UsersGuide/source/Introduction.rst | 23 ++++++++++------------- 2 files changed, 14 insertions(+), 14 deletions(-) diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 1cd7cb8e5d..db0b1cd140 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -66,6 +66,9 @@ Glossary NEMSIO A binary format for atmospheric model output from :term:`NCEP`'s Global Forecast System (GFS). + NWP + Numerical Weather Prediction + Orography The branch of physical geography dealing with mountains @@ -86,7 +89,7 @@ Glossary part of this collection. UPP - The Unified Post Processor is software developed at :term:`NCEP` and used operationally to + The `Unified Post Processor `__ is software developed at :term:`NCEP` and used operationally to post-process raw output from a variety of :term:`NCEP`'s NWP models, including the FV3. Weather Enterprise diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 21517a36c0..877dee4ada 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -1,8 +1,8 @@ .. _Introduction: -============ +============= Introduction -============ +============= The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. @@ -47,19 +47,18 @@ The dynamical core is the computational part of a model that solves the equation Common Community Physics Package --------------------------------- -The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions.The SRW release includes an experimental physics version and an updated operational version. +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW release includes an experimental physics version and an updated operational version. Data Format -------------- -The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model -ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. Post-processor ============== -The `Unified Post Processor `__ (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). +The `Unified Post Processor `__ (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). Visualization Example @@ -74,10 +73,10 @@ The SRW Application has a portable CMake-based build system that packages togeth The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Comuting (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow to generate an experiment, as described in the Quick Start Guide (:numref:`Section %s Generate the Forecast Experiment `). On other platforms, the required libraries will need to be installed via the HPC_Stack (see :numref:`Section %s Installing the HPC-Stack `). Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow to generate an experiment, as described in the Quick Start Guide :numref:`Section %s Generate the Forecast Experiment `. On other platforms, the required libraries will need to be installed via the HPC-Stack (see :numref:`Section %s Installing the HPC-Stack `). Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. -User Support, Documentation, and Contributing Development -========================================================= +User Support, Documentation, and Contributions to Development +=============================================================== A forum-based, online `support system `_ organized by topic provides a centralized location for UFS users and developers to post questions and exchange information. @@ -118,10 +117,10 @@ A list of available documentation is shown in :numref:`Table %s `). +utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each component, which are outlined in the respective User's Guides listed in :numref:`Table %s `. Future Direction -================ +================= Users can expect to see incremental improvements and additional capabilities in upcoming releases of the SRW Application to enhance research opportunities and support operational @@ -133,8 +132,6 @@ forecast implementations. Planned enhancements include: * A verification package (e.g., METplus) integrated into the workflow. * Inclusion of stochastic perturbation techniques. -In addition to the above list, other improvements will be addressed in future releases. - .. bibliography:: references.bib From bc0748c81a8464cad1079887b8bb0a280a8afafb Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Feb 2022 15:31:20 -0500 Subject: [PATCH 030/118] submodule updates --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index 5784e7b9a9..f8b32160eb 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 5784e7b9a9db83bbc76f04da13ff7b7f4bbaa067 +Subproject commit f8b32160eb1c1165a51b2df7479b2e9a158aadfc From fef6d27824968bcca681a9f82221691901542fbc Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Feb 2022 18:24:06 -0500 Subject: [PATCH 031/118] fixed typos in QS --- docs/UsersGuide/source/Quickstart.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index a312748d08..018e295312 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -61,7 +61,7 @@ Start the container and run an interactive shell within it. This command also bi .. code-block:: console - singularity shell -e --writable --bind /home:/home ubuntu20.04-hpc-stack + singularity shell -e --writable --bind /home:/home ubuntu20.04-hpc-stack-0.1 Clone the develop branch of the UFS-SRW weather application repository: @@ -121,7 +121,7 @@ If the SRW Application has been built in a container provided by the Earth Predi On Other Systems (Non-Container Approach) ------------------------------------------ -Otherwise, for Level 1 and 2 systems, scripts for loading the proper modules and/or setting the +For Level 1 and 2 systems, scripts for loading the proper modules and/or setting the correct environment variables can be found in the ``env/`` directory of the SRW App in files named ``build__.env``. The commands in these files can be directly copy-pasted to the command line, or the file can be sourced from the ufs-srweather-app ``env/`` directory. From 0d16101ce9d82689deb1ccd186dddd6721d93fb1 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 24 Feb 2022 18:02:09 -0500 Subject: [PATCH 032/118] QS updates --- docs/UsersGuide/source/Quickstart.rst | 33 +++++++++++++++------------ 1 file changed, 18 insertions(+), 15 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 018e295312..7cd52c7dc1 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -47,21 +47,22 @@ Pull the Singularity container: .. code-block:: console - singularity pull ubuntu20.04-hpc-stack-0.1.sif docker://noaaepic/ubuntu20.04-hpc-stack:0.1 + singularity pull ubuntu20.04-epic-srwapp-1.0.sif docker://noaaepic/ubuntu20.04-epic-srwapp:1.0 -Build the container and make a ``home`` directory inside it: +Build the container and make a ``contrib`` directory inside it if one does not already exist: .. code-block:: console - singularity build --sandbox ubuntu20.04-hpc-stack-0.1 ubuntu20.04-hpc-stack-0.1.sif - cd ubuntu20.04-hpc-stack-0.1 - mkdir home + singularity build --sandbox ubuntu20.04-epic-srwapp-1.0 ubuntu20.04-epic-srwapp-1.0.sif + cd ubuntu20.04-epic-srwapp-1.0 + mkdir contrib + cd .. Start the container and run an interactive shell within it. This command also binds the local home directory to the container so that data can be shared between them. .. code-block:: console - singularity shell -e --writable --bind /home:/home ubuntu20.04-hpc-stack-0.1 + singularity shell -e --writable --bind /:/contrib ubuntu20.04-epic-srwapp-1.0 Clone the develop branch of the UFS-SRW weather application repository: @@ -79,6 +80,7 @@ Check out submodules for the SRW Application: cd ufs-srweather-app ./manage_externals/checkout_externals +If the ``manage_externals`` command brings up an error, it may be necessary to run ``ln -s /usr/bin/python3 /usr/bin/python`` first. Run the UFS SRW Without a Container ------------------------------------ @@ -173,10 +175,9 @@ Generating the forecast experiment requires three steps: * Set experiment parameters * Set Python and other environment parameters -* Run the ``generate_FV3LAM_wflow.sh`` script to generate the experiment workflow +* Run a script to generate the experiment workflow -The first two steps depend on the platform being used and are described here for each Level 1 platform. -Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. +The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. .. _SetUpConfigFile: @@ -252,18 +253,20 @@ Next, load the appropriate Python environment for the workflow. The workflow req source ../../env/wflow_.env +This command will activate the ``regional_workflow``. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: + +.. code-block:: console + + conda activate regional_workflow + + .. _GenerateWorkflow: Generate the Regional Workflow ------------------------------------------- -First, activate the regional workflow from the ``ush`` directory: - -.. code-block:: console - - conda activate regional_workflow -Then, run the following command to generate the workflow: +Run the following command to generate the workflow: .. code-block:: console From 418a40bfa828bb09a94439a8c107d9509ae445d0 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 24 Feb 2022 18:57:28 -0500 Subject: [PATCH 033/118] QS updates --- docs/UsersGuide/source/Quickstart.rst | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 7cd52c7dc1..2609b91797 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -155,17 +155,17 @@ From the build directory, run the ``cmake`` command below to set up the ``Makefi cmake .. -DCMAKE_INSTALL_PREFIX=.. make -j 4 >& build.out & -Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. -When the build completes, you should see the forecast model executable ``NEMS.exe`` and eleven -pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are -described in :numref:`Table %s `. +The build will take a few minutes to complete. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. + +.. hint:: + + If you do not see a ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. Download and Stage the Data ============================ The SRW requires input files to run. These include static datasets, initial and boundary conditions -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are -already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :doc:`Input and Output Files `, Section 3. Section 1 contains useful background information on the input files required by the SRW. +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :doc:`Input and Output Files `, Section 3. Section 1 contains useful background information on the input files required by the SRW. .. _GenerateForecast: @@ -185,11 +185,11 @@ Set Experiment Parameters ------------------------- Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. -Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``): +Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: .. code-block:: console - cd ../regional_workflow/ush + cd regional_workflow/ush cp config.community.sh config.sh The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. @@ -203,7 +203,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min EXPT_SUBDIR="GST" EXPT_BASEDIR="home/$USER/expt_dirs" -Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :doc:`Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in the section on :doc:`Limited Area Model (LAM) Grids `. +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :ref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :ref:`Chapter %s: Limited Area Model (LAM) Grids `. .. Important:: From 77d565db54acf9cb39c2c294e5183186d2e8d197 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 24 Feb 2022 19:03:43 -0500 Subject: [PATCH 034/118] QS updates --- docs/UsersGuide/source/Quickstart.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 2609b91797..0c0782a708 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -203,7 +203,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min EXPT_SUBDIR="GST" EXPT_BASEDIR="home/$USER/expt_dirs" -Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :ref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :ref:`Chapter %s: Limited Area Model (LAM) Grids `. +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. .. Important:: From 2e1a03fc6f90f6a11e85a9306740370c13f69c6e Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 25 Feb 2022 18:36:37 -0500 Subject: [PATCH 035/118] updates to InputOutput and QS --- docs/UsersGuide/source/Glossary.rst | 7 +- docs/UsersGuide/source/InputOutputFiles.rst | 156 +++++++++----------- docs/UsersGuide/source/Quickstart.rst | 24 ++- 3 files changed, 94 insertions(+), 93 deletions(-) diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index db0b1cd140..f426ac8a9c 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -37,6 +37,9 @@ Glossary HRRR `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. + IC/LBC + Initial conditions/lateral boundary conditions + LAM Limited Area Model. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. @@ -66,8 +69,8 @@ Glossary NEMSIO A binary format for atmospheric model output from :term:`NCEP`'s Global Forecast System (GFS). - NWP - Numerical Weather Prediction + NWP (Numerical Weather Prediction) + Numerical Weather Prediction (NWP) takes current observations of weather and processes them with computer models to forecast the future state of the weather. Orography The branch of physical geography dealing with mountains diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index a103de9131..c508bcda85 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -4,8 +4,7 @@ Input and Output Files ====================== This chapter provides an overview of the input and output files needed by the components -of the UFS SRW Application (:term:`UFS_UTILS`, the UFS :term:`Weather Model`, and :term:`UPP`). -Links to more detailed documentation for each of the components are provided. +of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. Input Files =========== @@ -19,31 +18,19 @@ The external model files needed for initializing the runs can be obtained in a n ways, including: pulled directly from `NOMADS `_; limited data availability), pulled from the NOAA HPSS during the workflow execution (requires user access), or obtained and staged by the user from a different source. The data format for -these files can be :term:`GRIB2` or :term:`NEMSIO`. More information on downloading and staging -the external model data can be found in :numref:`Section %s `. Once staged, -the end-to-end application will run the system and write output files to disk. +these files can be :term:`GRIB2` or :term:`NEMSIO`. More information on downloading and setting up +the external model data can be found in :numref:`Section %s `. Once the data is set up, the end-to-end application will run the system and write output files to disk. Pre-processing (UFS_UTILS) -------------------------- -When a user runs the SRW Application as described in the quickstart guide -:numref:`Section %s `, input data for the pre-processing utilities is linked -from a location on disk to your experiment directory by the workflow generation step. The -pre-processing utilities use many different datasets to create grids, and to generate model -input datasets from the external model files. A detailed description of the input files -for the pre-processing utilities can be found `here -`__. +When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, the workflow generation step (:numref:`Step %s `) links the input data for the pre-processing utilities from a location on disk to the experiment directory. The +pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. UFS Weather Model ----------------- The input files for the weather model include both static (fixed) files and grid and date specific files (terrain, initial conditions, boundary conditions, etc). The static fix files -must be staged by the user unless you are running on a pre-configured platform, in which case -you can link to the existing copy on that machine. See :numref:`Section %s ` -for more information. The static, grid, and date specific files are linked in the experiment -directory by the workflow scripts. An extensive description of the input files for the weather -model can be found in the `UFS Weather Model User's Guide `__. -The namelists and configuration files for the SRW Application are created from templates by the -workflow, as described in :numref:`Section %s `. +must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s ` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. Unified Post Processor (UPP) ---------------------------- @@ -54,10 +41,10 @@ Documentation for the UPP input files can be found in the `UPP User's Guide Workflow -------- -The SRW Application uses a series of template files, combined with user selected settings, +The SRW Application uses a series of template files, combined with user-selected settings, to create the required namelists and parameter files needed by the Application. These -templates can be reviewed to see what defaults are being used, and where configuration parameters -are assigned from the ``config.sh`` file. +templates can be reviewed to see what defaults are being used and where configuration parameters +from the ``config.sh`` file are assigned. List of Template Files ^^^^^^^^^^^^^^^^^^^^^^ @@ -81,7 +68,7 @@ and are shown in :numref:`Table %s `. +-----------------------------+-------------------------------------------------------------+ | field_table_[CCPP] | Cycle-independent file that the forecast model reads in at | | | the start of each forecast. It specifies the tracers that | - | | the forecast model will advect. A different field_table | + | | the forecast model will advect. A different field_table | | | may be needed for different CCPP suites. | +-----------------------------+-------------------------------------------------------------+ | FV3.input.yml | YAML configuration file containing the forecast model’s | @@ -108,9 +95,7 @@ and are shown in :numref:`Table %s `. | README.xml_templating.md | Instruction of Rocoto XML templating with Jinja. | +-----------------------------+-------------------------------------------------------------+ -Additional information related to the ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, -``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide -`__, +Additional information related to the ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on the ``regional_grid.nml`` can be found in the `UFS_UTILS User’s Guide `_. @@ -119,13 +104,8 @@ Migratory Route of the Input Files in the Workflow :numref:`Figure %s ` shows how the case-specific input files in the ``ufs-srweather-app/regional_workflow/ush/templates/`` directory flow to the experiment directory. The value of ``CCPP_PHYS_SUITE`` is specified in the configuration file ``config.sh``. The template -input files corresponding to ``CCPP_PHYS_SUITE``, such as ``field_table`` and ``nems_configure``, are -copied to the experiment directory ``EXPTDIR`` and the namelist file of the weather model ``input.nml`` -is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the script ``generate_FV3LAM_wflow.sh``. -While running the task ``RUN_FCST`` in the regional workflow as shown in :numref:`Figure %s `, -the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``EXPTDIR`` are linked to the -cycle directory ``CYCLE_DIR/``, and ``diag_table`` and ``model_configure`` are copied from the ``templates`` -directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. +input files corresponding to ``CCPP_PHYS_SUITE``, such as ``field_table`` and ``nems_configure``, are copied to the experiment directory ``EXPTDIR``, and the namelist file of the weather model ``input.nml`` is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the script ``generate_FV3LAM_wflow.sh``. +While running the task ``RUN_FCST`` in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``EXPTDIR``, are linked to the cycle directory ``CYCLE_DIR/``. Additionally, ``diag_table`` and ``model_configure`` are copied from the ``templates`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. .. _MigratoryRoute: @@ -194,20 +174,15 @@ directory and have the naming convention (file->linked to): * ``BGRD3D_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgrd3df{fhr}.tmXX.grib2`` * ``BGDAWP_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgdawpf{fhr}.tmXX.grib2`` -The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by -the user in the ``config.sh`` settings. +The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by the user in the ``config.sh`` settings. If you wish to modify the fields or levels that are output from the UPP, you will need to make -modifications to file ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW -Application. Specifically, if the code was cloned in the directory ``ufs-srweather-app``, the file will be -located in ``ufs-srweather-app/src/UPP/parm``. +modifications to file ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. Specifically, if the code was cloned in the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``. .. note:: This process requires advanced knowledge of which fields can be output for the UFS Weather Model. -Use the directions in the `UPP User's Guide `__ -for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that -the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). +Use the directions in the `UPP User's Guide `__ for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). Once you have created the new flat text file reflecting your changes, you will need to modify your ``config.sh`` to point the workflow to the new text file. In your ``config.sh``, set the following: @@ -215,11 +190,10 @@ Once you have created the new flat text file reflecting your changes, you will n .. code-block:: console USE_CUSTOM_POST_CONFIG_FILE=”TRUE” - CUSTOM_POST_CONFIG_PATH=”/path/to/custom/postxconfig-NT-fv3lam.txt” + CUSTOM_POST_CONFIG_PATH=”” which tells the workflow to use the custom file located in the user-defined path. The path should -include the filename. If this is set to true and the file path is not found, then an error will occur -when trying to generate the SRW Application workflow. +include the filename. If this is set to true and the file path is not found, then an error will occur when trying to generate the SRW Application workflow. You may then start your case workflow as usual and the UPP will use the new flat ``*.txt`` file. @@ -228,22 +202,21 @@ You may then start your case workflow as usual and the UPP will use the new flat Downloading and Staging Input Data ================================== A set of input files, including static (fix) data and raw initial and lateral boundary conditions -(IC/LBCs), are needed to run the SRW Application. +(:term:`IC/LBC`'s), are needed to run the SRW Application. .. _StaticFixFiles: Static Files ------------ -A set of fix files are necessary to run the SRW Application. Environment variables describe the -location of the static files: ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` are the directories -where the static files are located. If you are on a pre-configured or configurable platform, there is no -need to stage the fixed files manually because they have been prestaged and the paths -are set in ``regional_workflow/ush/setup.sh``. If the user's platform is not defined -in that file, the static files can be pulled individually or as a full tar file from the `FTP data repository -`__ or from `Amazon Web Services (AWS) cloud storage -`__ -and staged on your machine. The paths to the staged files must then be set in ``config.sh`` -as follows: +The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to +the directories where the static files are located. If you are on a pre-configured or configurable platform (i.e., a Level 1 or 2 platform), there is no need to stage the fixed files manually because they have been prestaged, and the paths are set in ``regional_workflow/ush/setup.sh``. On Level 3 & 4 systems, the static files can be downloaded individually or as a full tar file from the `FTP data repository `__ or from `Amazon Web Services (AWS) cloud storage `__ using the ``wget`` command. Then ``tar -xf `` will extract the compressed file: + +.. code-block:: console + + wget https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/fix/fix_files.tar.gz + tar -xf fix_files.tar.gz + +The paths to the staged files must then be set in ``config.sh``. Add the following code or alter the variable paths if they are already listed in the ``config.sh`` file: * ``FIXgsm=/path-to/fix/fix_am`` * ``TOPO_DIR=/path-to/fix/fix_am/fix_orog`` @@ -251,25 +224,35 @@ as follows: Initial Condition Formats and Source ------------------------------------ -The SRW Application currently supports raw initial and lateral boundary conditions from numerous models -(i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, -or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files -from the GFS. - -Environment variables describe what IC/LBC files to use (pre-staged files or files to be automatically -pulled from the NOAA HPSS) and the location of the and IC/LBC files: ``USE_USER_STAGED_EXTRN_FILES`` -is the ``T/F`` flag defining what raw data files to use, ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` is the -directory where the initial conditions are located, and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` is the -directory where the lateral boundary conditions are located. - -If you have access to the NOAA HPSS and want to automatically download the IC/LBC files using the -workflow, these environment variables can be left out of the ``config.sh`` file. However, if you do -not have access to the NOAA HPSS and you need to pull and stage the data manually, you will need to -set ``USE_USER_STAGED_EXTRN_FILES`` to ``TRUE`` and then set the paths to the where the IC/LBC files are located. - -A small sample of IC/LBCs is available at the `FTP data repository -`__ or from `AWS cloud storage -`_. +The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS. + +The data required to run the "out-of'the-box" SRW case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. + +To add this data to your system, run the following commands from the ``ufs-srweather-app`` directory: + +.. code-block:: console + + wget https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/gst_model_data.tar.gz + tar -xf gst_model_data.tar.gz + +This will extract the files and place them within a new ``model_data`` directory inside the ``ufs-srweather-app``. + +Then, the paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` must be set in the ``config.sh`` file. + +.. code-block:: console + + cd + vi config.sh + +Then, in ``config.sh``, set the following environment variables: + +.. code-block:: console + + USE_USER_STAGED_EXTRN_FILES=TRUE + EXTRN_MDL_SOURCE_BASEDIR_ICS= + EXTRN_MDL_SOURCE_BASEDIR_LBCS= + +These environment variables describe what :term:`IC/LBC` files to use (pre-staged files or files to be automatically pulled from the NOAA HPSS) and the location of the IC/LBC files. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` is the directory where the initial conditions are located, and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` is the directory where the lateral boundary conditions are located. Initial and Lateral Boundary Condition Organization --------------------------------------------------- @@ -278,10 +261,9 @@ below. While there is flexibility to modify these settings, this will provide th for multiple dates when using the SRW Application workflow. For ease of reusing the ``config.sh`` for multiple dates and cycles, it is recommended to set up -your raw IC/LBC files such that it includes the model name (e.g., FV3GFS, NAM, RAP, HRRR) and -``YYYYMMDDHH``, for example: ``/path-to/model_data/FV3GFS/2019061518``. Since both initial +your raw :term:`IC/LBC` files such that it includes the model name (e.g., FV3GFS, NAM, RAP, HRRR) and ``YYYYMMDDHH``, for example: ``/path-to/model_data/FV3GFS/2019061518``. Since both initial and lateral boundary condition files are necessary, you can also include an ICS and LBCS directory. -The sample IC/LBCs available at the FTP data repository are structured as follows: +The sample IC/LBC's available at the FTP data repository are structured as follows: * ``/path-to/model_data/MODEL/YYYYMMDDHH/ICS`` * ``/path-to/model_data/MODEL/YYYYMMDDHH/LBCS`` @@ -289,8 +271,9 @@ The sample IC/LBCs available at the FTP data repository are structured as follow When files are pulled from the NOAA HPSS, the naming convention looks something like: * FV3GFS (GRIB2): ``gfs.t{cycle}z.pgrb2.0p25.f{fhr}`` -* FV3GFS (NEMSIO): ICs: ``gfs.t{cycle}z.atmanl.nemsio`` and ``gfs.t{cycle}z.sfcanl.nemsio``; - LBCs: ``gfs.t{cycle}z.atmf{fhr}.nemsio`` +* FV3GFS (NEMSIO): + *ICs: ``gfs.t{cycle}z.atmanl.nemsio`` and ``gfs.t{cycle}z.sfcanl.nemsio``; + *LBCs: ``gfs.t{cycle}z.atmf{fhr}.nemsio`` * RAP (GRIB2): ``rap.t{cycle}z.wrfprsf{fhr}.grib2`` * HRRR (GRIB2): ``hrrr.t{cycle}z.wrfprsf{fhr}.grib2`` @@ -313,7 +296,7 @@ Doing this allows for the following to be set in the ``config.sh`` regardless of EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/model_data/RAP" EXTRN_MDL_FILES_LBCS=( "rap.wrfprsf03.grib2" "rap.wrfprsf06.grib2" ) -If you choose to forgo the extra ``ICS`` and ``LBCS`` directory, you may also simply either +If you choose to forgo the extra ``ICS`` and ``LBCS`` directory, you may either rename the original files to remove the cycle or modify the ``config.sh`` to set: .. code-block:: console @@ -327,15 +310,14 @@ The default initial and lateral boundary condition files are set to be a severe from 20190615 at 00 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file (``gst_model_data.tar.gz``) containing the model data for this case is available on EMC's FTP data repository at https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/. It is -also available on Amazon Web Services (AWS) at -https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar.gz. +also available on Amazon Web Services (AWS) at https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar.gz. Running the App for Different Dates ----------------------------------- If users want to run the SRW Application for dates other than 06-15-2019, you will need to make a change in the case to specify the desired data. This is done by modifying the ``config.sh`` ``DATE_FIRST_CYCL``, ``DATE_LAST_CYCL``, and ``CYCL_HRS`` settings. The -forecast length can be modified by changed the ``FCST_LEN_HRS``. In addition, the lateral +forecast length can be modified by changing the ``FCST_LEN_HRS``. In addition, the lateral boundary interval can be specified using the ``LBC_SPEC_INTVL_HRS`` variable. Users will need to ensure that the initial and lateral boundary condition files are available @@ -346,7 +328,7 @@ Staging Initial Conditions Manually If users want to run the SRW Application with raw model files for dates other than what are currently available on the preconfigured platforms, they need to stage the data manually. The data should be placed in ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. -Raw model files may be available from a number of sources. A few examples are provided here for convenience. +The path to these variables can be set in the ``config.sh`` file. Raw model files are available from a number of sources. A few examples are provided here for convenience. NOMADS: https://nomads.ncep.noaa.gov/pub/data/nccf/com/{model}/prod, where model may be: @@ -392,7 +374,7 @@ GRIB2 and NEMSIO files your directory structure might look like: /path-to/model_data/FV3GFS/YYYYMMDDHH/ICS and LBCS /path-to/model_data/FV3GFS_nemsio/YYYYMMDDHH/ICS and LBCS -If you want to use GRIB2 format files for FV3GFS you must also set two additional environment +If you want to use GRIB2 format files for FV3GFS you must also set additional environment variables, including: .. code-block:: console @@ -411,6 +393,4 @@ that the users share the same ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_S directories. That way, if raw model input files are already on disk for a given date they do not need to be replicated. -The files in the subdirectories of the ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` -directories should be write-protected. This prevents these files from being accidentally modified or deleted. -The directories should generally be group writable so the directory can be shared among multiple users. +The files in the subdirectories of the ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directories should be write-protected. This prevents these files from being accidentally modified or deleted. The directories should generally be group writable so the directory can be shared among multiple users. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 0c0782a708..179acdf9e4 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -68,7 +68,7 @@ Clone the develop branch of the UFS-SRW weather application repository: .. code-block:: console - git clone https://github.com/jkbk2004/ufs-srweather-app + git clone -b feature/singularity --single-branch https://github.com/NOAA-EPIC/ufs-srweather-app.git .. COMMENT: This will need to be changed to release branch of the SRW repo once it exists. @@ -202,6 +202,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min ACCOUNT="none" EXPT_SUBDIR="GST" EXPT_BASEDIR="home/$USER/expt_dirs" + COMPILER="gnu" Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. @@ -243,6 +244,22 @@ For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use EXPT_SUBDIR="my_expt_name" +**NOAA Cloud Systems:** + +.. code-block:: console + + MACHINE="" + ACCOUNT="none" + EXPT_SUBDIR="" + EXPT_BASEDIR="lustre/$USER/expt_dirs" + COMPILER="gnu" + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" ) + + .. _SetUpPythonEnv: Set up the Python and other Environment Parameters @@ -257,8 +274,9 @@ This command will activate the ``regional_workflow``. The user should see ``(reg .. code-block:: console - conda activate regional_workflow - + conda init + source ~/.bashrc + conda activate regional_workflow .. _GenerateWorkflow: From 80519d4731bc209b5fe976247b6ae370e7cecc5f Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 25 Feb 2022 18:47:52 -0500 Subject: [PATCH 036/118] fix I/O doc typos --- docs/UsersGuide/source/InputOutputFiles.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index c508bcda85..4cfae309e8 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -272,8 +272,8 @@ When files are pulled from the NOAA HPSS, the naming convention looks something * FV3GFS (GRIB2): ``gfs.t{cycle}z.pgrb2.0p25.f{fhr}`` * FV3GFS (NEMSIO): - *ICs: ``gfs.t{cycle}z.atmanl.nemsio`` and ``gfs.t{cycle}z.sfcanl.nemsio``; - *LBCs: ``gfs.t{cycle}z.atmf{fhr}.nemsio`` + * ICs: ``gfs.t{cycle}z.atmanl.nemsio`` and ``gfs.t{cycle}z.sfcanl.nemsio``; + * LBCs: ``gfs.t{cycle}z.atmf{fhr}.nemsio`` * RAP (GRIB2): ``rap.t{cycle}z.wrfprsf{fhr}.grib2`` * HRRR (GRIB2): ``hrrr.t{cycle}z.wrfprsf{fhr}.grib2`` From 6f11030326e80e1f93035f10fa90d20721148c30 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 28 Feb 2022 16:52:39 -0500 Subject: [PATCH 037/118] pull updates to hpc-stack docs --- hpc-stack-mod | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc-stack-mod b/hpc-stack-mod index f8b32160eb..66d3cd635c 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit f8b32160eb1c1165a51b2df7479b2e9a158aadfc +Subproject commit 66d3cd635c836930e821a6a500572e81505f25d3 From 999a417f4b26d5bfdb4b375e9ee635607f0a3347 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 28 Feb 2022 19:10:55 -0500 Subject: [PATCH 038/118] pull updates to hpc-stack docs --- docs/UsersGuide/source/InputOutputFiles.rst | 11 +++++------ docs/UsersGuide/source/_static/theme_overrides.css | 1 + hpc-stack-mod | 2 +- 3 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 4cfae309e8..3ae544e9cc 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -4,7 +4,7 @@ Input and Output Files ====================== This chapter provides an overview of the input and output files needed by the components -of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. +of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. For SRW users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. Input Files =========== @@ -23,13 +23,12 @@ the external model data can be found in :numref:`Section %s `, the workflow generation step (:numref:`Step %s `) links the input data for the pre-processing utilities from a location on disk to the experiment directory. The +When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. UFS Weather Model ----------------- -The input files for the weather model include both static (fixed) files and grid and date -specific files (terrain, initial conditions, boundary conditions, etc). The static fix files +The input files for the weather model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s ` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. Unified Post Processor (UPP) @@ -53,7 +52,7 @@ and are shown in :numref:`Table %s `. .. _TemplateFiles: -.. table:: Template files for a regional workflow. +.. table:: Template Files for a Regional Workflow +-----------------------------+-------------------------------------------------------------+ | **File Name** | **Description** | @@ -68,7 +67,7 @@ and are shown in :numref:`Table %s `. +-----------------------------+-------------------------------------------------------------+ | field_table_[CCPP] | Cycle-independent file that the forecast model reads in at | | | the start of each forecast. It specifies the tracers that | - | | the forecast model will advect. A different field_table | + | | the forecast model will advect. A different field_table | | | may be needed for different CCPP suites. | +-----------------------------+-------------------------------------------------------------+ | FV3.input.yml | YAML configuration file containing the forecast model’s | diff --git a/docs/UsersGuide/source/_static/theme_overrides.css b/docs/UsersGuide/source/_static/theme_overrides.css index 63ee6cc74c..f2b48b594c 100644 --- a/docs/UsersGuide/source/_static/theme_overrides.css +++ b/docs/UsersGuide/source/_static/theme_overrides.css @@ -11,3 +11,4 @@ overflow: visible !important; } } + diff --git a/hpc-stack-mod b/hpc-stack-mod index 66d3cd635c..5e73fc49cd 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 66d3cd635c836930e821a6a500572e81505f25d3 +Subproject commit 5e73fc49cd0053833e78266c15ee996df48ce855 From f07fe8ab05b73da4c2e1729580e8722a5b8768fb Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 28 Feb 2022 19:30:48 -0500 Subject: [PATCH 039/118] fix table wrapping --- docs/UsersGuide/source/Include-HPCInstall.rst | 3 ++- docs/UsersGuide/source/conf.py | 5 +++-- 2 files changed, 5 insertions(+), 3 deletions(-) diff --git a/docs/UsersGuide/source/Include-HPCInstall.rst b/docs/UsersGuide/source/Include-HPCInstall.rst index 097519db2d..b467d96d23 100644 --- a/docs/UsersGuide/source/Include-HPCInstall.rst +++ b/docs/UsersGuide/source/Include-HPCInstall.rst @@ -4,4 +4,5 @@ .. include:: ../../../hpc-stack-mod/docs/source/hpc-prereqs.rst .. include:: ../../../hpc-stack-mod/docs/source/hpc-parameters.rst -.. include:: ../../../hpc-stack-mod/docs/source/hpc-components.rst \ No newline at end of file +.. include:: ../../../hpc-stack-mod/docs/source/hpc-components.rst +.. include:: ../../../hpc-stack-mod/docs/source/hpc-notes.rst \ No newline at end of file diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index d4404bb8b5..4f96cbed60 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -103,12 +103,13 @@ # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = [] - +#html_static_path = [] +html_static_path = ['_static'] html_context = {} def setup(app): app.add_css_file('custom.css') # may also be an URL + app.add_css_file('theme_overrides.css') # may also be a URL # Custom sidebar templates, must be a dictionary that maps document names # to template names. From b58d66173f77e72dbbd61f5f95f314ed10fa8194 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 3 Mar 2022 14:39:38 -0500 Subject: [PATCH 040/118] updates to QS for cloud --- .gitmodules | 3 -- docs/UsersGuide/source/InputOutputFiles.rst | 2 + docs/UsersGuide/source/Introduction.rst | 3 ++ docs/UsersGuide/source/Quickstart.rst | 46 +++++++++++---------- docs/UsersGuide/source/SRWAppOverview.rst | 22 +++++++++- hpc-stack-mod | 1 - 6 files changed, 49 insertions(+), 28 deletions(-) delete mode 160000 hpc-stack-mod diff --git a/.gitmodules b/.gitmodules index d115eb5c82..e69de29bb2 100644 --- a/.gitmodules +++ b/.gitmodules @@ -1,3 +0,0 @@ -[submodule "hpc-stack-mod"] - path = hpc-stack-mod - url = https://github.com/gspetro-NOAA/hpc-stack.git diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 3ae544e9cc..1f298211af 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -6,6 +6,8 @@ Input and Output Files This chapter provides an overview of the input and output files needed by the components of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. For SRW users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. +.. _Input: + Input Files =========== The SRW Application requires numerous input files to run: static datasets (fix files diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 877dee4ada..66e89337af 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -26,6 +26,9 @@ Variables presented as ``AaBbCc123`` in this document typically refer to variabl File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). +.. hint:: + To get started running the SRW, see the :ref:`Quick Start Guide ` sections on running the SRW in a container. + Pre-processor Utilities and Initial Conditions ============================================== diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 179acdf9e4..9acd0da091 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -43,26 +43,34 @@ The SRW Application source code is publicly available on GitHub and can be run i Run the UFS SRW in a Singularity Container ------------------------------------------- -Pull the Singularity container: +.. note:: + On NOAA Cloud systems, certain environment variables must be set *before* building the container: + + .. code-block:: console + sudo su + export SINGULARITY_CACHEDIR=/lustre/cache + export SINGULARITY_TEMPDIR=/lustre/tmp -.. code-block:: console + If the ``cache`` and ``tmp`` directories do not exist already, they must be created. - singularity pull ubuntu20.04-epic-srwapp-1.0.sif docker://noaaepic/ubuntu20.04-epic-srwapp:1.0 +.. important:: + ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to your ``/contrib`` directory, which is much slower but persistent. -Build the container and make a ``contrib`` directory inside it if one does not already exist: +Build the container: .. code-block:: console - singularity build --sandbox ubuntu20.04-epic-srwapp-1.0 ubuntu20.04-epic-srwapp-1.0.sif - cd ubuntu20.04-epic-srwapp-1.0 - mkdir contrib - cd .. + singularity build --sandbox ubuntu20.04-epic-srwapp-1.0 docker://noaaepic/ubuntu20.04-epic-srwapp:1.0 + +.. note:: + If a ``singularity: command not found`` error message appears, try running: ``module load singularity``. + Start the container and run an interactive shell within it. This command also binds the local home directory to the container so that data can be shared between them. .. code-block:: console - singularity shell -e --writable --bind /:/contrib ubuntu20.04-epic-srwapp-1.0 + singularity shell -e --writable --bind /:/lustre ubuntu20.04-epic-srwapp-1.0 Clone the develop branch of the UFS-SRW weather application repository: @@ -70,8 +78,8 @@ Clone the develop branch of the UFS-SRW weather application repository: git clone -b feature/singularity --single-branch https://github.com/NOAA-EPIC/ufs-srweather-app.git -.. - COMMENT: This will need to be changed to release branch of the SRW repo once it exists. +.. + COMMENT: change repo for release Check out submodules for the SRW Application: @@ -109,7 +117,7 @@ Set up the Build Environment Container Approach -------------------- -If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the `ufs-srweather-app` directory as follows: +If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the ``ufs-srweather-app`` directory as follows: .. code-block:: console @@ -133,11 +141,6 @@ directory to source the appropriate file. On Level 3-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. On systems without Lmod, this process will typically involve commands in the form ``export =``. You may need to use ``setenv`` rather than ``export`` depending on your environment. -.. hint:: - - If the system cannot find a module (i.e., a "module unknown" message appears), check whether the module version numbers match in ``ufs-srweather-app/env/build__.env`` and the ``hpc-stack/stack/stack_custom.yaml``. - - Build the Executables ===================== @@ -155,17 +158,17 @@ From the build directory, run the ``cmake`` command below to set up the ``Makefi cmake .. -DCMAKE_INSTALL_PREFIX=.. make -j 4 >& build.out & -The build will take a few minutes to complete. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. +The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console when you list the files in ``ufs-srweather-app/bin`` (``[1]+ Exit`` may indicate an error). Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. .. hint:: - If you do not see a ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. Download and Stage the Data ============================ The SRW requires input files to run. These include static datasets, initial and boundary conditions -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :doc:`Input and Output Files `, Section 3. Section 1 contains useful background information on the input files required by the SRW. +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. .. _GenerateForecast: @@ -302,10 +305,9 @@ An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is csh/tcsh, replace ``export`` with ``setenv`` in the command above. - Run the Workflow Using Rocoto ============================= -The information in this section assumes that Rocoto is available on the desired platform. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. +The information in this section assumes that Rocoto is available on the desired platform. Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. Launch the Rocoto Workflow Using a Script ----------------------------------------------- diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 61d9c057f5..1e2c96766f 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -140,9 +140,9 @@ executables listed in :numref:`Table %s ` will be located in th +------------------------+---------------------------------------------------------------------------------+ | make_solo_mosaic | Creates mosaic files with halos | +------------------------+---------------------------------------------------------------------------------+ - | ncep_post | Post-processor for the model output | + | upp.x | Post-processor for the model output | +------------------------+---------------------------------------------------------------------------------+ - | NEMS.exe | UFS Weather Model executable | + | ufs_model | UFS Weather Model executable | +------------------------+---------------------------------------------------------------------------------+ | orog | Generates orography, land mask, and gravity wave drag files from fixed files | +------------------------+---------------------------------------------------------------------------------+ @@ -155,6 +155,24 @@ executables listed in :numref:`Table %s ` will be located in th +------------------------+---------------------------------------------------------------------------------+ | vcoord_gen | Generates hybrid coordinate interface profiles | +------------------------+---------------------------------------------------------------------------------+ + | fvcom_to_FV3 | | + +------------------------+---------------------------------------------------------------------------------+ + | make_hgrid | | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_ice_blend | | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_snow2mdl | | + +------------------------+---------------------------------------------------------------------------------+ + | global_cycle | | + +------------------------+---------------------------------------------------------------------------------+ + | inland | | + +------------------------+---------------------------------------------------------------------------------+ + | orog_gsl | | + +------------------------+---------------------------------------------------------------------------------+ + | fregrid | | + +------------------------+---------------------------------------------------------------------------------+ + | lakefrac | | + +------------------------+---------------------------------------------------------------------------------+ .. _GridSpecificConfig: diff --git a/hpc-stack-mod b/hpc-stack-mod deleted file mode 160000 index 5e73fc49cd..0000000000 --- a/hpc-stack-mod +++ /dev/null @@ -1 +0,0 @@ -Subproject commit 5e73fc49cd0053833e78266c15ee996df48ce855 From 0b50e046e310e856ad905184fb25e86e1863033d Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 3 Mar 2022 15:06:32 -0500 Subject: [PATCH 041/118] fix QS export statements --- docs/UsersGuide/source/Quickstart.rst | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 9acd0da091..1f35407cfd 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -46,14 +46,13 @@ Run the UFS SRW in a Singularity Container .. note:: On NOAA Cloud systems, certain environment variables must be set *before* building the container: - .. code-block:: console - sudo su - export SINGULARITY_CACHEDIR=/lustre/cache - export SINGULARITY_TEMPDIR=/lustre/tmp + ``sudo su`` + ``export SINGULARITY_CACHEDIR=/lustre/cache`` + ``export SINGULARITY_TEMPDIR=/lustre/tmp`` If the ``cache`` and ``tmp`` directories do not exist already, they must be created. -.. important:: +.. note:: ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to your ``/contrib`` directory, which is much slower but persistent. Build the container: @@ -76,7 +75,7 @@ Clone the develop branch of the UFS-SRW weather application repository: .. code-block:: console - git clone -b feature/singularity --single-branch https://github.com/NOAA-EPIC/ufs-srweather-app.git + git clone -b feature/singularity-addition https://github.com/EdwardSnyder-NOAA/ufs-srweather-app .. COMMENT: change repo for release From 8786b3246c47db3b628a0dfa29e00039fd96928f Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 3 Mar 2022 15:09:48 -0500 Subject: [PATCH 042/118] fix QS export statements --- docs/UsersGuide/source/Quickstart.rst | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 1f35407cfd..044d920b65 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -47,7 +47,9 @@ Run the UFS SRW in a Singularity Container On NOAA Cloud systems, certain environment variables must be set *before* building the container: ``sudo su`` + ``export SINGULARITY_CACHEDIR=/lustre/cache`` + ``export SINGULARITY_TEMPDIR=/lustre/tmp`` If the ``cache`` and ``tmp`` directories do not exist already, they must be created. From 3a442d668d66d3d3c564a6966ce44d18884d3705 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 3 Mar 2022 17:15:46 -0500 Subject: [PATCH 043/118] QS edits on bind, config --- docs/UsersGuide/source/Quickstart.rst | 26 +++++++++++++++----------- 1 file changed, 15 insertions(+), 11 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 044d920b65..b4a9c55275 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -46,16 +46,15 @@ Run the UFS SRW in a Singularity Container .. note:: On NOAA Cloud systems, certain environment variables must be set *before* building the container: - ``sudo su`` - - ``export SINGULARITY_CACHEDIR=/lustre/cache`` - - ``export SINGULARITY_TEMPDIR=/lustre/tmp`` + .. code-block:: + + sudo su + export SINGULARITY_CACHEDIR=/lustre/cache + export SINGULARITY_TEMPDIR=/lustre/tmp If the ``cache`` and ``tmp`` directories do not exist already, they must be created. -.. note:: - ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to your ``/contrib`` directory, which is much slower but persistent. + ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to the ``/contrib`` directory, which is much slower but persistent. Build the container: @@ -67,11 +66,16 @@ Build the container: If a ``singularity: command not found`` error message appears, try running: ``module load singularity``. -Start the container and run an interactive shell within it. This command also binds the local home directory to the container so that data can be shared between them. +Start the container and run an interactive shell within it. This command also binds the local directory to the container so that data can be shared between them. On NOAA systems, the local directory is usually the topmost/base/root directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. .. code-block:: console - singularity shell -e --writable --bind /:/lustre ubuntu20.04-epic-srwapp-1.0 + singularity shell -e --writable --bind /:/ ubuntu20.04-epic-srwapp-1.0 + +.. important:: + When binding the two directories, they must have the same name! It may be necessary to create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. + + Be sure to bind the directory that contains the data the experiment will access. Clone the develop branch of the UFS-SRW weather application repository: @@ -202,7 +206,7 @@ Next, edit the new ``config.sh`` file to customize it for your machine. At a min .. code-block:: console - MACHINE="AWS" + MACHINE="SINGULARITY" ACCOUNT="none" EXPT_SUBDIR="GST" EXPT_BASEDIR="home/$USER/expt_dirs" @@ -252,7 +256,7 @@ For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use .. code-block:: console - MACHINE="" + MACHINE="SINGULARITY" ACCOUNT="none" EXPT_SUBDIR="" EXPT_BASEDIR="lustre/$USER/expt_dirs" From 0cae1601464d95c8596f502bee542ec6ea40e184 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 4 Mar 2022 10:48:40 -0500 Subject: [PATCH 044/118] add bullet points to notes --- docs/UsersGuide/source/Quickstart.rst | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index b4a9c55275..6007744424 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -52,9 +52,9 @@ Run the UFS SRW in a Singularity Container export SINGULARITY_CACHEDIR=/lustre/cache export SINGULARITY_TEMPDIR=/lustre/tmp - If the ``cache`` and ``tmp`` directories do not exist already, they must be created. + * If the ``cache`` and ``tmp`` directories do not exist already, they must be created. - ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to the ``/contrib`` directory, which is much slower but persistent. + * ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to the ``/contrib`` directory, which is much slower but persistent. Build the container: @@ -73,9 +73,8 @@ Start the container and run an interactive shell within it. This command also bi singularity shell -e --writable --bind /:/ ubuntu20.04-epic-srwapp-1.0 .. important:: - When binding the two directories, they must have the same name! It may be necessary to create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. - - Be sure to bind the directory that contains the data the experiment will access. + * When binding two directories, they must have the same name. It may be necessary to create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. + * Be sure to bind the directory that contains the data the experiment will access. Clone the develop branch of the UFS-SRW weather application repository: From 27247a5cac1ab30a8968afe05fb62b1538b9cd1a Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 4 Mar 2022 13:59:10 -0500 Subject: [PATCH 045/118] running without rocoto --- docs/UsersGuide/source/Quickstart.rst | 2 +- docs/UsersGuide/source/SRWAppOverview.rst | 20 ++++++++++++++++++++ 2 files changed, 21 insertions(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 6007744424..a894f580e2 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -196,7 +196,7 @@ Make a copy of ``config.community.sh`` to get started (under `` Date: Fri, 4 Mar 2022 14:11:10 -0500 Subject: [PATCH 046/118] add HPC-Stack submodule w/docs --- .gitmodules | 3 +++ hpc-stack-mod | 1 + 2 files changed, 4 insertions(+) create mode 160000 hpc-stack-mod diff --git a/.gitmodules b/.gitmodules index e69de29bb2..ca914133d5 100644 --- a/.gitmodules +++ b/.gitmodules @@ -0,0 +1,3 @@ +[submodule "hpc-stack-mod"] + path = hpc-stack-mod + url = https://github.com/NOAA-EMC/hpc-stack.git diff --git a/hpc-stack-mod b/hpc-stack-mod new file mode 160000 index 0000000000..0199b163a2 --- /dev/null +++ b/hpc-stack-mod @@ -0,0 +1 @@ +Subproject commit 0199b163a28d410524ebd9586699ca20620aa509 From 29cf292b94ba18eb26b427f1c7064d14f9b35682 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 8 Mar 2022 16:10:34 -0500 Subject: [PATCH 047/118] split QS into container/non-container approaches --- docs/UsersGuide/source/Components.rst | 3 +- docs/UsersGuide/source/FAQ.rst | 2 +- docs/UsersGuide/source/Glossary.rst | 3 + docs/UsersGuide/source/InputOutputFiles.rst | 4 +- docs/UsersGuide/source/Introduction.rst | 6 +- .../source/Quickstart_Container.rst | 306 ++++++++++++++++++ ...kstart.rst => Quickstart_NonContainer.rst} | 98 +----- docs/UsersGuide/source/SRWAppOverview.rst | 63 ++-- docs/UsersGuide/source/WE2Etests.rst | 2 +- docs/UsersGuide/source/index.rst | 3 +- 10 files changed, 365 insertions(+), 125 deletions(-) create mode 100644 docs/UsersGuide/source/Quickstart_Container.rst rename docs/UsersGuide/source/{Quickstart.rst => Quickstart_NonContainer.rst} (85%) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 9562242a59..50513a303e 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -87,8 +87,7 @@ For the selected computational platforms that have been pre-configured (Level 1) required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these -pre-configured platforms, and users can proceed directly to the using the workflow, as -described in the Quick Start (:numref:`Section %s `). +pre-configured platforms. Users can download the SRW code and choose whether to run it :ref:`in a container ` or :ref:`locally `. A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. diff --git a/docs/UsersGuide/source/FAQ.rst b/docs/UsersGuide/source/FAQ.rst index 05313a998c..ee744db726 100644 --- a/docs/UsersGuide/source/FAQ.rst +++ b/docs/UsersGuide/source/FAQ.rst @@ -36,7 +36,7 @@ run the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks. How do I define an experiment name? =================================== The name of the experiment is set in the ``config.sh`` file using the variable ``EXPT_SUBDIR``. -See :numref:`Section %s ` for more details. +See :numref:`Section %s ` for more details. ================================================ How do I change the Suite Definition File (SDF)? diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index f426ac8a9c..dbac0aca42 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -43,6 +43,9 @@ Glossary LAM Limited Area Model. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. + MPI + MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC). + NAM `North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes. diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 1f298211af..65883c796f 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -25,7 +25,7 @@ the external model data can be found in :numref:`Section %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The +When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. UFS Weather Model @@ -227,7 +227,7 @@ Initial Condition Formats and Source ------------------------------------ The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS. -The data required to run the "out-of'the-box" SRW case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. +The data required to run the "out-of'the-box" SRW case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. To add this data to your system, run the following commands from the ``ufs-srweather-app`` directory: diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 66e89337af..0a4ba41d9a 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. +The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides two Quick Start Guides for running the application :ref:`in a conainer ` or :ref:`locally `, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -27,7 +27,7 @@ Variables presented as ``AaBbCc123`` in this document typically refer to variabl File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). .. hint:: - To get started running the SRW, see the :ref:`Quick Start Guide ` sections on running the SRW in a container. + To get started running the SRW, see the :ref:`Containerized Quick Start Guide `. Pre-processor Utilities and Initial Conditions @@ -76,7 +76,7 @@ The SRW Application has a portable CMake-based build system that packages togeth The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can proceed directly to using the workflow to generate an experiment, as described in the Quick Start Guide :numref:`Section %s Generate the Forecast Experiment `. On other platforms, the required libraries will need to be installed via the HPC-Stack (see :numref:`Section %s Installing the HPC-Stack `). Once these libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW must be :ref:`run within a container ` that contains the HPC-Stack, or the required libraries (i.e., HPC-Stack) will need to be installed as part of the :ref:`non-container `) SRW installation process. Once these prerequisite libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. User Support, Documentation, and Contributions to Development =============================================================== diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst new file mode 100644 index 0000000000..c8ae4cafca --- /dev/null +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -0,0 +1,306 @@ +.. _QuickstartC: + +================================================= +Containerized Workflow Quick Start (Recommended) +================================================= + +This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a container. The container approach provides a uniform enviroment in which to build and run the SRW. Normally, the details of building and running the SRW vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions. Installation via an EPIC-provided container reduces this variability and allows for a smoother SRW build and run experience. + +The "out-of-the-box" SRW case described in this guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. + +.. _DownloadCodeC: + +Download the UFS SRW Application Code +=========================================== +The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. + +Working in the Cloud +----------------------- + +Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure the SRW builds and runs correctly. For those working on non-cloud-based systems, skip to :numref:`Step %s `. + +On NOAA Cloud systems, certain environment variables must be set *before* building the container: + +.. code-block:: + + sudo su + export SINGULARITY_CACHEDIR=/lustre/cache + export SINGULARITY_TEMPDIR=/lustre/tmp + +* If the ``cache`` and ``tmp`` directories do not exist already, they must be created. + +* ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the file `__ and move it to the ``/contrib`` directory, which is much slower but persistent. + +On NOAA Cloud systems, allocate a compute node on which to run the SRW. Then, build and run the SRW from that node: + +.. code-block:: console + + salloc -N 1 + module load gnu openmpi + mpirun -n 1 hostname + +This last command will output a hostname. Next, run ``ssh ``, replacing ```` with the actual hostname output in the prior command. + +.. _BuildC: + +Set Up the Container +------------------------ + +Build the container: + +.. code-block:: console + + singularity build --sandbox ubuntu20.04-epic-srwapp-1.0 docker://noaaepic/ubuntu20.04-epic-srwapp:1.0 + +.. hint:: + If a ``singularity: command not found`` error message appears, try running: ``module load singularity``. + +Start the container and run an interactive shell within it: + +.. code-block:: console + + singularity shell -e --writable --bind /:/ ubuntu20.04-epic-srwapp-1.0 + +The command above also binds the local directory to the container so that data can be shared between them. On NOAA systems, the local directory is usually the topmost directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. + +.. important:: + * When binding two directories, they must have the same name. It may be necessary to ``cd`` into the container and create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. + * Be sure to bind the directory that contains the data the experiment will access. + +Download the SRW Code +------------------------ + +Clone the develop branch of the UFS-SRW weather application repository: + +.. code-block:: console + + git clone -b feature/singularity-addition https://github.com/EdwardSnyder-NOAA/ufs-srweather-app + +.. + COMMENT: change repo for release + +Check out submodules for the SRW Application: + +.. code-block:: console + + cd ufs-srweather-app + ./manage_externals/checkout_externals + +If the ``manage_externals`` command brings up an error, it may be necessary to run ``ln -s /usr/bin/python3 /usr/bin/python`` first. + + +.. _SetUpBuildC: + +Set up the Build Environment +============================ + +If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the ``ufs-srweather-app`` directory as follows: + +.. code-block:: console + + ln -s /usr/bin/python3 /usr/bin/python + source /usr/share/lmod/6.6/init/profile + module use /opt/hpc-modules/modulefiles/stack + module load hpc hpc-gnu hpc-openmpi hpc-python + module load netcdf hdf5 bacio sfcio sigio nemsio w3emc esmf fms crtm g2 png zlib g2tmpl ip sp w3nco cmake gfsio wgrib2 upp + + + +Build the Executables +===================== + +Create a directory to hold the build's executables: + +.. code-block:: console + + mkdir build + cd build + +From the build directory, run the ``cmake`` command below to set up the ``Makefile``, then run the ``make`` command to build the executables: + +.. code-block:: console + + cmake .. -DCMAKE_INSTALL_PREFIX=.. + make -j 4 >& build.out & + +The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console when you list the files in ``ufs-srweather-app/bin`` (``[1]+ Exit`` may indicate an error). Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. + +.. hint:: + + If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + +Download and Stage the Data +============================ + +The SRW requires input files to run. These include static datasets, initial and boundary conditions +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. + +.. _GenerateForecastC: + +Generate the Forecast Experiment +================================= +Generating the forecast experiment requires three steps: + +* Set experiment parameters +* Set Python and other environment parameters +* Run a script to generate the experiment workflow + +The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. + +.. _SetUpConfigFileC: + +Set Experiment Parameters +------------------------- +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. + +Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: + +.. code-block:: console + + cd ../regional_workflow/ush + cp config.community.sh config.sh + +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. + +Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. For example: + +.. code-block:: console + + MACHINE="SINGULARITY" + ACCOUNT="none" + EXPT_SUBDIR="GST" + EXPT_BASEDIR="home/$USER/expt_dirs" + COMPILER="gnu" + +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. + +.. Important:: + + If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. + +Minimum parameter settings for Level 1 machines: + +**Cheyenne:** + +.. code-block:: console + + MACHINE="cheyenne" + ACCOUNT="" + EXPT_SUBDIR="" + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" + +**Hera:** + +.. code-block:: console + + MACHINE="hera" + ACCOUNT="" + EXPT_SUBDIR="" + +**Jet, Orion, Gaea:** + +The settings are the same as for Hera, except that ``"hera"`` should be switched to ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. + +For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: + +.. code-block:: console + + MACHINE=”wcoss_cray” or MACHINE=”wcoss_dell_p3” + ACCOUNT="my_account" + EXPT_SUBDIR="my_expt_name" + + +**NOAA Cloud Systems:** + +.. code-block:: console + + MACHINE="SINGULARITY" + ACCOUNT="none" + EXPT_SUBDIR="" + EXPT_BASEDIR="lustre/$USER/expt_dirs" + COMPILER="gnu" + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" ) + + +.. _SetUpPythonEnvC: + +Activate the Regional Workflow +---------------------------------------------- +Next, activate the regional workflow. + +.. code-block:: console + + conda init + source ~/.bashrc + conda activate regional_workflow + +The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. + + +.. _GenerateWorkflowC: + +Generate the Regional Workflow +------------------------------------------- + +Run the following command to generate the workflow: + +.. code-block:: console + + ./generate_FV3LAM_wflow.sh + +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The last line of output from this script should start with ``*/1 * * * *`` or ``*/3 * * * *``. + +The generated workflow will be in experiment directory specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in the experiment directory. + +Run the Workflow Using Stand-Alone Scripts +============================================= + +.. note:: + The Rocoto workflow manager cannot be used inside a container. + +#. ``cd`` into the experiment directory + +#. Set the environment variable ``EXPTDIR`` for either bash or csh, respectively: + + .. code-block:: console + + export EXPTDIR=`pwd` + setenv EXPTDIR `pwd` + +#. COPY the wrapper scripts from the regional_workflow directory into your experiment directory: + + .. code-block:: console + + cp ufs-srweather-app/regional_workflow/ush/wrappers/* . + +#. Set the OMP_NUM_THREADS variable and fix dash/bash shell issue (this ensures the system does not use an alias of ``sh`` to dash). + + .. code-block:: console + + export OMP_NUM_THREADS=1 + sed -i 's/bin\/sh/bin\/bash/g' *sh + +#. RUN each of the listed scripts in order. Scripts with the same stage number (listed in :numref:`Table %s `) may be run simultaneously. + + .. code-block:: console + + ./run_make_grid.sh + ./run_get_ics.sh + ./run_get_lbcs.sh + ./run_make_orog.sh + ./run_make_sfc_climo.sh + ./run_make_ics.sh + ./run_make_lbcs.sh + ./run_fcst.sh + ./run_post.sh + + +Plot the Output +=============== +Two python scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart_NonContainer.rst similarity index 85% rename from docs/UsersGuide/source/Quickstart.rst rename to docs/UsersGuide/source/Quickstart_NonContainer.rst index a894f580e2..ad6c20ee04 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart_NonContainer.rst @@ -1,8 +1,8 @@ -.. _Quickstart: +.. _QuickstartNC: -==================== -Workflow Quick Start -==================== +====================================== +Workflow Quick Start (Non-Container) +====================================== This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. @@ -10,6 +10,9 @@ This Workflow Quick Start Guide will help users to build and run the "out-of-the The UFS defines `four platform levels `_. The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. +.. note:: + The :ref:`container approach ` is recommended when possible for a smoother build and run experience. Building without a container allows for use of the Rocoto workflow manager and may allow for more cutomization; however, this comes at the expense of more in-depth troubleshooting, especially on Level 3 and 4 systems. + .. _HPCstackInfo: @@ -34,69 +37,12 @@ Users can either build the HPC-stack on their local system or use the centrally After completing installation, continue to the next section. -.. _DownloadCode: +.. _DownloadCodeNC: Download the UFS SRW Application Code ===================================== The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. -Run the UFS SRW in a Singularity Container -------------------------------------------- - -.. note:: - On NOAA Cloud systems, certain environment variables must be set *before* building the container: - - .. code-block:: - - sudo su - export SINGULARITY_CACHEDIR=/lustre/cache - export SINGULARITY_TEMPDIR=/lustre/tmp - - * If the ``cache`` and ``tmp`` directories do not exist already, they must be created. - - * ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, tar the file and move it to the ``/contrib`` directory, which is much slower but persistent. - -Build the container: - -.. code-block:: console - - singularity build --sandbox ubuntu20.04-epic-srwapp-1.0 docker://noaaepic/ubuntu20.04-epic-srwapp:1.0 - -.. note:: - If a ``singularity: command not found`` error message appears, try running: ``module load singularity``. - - -Start the container and run an interactive shell within it. This command also binds the local directory to the container so that data can be shared between them. On NOAA systems, the local directory is usually the topmost/base/root directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. - -.. code-block:: console - - singularity shell -e --writable --bind /:/ ubuntu20.04-epic-srwapp-1.0 - -.. important:: - * When binding two directories, they must have the same name. It may be necessary to create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. - * Be sure to bind the directory that contains the data the experiment will access. - -Clone the develop branch of the UFS-SRW weather application repository: - -.. code-block:: console - - git clone -b feature/singularity-addition https://github.com/EdwardSnyder-NOAA/ufs-srweather-app - -.. - COMMENT: change repo for release - -Check out submodules for the SRW Application: - -.. code-block:: console - - cd ufs-srweather-app - ./manage_externals/checkout_externals - -If the ``manage_externals`` command brings up an error, it may be necessary to run ``ln -s /usr/bin/python3 /usr/bin/python`` first. - -Run the UFS SRW Without a Container ------------------------------------- - Clone the release branch of the repository: .. code-block:: console @@ -114,27 +60,11 @@ Then, run the executable that pulls in the submodules for the SRW Application: ./manage_externals/checkout_externals -.. _SetUpBuild: +.. _SetUpBuildNC: Set up the Build Environment ============================ -Container Approach --------------------- -If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the ``ufs-srweather-app`` directory as follows: - -.. code-block:: console - - ln -s /usr/bin/python3 /usr/bin/python - source /usr/share/lmod/6.6/init/profile - module use /opt/hpc-modules/modulefiles/stack - module load hpc hpc-gnu hpc-openmpi hpc-python - module load netcdf hdf5 bacio sfcio sigio nemsio w3emc esmf fms crtm g2 png zlib g2tmpl ip sp w3nco cmake gfsio wgrib2 upp - - -On Other Systems (Non-Container Approach) ------------------------------------------- - For Level 1 and 2 systems, scripts for loading the proper modules and/or setting the correct environment variables can be found in the ``env/`` directory of the SRW App in files named ``build__.env``. The commands in these files can be directly copy-pasted @@ -174,7 +104,7 @@ Download and Stage the Data The SRW requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. -.. _GenerateForecast: +.. _GenerateForecastNC: Generate the Forecast Experiment ================================= @@ -186,7 +116,7 @@ Generating the forecast experiment requires three steps: The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. -.. _SetUpConfigFile: +.. _SetUpConfigFileNC: Set Experiment Parameters ------------------------- @@ -215,7 +145,7 @@ Sample settings are indicated below for Level 1 platforms. Detailed guidance app .. Important:: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. + If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. Minimum parameter settings for Level 1 machines: @@ -286,7 +216,7 @@ This command will activate the ``regional_workflow``. The user should see ``(reg conda activate regional_workflow -.. _GenerateWorkflow: +.. _GenerateWorkflowNC: Generate the Regional Workflow ------------------------------------------- @@ -299,7 +229,7 @@ Run the following command to generate the workflow: The last line of output from this script, starting with ``*/1 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. -This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index c93b9e8a28..9810f9955b 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -155,23 +155,23 @@ executables listed in :numref:`Table %s ` will be located in th +------------------------+---------------------------------------------------------------------------------+ | vcoord_gen | Generates hybrid coordinate interface profiles | +------------------------+---------------------------------------------------------------------------------+ - | fvcom_to_FV3 | | + | fvcom_to_FV3 | | +------------------------+---------------------------------------------------------------------------------+ - | make_hgrid | | + | make_hgrid | | +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_ice_blend | | + | emcsfc_ice_blend | | +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_snow2mdl | | + | emcsfc_snow2mdl | | +------------------------+---------------------------------------------------------------------------------+ - | global_cycle | | + | global_cycle | | +------------------------+---------------------------------------------------------------------------------+ - | inland | | + | inland | | +------------------------+---------------------------------------------------------------------------------+ - | orog_gsl | | + | orog_gsl | | +------------------------+---------------------------------------------------------------------------------+ - | fregrid | | + | fregrid | | +------------------------+---------------------------------------------------------------------------------+ - | lakefrac | | + | lakefrac | | +------------------------+---------------------------------------------------------------------------------+ .. _GridSpecificConfig: @@ -640,20 +640,11 @@ Wait a few seconds and issue a second set of ``rocotorun`` and ``rocotostat`` co Run the Workflow Using the Stand-alone Scripts ---------------------------------------------- -The regional workflow has the capability to be run using standalone shell scripts if the -Rocoto software is not available on a given platform. These scripts are located in the -``ufs-srweather-app/regional_workflow/ush/wrappers`` directory. Each workflow task has -a wrapper script to set environment variables and run the job script. +The regional workflow has the capability to be run using standalone shell scripts if the Rocoto software is not available on a given platform. These scripts are located in the ``ufs-srweather-app/regional_workflow/ush/wrappers`` directory. Each workflow task has a wrapper script to set environment variables and run the job script. Example batch-submit scripts for Hera (Slurm) and Cheyenne (PBS) are included: ``sq_job.sh`` -and ``qsub_job.sh``, respectively. These examples set the build and run environment for Hera or Cheyenne -so that run-time libraries match the compiled libraries (i.e. netCDF, MPI). Users may either -modify the submit batch script as each task is submitted, or duplicate this batch wrapper -for their system settings for each task. Alternatively, some batch systems allow users to -specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, -for example). This piece will be unique to your platform. The tasks run by the regional workflow -are shown in :numref:`Table %s `. Tasks with the same stage level may -be run concurrently (no dependency). +and ``qsub_job.sh``, respectively. These examples set the build and run environment for Hera or Cheyenne so that run-time libraries match the compiled libraries (i.e. netCDF, MPI). Users may either modify the submit batch script as each task is submitted, or duplicate this batch wrapper +for their system settings for each task. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). This piece will be unique to your platform. The tasks run by the regional workflow are shown in :numref:`Table %s `. Tasks with the same stage level may be run concurrently (no dependency). .. _RegionalWflowTasks: @@ -713,15 +704,14 @@ The steps to run the standalone scripts are as follows: cp ufs-srweather-app/regional_workflow/ush/wrappers/* . -#. f00 +#. Set the OMP_NUM_THREADS variable and fix dash/bash shell issue (this ensures the system does not use an alias of ``sh`` to dash). -.. code-block:: + .. code-block:: console - export OMP_NUM_THREADS=1 - sed -i 's/bin\/sh/bin\/bash/g' *sh + export OMP_NUM_THREADS=1 + sed -i 's/bin\/sh/bin\/bash/g' *sh -#. RUN each of the listed scripts in order. Scripts with the same stage number - may be run simultaneously. +#. RUN each of the listed scripts in order. Scripts with the same stage number (listed in :numref:`Table %s `) may be run simultaneously. .. code-block:: console @@ -735,14 +725,25 @@ The steps to run the standalone scripts are as follows: ./run_fcst.sh ./run_post.sh + .. note:: + If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. To allocate a second node: + + .. code-block:: console + + salloc -N 1 + module load gnu openmpi + mpirun -n 1 hostname + + This last command will output a hostname. Then, run ``ssh ``, replacing ```` with the actual hostname output in the prior command. + - #. On most HPC systems, you will need to submit a batch job to run multi-processor jobs. + #. On most HPC systems, you will need to submit a batch job to run multi-processor jobs. - #. On some HPC systems, you may be able to run the first two jobs (serial) on a login node/command-line + #. On some HPC systems, you may be able to run the first two jobs (serial) on a login node/command-line - #. Example scripts for Slurm (Hera) and PBS (Cheyenne) are provided. These will need to be adapted to your system. + #. Example scripts for Slurm (Hera) and PBS (Cheyenne) are provided. These will need to be adapted to your system. - #. This submit batch script is hard-coded per task, so will need to be modified or copied to run each task. + #. This submit batch script is hard-coded per task, so will need to be modified or copied to run each task. Check the batch script output file in your experiment directory for a “SUCCESS” message near the end of the file. diff --git a/docs/UsersGuide/source/WE2Etests.rst b/docs/UsersGuide/source/WE2Etests.rst index 7332bc4b5f..3afa207d2a 100644 --- a/docs/UsersGuide/source/WE2Etests.rst +++ b/docs/UsersGuide/source/WE2Etests.rst @@ -15,7 +15,7 @@ in the file ``testlist.release_public_v1.txt``. The base experiment configuration file for each test is located in the ``baseline_configs`` subdirectory. Each file is named ``config.${expt_name}.sh``, where ``${expt_name}`` is the name of the corresponding test configuration. These base configuration files are subsets of -the full ``config.sh`` experiment configuration file used in :numref:`Section %s ` +the full ``config.sh`` experiment configuration file used in :numref:`Section %s ` and described in :numref:`Section %s `. For each test that the user wants to run, the ``run_experiments.sh`` script reads in its base configuration file and generates from it a full ``config.sh`` file (a copy of which is placed in the experiment directory for the test). diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index 96ac2e51c4..290b1ec482 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -12,7 +12,8 @@ UFS Short-Range Weather App Users Guide Introduction - Quickstart + Quickstart_Container + Quickstart_NonContainer CodeReposAndDirs SRWAppOverview Components From 3e300980ed04e41b2a9a3287855485eab5fe79bc Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 9 Mar 2022 16:50:09 -0500 Subject: [PATCH 048/118] added filepath changes for running in container on Orion, et al. --- docs/UsersGuide/source/InputOutputFiles.rst | 2 + .../source/Quickstart_Container.rst | 97 +++++++++---------- .../source/Quickstart_NonContainer.rst | 2 +- 3 files changed, 49 insertions(+), 52 deletions(-) diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 65883c796f..d1079c78fb 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -223,6 +223,8 @@ The paths to the staged files must then be set in ``config.sh``. Add the followi * ``TOPO_DIR=/path-to/fix/fix_am/fix_orog`` * ``SFC_CLIMO_INPUT_DIR=/path-to/fix_am/fix/sfc_climo/`` +.. _InitialConditions: + Initial Condition Formats and Source ------------------------------------ The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS. diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst index c8ae4cafca..87b65a31a6 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -14,6 +14,15 @@ Download the UFS SRW Application Code =========================================== The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. +Prerequisites: Install Singularity +------------------------------------ + +To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `_. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended. + +.. warning:: + Docker containers can only be run with root privileges, and users cannot have root privileges on HPC's. Therefore, it is not possible to build the HPC-Stack inside a Docker container on an HPC system. A Docker image may be pulled, but it must be run inside a container such as Singularity. + + Working in the Cloud ----------------------- @@ -63,7 +72,7 @@ Start the container and run an interactive shell within it: The command above also binds the local directory to the container so that data can be shared between them. On NOAA systems, the local directory is usually the topmost directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. -.. important:: +.. attention:: * When binding two directories, they must have the same name. It may be necessary to ``cd`` into the container and create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. * Be sure to bind the directory that contains the data the experiment will access. @@ -74,7 +83,7 @@ Clone the develop branch of the UFS-SRW weather application repository: .. code-block:: console - git clone -b feature/singularity-addition https://github.com/EdwardSnyder-NOAA/ufs-srweather-app + git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git .. COMMENT: change repo for release @@ -141,9 +150,9 @@ Generate the Forecast Experiment ================================= Generating the forecast experiment requires three steps: -* Set experiment parameters -* Set Python and other environment parameters -* Run a script to generate the experiment workflow +* :ref:`Set experiment parameters ` +* :ref:`Set Python and other environment parameters ` +* :ref:`Run a script to generate the experiment workflow ` The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. @@ -162,71 +171,57 @@ Make a copy of ``config.community.sh`` to get started (under ```, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. - -.. Important:: +Additionally, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and add the correct paths to the data. The following is a sample for a 48-hour forecast: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. +.. code-block:: -Minimum parameter settings for Level 1 machines: - -**Cheyenne:** - -.. code-block:: console - - MACHINE="cheyenne" - ACCOUNT="" - EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" - -**Hera:** - -.. code-block:: console - - MACHINE="hera" - ACCOUNT="" - EXPT_SUBDIR="" - -**Jet, Orion, Gaea:** + EXTRN_MDL_SOURCE_BASEDIR_ICS="/path/to/model_data/FV3GFS" + EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path/to/model_data/FV3GFS" + EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" \ "gfs.pgrb2.0p25.f030" "gfs.pgrb2.0p25.f036" "gfs.pgrb2.0p25.f042" "gfs.pgrb2.0p25.f048" ) -The settings are the same as for Hera, except that ``"hera"`` should be switched to ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. +On Level 1 systems, ``/path/to/model_data/FV3GFS`` should correspond to the location of the machine's global data. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Step %s `. -For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: +On NOAA Cloud platforms, users may continue to the :ref:`next step `. On other Level 1 systems, additional file paths must be set: -.. code-block:: console + #. From the ``regional_workflow/ush`` directory, run: ``cd machine``. + #. Open the file corresponding to the Level 1 platform in use (e.g., ``vi orion.sh``). + #. Copy the section of code starting after ``#UFS SRW App specific paths``. For example, on Orion, the following text must be copied: - MACHINE=”wcoss_cray” or MACHINE=”wcoss_dell_p3” - ACCOUNT="my_account" - EXPT_SUBDIR="my_expt_name" + .. code-block:: console + FIXgsm=${FIXgsm:-"/work/noaa/global/glopara/fix/fix_am"} + FIXaer=${FIXaer:-"/work/noaa/global/glopara/fix/fix_aer"} + FIXlut=${FIXlut:-"/work/noaa/global/glopara/fix/fix_lut"} + TOPO_DIR=${TOPO_DIR:-"/work/noaa/global/glopara/fix/fix_orog"} + SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/work/noaa/global/glopara/fix/fix_sfc_climo"} + FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} -**NOAA Cloud Systems:** + #. Exit the system-specific file and open the ``singularity.sh`` file. + #. Comment out or delete the corresponding chunk of text in the ``singularity.sh`` file, and paste the correct paths from the system-specific file in its place. For example, on Orion, delete the text below, and replace it with the Orion-specific text copied in the previous step. -.. code-block:: console + .. code-block:: console - MACHINE="SINGULARITY" - ACCOUNT="none" - EXPT_SUBDIR="" - EXPT_BASEDIR="lustre/$USER/expt_dirs" - COMPILER="gnu" - USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/GST/model_data/FV3GFS" - EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/GST/model_data/FV3GFS" - EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" ) + # UFS SRW App specific paths + FIXgsm=${FIXgsm:-"/contrib/global/glopara/fix/fix_am"} + FIXaer=${FIXaer:-"/contrib/global/glopara/fix/fix_aer"} + FIXlut=${FIXlut:-"/contrib/global/glopara/fix/fix_lut"} + TOPO_DIR=${TOPO_DIR:-"/contrib/global/glopara/fix/fix_orog"} + SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/contrib/global/glopara/fix/fix_sfc_climo"} + FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} +From here, it should be possible to continue to the :ref:`next step ` on Level 1 systems. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. .. _SetUpPythonEnvC: diff --git a/docs/UsersGuide/source/Quickstart_NonContainer.rst b/docs/UsersGuide/source/Quickstart_NonContainer.rst index ad6c20ee04..4339c5ab79 100644 --- a/docs/UsersGuide/source/Quickstart_NonContainer.rst +++ b/docs/UsersGuide/source/Quickstart_NonContainer.rst @@ -47,7 +47,7 @@ Clone the release branch of the repository: .. code-block:: console - git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git + git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git .. COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. From 53807fa7f18917b1501119eb07a41538b62aec8b Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 10 Mar 2022 16:55:03 -0500 Subject: [PATCH 049/118] edits to overview and container QS --- .../source/Quickstart_Container.rst | 19 +++-- docs/UsersGuide/source/SRWAppOverview.rst | 70 ++++++++----------- 2 files changed, 41 insertions(+), 48 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst index 87b65a31a6..413c72e283 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -20,13 +20,13 @@ Prerequisites: Install Singularity To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `_. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended. .. warning:: - Docker containers can only be run with root privileges, and users cannot have root privileges on HPC's. Therefore, it is not possible to build the HPC-Stack inside a Docker container on an HPC system. A Docker image may be pulled, but it must be run inside a container such as Singularity. + Docker containers can only be run with root privileges, and users cannot have root privileges on HPC's. Therefore, it is not possible to build the SRW, which uses the HPC-Stack, inside a Docker container on an HPC system. A Docker image may be pulled, but it must be run inside a container such as Singularity. Working in the Cloud ----------------------- -Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure the SRW builds and runs correctly. For those working on non-cloud-based systems, skip to :numref:`Step %s `. +For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure the SRW builds and runs correctly. On NOAA Cloud systems, certain environment variables must be set *before* building the container: @@ -40,15 +40,22 @@ On NOAA Cloud systems, certain environment variables must be set *before* buildi * ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the file `__ and move it to the ``/contrib`` directory, which is much slower but persistent. -On NOAA Cloud systems, allocate a compute node on which to run the SRW. Then, build and run the SRW from that node: +.. _WorkOnHPC: + +Working on HPC Systems +-------------------------- + +Those *not* working on HPC systems may skip to the `next step `. +On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On some systems, it may be necessary to run the command ``module load singularity`` first. .. code-block:: console salloc -N 1 - module load gnu openmpi + module load openmpi mpirun -n 1 hostname - -This last command will output a hostname. Next, run ``ssh ``, replacing ```` with the actual hostname output in the prior command. + ssh + +The compiler options are ``gnu`` or ``intel``. The third command will output a hostname. This hostname should replace ```` in the last command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. .. _BuildC: diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 9810f9955b..cc4c88d8f2 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -3,14 +3,10 @@ ======================================== Short-Range Weather Application Overview ======================================== -The UFS Short-Range Weather Application (SRW App) is an umbrella repository that contains the tool -``manage_externals`` to check out all of the components required for the application. Once the + +The UFS Short-Range Weather Application (SRW App) is an umbrella repository consisting of a number of different :ref:`components ` housed in external repositories. The SRW APP assembles the required components using the ``manage_externals/checkout_externals`` script. Once the build process is complete, all the files and executables necessary for a regional experiment are -located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. -Users can utilize the pre-defined domains or build their own domain (details provided in :numref:`Chapter %s `). -In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration -files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, -with the scripts to generate and run the workflow shown in red. The steps are as follows: +located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. Users can utilize the pre-defined domains (grids) or build their own domain (details provided in :numref:`Chapter %s `). In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: #. Clone the UFS Short Range Weather Application from GitHub. #. Check out the external repositories. @@ -27,7 +23,7 @@ Each step will be described in detail in the following sections. .. figure:: _static/FV3LAM_wflow_overall.png - *Overall layout of the SRW App.* + *Overall layout of the SRW App* .. _DownloadSRWApp: @@ -40,12 +36,15 @@ Retrieve the UFS Short Range Weather Application (SRW App) repository from GitHu git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git cd ufs-srweather-app +.. + COMMENT: Change version number in 2 places above! + The cloned repository contains the configuration files and sub-directories shown in :numref:`Table %s `. .. _FilesAndSubDirs: -.. table:: Files and sub-directories of the ufs-srweather-app repository. +.. table:: Files and sub-directories of the ufs-srweather-app repository +--------------------------------+--------------------------------------------------------+ | **File/directory Name** | **Description** | @@ -57,7 +56,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | LICENSE.md | CC0 license information | +--------------------------------+--------------------------------------------------------+ - | README.md | Quick start guide | + | README.md | Quick Start Guide | +--------------------------------+--------------------------------------------------------+ | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | | | other packages | @@ -78,14 +77,13 @@ The cloned repository contains the configuration files and sub-directories shown External Components =================== -Check out the external repositories, including regional_workflow, ufs-weather-model, ufs_utils, and emc_post for the SRW App. +Check out the external repositories, including regional_workflow, ufs-weather-model, ufs_utils, and upp.x for the SRW App. .. code-block:: console ./manage_externals/checkout_externals -This step will use the configuration ``Externals.cfg`` file in the ``ufs-srweather-app`` directory to -clone the specific tags (version of codes) of the external repositories as listed in +This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories as listed in :numref:`Section %s `. .. _BuildExecutables: @@ -96,8 +94,7 @@ Before building the executables, the build environment must be set up for your s Instructions for loading the proper modules and/or setting the correct environment variables can be found in the ``env/`` directory in files named ``build__.env.`` For the most part, the commands in those files can be directly copied and pasted, but you may need to modify -certain variables such as the path to NCEP libraries for your specific platform. Here is a directory -listing example of these kinds of files: +certain variables such as the path to NCEP libraries for your specific platform. Here is a directory listing example of these kinds of files: .. code-block:: console @@ -111,17 +108,14 @@ The following steps will build the pre-processing utilities, forecast model, and .. code-block:: console - make dir + mkdir build cd build cmake .. -DCMAKE_INSTALL_PREFIX=.. make -j 4 >& build.out & where ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories containing various components of the SRW App will be created, and its -recommended value ``..`` denotes one directory up from the build directory. In the next line for -the ``make`` call, ``-j 4`` indicates the build will run in parallel with 4 threads. If this step is successful, the -executables listed in :numref:`Table %s ` will be located in the -``ufs-srweather-app/bin`` directory. +recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. If this step is successful, the executables listed in :numref:`Table %s ` will be located in the ``ufs-srweather-app/bin`` directory. .. _ExecDescription: @@ -157,7 +151,8 @@ executables listed in :numref:`Table %s ` will be located in th +------------------------+---------------------------------------------------------------------------------+ | fvcom_to_FV3 | | +------------------------+---------------------------------------------------------------------------------+ - | make_hgrid | | + | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | + | | for global uniform grids | | +------------------------+---------------------------------------------------------------------------------+ | emcsfc_ice_blend | | +------------------------+---------------------------------------------------------------------------------+ @@ -206,18 +201,17 @@ can be found in :numref:`Chapter %s `. Case-specific Configuration ============================= +When generating a new experiment (described in detail in :numref:`Section %s `), the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file, located in the +``ufs-srweather-app/regional_workflow/ush`` directory. + .. _DefaultConfigSection: Default configuration: ``config_defaults.sh`` ------------------------------------------------ -When generating a new experiment (described in detail in :numref:`Section %s `), -the ``config_defaults.sh`` file is read first and assigns default values to the experiment -parameters. Important configuration variables in the ``config_defaults.sh`` file are shown in -:numref:`Table %s `, with more documentation found in the file itself, and -in :numref:`Chapter %s `. Some of these default values are intentionally invalid in order -to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. -Therefore, any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` -settings. Note that there is usually no need for a user to modify the default configuration file. +Important configuration variables in the ``config_defaults.sh`` file appear in +:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` +settings. Note that there is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and +in :numref:`Chapter %s `. .. _ConfigVarsDefault: @@ -330,15 +324,7 @@ settings. Note that there is usually no need for a user to modify the default co User-specific configuration: ``config.sh`` ------------------------------------------ -Before generating an experiment, the user must create a ``config.sh`` file in the -``ufs-srweather-app/regional_workflow/ush`` directory by copying either of the example -configuration files, ``config.community.sh`` for the community mode or ``config.nco.sh`` for -the NCO mode, or creating their own ``config.sh`` file. Note that the *community mode* is -recommended in most cases and will be fully supported for this release while the operational/NCO -mode will be more exclusively used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) -and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. -:numref:`Table %s ` shows the configuration variables, along with their default -values in ``config_default.sh`` and the values defined in ``config.community.sh``. +The user must create a ``config.sh`` file in the ``ufs-srweather-app/regional_workflow/ush`` directory by copying either of the example configuration files (``config.community.sh`` for the community mode or ``config.nco.sh`` for the operational mode). Alternatively, the user can create a custom ``config.sh`` file from scratch. Note that the *community mode* is recommended in most cases and will be fully supported for this release while the operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. .. note:: @@ -348,7 +334,7 @@ values in ``config_default.sh`` and the values defined in ``config.community.sh` .. _ConfigCommunity: -.. table:: Configuration variables specified in the config.community.sh script. +.. table:: Configuration variables specified in the config.community.sh script +--------------------------------+-------------------+--------------------------------------------------------+ | **Parameter** | **Default Value** | **``config.community.sh`` Value** | @@ -397,11 +383,11 @@ values in ``config_default.sh`` and the values defined in ``config.community.sh` +--------------------------------+-------------------+--------------------------------------------------------+ | EXTRN_MDL_SOURCE_BASE_DIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | + | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | + | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | + | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | +--------------------------------+-------------------+--------------------------------------------------------+ From 93bfe9b619d510d742272703154d7b6abb023d81 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Mar 2022 15:15:46 -0500 Subject: [PATCH 050/118] moved CodeReposAndDirs.rst info to the Introduction & deleted file --- docs/UsersGuide/source/CodeReposAndDirs.rst | 261 ----------------- docs/UsersGuide/source/Introduction.rst | 277 +++++++++++++++++- .../source/Quickstart_Container.rst | 52 +--- docs/UsersGuide/source/SRWAppOverview.rst | 23 +- docs/UsersGuide/source/index.rst | 1 - 5 files changed, 285 insertions(+), 329 deletions(-) delete mode 100644 docs/UsersGuide/source/CodeReposAndDirs.rst diff --git a/docs/UsersGuide/source/CodeReposAndDirs.rst b/docs/UsersGuide/source/CodeReposAndDirs.rst deleted file mode 100644 index 3031f84573..0000000000 --- a/docs/UsersGuide/source/CodeReposAndDirs.rst +++ /dev/null @@ -1,261 +0,0 @@ -.. _CodeReposAndDirs: - -========================================= -Code Repositories and Directory Structure -========================================= -This chapter describes the code repositories that comprise the UFS SRW Application, -without describing any of the components in detail. - -.. _HierarchicalRepoStr: - -Hierarchical Repository Structure -================================= -The umbrella repository for the UFS SRW Application is named ufs-srweather-app and is -available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella -repository is defined as a repository that houses external code, called "externals," from -additional repositories. The UFS SRW Application includes the ``manage_externals`` tools -along with a configuration file called ``Externals.cfg``, which describes the external -repositories associated with this umbrella repo (see :numref:`Table %s `). - -.. _top_level_repos: - -.. table:: List of top-level repositories that comprise the UFS SRW Application. - - +---------------------------------+---------------------------------------------------------+ - | **Repository Description** | **Authoritative repository URL** | - +=================================+=========================================================+ - | Umbrella repository for the UFS | https://github.com/ufs-community/ufs-srweather-app | - | Short-Range Weather Application | | - +---------------------------------+---------------------------------------------------------+ - | Repository for | https://github.com/ufs-community/ufs-weather-model | - | the UFS Weather Model | | - +---------------------------------+---------------------------------------------------------+ - | Repository for the regional | https://github.com/ufs-community/regional_workflow | - | workflow | | - +---------------------------------+---------------------------------------------------------+ - | Repository for UFS utilities, | https://github.com/ufs-community/UFS_UTILS | - | including pre-processing, | | - | chgres_cube, and more | | - +---------------------------------+---------------------------------------------------------+ - | Repository for the Unified Post | https://github.com/NOAA-EMC/UPP | - | Processor (UPP) | | - +---------------------------------+---------------------------------------------------------+ - -The UFS Weather Model contains a number of sub-repositories used by the model as -documented `here `__. - -Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not -included in the UFS SRW Application repository. The source code for these components resides in -the repositories `NCEPLIBS `_ and `NCEPLIBS-external -`_. - -These external components are already built on the preconfigured platforms listed `here -`__. -However, they must be cloned and built on other platforms according to the instructions provided -in the wiki pages of those repositories: https://github.com/NOAA-EMC/NCEPLIBS/wiki and -https://github.com/NOAA-EMC/NCEPLIBS-external/wiki. - -.. _TopLevelDirStructure: - -Directory Structure -=================== -The directory structure for the SRW Application is determined by the ``local_path`` settings in -the ``Externals.cfg`` file, which is in the directory where the umbrella repository has -been cloned. After ``manage_externals/checkout_externals`` is run, the specific GitHub repositories -that are described in :numref:`Table %s ` are cloned into the target -subdirectories shown below. The directories that will be created later by running the -scripts are presented in parentheses. Some directories have been removed for brevity. - -.. code-block:: console - - ufs-srweather-app - ├── (bin) - ├── (build) - ├── docs - │ └── UsersGuide - ├── (include) - ├── (lib) - ├── manage_externals - ├── regional_workflow - │ ├── docs - │ │ └── UsersGuide - │ ├── (fix) - │ ├── jobs - │ ├── modulefiles - │ ├── scripts - │ ├── tests - │ │ └── baseline_configs - │ └── ush - │ ├── Python - │ ├── rocoto - │ ├── templates - │ └── wrappers - ├── (share) - └── src - ├── UPP - │ ├── parm - │ └── sorc - │ └── ncep_post.fd - ├── UFS_UTILS - │ ├── sorc - │ │ ├── chgres_cube.fd - │ │ ├── fre-nctools.fd - | │ ├── grid_tools.fd - │ │ ├── orog_mask_tools.fd - │ │ └── sfc_climo_gen.fd - │ └── ush - └── ufs_weather_model - └── FV3 - ├── atmos_cubed_sphere - └── ccpp - -Regional Workflow Sub-Directories ---------------------------------- -Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure` there are -a number of sub-directories that are created when the regional workflow is cloned. The -contents of these sub-directories are described in :numref:`Table %s `. - -.. _Subdirectories: - -.. table:: Sub-directories of the regional workflow. - - +-------------------------+---------------------------------------------------------+ - | **Directory Name** | **Description** | - +=========================+=========================================================+ - | docs | Users' Guide Documentation | - +-------------------------+---------------------------------------------------------+ - | jobs | J-job scripts launched by Rocoto | - +-------------------------+---------------------------------------------------------+ - | modulefiles | Files used to load modules needed for building and | - | | running the workflow | - +-------------------------+---------------------------------------------------------+ - | scripts | Run scripts launched by the J-jobs | - +-------------------------+---------------------------------------------------------+ - | tests | Baseline experiment configuration | - +-------------------------+---------------------------------------------------------+ - | ush | Utility scripts used by the workflow | - +-------------------------+---------------------------------------------------------+ - -.. _ExperimentDirSection: - -Experiment Directory Structure -============================== -When the ``generate_FV3LAM_wflow.sh`` script is run, the user-defined experimental directory -``EXPTDIR=/path-to/ufs-srweather-app/../expt_dirs/${EXPT_SUBDIR}`` is created, where ``EXPT_SUBDIR`` -is specified in the ``config.sh`` file. The contents of the ``EXPTDIR`` directory, before the -workflow is run, is shown in :numref:`Table %s `. - -.. _ExptDirStructure: - -.. table:: Files and sub-directory initially created in the experimental directory. - :widths: 33 67 - - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | **File Name** | **Description** | - +===========================+=======================================================================================================+ - | config.sh | User-specified configuration file, see :numref:`Section %s ` | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | data_table | Cycle-independent input file (empty) | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | field_table | Tracers in the `forecast model | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | FV3LAM_wflow.xml | Rocoto XML file to run the workflow | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | input.nml | Namelist for the `UFS Weather model | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | launch_FV3LAM_wflow.sh | Symlink to the shell script of | - | | ``ufs-srweather-app/regional_workflow/ush/launch_FV3LAM_wflow.sh`` | - | | that can be used to (re)launch the Rocoto workflow. | - | | Each time this script is called, it appends to a log | - | | file named ``log.launch_FV3LAM_wflow``. | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | log.generate_FV3LAM_wflow | Log of the output from the experiment generation script | - | | ``generate_FV3LAM_wflow.sh`` | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | nems.configure | See `NEMS configuration file | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | suite_{CCPP}.xml | CCPP suite definition file used by the forecast model | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | var_defns.sh | Shell script defining the experiment parameters. It contains all | - | | of the primary parameters specified in the default and | - | | user-specified configuration files plus many secondary parameters | - | | that are derived from the primary ones by the experiment | - | | generation script. This file is sourced by various other scripts | - | | in order to make all the experiment variables available to these | - | | scripts. | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | YYYYMMDDHH | Cycle directory (empty) | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - -In addition, the *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. -The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files -after the grid, orography, and/or surface climatology generation tasks are run. - -.. _FixDirectories: - -.. table:: Description of the fix directories - - +-------------------------+----------------------------------------------------------+ - | **Directory Name** | **Description** | - +=========================+==========================================================+ - | fix_am | Directory containing the global `fix` (time-independent) | - | | data files. The experiment generation script copies | - | | these files from a machine-dependent system directory. | - +-------------------------+----------------------------------------------------------+ - | fix_lam | Directory containing the regional fix (time-independent) | - | | data files that describe the regional grid, orography, | - | | and various surface climatology fields as well as | - | | symlinks to pre-generated files. | - +-------------------------+----------------------------------------------------------+ - -Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named -``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. -Once the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks and the ``get_extrn_ics`` -and ``get_extrn_lbc`` tasks for the YYYYMMDDHH cycle have completed successfully, new files and -sub-directories are created, as described in :numref:`Table %s `. - -.. _CreatedByWorkflow: - -.. table:: New directories and files created when the workflow is launched. - :widths: 30 70 - - +---------------------------+--------------------------------------------------------------------+ - | **Directory/file Name** | **Description** | - +===========================+====================================================================+ - | YYYYMMDDHH | This is updated when the first cycle-specific workflow tasks are | - | | run, which are ``get_extrn_ics`` and ``get_extrn_lbcs`` (they are | - | | launched simultaneously for each cycle in the experiment). We | - | | refer to this as a “cycle directory”. Cycle directories are | - | | created to contain cycle-specific files for each cycle that the | - | | experiment runs. If ``DATE_FIRST_CYCL`` and ``DATE_LAST_CYCL`` | - | | were different, and/or ``CYCL_HRS`` contained more than one | - | | element in the ``config.sh`` file, then more than one cycle | - | | directory would be created under the experiment directory. | - +---------------------------+--------------------------------------------------------------------+ - | grid | Directory generated by the ``make_grid`` task containing grid | - | | files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | log | Contains log files generated by the overall workflow and its | - | | various tasks. Look in these files to trace why a task may have | - | | failed. | - +---------------------------+--------------------------------------------------------------------+ - | orog | Directory generated by the ``make_orog`` task containing the | - | | orography files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | sfc_climo | Directory generated by the ``make_sfc_climo`` task containing the | - | | surface climatology files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | FV3LAM_wflow.db | Database files that are generated when Rocoto is called (by the | - | FV3LAM_wflow_lock.db | launch script) to launch the workflow. | - +---------------------------+--------------------------------------------------------------------+ - | log.launch_FV3LAM_wflow | This is the log file to which the launch script | - | | ``launch_FV3LAM_wflow.sh`` appends its output each time it is | - | | called. Take a look at the last 30–50 lines of this file to check | - | | the status of the workflow. | - +---------------------------+--------------------------------------------------------------------+ - -The output files for an experiment are described in :numref:`Section %s `. -The workflow tasks are described in :numref:`Section %s `). diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 0a4ba41d9a..b6e1eb4f61 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -1,8 +1,8 @@ .. _Introduction: -============= +============== Introduction -============= +============== The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. @@ -12,6 +12,10 @@ The SRW App v1.0.0 citation is as follows and should be used when presenting res UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 +.. + COMMENT: Update version numbers/citation for release! + + How to Use This Document ======================== @@ -27,50 +31,293 @@ Variables presented as ``AaBbCc123`` in this document typically refer to variabl File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). .. hint:: - To get started running the SRW, see the :ref:`Containerized Quick Start Guide `. + * To get started running the SRW, see the :ref:`Containerized Quick Start Guide ` or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. + * For background information on the SRW code repositories and directory structure, see :numref:`Section %s ` below. + * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s: Components ` for a more in-depth treatment. + + +.. _SRWStructure: + +Code Repositories and Directory Structure +========================================= + +.. _HierarchicalRepoStr: + +Hierarchical Repository Structure +----------------------------------- +The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The UFS SRW Application includes the ``manage_externals`` tools along with a configuration file called ``Externals.cfg``, which describes the external repositories associated with this umbrella repo (see :numref:`Table %s `). + +.. _top_level_repos: +.. table:: List of top-level repositories that comprise the UFS SRW Application -Pre-processor Utilities and Initial Conditions -============================================== + +---------------------------------+---------------------------------------------------------+ + | **Repository Description** | **Authoritative repository URL** | + +=================================+=========================================================+ + | Umbrella repository for the UFS | https://github.com/ufs-community/ufs-srweather-app | + | Short-Range Weather Application | | + +---------------------------------+---------------------------------------------------------+ + | Repository for | https://github.com/ufs-community/ufs-weather-model | + | the UFS Weather Model | | + +---------------------------------+---------------------------------------------------------+ + | Repository for the regional | https://github.com/ufs-community/regional_workflow | + | workflow | | + +---------------------------------+---------------------------------------------------------+ + | Repository for UFS utilities, | https://github.com/ufs-community/UFS_UTILS | + | including pre-processing, | | + | chgres_cube, and more | | + +---------------------------------+---------------------------------------------------------+ + | Repository for the Unified Post | https://github.com/NOAA-EMC/UPP | + | Processor (UPP) | | + +---------------------------------+---------------------------------------------------------+ -The SRW Application includes a number of pre-processing utilities that initialize and prepare the -model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. +The UFS Weather Model contains a number of sub-repositories used by the model as +documented `here `__. + +Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not +included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on the preconfigured (Level 1) platforms listed `here +`__. +However, it must be built on other systems. :numref:`Chapter %s ` contains details on installing the HPC-Stack. + + +.. _TopLevelDirStructure: + +Directory Structure +---------------------- +The ``ufs-srweather-app`` umbrella repository structure is determined by the ``local_path`` settings contained within the ``Externals.cfg`` file. After ``manage_externals/checkout_externals`` is run (:numref:`Step %s `), the specific GitHub repositories described in :numref:`Table %s ` are cloned into the target subdirectories shown below. Directories that will be created as part of the build process appear in parentheses and will not be visible until after the build is complete. Some directories have been removed for brevity. + +.. code-block:: console + + ufs-srweather-app + ├── (bin) + ├── (build) + ├── docs + │ └── UsersGuide + ├── (include) + ├── (lib) + ├── manage_externals + ├── regional_workflow + │ ├── docs + │ │ └── UsersGuide + │ ├── (fix) + │ ├── jobs + │ ├── modulefiles + │ ├── scripts + │ ├── tests + │ │ └── baseline_configs + │ └── ush + │ ├── Python + │ ├── rocoto + │ ├── templates + │ └── wrappers + ├── (share) + └── src + ├── UPP + │ ├── parm + │ └── sorc + │ └── ncep_post.fd + ├── UFS_UTILS + │ ├── sorc + │ │ ├── chgres_cube.fd + │ │ ├── fre-nctools.fd + | │ ├── grid_tools.fd + │ │ ├── orog_mask_tools.fd + │ │ └── sfc_climo_gen.fd + │ └── ush + └── ufs_weather_model + └── FV3 + ├── atmos_cubed_sphere + └── ccpp + +Regional Workflow Sub-Directories +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure`, a number of sub-directories are created when the regional workflow is cloned. The contents of these sub-directories are described in :numref:`Table %s `. + +.. _Subdirectories: + +.. table:: Sub-directories of the regional workflow + + +-------------------------+---------------------------------------------------------+ + | **Directory Name** | **Description** | + +=========================+=========================================================+ + | docs | Users' Guide Documentation | + +-------------------------+---------------------------------------------------------+ + | jobs | J-job scripts launched by Rocoto | + +-------------------------+---------------------------------------------------------+ + | modulefiles | Files used to load modules needed for building and | + | | running the workflow | + +-------------------------+---------------------------------------------------------+ + | scripts | Run scripts launched by the J-jobs | + +-------------------------+---------------------------------------------------------+ + | tests | Baseline experiment configuration | + +-------------------------+---------------------------------------------------------+ + | ush | Utility scripts used by the workflow | + +-------------------------+---------------------------------------------------------+ + +.. _ExperimentDirSection: + +Experiment Directory Structure +-------------------------------- +When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shouws the contents of the experiment directory before the experiment workflow is run. + +.. _ExptDirStructure: + +.. table:: Files and sub-directory initially created in the experimental directory + :widths: 33 67 + + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | **File Name** | **Description** | + +===========================+=======================================================================================================+ + | config.sh | User-specified configuration file, see :numref:`Section %s ` | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | data_table | Cycle-independent input file (empty) | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | field_table | Tracers in the `forecast model | + | | `_ | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | FV3LAM_wflow.xml | Rocoto XML file to run the workflow | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | input.nml | Namelist for the `UFS Weather model | + | | `_ | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | launch_FV3LAM_wflow.sh | Symlink to the shell script of | + | | ``ufs-srweather-app/regional_workflow/ush/launch_FV3LAM_wflow.sh`` | + | | that can be used to (re)launch the Rocoto workflow. | + | | Each time this script is called, it appends to a log | + | | file named ``log.launch_FV3LAM_wflow``. | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | log.generate_FV3LAM_wflow | Log of the output from the experiment generation script | + | | ``generate_FV3LAM_wflow.sh`` | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | nems.configure | See `NEMS configuration file | + | | `_ | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | suite_{CCPP}.xml | CCPP suite definition file used by the forecast model | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | var_defns.sh | Shell script defining the experiment parameters. It contains all | + | | of the primary parameters specified in the default and | + | | user-specified configuration files plus many secondary parameters | + | | that are derived from the primary ones by the experiment | + | | generation script. This file is sourced by various other scripts | + | | in order to make all the experiment variables available to these | + | | scripts. | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + | YYYYMMDDHH | Cycle directory (empty) | + +---------------------------+-------------------------------------------------------------------------------------------------------+ + +In addition, the *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. +The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files +after the grid, orography, and/or surface climatology generation tasks are run. + +.. _FixDirectories: + +.. table:: Description of the fix directories + + +-------------------------+----------------------------------------------------------+ + | **Directory Name** | **Description** | + +=========================+==========================================================+ + | fix_am | Directory containing the global `fix` (time-independent) | + | | data files. The experiment generation script copies | + | | these files from a machine-dependent system directory. | + +-------------------------+----------------------------------------------------------+ + | fix_lam | Directory containing the regional fix (time-independent) | + | | data files that describe the regional grid, orography, | + | | and various surface climatology fields as well as | + | | symlinks to pre-generated files. | + +-------------------------+----------------------------------------------------------+ + +Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named +``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. +Once the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks and the ``get_extrn_ics`` +and ``get_extrn_lbc`` tasks for the YYYYMMDDHH cycle have completed successfully, new files and +sub-directories are created, as described in :numref:`Table %s `. + +.. _CreatedByWorkflow: + +.. table:: New directories and files created when the workflow is launched. + :widths: 30 70 + + +---------------------------+--------------------------------------------------------------------+ + | **Directory/file Name** | **Description** | + +===========================+====================================================================+ + | YYYYMMDDHH | This is updated when the first cycle-specific workflow tasks are | + | | run, which are ``get_extrn_ics`` and ``get_extrn_lbcs`` (they are | + | | launched simultaneously for each cycle in the experiment). We | + | | refer to this as a “cycle directory”. Cycle directories are | + | | created to contain cycle-specific files for each cycle that the | + | | experiment runs. If ``DATE_FIRST_CYCL`` and ``DATE_LAST_CYCL`` | + | | were different, and/or ``CYCL_HRS`` contained more than one | + | | element in the ``config.sh`` file, then more than one cycle | + | | directory would be created under the experiment directory. | + +---------------------------+--------------------------------------------------------------------+ + | grid | Directory generated by the ``make_grid`` task containing grid | + | | files for the experiment | + +---------------------------+--------------------------------------------------------------------+ + | log | Contains log files generated by the overall workflow and its | + | | various tasks. Look in these files to trace why a task may have | + | | failed. | + +---------------------------+--------------------------------------------------------------------+ + | orog | Directory generated by the ``make_orog`` task containing the | + | | orography files for the experiment | + +---------------------------+--------------------------------------------------------------------+ + | sfc_climo | Directory generated by the ``make_sfc_climo`` task containing the | + | | surface climatology files for the experiment | + +---------------------------+--------------------------------------------------------------------+ + | FV3LAM_wflow.db | Database files that are generated when Rocoto is called (by the | + | FV3LAM_wflow_lock.db | launch script) to launch the workflow. | + +---------------------------+--------------------------------------------------------------------+ + | log.launch_FV3LAM_wflow | This is the log file to which the launch script | + | | ``launch_FV3LAM_wflow.sh`` appends its output each time it is | + | | called. Take a look at the last 30–50 lines of this file to check | + | | the status of the workflow. | + +---------------------------+--------------------------------------------------------------------+ + +The output files for an experiment are described in :numref:`Section %s `. +The workflow tasks are described in :numref:`Section %s `). + + +.. _Utilities: + +SRW Component Summary: Pre-processor Utilities and Initial Conditions +========================================================================= + +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. Forecast Model -============== +----------------- Atmospheric Model --------------------- +^^^^^^^^^^^^^^^^^^^^^^ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package ---------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW release includes an experimental physics version and an updated operational version. Data Format --------------- +^^^^^^^^^^^^^^^^^^^^^^ The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. -Post-processor -============== +Unified Post-Processor (UPP) +-------------------------------- The `Unified Post Processor `__ (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). Visualization Example -===================== +------------------------- This SRW Application provides Python scripts to create basic visualizations of the model output. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow -========================= +---------------------------- The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `__ for more on workflow management). Individual components can also be run in a stand-alone, command line fashion. diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst index 413c72e283..82bcd0f960 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -10,7 +10,7 @@ The "out-of-the-box" SRW case described in this guide builds a weather forecast .. _DownloadCodeC: -Download the UFS SRW Application Code +Building the UFS SRW Application =========================================== The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. @@ -38,7 +38,7 @@ On NOAA Cloud systems, certain environment variables must be set *before* buildi * If the ``cache`` and ``tmp`` directories do not exist already, they must be created. -* ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the file `__ and move it to the ``/contrib`` directory, which is much slower but persistent. +* ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the files `__ and move it to the ``/contrib`` directory, which is much slower but persistent. .. _WorkOnHPC: @@ -46,16 +46,18 @@ Working on HPC Systems -------------------------- Those *not* working on HPC systems may skip to the `next step `. -On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On some systems, it may be necessary to run the command ``module load singularity`` first. +On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On NOAA's Cloud platforms, the following commands should work: .. code-block:: console salloc -N 1 - module load openmpi + module load gnu openmpi mpirun -n 1 hostname ssh -The compiler options are ``gnu`` or ``intel``. The third command will output a hostname. This hostname should replace ```` in the last command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. +The third command will output a hostname. This hostname should replace ```` in the last command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. + +The appropriate commands on other Level 1 platforms will vary, and users should consult the documentation for those platforms. .. _BuildC: @@ -83,27 +85,6 @@ The command above also binds the local directory to the container so that data c * When binding two directories, they must have the same name. It may be necessary to ``cd`` into the container and create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. * Be sure to bind the directory that contains the data the experiment will access. -Download the SRW Code ------------------------- - -Clone the develop branch of the UFS-SRW weather application repository: - -.. code-block:: console - - git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git - -.. - COMMENT: change repo for release - -Check out submodules for the SRW Application: - -.. code-block:: console - - cd ufs-srweather-app - ./manage_externals/checkout_externals - -If the ``manage_externals`` command brings up an error, it may be necessary to run ``ln -s /usr/bin/python3 /usr/bin/python`` first. - .. _SetUpBuildC: @@ -123,27 +104,14 @@ If the SRW Application has been built in a container provided by the Earth Predi Build the Executables -===================== +====================== Create a directory to hold the build's executables: .. code-block:: console - mkdir build - cd build - -From the build directory, run the ``cmake`` command below to set up the ``Makefile``, then run the ``make`` command to build the executables: - -.. code-block:: console - - cmake .. -DCMAKE_INSTALL_PREFIX=.. - make -j 4 >& build.out & - -The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console when you list the files in ``ufs-srweather-app/bin`` (``[1]+ Exit`` may indicate an error). Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. - -.. hint:: - - If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + cd ubuntu20.04-epic-srwapp-1.0/opt/ufs-srweather-app/build + source build-srw.sh Download and Stage the Data ============================ diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index cc4c88d8f2..4d0f2f20a7 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -56,7 +56,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | LICENSE.md | CC0 license information | +--------------------------------+--------------------------------------------------------+ - | README.md | Quick Start Guide | + | README.md | Getting Started Guide | +--------------------------------+--------------------------------------------------------+ | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | | | other packages | @@ -77,7 +77,7 @@ The cloned repository contains the configuration files and sub-directories shown External Components =================== -Check out the external repositories, including regional_workflow, ufs-weather-model, ufs_utils, and upp.x for the SRW App. +Retrieve required components from external repositories, including regional_workflow, ufs-weather-model, ufs_utils, and upp.x: .. code-block:: console @@ -152,21 +152,24 @@ recommended value ``..`` denotes one directory up from the build directory. In t | fvcom_to_FV3 | | +------------------------+---------------------------------------------------------------------------------+ | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | - | | for global uniform grids | | + | | for global uniform grids | +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_ice_blend | | + | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | + | | create a global sea ice analysis used to update the GFS once per day | +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_snow2mdl | | + | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | + | | global depth analysis used to update the GFS snow field once per day | +------------------------+---------------------------------------------------------------------------------+ - | global_cycle | | + | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | +------------------------+---------------------------------------------------------------------------------+ - | inland | | + | inland | Create an inland land mask | +------------------------+---------------------------------------------------------------------------------+ - | orog_gsl | | + | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | + | | developed by NOAA's Global Systems Laboratory (GSL) | +------------------------+---------------------------------------------------------------------------------+ - | fregrid | | + | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | +------------------------+---------------------------------------------------------------------------------+ - | lakefrac | | + | lakefrac | Set lake fraction and depth | +------------------------+---------------------------------------------------------------------------------+ .. _GridSpecificConfig: diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index 290b1ec482..4057dcee0d 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -14,7 +14,6 @@ UFS Short-Range Weather App Users Guide Introduction Quickstart_Container Quickstart_NonContainer - CodeReposAndDirs SRWAppOverview Components Include-HPCInstall From eb00397e23397f4245f3fcb03b37818e20e8cf49 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 11 Mar 2022 16:28:20 -0500 Subject: [PATCH 051/118] continued edits to SRWAppOverview --- docs/UsersGuide/source/Glossary.rst | 12 ++++ docs/UsersGuide/source/Introduction.rst | 13 ++-- .../source/Quickstart_Container.rst | 2 +- docs/UsersGuide/source/SRWAppOverview.rst | 71 ++++++------------- 4 files changed, 39 insertions(+), 59 deletions(-) diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index dbac0aca42..7099a6b1a2 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -9,6 +9,12 @@ Glossary CCPP The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of codes containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + Component + A software element that has a clear function and interface. In Earth system models, components are often single portions of the Earth system (e.g. atmosphere, ocean, or land surface) that are assembled to form a whole. + + Component Repository + A :term:`repository` that contains, at a minimum, source code for a single component. + CONUS Continental United States @@ -81,6 +87,9 @@ Glossary RAP `Rapid Refresh `. The continental-scale NOAA hourly-updated assimilation/modeling system operational at NCEP. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (HRRR) model. + Repository + A central location in which files (e.g., data, code, documentation) are stored and managed. + UFS The Unified Forecast System is a community-based, coupled comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global @@ -94,6 +103,9 @@ Glossary and boundary condition generation codes used by the UFS Short-Range Weather App are all part of this collection. + Umbrella repository + A repository that houses external code, or “externals,” from additional repositories. + UPP The `Unified Post Processor `__ is software developed at :term:`NCEP` and used operationally to post-process raw output from a variety of :term:`NCEP`'s NWP models, including the FV3. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index b6e1eb4f61..c9b1ee6491 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -45,7 +45,7 @@ Code Repositories and Directory Structure Hierarchical Repository Structure ----------------------------------- -The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The UFS SRW Application includes the ``manage_externals`` tools along with a configuration file called ``Externals.cfg``, which describes the external repositories associated with this umbrella repo (see :numref:`Table %s `). +The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The UFS SRW Application includes the ``manage_externals`` tools along with a configuration file called ``Externals.cfg``, which describes the external repositories associated with this umbrella repository (see :numref:`Table %s `). .. _top_level_repos: @@ -74,17 +74,14 @@ The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* The UFS Weather Model contains a number of sub-repositories used by the model as documented `here `__. -Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not -included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on the preconfigured (Level 1) platforms listed `here -`__. -However, it must be built on other systems. :numref:`Chapter %s ` contains details on installing the HPC-Stack. +Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on the preconfigured (Level 1) platforms listed `here `__. However, it must be built on other systems. :numref:`Chapter %s ` contains details on installing the HPC-Stack. .. _TopLevelDirStructure: Directory Structure ---------------------- -The ``ufs-srweather-app`` umbrella repository structure is determined by the ``local_path`` settings contained within the ``Externals.cfg`` file. After ``manage_externals/checkout_externals`` is run (:numref:`Step %s `), the specific GitHub repositories described in :numref:`Table %s ` are cloned into the target subdirectories shown below. Directories that will be created as part of the build process appear in parentheses and will not be visible until after the build is complete. Some directories have been removed for brevity. +The ``ufs-srweather-app`` :term:`umbrella repository` structure is determined by the ``local_path`` settings contained within the ``Externals.cfg`` file. After ``manage_externals/checkout_externals`` is run (:numref:`Step %s `), the specific GitHub repositories described in :numref:`Table %s ` are cloned into the target subdirectories shown below. Directories that will be created as part of the build process appear in parentheses and will not be visible until after the build is complete. Some directories have been removed for brevity. .. code-block:: console @@ -372,9 +369,7 @@ utilities, model code, and infrastructure. Users can post issues in the related Future Direction ================= -Users can expect to see incremental improvements and additional capabilities in upcoming -releases of the SRW Application to enhance research opportunities and support operational -forecast implementations. Planned enhancements include: +Users can expect to see incremental improvements and additional capabilities in upcoming releases of the SRW Application to enhance research opportunities and support operational forecast implementations. Planned enhancements include: * A more extensive set of supported developmental physics suites. * A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst index 82bcd0f960..c66a3abb5d 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -1,7 +1,7 @@ .. _QuickstartC: ================================================= -Containerized Workflow Quick Start (Recommended) +Containerized Quick Start Guide (Recommended) ================================================= This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a container. The container approach provides a uniform enviroment in which to build and run the SRW. Normally, the details of building and running the SRW vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions. Installation via an EPIC-provided container reduces this variability and allows for a smoother SRW build and run experience. diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 4d0f2f20a7..0e320c7dd8 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -1,21 +1,21 @@ .. _SRWAppOverview: -======================================== -Short-Range Weather Application Overview -======================================== +=========================================================== +Building and Running the Short-Range Weather Application +=========================================================== -The UFS Short-Range Weather Application (SRW App) is an umbrella repository consisting of a number of different :ref:`components ` housed in external repositories. The SRW APP assembles the required components using the ``manage_externals/checkout_externals`` script. Once the +The UFS Short-Range Weather Application (SRW App) is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. The SRW App assembles the required components using the ``manage_externals/checkout_externals`` script. Once the build process is complete, all the files and executables necessary for a regional experiment are -located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. Users can utilize the pre-defined domains (grids) or build their own domain (details provided in :numref:`Chapter %s `). In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: +located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. Users can utilize the pre-defined domains (grids) or build their own domain (see :numref:`Chapter %s ` for details). In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: -#. Clone the UFS Short Range Weather Application from GitHub. -#. Check out the external repositories. -#. Set up the build environment and build the regional workflow system using ``cmake/make``. -#. Optional: Add new grid information to the ``set_predef_grid_param.sh`` configuration file and update ``valid_param_vals.sh``. -#. Modify the case-specific ``config.sh`` configuration file. -#. Load the python environment for the regional workflow -#. Generate a regional workflow experiment. -#. Run the regional workflow as needed. + * :ref:`Clone the SRW App from GitHub. ` + * :ref:`Check out the external repositories. ` + * :ref:`Set up the build environment and build the regional workflow system. ` + * :ref:`Optional: Configure a new grid. ` + * :ref:`Configure the experiment. ` + * :ref:`Load the python environment for the regional workflow. ` + * :ref:`Generate a regional workflow experiment. ` + * :ref:`Run the regional workflow. ` Each step will be described in detail in the following sections. @@ -27,8 +27,8 @@ Each step will be described in detail in the following sections. .. _DownloadSRWApp: -Download from GitHub -==================== +Download the SRW App +======================== Retrieve the UFS Short Range Weather Application (SRW App) repository from GitHub and checkout the ``ufs-v1.0.0`` tag: .. code-block:: console @@ -83,18 +83,13 @@ Retrieve required components from external repositories, including regional_work ./manage_externals/checkout_externals -This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories as listed in -:numref:`Section %s `. +This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories as listed in :numref:`Section %s `. .. _BuildExecutables: Building the Executables for the Application ============================================ -Before building the executables, the build environment must be set up for your specific platform. -Instructions for loading the proper modules and/or setting the correct environment variables -can be found in the ``env/`` directory in files named ``build__.env.`` For the -most part, the commands in those files can be directly copied and pasted, but you may need to modify -certain variables such as the path to NCEP libraries for your specific platform. Here is a directory listing example of these kinds of files: +Before building the executables, the build environment must be set up for your specific platform. Instructions for loading the proper modules and/or setting the correct environment variables can be found in the ``env/`` directory in files named ``build__.env.`` For the most part, the commands in those files can be directly copied and pasted, but you may need to modify certain variables such as the path to NCEP libraries for your specific platform. Here is a directory listing example of these kinds of files: .. code-block:: console @@ -113,9 +108,7 @@ The following steps will build the pre-processing utilities, forecast model, and cmake .. -DCMAKE_INSTALL_PREFIX=.. make -j 4 >& build.out & -where ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, -and ``share`` directories containing various components of the SRW App will be created, and its -recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. If this step is successful, the executables listed in :numref:`Table %s ` will be located in the ``ufs-srweather-app/bin`` directory. +where ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories containing various components of the SRW App will be created, and its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. If this step is successful, the executables listed in :numref:`Table %s ` will be located in the ``ufs-srweather-app/bin`` directory. .. _ExecDescription: @@ -177,15 +170,8 @@ recommended value ``..`` denotes one directory up from the build directory. In t Grid-specific Configuration =========================== -Some SRW App parameters depend on the characteristics of the grid such as resolution and domain size. -These include ``ESG grid`` and ``Input configuration`` as well as the variables -related to the write component (quilting). The SRW App officially supports three different predefined -grids as shown in :numref:`Table %s `. Their names can be found under -``valid_vals_PREDEF_GRID_NAME`` in the ``valid_param_vals`` script, and their grid-specific configuration -variables are specified in the ``set_predef_grid_params`` script. If users want to create a new domain, -they should put its name in the ``valid_param_vals`` script and the corresponding grid-specific -parameters in the ``set_predef_grid_params`` script. More information on the predefined and user-generated options -can be found in :numref:`Chapter %s `. +Some SRW App parameters depend on the characteristics of the grid such as resolution and domain size. These include ``ESG grid`` and ``Input configuration`` as well as the variables related to the write component (quilting). The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. Their names can be found under ``valid_vals_PREDEF_GRID_NAME`` in the ``valid_param_vals`` script, and their grid-specific configuration variables are specified in the ``set_predef_grid_params`` script. If users want to create a new domain, they should put its name in the ``valid_param_vals`` script and the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. More information on the predefined and user-generated options +can be found in :numref:`Chapter %s `. .. _PredefinedGrids: @@ -439,11 +425,7 @@ that is executed when running the experiment with the Rocoto workflow manager. *Experiment generation description* -The ``setup.sh`` script reads three other configuration scripts: (1) ``config_default.sh`` -(:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), -and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). Note that these three -scripts are read in order: ``config_default.sh``, ``config.sh``, then ``set_predef_grid_params.sh``. -If a parameter is specified differently in these scripts, the file containing the last defined value will be used. +The ``setup.sh`` script reads three other configuration scripts: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). Note that these three scripts are read in order: ``config_default.sh``, ``config.sh``, then ``set_predef_grid_params.sh``. If a parameter is specified differently in these scripts, the file containing the last defined value will be used. .. _WorkflowTaskDescription: @@ -452,16 +434,7 @@ Description of Workflow Tasks The flowchart of the workflow tasks that are specified in the ``FV3LAM_wflow.xml`` file are illustrated in :numref:`Figure %s `, and each task is described in :numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, -``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and -surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, -``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` -file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` -file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) -when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named -``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files -``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. -There is usually no need for users to modify these files. To relaunch the workflow from scratch, -delete these two ``*.db`` files and then call the launch script repeatedly for each task. +``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksFig: From f4d2043ba6fe9a82c42331af02c53140112dfa58 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 15 Mar 2022 18:53:29 -0400 Subject: [PATCH 052/118] combine overview w/non-container docs --- docs/UsersGuide/source/Components.rst | 2 +- docs/UsersGuide/source/Introduction.rst | 6 +- .../source/Quickstart_Container.rst | 78 ++- .../source/Quickstart_NonContainer.rst | 459 +++++++++++++-- docs/UsersGuide/source/SRWAppOverview.rst | 547 +----------------- .../source/_static/theme_overrides.css | 1 + docs/UsersGuide/source/index.rst | 2 +- 7 files changed, 485 insertions(+), 610 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 50513a303e..bd8b22c254 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -87,7 +87,7 @@ For the selected computational platforms that have been pre-configured (Level 1) required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these -pre-configured platforms. Users can download the SRW code and choose whether to run it :ref:`in a container ` or :ref:`locally `. +pre-configured platforms. Users can download the SRW code and choose whether to run it :ref:`in a container ` or :ref:`locally `. A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index c9b1ee6491..f67f240f25 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides two Quick Start Guides for running the application :ref:`in a conainer ` or :ref:`locally `, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. +The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a Quick Start Guide for running the application :ref:`in a container ` and a detailed guide for running the SRW :ref:`locally `, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -155,7 +155,7 @@ Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure Experiment Directory Structure -------------------------------- -When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shouws the contents of the experiment directory before the experiment workflow is run. +When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shouws the contents of the experiment directory before the experiment workflow is run. .. _ExptDirStructure: @@ -320,7 +320,7 @@ The SRW Application has a portable CMake-based build system that packages togeth The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW must be :ref:`run within a container ` that contains the HPC-Stack, or the required libraries (i.e., HPC-Stack) will need to be installed as part of the :ref:`non-container `) SRW installation process. Once these prerequisite libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW must be :ref:`run within a container ` that contains the HPC-Stack, or the required libraries (i.e., HPC-Stack) will need to be installed as part of the :ref:`non-container `) SRW installation process. Once these prerequisite libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. User Support, Documentation, and Contributions to Development =============================================================== diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart_Container.rst index c66a3abb5d..554ba9b743 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart_Container.rst @@ -4,7 +4,7 @@ Containerized Quick Start Guide (Recommended) ================================================= -This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a container. The container approach provides a uniform enviroment in which to build and run the SRW. Normally, the details of building and running the SRW vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions. Installation via an EPIC-provided container reduces this variability and allows for a smoother SRW build and run experience. +This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a container. The container approach provides a uniform enviroment in which to build and run the SRW. Normally, the details of building and running the SRW vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an EPIC-provided container reduces this variability and allows for a smoother SRW build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. The "out-of-the-box" SRW case described in this guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. @@ -45,7 +45,7 @@ On NOAA Cloud systems, certain environment variables must be set *before* buildi Working on HPC Systems -------------------------- -Those *not* working on HPC systems may skip to the `next step `. +Those *not* working on HPC systems may skip to the :ref:`next step `. On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On NOAA's Cloud platforms, the following commands should work: .. code-block:: console @@ -83,7 +83,7 @@ The command above also binds the local directory to the container so that data c .. attention:: * When binding two directories, they must have the same name. It may be necessary to ``cd`` into the container and create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. - * Be sure to bind the directory that contains the data the experiment will access. + * Be sure to bind the directory that contains the experiment data. .. _SetUpBuildC: @@ -95,6 +95,7 @@ If the SRW Application has been built in a container provided by the Earth Predi .. code-block:: console + cd ubuntu20.04-epic-srwapp-1.0/opt/ufs-srweather-app/ ln -s /usr/bin/python3 /usr/bin/python source /usr/share/lmod/6.6/init/profile module use /opt/hpc-modules/modulefiles/stack @@ -106,17 +107,17 @@ If the SRW Application has been built in a container provided by the Earth Predi Build the Executables ====================== -Create a directory to hold the build's executables: +From the ``ufs-srweather-app`` directory, ``cd`` into the build directory and run the script that builds the SRW App: .. code-block:: console - cd ubuntu20.04-epic-srwapp-1.0/opt/ufs-srweather-app/build + cd build source build-srw.sh Download and Stage the Data ============================ -The SRW requires input files to run. These include static datasets, initial and boundary conditions +The SRW requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. .. _GenerateForecastC: @@ -133,15 +134,15 @@ The first two steps depend on the platform being used and are described here for .. _SetUpConfigFileC: -Set Experiment Parameters -------------------------- +Set the Experiment Parameters +------------------------------- Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: .. code-block:: console - cd ../regional_workflow/ush + cd regional_workflow/ush cp config.community.sh config.sh The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. @@ -183,8 +184,8 @@ On NOAA Cloud platforms, users may continue to the :ref:`next step `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in the experiment directory. +The generated workflow will be in the experiment directory specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in the experiment directory. + +.. _RunUsingStandaloneScripts: Run the Workflow Using Stand-Alone Scripts ============================================= @@ -234,6 +237,8 @@ Run the Workflow Using Stand-Alone Scripts .. note:: The Rocoto workflow manager cannot be used inside a container. +The regional workflow can be run using standalone shell scripts if the Rocoto software is not available on a given platform. If Rocoto *is* available, see `Section %s ` to run the workflow using Rocoto. + #. ``cd`` into the experiment directory #. Set the environment variable ``EXPTDIR`` for either bash or csh, respectively: @@ -243,7 +248,7 @@ Run the Workflow Using Stand-Alone Scripts export EXPTDIR=`pwd` setenv EXPTDIR `pwd` -#. COPY the wrapper scripts from the regional_workflow directory into your experiment directory: +#. Copy the wrapper scripts from the regional_workflow directory into your experiment directory. Each workflow task has a wrapper script that sets environment variables and run the job script. .. code-block:: console @@ -256,7 +261,7 @@ Run the Workflow Using Stand-Alone Scripts export OMP_NUM_THREADS=1 sed -i 's/bin\/sh/bin\/bash/g' *sh -#. RUN each of the listed scripts in order. Scripts with the same stage number (listed in :numref:`Table %s `) may be run simultaneously. +#. Run each of the listed scripts in order. Scripts with the same stage number (listed in :numref:`Table %s `) may be run simultaneously. .. code-block:: console @@ -270,6 +275,51 @@ Run the Workflow Using Stand-Alone Scripts ./run_fcst.sh ./run_post.sh +Check the batch script output file in your experiment directory for a “SUCCESS” message near the end of the file. + +.. hint:: + If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. On an HPC system, the user will need to allocate a(nother) compute node. The process for doing so is system-dependent, and users should check the documentation available for their HPC system. Instructions for allocating a compute node on NOAA Cloud systems can be viewed in the :numref:`Step %s ` as an example. + +.. note:: + #. On most HPC systems, users will need to submit a batch job to run multi-processor jobs. On some HPC systems, users may be able to run the first two jobs (serial) on a login node/command-line. Example scripts for Slurm (Hera) and PBS (Cheyenne) resource managers are provided. These will need to be adapted to each user's system. This submit batch script is hard-coded per task, so it will need to be modified or copied to run each task. + + +.. _RegionalWflowTasks: + +.. table:: List of tasks in the regional workflow in the order that they are executed. + Scripts with the same stage number may be run simultaneously. The number of + processors and wall clock time is a good starting point for Cheyenne or Hera + when running a 48-h forecast on the 25-km CONUS domain. + + +------------+------------------------+----------------+----------------------------+ + | **Stage/** | **Task Run Script** | **Number of** | **Wall clock time (H:MM)** | + | **step** | | **Processors** | | + +============+========================+================+============================+ + | 1 | run_get_ics.sh | 1 | 0:20 (depends on HPSS vs | + | | | | FTP vs staged-on-disk) | + +------------+------------------------+----------------+----------------------------+ + | 1 | run_get_lbcs.sh | 1 | 0:20 (depends on HPSS vs | + | | | | FTP vs staged-on-disk) | + +------------+------------------------+----------------+----------------------------+ + | 1 | run_make_grid.sh | 24 | 0:20 | + +------------+------------------------+----------------+----------------------------+ + | 2 | run_make_orog.sh | 24 | 0:20 | + +------------+------------------------+----------------+----------------------------+ + | 3 | run_make_sfc_climo.sh | 48 | 0:20 | + +------------+------------------------+----------------+----------------------------+ + | 4 | run_make_ics.sh | 48 | 0:30 | + +------------+------------------------+----------------+----------------------------+ + | 4 | run_make_lbcs.sh | 48 | 0:30 | + +------------+------------------------+----------------+----------------------------+ + | 5 | run_fcst.sh | 48 | 0:30 | + +------------+------------------------+----------------+----------------------------+ + | 6 | run_post.sh | 48 | 0:25 (2 min per output | + | | | | forecast hour) | + +------------+------------------------+----------------+----------------------------+ + +Example batch-submit scripts for Hera (Slurm) and Cheyenne (PBS) are included (``sq_job.sh`` +and ``qsub_job.sh``, respectively). These examples set the build and run environment for Hera or Cheyenne so that run-time libraries match the compiled libraries (i.e. netCDF, MPI). Users may either modify the submit batch script as each task is submitted, or duplicate this batch wrapper +for their system settings for each task. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). This piece will be unique to your platform. The tasks run by the regional workflow are shown in :numref:`Table %s `. Tasks with the same stage level may be run concurrently (no dependency). Plot the Output =============== diff --git a/docs/UsersGuide/source/Quickstart_NonContainer.rst b/docs/UsersGuide/source/Quickstart_NonContainer.rst index 4339c5ab79..7b0bd6195f 100644 --- a/docs/UsersGuide/source/Quickstart_NonContainer.rst +++ b/docs/UsersGuide/source/Quickstart_NonContainer.rst @@ -1,17 +1,39 @@ -.. _QuickstartNC: +.. _BuildRunSRW: -====================================== -Workflow Quick Start (Non-Container) -====================================== +============================================== +Building and Running the SRW (Non-Container) +============================================== -This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. The "out-of-the-box" case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +The UFS Short-Range Weather Application (SRW App) is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. + +This chapter walks users through how to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: - The UFS defines `four platform levels `_. The steps described in this chapter are most applicable to preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems as well but may require additional troubleshooting by the user. + The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user need to perform additional troubleshooting. .. note:: - The :ref:`container approach ` is recommended when possible for a smoother build and run experience. Building without a container allows for use of the Rocoto workflow manager and may allow for more cutomization; however, this comes at the expense of more in-depth troubleshooting, especially on Level 3 and 4 systems. + The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more cutomization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. + +The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: + + * :ref:`Install prerequisites ` + * :ref:`Clone the SRW App from GitHub ` + * :ref:`Check out the external repositories ` + * :ref:`Set up the build environment ` + * :ref:`Build the executables ` + * :ref:`Download and stage data ` + * :ref:`Optional: Configure a new grid ` + * :ref:`Configure the experiment ` + * :ref:`Load the python environment for the regional workflow ` + * :ref:`Generate a regional workflow experiment ` + * :ref:`Run the regional workflow ` + +.. _AppOverallProc: + +.. figure:: _static/FV3LAM_wflow_overall.png + + *Overall layout of the SRW App* .. _HPCstackInfo: @@ -20,28 +42,26 @@ Install the HPC-Stack ======================== .. Attention:: - Skip the HPC-stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion). + Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud). -**Definition:** :term:`HPC-stack` is a repository that provides a unified, shell script-based build system that builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. +**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system and builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. Background ---------------- -The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF etc) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF, etc.) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. Instructions ------------------------- -`Level 1 `_ platforms (e.g. Cheyenne, Hera) already have the HPC-Stack installed. Users on those platforms do *not* need to install the HPC-Stack before building applications or models that require the HPC-Stack. Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications or models that depend on it. - -Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the next section. -.. _DownloadCodeNC: +.. _DownloadSRWApp: Download the UFS SRW Application Code ===================================== -The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. +The SRW Application source code is publicly available on GitHub. It relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. Clone the release branch of the repository: @@ -52,31 +72,84 @@ Clone the release branch of the repository: .. COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. -Then, run the executable that pulls in the submodules for the SRW Application: +The cloned repository contains the configuration files and sub-directories shown in +:numref:`Table %s `. + +.. _FilesAndSubDirs: + +.. table:: Files and sub-directories of the ufs-srweather-app repository + + +--------------------------------+--------------------------------------------------------+ + | **File/directory Name** | **Description** | + +================================+========================================================+ + | CMakeLists.txt | Main cmake file for SRW App | + +--------------------------------+--------------------------------------------------------+ + | Externals.cfg | Tags of the GitHub repositories/branches for the | + | | external repositories | + +--------------------------------+--------------------------------------------------------+ + | LICENSE.md | CC0 license information | + +--------------------------------+--------------------------------------------------------+ + | README.md | Getting Started Guide | + +--------------------------------+--------------------------------------------------------+ + | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | + | | other packages | + +--------------------------------+--------------------------------------------------------+ + | ufs_srweather_app.settings.in | SRW App configuration summary | + +--------------------------------+--------------------------------------------------------+ + | env | Contains build and workflow environment files | + +--------------------------------+--------------------------------------------------------+ + | docs | Contains release notes, documentation, and Users' Guide| + +--------------------------------+--------------------------------------------------------+ + | manage_externals | Utility for checking out external repositories | + +--------------------------------+--------------------------------------------------------+ + | src | Contains CMakeLists.txt; external repositories | + | | will be cloned in this directory. | + +--------------------------------+--------------------------------------------------------+ + + +.. _CheckoutExternals: + +Check Out External Components +================================ + +Next, run the executable that pulls in SRW components from external repositories, including the regional_workflow, ufs-weather-model, ufs_utils, and upp repositories: .. code-block:: console cd ufs-srweather-app ./manage_externals/checkout_externals +This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s `. -.. _SetUpBuildNC: +.. _SetUpBuild: Set up the Build Environment ============================ -For Level 1 and 2 systems, scripts for loading the proper modules and/or setting the -correct environment variables can be found in the ``env/`` directory of the SRW App in files named -``build__.env``. The commands in these files can be directly copy-pasted -to the command line, or the file can be sourced from the ufs-srweather-app ``env/`` directory. -For example, on Hera, run ``source env/build_hera_intel.env`` from the main ufs-srweather-app -directory to source the appropriate file. +Before building the SRW App, the build environment must be set up for the user's specific platform. For Level 1 systems, scripts for loading the proper modules and/or setting the correct environment variables can be found in the ``env`` directory of the SRW App in files named ``build__.env``. Here is a sample directory listing of these build files: + +.. code-block:: console + + $ ls -l env/ + -rw-rw-r-- 1 user ral 1228 Oct 9 10:09 build_cheyenne_intel.env + -rw-rw-r-- 1 user ral 1134 Oct 9 10:09 build_hera_intel.env + -rw-rw-r-- 1 user ral 1228 Oct 9 10:09 build_jet_intel.env + ... + +On Level 1 systems, the commands in the ``build__.env`` files can be directly copy-pasted into the command line, or the file can be sourced from the ufs-srweather-app ``env`` directory. For example, on Hera, run: + +.. code-block:: -On Level 3-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. On systems without Lmod, this process will typically involve commands in the form ``export =``. You may need to use ``setenv`` rather than ``export`` depending on your environment. + source env/build_hera_intel.env +from the main ufs-srweather-app directory to source the appropriate file. + +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, this process will typically involve commands in the form ``export =``. You may need to use ``setenv`` rather than ``export`` depending on your shell environment. + +.. _BuildExecutables: Build the Executables -===================== +======================= Create a directory to hold the build's executables: @@ -85,26 +158,109 @@ Create a directory to hold the build's executables: mkdir build cd build -From the build directory, run the ``cmake`` command below to set up the ``Makefile``, then run the ``make`` command to build the executables: +From the build directory, run the following commands to build the pre-processing utilities, forecast model, and post-processor: .. code-block:: console cmake .. -DCMAKE_INSTALL_PREFIX=.. make -j 4 >& build.out & -The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console when you list the files in ``ufs-srweather-app/bin`` (``[1]+ Exit`` may indicate an error). Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. +``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. + +The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. .. hint:: If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. +.. _ExecDescription: + +.. table:: Names and descriptions of the executables produced by the build step and used by the SRW App + + +------------------------+---------------------------------------------------------------------------------+ + | **Executable Name** | **Description** | + +========================+=================================================================================+ + | chgres_cube | Reads in raw external model (global or regional) and surface climatology data | + | | to create initial and lateral boundary conditions | + +------------------------+---------------------------------------------------------------------------------+ + | filter_topo | Filters topography based on resolution | + +------------------------+---------------------------------------------------------------------------------+ + | global_equiv_resol | Calculates a global, uniform, cubed-sphere equivalent resolution for the | + | | regional Extended Schmidt Gnomonic (ESG) grid | + +------------------------+---------------------------------------------------------------------------------+ + | make_solo_mosaic | Creates mosaic files with halos | + +------------------------+---------------------------------------------------------------------------------+ + | upp.x | Post-processor for the model output | + +------------------------+---------------------------------------------------------------------------------+ + | ufs_model | UFS Weather Model executable | + +------------------------+---------------------------------------------------------------------------------+ + | orog | Generates orography, land mask, and gravity wave drag files from fixed files | + +------------------------+---------------------------------------------------------------------------------+ + | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | + +------------------------+---------------------------------------------------------------------------------+ + | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | + +------------------------+---------------------------------------------------------------------------------+ + | shave | Shaves the excess halo rows down to what is required for the LBCs in the | + | | orography and grid files | + +------------------------+---------------------------------------------------------------------------------+ + | vcoord_gen | Generates hybrid coordinate interface profiles | + +------------------------+---------------------------------------------------------------------------------+ + | fvcom_to_FV3 | Determine lake surface conditions for the Great Lakes | + +------------------------+---------------------------------------------------------------------------------+ + | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | + | | for global uniform grids | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | + | | create a global sea ice analysis used to update the GFS once per day | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | + | | global depth analysis used to update the GFS snow field once per day | + +------------------------+---------------------------------------------------------------------------------+ + | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | + +------------------------+---------------------------------------------------------------------------------+ + | inland | Creates an inland land mask by determining in-land (i.e. non-coastal) points | + | | and assigning a value of 1. Default value is 0. | + +------------------------+---------------------------------------------------------------------------------+ + | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | + | | developed by NOAA's Global Systems Laboratory (GSL) | + +------------------------+---------------------------------------------------------------------------------+ + | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | + +------------------------+---------------------------------------------------------------------------------+ + | lakefrac | Calculates the ratio of the lake area to the grid cell area at each atmospheric | + | | grid point. | + +------------------------+---------------------------------------------------------------------------------+ + +.. _Data: + Download and Stage the Data ============================ The SRW requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. -.. _GenerateForecastNC: +.. _GridSpecificConfig: + +Grid Configuration +======================= + +The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, they will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. + +.. _PredefinedGrids: + +.. table:: Predefined grids in the SRW App + + +----------------------+-------------------+--------------------------------+ + | **Grid Name** | **Grid Type** | **Quilting (write component)** | + +======================+===================+================================+ + | RRFS_CONUS_25km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + | RRFS_CONUS_13km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + | RRFS_CONUS_3km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + + +.. _GenerateForecast: Generate the Forecast Experiment ================================= @@ -116,36 +272,219 @@ Generating the forecast experiment requires three steps: The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. -.. _SetUpConfigFileNC: +.. _ExptConfig: Set Experiment Parameters -------------------------- -Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. +---------------------------- + +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specific ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. For background info on ``config_defaults.sh``, read :numref:`Section %s ` or jump to :numref:`Section %s ` to continue configuring the experiment. + +.. _DefaultConfigSection: -Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: +Default configuration: ``config_defaults.sh`` +------------------------------------------------ + +.. note:: + Users may skip to :numref:`Step %s `. This section provides background information on how the SRW App uses the ``config_defaults.sh`` file, but this information is not necessary for running the SRW. + +Important configuration variables in the ``config_defaults.sh`` file appear in +:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` +settings. There is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and in :numref:`Chapter %s `. + +.. _ConfigVarsDefault: + +.. table:: Configuration variables specified in the config_defaults.sh script. + + +----------------------+------------------------------------------------------------+ + | **Group Name** | **Configuration variables** | + +======================+============================================================+ + | Experiment mode | RUN_ENVIR | + +----------------------+------------------------------------------------------------+ + | Machine and queue | MACHINE, ACCOUNT, SCHED, PARTITION_DEFAULT, QUEUE_DEFAULT, | + | | PARTITION_HPSS, QUEUE_HPSS, PARTITION_FCST, QUEUE_FCST | + +----------------------+------------------------------------------------------------+ + | Cron | USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS | + +----------------------+------------------------------------------------------------+ + | Experiment Dir. | EXPT_BASEDIR, EXPT_SUBDIR | + +----------------------+------------------------------------------------------------+ + | NCO mode | COMINgfs, STMP, NET, envir, RUN, PTMP | + +----------------------+------------------------------------------------------------+ + | Separator | DOT_OR_USCORE | + +----------------------+------------------------------------------------------------+ + | File name | EXPT_CONFIG_FN, RGNL_GRID_NML_FN, DATA_TABLE_FN, | + | | DIAG_TABLE_FN, FIELD_TABLE_FN, FV3_NML_BASE_SUITE_FN, | + | | FV3_NML_YALM_CONFIG_FN, FV3_NML_BASE_ENS_FN, | + | | MODEL_CONFIG_FN, NEMS_CONFIG_FN, FV3_EXEC_FN, | + | | WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, | + | | EXTRN_MDL_ICS_VAR_DEFNS_FN, EXTRN_MDL_LBCS_VAR_DEFNS_FN, | + | | WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN | + +----------------------+------------------------------------------------------------+ + | Forecast | DATE_FIRST_CYCL, DATE_LAST_CYCL, CYCL_HRS, FCST_LEN_HRS | + +----------------------+------------------------------------------------------------+ + | IC/LBC | EXTRN_MDL_NAME_ICS, EXTRN_MDL_NAME_LBCS, | + | | LBC_SPEC_INTVL_HRS, FV3GFS_FILE_FMT_ICS, | + | | FV3GFS_FILE_FMT_LBCS | + +----------------------+------------------------------------------------------------+ + | NOMADS | NOMADS, NOMADS_file_type | + +----------------------+------------------------------------------------------------+ + | External model | USE_USER_STAGED_EXTRN_FILES, EXTRN_MDL_SOURCE_BASEDRI_ICS, | + | | EXTRN_MDL_FILES_ICS, EXTRN_MDL_SOURCE_BASEDIR_LBCS, | + | | EXTRN_MDL_FILES_LBCS | + +----------------------+------------------------------------------------------------+ + | CCPP | CCPP_PHYS_SUITE | + +----------------------+------------------------------------------------------------+ + | GRID | GRID_GEN_METHOD | + +----------------------+------------------------------------------------------------+ + | ESG grid | ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, | + | | ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, | + | | ESGgrid_WIDE_HALO_WIDTH | + +----------------------+------------------------------------------------------------+ + | Input configuration | DT_ATMOS, LAYOUT_X, LAYOUT_Y, BLOCKSIZE, QUILTING, | + | | PRINT_ESMF, WRTCMP_write_groups, | + | | WRTCMP_write_tasks_per_group, WRTCMP_output_grid, | + | | WRTCMP_cen_lon, WRTCMP_cen_lat, WRTCMP_lon_lwr_left, | + | | WRTCMP_lat_lwr_left, WRTCMP_lon_upr_rght, | + | | WRTCMP_lat_upr_rght, WRTCMP_dlon, WRTCMP_dlat, | + | | WRTCMP_stdlat1, WRTCMP_stdlat2, WRTCMP_nx, WRTCMP_ny, | + | | WRTCMP_dx, WRTCMP_dy | + +----------------------+------------------------------------------------------------+ + | Pre-existing grid | PREDEF_GRID_NAME, PREEXISTING_DIR_METHOD, VERBOSE | + +----------------------+------------------------------------------------------------+ + | Cycle-independent | RUN_TASK_MAKE_GRID, GRID_DIR, RUN_TASK_MAKE_OROG, | + | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | + +----------------------+------------------------------------------------------------+ + | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | + | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | + | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | + | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | + | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | + | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | + +----------------------+------------------------------------------------------------+ + | Workflow task | MAKE_GRID_TN, MAKE_OROG_TN, MAKE_SFC_CLIMO_TN, | + | | GET_EXTRN_ICS_TN, GET_EXTRN_LBCS_TN, MAKE_ICS_TN, | + | | MAKE_LBCS_TN, RUN_FCST_TN, RUN_POST_TN | + +----------------------+------------------------------------------------------------+ + | NODE | NNODES_MAKE_GRID, NNODES_MAKE_OROG, NNODES_MAKE_SFC_CLIMO, | + | | NNODES_GET_EXTRN_ICS, NNODES_GET_EXTRN_LBCS, | + | | NNODES_MAKE_ICS, NNODES_MAKE_LBCS, NNODES_RUN_FCST, | + | | NNODES_RUN_POST | + +----------------------+------------------------------------------------------------+ + | MPI processes | PPN_MAKE_GRID, PPN_MAKE_OROG, PPN_MAKE_SFC_CLIMO, | + | | PPN_GET_EXTRN_ICS, PPN_GET_EXTRN_LBCS, PPN_MAKE_ICS, | + | | PPN_MAKE_LBCS, PPN_RUN_FCST, PPN_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Walltime | WTIME_MAKE_GRID, WTIME_MAKE_OROG, WTIME_MAKE_SFC_CLIMO, | + | | WTIME_GET_EXTRN_ICS, WTIME_GET_EXTRN_LBCS, WTIME_MAKE_ICS, | + | | WTIME_MAKE_LBCS, WTIME_RUN_FCST, WTIME_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | + | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | + | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | + | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Post configuration | USE_CUSTOM_POST_CONFIG_FILE, CUSTOM_POST_CONFIG_FP | + +----------------------+------------------------------------------------------------+ + | Running ensembles | DO_ENSEMBLE, NUM_ENS_MEMBERS | + +----------------------+------------------------------------------------------------+ + | Stochastic physics | DO_SHUM, DO_SPPT, DO_SKEB, SHUM_MAG, SHUM_LSCALE, | + | | SHUM_TSCALE, SHUM_INT, SPPT_MAG, SPPT_LSCALE, SPPT_TSCALE, | + | | SPPT_INT, SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, | + | | SKEB_VDOF, USE_ZMTNBLCK | + +----------------------+------------------------------------------------------------+ + | Boundary blending | HALO_BLEND | + +----------------------+------------------------------------------------------------+ + | FVCOM | USE_FVCOM, FVCOM_DIR, FVCOM_FILE | + +----------------------+------------------------------------------------------------+ + | Compiler | COMPILER | + +----------------------+------------------------------------------------------------+ + + +.. _UserSpecificConfig: + +User-specific configuration: ``config.sh`` +-------------------------------------------- + +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in the ``ufs-srweather-app/regional_workflow/ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. + +.. _ConfigCommunity: + +.. table:: Configuration variables specified in the config.community.sh script + + +--------------------------------+-------------------+--------------------------------------------------------+ + | **Parameter** | **Default Value** | **config.community.sh Value** | + +================================+===================+========================================================+ + | MACHINE | "BIG_COMPUTER" | "hera" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | ACCOUNT | "project_name" | "an_account" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv15p2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | VERBOSE | "TRUE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_ENVIR | "nco" | "community" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | PREEXISTING_DIR_METHOD | "delete" | "rename" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | GRID_GEN_METHOD | "ESGgrid" | "ESGgrid" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | QUILTING | "TRUE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v15p2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FCST_LEN_HRS | "24" | "48" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | LBC_SPEC_INTVL_HRS | "6" | "6" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | DATE_FIRST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | DATE_LAST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | CYCL_HRS | ("HH1" "HH2") | "00" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | WTIME_RUN_FCST | "04:30:00" | "01:00:00" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | USE_USER_STAGED_EXTRN_FILES | "FALSE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASE_DIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | + +--------------------------------+-------------------+--------------------------------------------------------+ + + +To get started, make a copy of ``config.community.sh`` (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: .. code-block:: console - cd ../regional_workflow/ush + cd regional_workflow/ush cp config.community.sh config.sh The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. -Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. For example: +Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. -.. code-block:: console +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. - MACHINE="SINGULARITY" - ACCOUNT="none" - EXPT_SUBDIR="GST" - EXPT_BASEDIR="home/$USER/expt_dirs" - COMPILER="gnu" +.. important:: -Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. + If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. -.. Important:: +.. hint:: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. + To determine an appropriate ACCOUNT field for Level 1 systems, run ``groups``, and it will return a list of projects you have permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. Minimum parameter settings for Level 1 machines: @@ -191,11 +530,18 @@ For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use EXPT_BASEDIR="lustre/$USER/expt_dirs" COMPILER="gnu" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/model_data/FV3GFS" EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/GST/model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/model_data/FV3GFS" EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" ) +.. note:: + + The values of the configuration variables should be consistent with those in the + ``valid_param_vals script``. In addition, various example configuration files can be + found in the ``regional_workflow/tests/baseline_configs`` directory. + + .. _SetUpPythonEnv: @@ -221,15 +567,26 @@ This command will activate the ``regional_workflow``. The user should see ``(reg Generate the Regional Workflow ------------------------------------------- -Run the following command to generate the workflow: +Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` directory to generate the workflow: .. code-block:: console ./generate_FV3LAM_wflow.sh -The last line of output from this script, starting with ``*/1 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. +The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. + +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``$EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``$EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. + +The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). If a parameter is specified differently in these scripts, the file containing the last defined value will be used. + +The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. + +.. _WorkflowGeneration: + +.. figure:: _static/FV3regional_workflow_gen.png + + *Experiment generation description* -This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in $EXPTDIR. An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: @@ -239,9 +596,11 @@ An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is csh/tcsh, replace ``export`` with ``setenv`` in the command above. +.. _RocotoRun: + Run the Workflow Using Rocoto ============================= -The information in this section assumes that Rocoto is available on the desired platform. Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. +The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. Launch the Rocoto Workflow Using a Script ----------------------------------------------- @@ -361,7 +720,7 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai Additional Options ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in `Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in :ref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 0e320c7dd8..2d846bb5da 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -1,440 +1,15 @@ .. _SRWAppOverview: =========================================================== -Building and Running the Short-Range Weather Application +Overview of the Short-Range Weather Application Workflow =========================================================== -The UFS Short-Range Weather Application (SRW App) is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. The SRW App assembles the required components using the ``manage_externals/checkout_externals`` script. Once the -build process is complete, all the files and executables necessary for a regional experiment are -located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. Users can utilize the pre-defined domains (grids) or build their own domain (see :numref:`Chapter %s ` for details). In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: - - * :ref:`Clone the SRW App from GitHub. ` - * :ref:`Check out the external repositories. ` - * :ref:`Set up the build environment and build the regional workflow system. ` - * :ref:`Optional: Configure a new grid. ` - * :ref:`Configure the experiment. ` - * :ref:`Load the python environment for the regional workflow. ` - * :ref:`Generate a regional workflow experiment. ` - * :ref:`Run the regional workflow. ` - -Each step will be described in detail in the following sections. - -.. _AppOverallProc: - -.. figure:: _static/FV3LAM_wflow_overall.png - - *Overall layout of the SRW App* - -.. _DownloadSRWApp: - -Download the SRW App -======================== -Retrieve the UFS Short Range Weather Application (SRW App) repository from GitHub and checkout the ``ufs-v1.0.0`` tag: - -.. code-block:: console - - git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git - cd ufs-srweather-app - -.. - COMMENT: Change version number in 2 places above! - -The cloned repository contains the configuration files and sub-directories shown in -:numref:`Table %s `. - -.. _FilesAndSubDirs: - -.. table:: Files and sub-directories of the ufs-srweather-app repository - - +--------------------------------+--------------------------------------------------------+ - | **File/directory Name** | **Description** | - +================================+========================================================+ - | CMakeLists.txt | Main cmake file for SRW App | - +--------------------------------+--------------------------------------------------------+ - | Externals.cfg | Tags of the GitHub repositories/branches for the | - | | external repositories | - +--------------------------------+--------------------------------------------------------+ - | LICENSE.md | CC0 license information | - +--------------------------------+--------------------------------------------------------+ - | README.md | Getting Started Guide | - +--------------------------------+--------------------------------------------------------+ - | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | - | | other packages | - +--------------------------------+--------------------------------------------------------+ - | ufs_srweather_app.settings.in | SRW App configuration summary | - +--------------------------------+--------------------------------------------------------+ - | env | Contains build and workflow environment files | - +--------------------------------+--------------------------------------------------------+ - | docs | Contains release notes, documentation, and Users' Guide| - +--------------------------------+--------------------------------------------------------+ - | manage_externals | Utility for checking out external repositories | - +--------------------------------+--------------------------------------------------------+ - | src | Contains CMakeLists.txt; external repositories | - | | will be cloned in this directory. | - +--------------------------------+--------------------------------------------------------+ - -.. _CheckoutExternals: - -External Components -=================== -Retrieve required components from external repositories, including regional_workflow, ufs-weather-model, ufs_utils, and upp.x: - -.. code-block:: console - - ./manage_externals/checkout_externals - -This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories as listed in :numref:`Section %s `. - -.. _BuildExecutables: - -Building the Executables for the Application -============================================ -Before building the executables, the build environment must be set up for your specific platform. Instructions for loading the proper modules and/or setting the correct environment variables can be found in the ``env/`` directory in files named ``build__.env.`` For the most part, the commands in those files can be directly copied and pasted, but you may need to modify certain variables such as the path to NCEP libraries for your specific platform. Here is a directory listing example of these kinds of files: - -.. code-block:: console - - $ ls -l env/ - -rw-rw-r-- 1 user ral 1228 Oct 9 10:09 build_cheyenne_intel.env - -rw-rw-r-- 1 user ral 1134 Oct 9 10:09 build_hera_intel.env - -rw-rw-r-- 1 user ral 1228 Oct 9 10:09 build_jet_intel.env - ... - -The following steps will build the pre-processing utilities, forecast model, and post-processor: - -.. code-block:: console - - mkdir build - cd build - cmake .. -DCMAKE_INSTALL_PREFIX=.. - make -j 4 >& build.out & - -where ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories containing various components of the SRW App will be created, and its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. If this step is successful, the executables listed in :numref:`Table %s ` will be located in the ``ufs-srweather-app/bin`` directory. - -.. _ExecDescription: - -.. table:: Names and descriptions of the executables produced by the build step and used by the SRW App. - - +------------------------+---------------------------------------------------------------------------------+ - | **Executable Name** | **Description** | - +========================+=================================================================================+ - | chgres_cube | Reads in raw external model (global or regional) and surface climatology data | - | | to create initial and lateral boundary conditions | - +------------------------+---------------------------------------------------------------------------------+ - | filter_topo | Filters topography based on resolution | - +------------------------+---------------------------------------------------------------------------------+ - | global_equiv_resol | Calculates a global, uniform, cubed-sphere equivalent resolution for the | - | | regional Extended Schmidt Gnomonic (ESG) grid | - +------------------------+---------------------------------------------------------------------------------+ - | make_solo_mosaic | Creates mosaic files with halos | - +------------------------+---------------------------------------------------------------------------------+ - | upp.x | Post-processor for the model output | - +------------------------+---------------------------------------------------------------------------------+ - | ufs_model | UFS Weather Model executable | - +------------------------+---------------------------------------------------------------------------------+ - | orog | Generates orography, land mask, and gravity wave drag files from fixed files | - +------------------------+---------------------------------------------------------------------------------+ - | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | - +------------------------+---------------------------------------------------------------------------------+ - | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | - +------------------------+---------------------------------------------------------------------------------+ - | shave | Shaves the excess halo rows down to what is required for the LBCs in the | - | | orography and grid files | - +------------------------+---------------------------------------------------------------------------------+ - | vcoord_gen | Generates hybrid coordinate interface profiles | - +------------------------+---------------------------------------------------------------------------------+ - | fvcom_to_FV3 | | - +------------------------+---------------------------------------------------------------------------------+ - | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | - | | for global uniform grids | - +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | - | | create a global sea ice analysis used to update the GFS once per day | - +------------------------+---------------------------------------------------------------------------------+ - | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | - | | global depth analysis used to update the GFS snow field once per day | - +------------------------+---------------------------------------------------------------------------------+ - | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | - +------------------------+---------------------------------------------------------------------------------+ - | inland | Create an inland land mask | - +------------------------+---------------------------------------------------------------------------------+ - | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | - | | developed by NOAA's Global Systems Laboratory (GSL) | - +------------------------+---------------------------------------------------------------------------------+ - | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | - +------------------------+---------------------------------------------------------------------------------+ - | lakefrac | Set lake fraction and depth | - +------------------------+---------------------------------------------------------------------------------+ - -.. _GridSpecificConfig: - -Grid-specific Configuration -=========================== - -Some SRW App parameters depend on the characteristics of the grid such as resolution and domain size. These include ``ESG grid`` and ``Input configuration`` as well as the variables related to the write component (quilting). The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. Their names can be found under ``valid_vals_PREDEF_GRID_NAME`` in the ``valid_param_vals`` script, and their grid-specific configuration variables are specified in the ``set_predef_grid_params`` script. If users want to create a new domain, they should put its name in the ``valid_param_vals`` script and the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. More information on the predefined and user-generated options -can be found in :numref:`Chapter %s `. - -.. _PredefinedGrids: - -.. table:: Predefined grids in the SRW App. - - +----------------------+-------------------+--------------------------------+ - | **Grid Name** | **Grid Type** | **Quilting (write component)** | - +======================+===================+================================+ - | RRFS_CONUS_25km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - | RRFS_CONUS_13km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - | RRFS_CONUS_3km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - -Case-specific Configuration -============================= - -When generating a new experiment (described in detail in :numref:`Section %s `), the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file, located in the -``ufs-srweather-app/regional_workflow/ush`` directory. - -.. _DefaultConfigSection: - -Default configuration: ``config_defaults.sh`` ------------------------------------------------- -Important configuration variables in the ``config_defaults.sh`` file appear in -:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` -settings. Note that there is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and -in :numref:`Chapter %s `. - -.. _ConfigVarsDefault: - -.. table:: Configuration variables specified in the config_defaults.sh script. - - +----------------------+------------------------------------------------------------+ - | **Group Name** | **Configuration variables** | - +======================+============================================================+ - | Experiment mode | RUN_ENVIR | - +----------------------+------------------------------------------------------------+ - | Machine and queue | MACHINE, ACCOUNT, SCHED, PARTITION_DEFAULT, QUEUE_DEFAULT, | - | | PARTITION_HPSS, QUEUE_HPSS, PARTITION_FCST, QUEUE_FCST | - +----------------------+------------------------------------------------------------+ - | Cron | USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS | - +----------------------+------------------------------------------------------------+ - | Experiment Dir. | EXPT_BASEDIR, EXPT_SUBDIR | - +----------------------+------------------------------------------------------------+ - | NCO mode | COMINgfs, STMP, NET, envir, RUN, PTMP | - +----------------------+------------------------------------------------------------+ - | Separator | DOT_OR_USCORE | - +----------------------+------------------------------------------------------------+ - | File name | EXPT_CONFIG_FN, RGNL_GRID_NML_FN, DATA_TABLE_FN, | - | | DIAG_TABLE_FN, FIELD_TABLE_FN, FV3_NML_BASE_SUITE_FN, | - | | FV3_NML_YALM_CONFIG_FN, FV3_NML_BASE_ENS_FN, | - | | MODEL_CONFIG_FN, NEMS_CONFIG_FN, FV3_EXEC_FN, | - | | WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, | - | | EXTRN_MDL_ICS_VAR_DEFNS_FN, EXTRN_MDL_LBCS_VAR_DEFNS_FN, | - | | WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN | - +----------------------+------------------------------------------------------------+ - | Forecast | DATE_FIRST_CYCL, DATE_LAST_CYCL, CYCL_HRS, FCST_LEN_HRS | - +----------------------+------------------------------------------------------------+ - | IC/LBC | EXTRN_MDL_NAME_ICS, EXTRN_MDL_NAME_LBCS, | - | | LBC_SPEC_INTVL_HRS, FV3GFS_FILE_FMT_ICS, | - | | FV3GFS_FILE_FMT_LBCS | - +----------------------+------------------------------------------------------------+ - | NOMADS | NOMADS, NOMADS_file_type | - +----------------------+------------------------------------------------------------+ - | External model | USE_USER_STAGED_EXTRN_FILES, EXTRN_MDL_SOURCE_BASEDRI_ICS, | - | | EXTRN_MDL_FILES_ICS, EXTRN_MDL_SOURCE_BASEDIR_LBCS, | - | | EXTRN_MDL_FILES_LBCS | - +----------------------+------------------------------------------------------------+ - | CCPP | CCPP_PHYS_SUITE | - +----------------------+------------------------------------------------------------+ - | GRID | GRID_GEN_METHOD | - +----------------------+------------------------------------------------------------+ - | ESG grid | ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, | - | | ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, | - | | ESGgrid_WIDE_HALO_WIDTH | - +----------------------+------------------------------------------------------------+ - | Input configuration | DT_ATMOS, LAYOUT_X, LAYOUT_Y, BLOCKSIZE, QUILTING, | - | | PRINT_ESMF, WRTCMP_write_groups, | - | | WRTCMP_write_tasks_per_group, WRTCMP_output_grid, | - | | WRTCMP_cen_lon, WRTCMP_cen_lat, WRTCMP_lon_lwr_left, | - | | WRTCMP_lat_lwr_left, WRTCMP_lon_upr_rght, | - | | WRTCMP_lat_upr_rght, WRTCMP_dlon, WRTCMP_dlat, | - | | WRTCMP_stdlat1, WRTCMP_stdlat2, WRTCMP_nx, WRTCMP_ny, | - | | WRTCMP_dx, WRTCMP_dy | - +----------------------+------------------------------------------------------------+ - | Pre-existing grid | PREDEF_GRID_NAME, PREEXISTING_DIR_METHOD, VERBOSE | - +----------------------+------------------------------------------------------------+ - | Cycle-independent | RUN_TASK_MAKE_GRID, GRID_DIR, RUN_TASK_MAKE_OROG, | - | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | - +----------------------+------------------------------------------------------------+ - | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | - | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | - | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | - | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | - | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | - | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | - +----------------------+------------------------------------------------------------+ - | Workflow task | MAKE_GRID_TN, MAKE_OROG_TN, MAKE_SFC_CLIMO_TN, | - | | GET_EXTRN_ICS_TN, GET_EXTRN_LBCS_TN, MAKE_ICS_TN, | - | | MAKE_LBCS_TN, RUN_FCST_TN, RUN_POST_TN | - +----------------------+------------------------------------------------------------+ - | NODE | NNODES_MAKE_GRID, NNODES_MAKE_OROG, NNODES_MAKE_SFC_CLIMO, | - | | NNODES_GET_EXTRN_ICS, NNODES_GET_EXTRN_LBCS, | - | | NNODES_MAKE_ICS, NNODES_MAKE_LBCS, NNODES_RUN_FCST, | - | | NNODES_RUN_POST | - +----------------------+------------------------------------------------------------+ - | MPI processes | PPN_MAKE_GRID, PPN_MAKE_OROG, PPN_MAKE_SFC_CLIMO, | - | | PPN_GET_EXTRN_ICS, PPN_GET_EXTRN_LBCS, PPN_MAKE_ICS, | - | | PPN_MAKE_LBCS, PPN_RUN_FCST, PPN_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Walltime | WTIME_MAKE_GRID, WTIME_MAKE_OROG, WTIME_MAKE_SFC_CLIMO, | - | | WTIME_GET_EXTRN_ICS, WTIME_GET_EXTRN_LBCS, WTIME_MAKE_ICS, | - | | WTIME_MAKE_LBCS, WTIME_RUN_FCST, WTIME_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | - | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | - | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | - | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | - +----------------------+------------------------------------------------------------+ - | Post configuration | USE_CUSTOM_POST_CONFIG_FILE, CUSTOM_POST_CONFIG_FP | - +----------------------+------------------------------------------------------------+ - | Running ensembles | DO_ENSEMBLE, NUM_ENS_MEMBERS | - +----------------------+------------------------------------------------------------+ - | Stochastic physics | DO_SHUM, DO_SPPT, DO_SKEB, SHUM_MAG, SHUM_LSCALE, | - | | SHUM_TSCALE, SHUM_INT, SPPT_MAG, SPPT_LSCALE, SPPT_TSCALE, | - | | SPPT_INT, SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, | - | | SKEB_VDOF, USE_ZMTNBLCK | - +----------------------+------------------------------------------------------------+ - | Boundary blending | HALO_BLEND | - +----------------------+------------------------------------------------------------+ - | FVCOM | USE_FVCOM, FVCOM_DIR, FVCOM_FILE | - +----------------------+------------------------------------------------------------+ - | Compiler | COMPILER | - +----------------------+------------------------------------------------------------+ - -.. _UserSpecificConfig: - -User-specific configuration: ``config.sh`` ------------------------------------------- -The user must create a ``config.sh`` file in the ``ufs-srweather-app/regional_workflow/ush`` directory by copying either of the example configuration files (``config.community.sh`` for the community mode or ``config.nco.sh`` for the operational mode). Alternatively, the user can create a custom ``config.sh`` file from scratch. Note that the *community mode* is recommended in most cases and will be fully supported for this release while the operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. - -.. note:: - - The values of the configuration variables should be consistent with those in the - ``valid_param_vals script``. In addition, various example configuration files can be - found in the ``regional_workflow/tests/baseline_configs`` directory. - -.. _ConfigCommunity: - -.. table:: Configuration variables specified in the config.community.sh script - - +--------------------------------+-------------------+--------------------------------------------------------+ - | **Parameter** | **Default Value** | **``config.community.sh`` Value** | - +================================+===================+========================================================+ - | MACHINE | "BIG_COMPUTER" | "hera" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | ACCOUNT | "project_name" | "an_account" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv15p2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | VERBOSE | "TRUE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | RUN_ENVIR | "nco" | "community" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | PREEXISTING_DIR_METHOD | "delete" | "rename" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | GRID_GEN_METHOD | "ESGgrid" | "ESGgrid" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | QUILTING | "TRUE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v15p2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FCST_LEN_HRS | "24" | "48" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | LBC_SPEC_INTVL_HRS | "6" | "6" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | DATE_FIRST_CYCL | "YYYYMMDD" | "20190615" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | DATE_LAST_CYCL | "YYYYMMDD" | "20190615" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | CYCL_HRS | ("HH1" "HH2") | "00" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | WTIME_RUN_FCST | "04:30:00" | "01:00:00" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | USE_USER_STAGED_EXTRN_FILES | "FALSE" | "TRUE" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_SOURCE_BASE_DIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | - +--------------------------------+-------------------+--------------------------------------------------------+ - | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | - +--------------------------------+-------------------+--------------------------------------------------------+ - - -.. _LoadPythonEnv: - -Python Environment for Workflow -=============================== -It is necessary to load the appropriate Python environment for the workflow. -The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. -This Python environment has already been set up on Level 1 platforms, and can be activated in -the following way: - -.. code-block:: console - - source ../../env/wflow_.env - -when in the ``ufs-srweather-app/regional_workflow/ush`` directory. - -.. _GeneratingWflowExpt: - -Generating a Regional Workflow Experiment -========================================= - -Steps to a Generate a New Experiment ----------------------------------------- -Generating an experiment requires running - -.. code-block:: console - - generate_FV3LAM_wflow.sh - -in the ``ufs-srweather-app/regional_workflow/ush`` directory. This is the all-in-one script for users -to set up their experiment with ease. :numref:`Figure %s ` shows the flowchart -for generating an experiment. First, it sets up the configuration parameters by running -the ``setup.sh`` script. Second, it copies the time-independent (fix) files and other necessary -input files such as ``data_table``, ``field_table``, ``nems.configure``, ``model_configure``, -and the CCPP suite file from its location in the ufs-weather-model directory to the experiment directory (``EXPTDIR``). -Third, it copies the weather model executable (``NEMS.exe``) from the ``bin`` directory to ``EXPTDIR``, -and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` -file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` -that is executed when running the experiment with the Rocoto workflow manager. - -.. _WorkflowGeneration: - -.. figure:: _static/FV3regional_workflow_gen.png - - *Experiment generation description* - -The ``setup.sh`` script reads three other configuration scripts: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). Note that these three scripts are read in order: ``config_default.sh``, ``config.sh``, then ``set_predef_grid_params.sh``. If a parameter is specified differently in these scripts, the file containing the last defined value will be used. .. _WorkflowTaskDescription: Description of Workflow Tasks ----------------------------- -The flowchart of the workflow tasks that are specified in the ``FV3LAM_wflow.xml`` file are -illustrated in :numref:`Figure %s `, and each task is described in -:numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, -``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. +Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in :numref:`Figure %s `. Each task is described in :numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksFig: @@ -537,8 +112,7 @@ This command will print out the status of the workflow tasks as follows: 0 out of 1 cycles completed. Workflow status: IN PROGRESS -Error messages for each task can be found in the task log files located in the ``EXPTDIR/log`` directory. In order to launch -more tasks in the workflow, you just need to call the launch script again as follows: +Error messages for each task can be found in the task log files located in the ``EXPTDIR/log`` directory. In order to launch more tasks in the workflow, you just need to call the launch script again: .. code-block:: console @@ -575,16 +149,17 @@ To launch the workflow manually, the ``rocoto`` module should be loaded: .. code-block:: console + module use rocoto module load rocoto -Then, launch the workflow as follows: +Then, launch the workflow: .. code-block:: console cd ${EXPTDIR} rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -To check the status of the workflow, issue a ``rocotostat`` command as follows: +To check the status of the workflow, issue a ``rocotostat`` command: .. code-block:: console @@ -598,114 +173,4 @@ Wait a few seconds and issue a second set of ``rocotorun`` and ``rocotostat`` co rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -.. _RunUsingStandaloneScripts: - -Run the Workflow Using the Stand-alone Scripts ----------------------------------------------- -The regional workflow has the capability to be run using standalone shell scripts if the Rocoto software is not available on a given platform. These scripts are located in the ``ufs-srweather-app/regional_workflow/ush/wrappers`` directory. Each workflow task has a wrapper script to set environment variables and run the job script. - -Example batch-submit scripts for Hera (Slurm) and Cheyenne (PBS) are included: ``sq_job.sh`` -and ``qsub_job.sh``, respectively. These examples set the build and run environment for Hera or Cheyenne so that run-time libraries match the compiled libraries (i.e. netCDF, MPI). Users may either modify the submit batch script as each task is submitted, or duplicate this batch wrapper -for their system settings for each task. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). This piece will be unique to your platform. The tasks run by the regional workflow are shown in :numref:`Table %s `. Tasks with the same stage level may be run concurrently (no dependency). - -.. _RegionalWflowTasks: - -.. table:: List of tasks in the regional workflow in the order that they are executed. - Scripts with the same stage number may be run simultaneously. The number of - processors and wall clock time is a good starting point for Cheyenne or Hera - when running a 48-h forecast on the 25-km CONUS domain. - - +------------+------------------------+----------------+----------------------------+ - | **Stage/** | **Task Run Script** | **Number of** | **Wall clock time (H:MM)** | - | **step** | | **Processors** | | - +============+========================+================+============================+ - | 1 | run_get_ics.sh | 1 | 0:20 (depends on HPSS vs | - | | | | FTP vs staged-on-disk) | - +------------+------------------------+----------------+----------------------------+ - | 1 | run_get_lbcs.sh | 1 | 0:20 (depends on HPSS vs | - | | | | FTP vs staged-on-disk) | - +------------+------------------------+----------------+----------------------------+ - | 1 | run_make_grid.sh | 24 | 0:20 | - +------------+------------------------+----------------+----------------------------+ - | 2 | run_make_orog.sh | 24 | 0:20 | - +------------+------------------------+----------------+----------------------------+ - | 3 | run_make_sfc_climo.sh | 48 | 0:20 | - +------------+------------------------+----------------+----------------------------+ - | 4 | run_make_ics.sh | 48 | 0:30 | - +------------+------------------------+----------------+----------------------------+ - | 4 | run_make_lbcs.sh | 48 | 0:30 | - +------------+------------------------+----------------+----------------------------+ - | 5 | run_fcst.sh | 48 | 0:30 | - +------------+------------------------+----------------+----------------------------+ - | 6 | run_post.sh | 48 | 0:25 (2 min per output | - | | | | forecast hour) | - +------------+------------------------+----------------+----------------------------+ - -The steps to run the standalone scripts are as follows: - -#. Clone and build the ufs-srweather-app following the steps - `here `_, or in - :numref:`Sections %s ` to :numref:`Section %s ` above. - -#. Generate an experiment configuration following the steps - `here `_, or in - :numref:`Section %s ` above. - -#. ``cd`` into the experiment directory - -#. Set the environment variable ``EXPTDIR`` for either csh and bash, respectively: - - .. code-block:: console - - setenv EXPTDIR `pwd` - export EXPTDIR=`pwd` - -#. COPY the wrapper scripts from the regional_workflow directory into your experiment directory: - - .. code-block:: console - - cp ufs-srweather-app/regional_workflow/ush/wrappers/* . - -#. Set the OMP_NUM_THREADS variable and fix dash/bash shell issue (this ensures the system does not use an alias of ``sh`` to dash). - - .. code-block:: console - - export OMP_NUM_THREADS=1 - sed -i 's/bin\/sh/bin\/bash/g' *sh - -#. RUN each of the listed scripts in order. Scripts with the same stage number (listed in :numref:`Table %s `) may be run simultaneously. - - .. code-block:: console - - ./run_make_grid.sh - ./run_get_ics.sh - ./run_get_lbcs.sh - ./run_make_orog.sh - ./run_make_sfc_climo.sh - ./run_make_ics.sh - ./run_make_lbcs.sh - ./run_fcst.sh - ./run_post.sh - - .. note:: - If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. To allocate a second node: - - .. code-block:: console - - salloc -N 1 - module load gnu openmpi - mpirun -n 1 hostname - - This last command will output a hostname. Then, run ``ssh ``, replacing ```` with the actual hostname output in the prior command. - - - #. On most HPC systems, you will need to submit a batch job to run multi-processor jobs. - - #. On some HPC systems, you may be able to run the first two jobs (serial) on a login node/command-line - - #. Example scripts for Slurm (Hera) and PBS (Cheyenne) are provided. These will need to be adapted to your system. - - #. This submit batch script is hard-coded per task, so will need to be modified or copied to run each task. - -Check the batch script output file in your experiment directory for a “SUCCESS” message near the end of the file. diff --git a/docs/UsersGuide/source/_static/theme_overrides.css b/docs/UsersGuide/source/_static/theme_overrides.css index f2b48b594c..9143850a43 100644 --- a/docs/UsersGuide/source/_static/theme_overrides.css +++ b/docs/UsersGuide/source/_static/theme_overrides.css @@ -10,5 +10,6 @@ .wy-table-responsive { overflow: visible !important; } + } diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index 4057dcee0d..ad0d16b25e 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -12,9 +12,9 @@ UFS Short-Range Weather App Users Guide Introduction + SRWAppOverview Quickstart_Container Quickstart_NonContainer - SRWAppOverview Components Include-HPCInstall InputOutputFiles From d1addf82b700b343779fdd737a2509061218461c Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 16 Mar 2022 14:50:38 -0400 Subject: [PATCH 053/118] finish merging non-container guide & SRWOverview, rename/remove files, update FAQ --- ...start_NonContainer.rst => BuildRunSRW.rst} | 163 +++++++++++++--- docs/UsersGuide/source/FAQ.rst | 46 +++-- docs/UsersGuide/source/Introduction.rst | 25 ++- ...uickstart_Container.rst => Quickstart.rst} | 6 +- docs/UsersGuide/source/SRWAppOverview.rst | 176 ------------------ docs/UsersGuide/source/index.rst | 5 +- 6 files changed, 186 insertions(+), 235 deletions(-) rename docs/UsersGuide/source/{Quickstart_NonContainer.rst => BuildRunSRW.rst} (82%) rename docs/UsersGuide/source/{Quickstart_Container.rst => Quickstart.rst} (99%) delete mode 100644 docs/UsersGuide/source/SRWAppOverview.rst diff --git a/docs/UsersGuide/source/Quickstart_NonContainer.rst b/docs/UsersGuide/source/BuildRunSRW.rst similarity index 82% rename from docs/UsersGuide/source/Quickstart_NonContainer.rst rename to docs/UsersGuide/source/BuildRunSRW.rst index 7b0bd6195f..1bfa8c1e2d 100644 --- a/docs/UsersGuide/source/Quickstart_NonContainer.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -1,8 +1,8 @@ .. _BuildRunSRW: -============================================== -Building and Running the SRW (Non-Container) -============================================== +===================================== +Building and Running the SRW +===================================== The UFS Short-Range Weather Application (SRW App) is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. @@ -24,10 +24,11 @@ The overall procedure for generating an experiment is shown in :numref:`Figure % * :ref:`Build the executables ` * :ref:`Download and stage data ` * :ref:`Optional: Configure a new grid ` - * :ref:`Configure the experiment ` - * :ref:`Load the python environment for the regional workflow ` * :ref:`Generate a regional workflow experiment ` + * :ref:`Configure the experiment parameters ` + * :ref:`Load the python environment for the regional workflow ` * :ref:`Run the regional workflow ` + * :ref:`Optional: Plot the output ` .. _AppOverallProc: @@ -266,9 +267,9 @@ Generate the Forecast Experiment ================================= Generating the forecast experiment requires three steps: -* Set experiment parameters -* Set Python and other environment parameters -* Run a script to generate the experiment workflow +* :ref:`Set experiment parameters ` +* :ref:`Set Python and other environment parameters ` +* :ref:`Run a script to generate the experiment workflow ` The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. @@ -562,7 +563,7 @@ This command will activate the ``regional_workflow``. The user should see ``(reg conda activate regional_workflow -.. _GenerateWorkflowNC: +.. _GenerateWorkflow: Generate the Regional Workflow ------------------------------------------- @@ -573,7 +574,7 @@ Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` d ./generate_FV3LAM_wflow.sh -The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. +The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``$EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``$EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. @@ -587,45 +588,149 @@ The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASED *Experiment generation description* +.. _WorkflowTaskDescription: -An environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: +Description of Workflow Tasks +-------------------------------- -.. code-block:: console +.. note:: + This section gives an overview of workflow tasks. To begin running the workflow, skip to :numref:`Step %s ` + +Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in :numref:`Figure %s `. Each task is described in :numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. + +.. _WorkflowTasksFig: + +.. figure:: _static/FV3LAM_wflow_flowchart.png + + *Flowchart of the workflow tasks* + +.. _WorkflowTasksTable: + +.. table:: Workflow tasks in SRW App + + +----------------------+------------------------------------------------------------+ + | **Workflow Task** | **Task Description** | + +======================+============================================================+ + | make_grid | Pre-processing task to generate regional grid files. Can | + | | be run, at most, once per experiment. | + +----------------------+------------------------------------------------------------+ + | make_orog | Pre-processing task to generate orography files. Can be | + | | run, at most, once per experiment. | + +----------------------+------------------------------------------------------------+ + | make_sfc_climo | Pre-processing task to generate surface climatology files. | + | | Can be run, at most, once per experiment. | + +----------------------+------------------------------------------------------------+ + | get_extrn_ics | Cycle-specific task to obtain external data for the | + | | initial conditions | + +----------------------+------------------------------------------------------------+ + | get_extrn_lbcs | Cycle-specific task to obtain external data for the | + | | lateral boundary (LB) conditions | + +----------------------+------------------------------------------------------------+ + | make_ics | Generate initial conditions from the external data | + +----------------------+------------------------------------------------------------+ + | make_lbcs | Generate lateral boundary conditions from the external data| + +----------------------+------------------------------------------------------------+ + | run_fcst | Run the forecast model (UFS weather model) | + +----------------------+------------------------------------------------------------+ + | run_post | Run the post-processing tool (UPP) | + +----------------------+------------------------------------------------------------+ - export EXPTDIR=// -If the login shell is csh/tcsh, replace ``export`` with ``setenv`` in the command above. .. _RocotoRun: Run the Workflow Using Rocoto ============================= -The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: using the ``./launch_FV3LAM_wflow.sh`` or by hand. +The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. + +Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: + +.. code-block:: console + + export EXPTDIR=// + +If the login shell is csh/tcsh, it can be set using: + +.. code-block:: console + + setenv EXPTDIR /path-to-experiment/directory + Launch the Rocoto Workflow Using a Script ----------------------------------------------- -To run Rocoto using the script provided: +To run Rocoto using the ``launch_FV3LAM_wflow.sh`` script provided, simply call it without any arguments: .. code-block:: console cd $EXPTDIR ./launch_FV3LAM_wflow.sh -Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named ``log.launch_FV3LAM_wflow`` will be created (or appended) in the ``EXPTDIR``. Check the end of the log file periodically to see how the experiment is progressing: +This script creates a log file named ``log.launch_FV3LAM_wflow`` in the EXPTDIR directory or appends information to it if the file already exists. Check the end of the log file periodically to see how the experiment is progressing: .. code-block:: console - cd $EXPTDIR - vi ``log.launch_FV3LAM_wflow`` + tail -n 30 log.launch_FV3LAM_wflow + +In order to launch additional tasks in the workflow, call the launch script again; this action will need to be repeated until all tasks in the workflow have been launched. To (re)launch the workflow and check its progress on a single line, run: + +.. code-block:: console + + ./launch_FV3LAM_wflow.sh; tail -n 80 log.launch_FV3LAM_wflow -Alternatively, to (re)launch the workflow and check its progress on a single line: +This will output the last 80 lines of the log file, which includes the status of the workflow tasks (e.g., SUCCEEDED, DEAD, RUNNING, SUBMITTING, QUEUED). The number 80 can be changed according to the user's preferences. The output will look like this: .. code-block:: console - ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ====================================================================================================== + 202006170000 make_grid druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 make_orog - - - - - + 202006170000 make_sfc_climo - - - - - + 202006170000 get_extrn_ics druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 get_extrn_lbcs druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 make_ics - - - - - + 202006170000 make_lbcs - - - - - + 202006170000 run_fcst - - - - - + 202006170000 run_post_00 - - - - - + 202006170000 run_post_01 - - - - - + 202006170000 run_post_02 - - - - - + 202006170000 run_post_03 - - - - - + 202006170000 run_post_04 - - - - - + 202006170000 run_post_05 - - - - - + 202006170000 run_post_06 - - - - - + + Summary of workflow status: + ~~~~~~~~~~~~~~~~~~~~~~~~~~ + + 0 out of 1 cycles completed. + Workflow status: IN PROGRESS + +Error messages for each specific task can be found in the task log files located in the ``$EXPTDIR/log`` directory. + +If everything goes smoothly, you will eventually get the following workflow status table as follows: -This will output the last 40 lines of the log file. The number 40 can be changed according to the user's preferences. +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ====================================================================================================== + 202006170000 make_grid 8854765 SUCCEEDED 0 1 6.0 + 202006170000 make_orog 8854809 SUCCEEDED 0 1 27.0 + 202006170000 make_sfc_climo 8854849 SUCCEEDED 0 1 36.0 + 202006170000 get_extrn_ics 8854763 SUCCEEDED 0 1 54.0 + 202006170000 get_extrn_lbcs 8854764 SUCCEEDED 0 1 61.0 + 202006170000 make_ics 8854914 SUCCEEDED 0 1 119.0 + 202006170000 make_lbcs 8854913 SUCCEEDED 0 1 98.0 + 202006170000 run_fcst 8854992 SUCCEEDED 0 1 655.0 + 202006170000 run_post_00 8855459 SUCCEEDED 0 1 6.0 + 202006170000 run_post_01 8855460 SUCCEEDED 0 1 6.0 + 202006170000 run_post_02 8855461 SUCCEEDED 0 1 6.0 + 202006170000 run_post_03 8855462 SUCCEEDED 0 1 6.0 + 202006170000 run_post_04 8855463 SUCCEEDED 0 1 6.0 + 202006170000 run_post_05 8855464 SUCCEEDED 0 1 6.0 + 202006170000 run_post_06 8855465 SUCCEEDED 0 1 6.0 + +If all the tasks complete successfully, the workflow status in the log file will include the word “SUCCESS." Otherwise, the workflow status will include the word “FAILURE." Launch the Rocoto Workflow Manually @@ -634,7 +739,7 @@ Launch the Rocoto Workflow Manually Load Rocoto ^^^^^^^^^^^^^^^^ -Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can manually load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. +Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. For most systems, a variant on the following commands will be necessary to load the Rocoto module: @@ -705,7 +810,7 @@ After loading Rocoto, call ``rocotorun`` from the experiment directory to launch The ``rocotorun`` and ``rocotostat`` commands will need to be resubmitted regularly and repeatedly until the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. -If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed: +If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed, users can open the ``make_grid.log`` file to see what caused the problem: .. code-block:: console @@ -716,11 +821,11 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. -.. _AdditionalOptions: +.. _Automate: -Additional Options +Automated Option ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in :ref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in :ref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console @@ -728,7 +833,7 @@ For automatic resubmission of the workflow at regular intervals (e.g., every min where ```` is changed to correspond to the user's machine, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. -Then, check the experiment progress with: +To check the experiment progress: .. code-block:: console @@ -760,6 +865,8 @@ The workflow run is completed when all tasks have “SUCCEEDED”, and the rocot ... 201906150000 run_post_f048 4953381 SUCCEEDED 0 1 7.0 +.. _PlotOutput: + Plot the Output =============== Two python scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. diff --git a/docs/UsersGuide/source/FAQ.rst b/docs/UsersGuide/source/FAQ.rst index ee744db726..636e27ad21 100644 --- a/docs/UsersGuide/source/FAQ.rst +++ b/docs/UsersGuide/source/FAQ.rst @@ -1,17 +1,26 @@ .. _FAQ: -*** +**** FAQ -*** +**** + +* :ref:`How do I turn on/off the cycle-independent workflow tasks? ` +* :ref:`How do I define an experiment name? ` +* :ref:`How do I change the Physics Suite Definition File (SDF)? ` +* :ref:`How do I restart a DEAD task? ` +* :ref:`How do I change the grid? ` + +.. _CycleInd: + +=========================================================== +How do I turn on/off the cycle-independent workflow tasks? +=========================================================== -========================================================= -How do I turn On/Off the Cycle-Independent Workflow Tasks -========================================================= The first three pre-processing tasks ``make_grid``, ``make_orog``, and ``make_sfc_climo`` are cycle-independent, meaning that they only need to be run once per experiment. If the grid, orography, and surface climatology files that these tasks generate are already available (e.g. from a previous experiment that used the same grid as the current), then -these tasks can be skipped by having the workflow use those pre-generated files. This +these tasks can be skipped, and the workflow can use those pre-generated files. This can be done by adding the following lines to the ``config.sh`` script before running the ``generate_FV3LAM_wflow.sh`` script: @@ -28,26 +37,36 @@ The ``RUN_TASK_MAKE_GRID``, ``RUN_TASK_MAKE_OROG``, and ``RUN_TASK_MAKE_SFC_CLIM disable their respective tasks, and ``GRID_DIR``, ``OROG_DIR``, and ``SFC_CLIMO_DIR`` specify the directories in which the workflow can find the pre-generated grid, orography, and surface climatology files, respectively (these directories may be the same, i.e. all -three sets of files may be placed in the same location). By default, the ``RUN_TASK_MAKE_...`` +three sets of files may be placed in the same location). By default, the ``RUN_TASK_MAKE_...`` flags are set to ``TRUE`` in ``config_defaults.sh``, i.e. the workflow will by default run the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks. +.. _DefineExptName: + =================================== How do I define an experiment name? =================================== + The name of the experiment is set in the ``config.sh`` file using the variable ``EXPT_SUBDIR``. -See :numref:`Section %s ` for more details. +See :numref:`Section %s ` for more details. + + +.. _ChangePhysics: + +========================================================= +How do I change the Physics Suite Definition File (SDF)? +========================================================= -================================================ -How do I change the Suite Definition File (SDF)? -================================================ The SDF is set in the ``config.sh`` file using the variable ``CCPP_PHYS_SUITE``. When the ``generate_FV3LAM_wflow.sh`` script is run, the SDF file is copied from its location in the forecast model directory to the experiment directory ``EXPTDIR``. +.. _RestartTask: + ============================= How do I restart a DEAD task? ============================= + On platforms that utilize Rocoto workflow software (such as NCAR’s Cheyenne machine), sometimes if something goes wrong with the workflow a task may end up in the DEAD state: @@ -70,12 +89,15 @@ command: rocotorewind -w FV3SAR_wflow.xml -d FV3SAR_wflow.db -v 10 -c 201905200000 -t get_extrn_ics where ``-c`` specifies the cycle date (first column of rocotostat output) and ``-t`` represents the task name -(second column of rocotostat output). After using ``rocotorewind``, the next time ``rocotorun`` is used to +(second column of rocotostat output). After using ``rocotorewind``, the next time ``rocotorun`` is used to advance the workflow, the job will be resubmitted. +.. _ChangeGrid: + =========================== How do I change the grid? =========================== + To change the predefined grid, you need to modify the ``PREDEF_GRID_NAME`` variable in the ``config.sh`` script which the user has created to generate an experiment configuration and workflow. Users can choose from one of three predefined grids for the SRW Application: diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index f67f240f25..cd8f699da7 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for multiple applications (see the `complete list here `__). The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a Quick Start Guide for running the application :ref:`in a container ` and a detailed guide for running the SRW :ref:`locally `, in addition to an overview of the :ref:`release components `, a description of the supported capabilities, and details on where to find more information and obtain support. +The UFS can be configured for `multiple applications `__. The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application in a container and a :ref:`detailed guide ` for running the SRW on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -19,21 +19,21 @@ UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range How to Use This Document ======================== -This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the `UFS forum `__. +This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the `UFS Forum `__. .. code-block:: console Throughout the guide, this presentation style indicates shell commands and options, code examples, etc. -Variables presented as ``AaBbCc123`` in this document typically refer to variables in scripts, names of files, and directories. +Variables presented as ``AaBbCc123`` in this User's Guide typically refer to variables in scripts, names of files, and directories. -File paths or code that include angle brackets (e.g., ``env/build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``env/build_aws_gcc.env``). +File paths or code that include angle brackets (e.g., ``build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``build_orion_intel.env``). .. hint:: - * To get started running the SRW, see the :ref:`Containerized Quick Start Guide ` or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. + * To get started running the SRW, see the :ref:`Quick Start Guide ` for beginners or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. * For background information on the SRW code repositories and directory structure, see :numref:`Section %s ` below. - * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s: Components ` for a more in-depth treatment. + * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. .. _SRWStructure: @@ -45,7 +45,7 @@ Code Repositories and Directory Structure Hierarchical Repository Structure ----------------------------------- -The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The UFS SRW Application includes the ``manage_externals`` tools along with a configuration file called ``Externals.cfg``, which describes the external repositories associated with this umbrella repository (see :numref:`Table %s `). +The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather-app`` and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The SRW Application includes the ``manage_externals`` tool and a configuration file called ``Externals.cfg``, which describes the external repositories associated with the SRW umbrella repository (see :numref:`Table %s `). .. _top_level_repos: @@ -71,10 +71,9 @@ The umbrella repository for the UFS SRW Application is named *ufs-srweather-app* | Processor (UPP) | | +---------------------------------+---------------------------------------------------------+ -The UFS Weather Model contains a number of sub-repositories used by the model as -documented `here `__. +The UFS Weather Model contains a number of sub-repositories, which are documented `here `__. -Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on the preconfigured (Level 1) platforms listed `here `__. However, it must be built on other systems. :numref:`Chapter %s ` contains details on installing the HPC-Stack. +Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on `preconfigured (Level 1) platforms `__. However, it must be built on other systems. :numref:`Chapter %s ` contains details on installing the HPC-Stack. .. _TopLevelDirStructure: @@ -128,7 +127,7 @@ The ``ufs-srweather-app`` :term:`umbrella repository` structure is determined by Regional Workflow Sub-Directories ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure`, a number of sub-directories are created when the regional workflow is cloned. The contents of these sub-directories are described in :numref:`Table %s `. +A number of sub-directories are created under the ``regional_workflow`` directory when the regional workflow is cloned (see directory diagram :ref:`above `). The contents of these sub-directories are described in :numref:`Table %s `. .. _Subdirectories: @@ -137,7 +136,7 @@ Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure +-------------------------+---------------------------------------------------------+ | **Directory Name** | **Description** | +=========================+=========================================================+ - | docs | Users' Guide Documentation | + | docs | User's Guide Documentation | +-------------------------+---------------------------------------------------------+ | jobs | J-job scripts launched by Rocoto | +-------------------------+---------------------------------------------------------+ @@ -155,7 +154,7 @@ Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure Experiment Directory Structure -------------------------------- -When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shouws the contents of the experiment directory before the experiment workflow is run. +When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shows the contents of the experiment directory before the experiment workflow is run. .. _ExptDirStructure: diff --git a/docs/UsersGuide/source/Quickstart_Container.rst b/docs/UsersGuide/source/Quickstart.rst similarity index 99% rename from docs/UsersGuide/source/Quickstart_Container.rst rename to docs/UsersGuide/source/Quickstart.rst index 554ba9b743..8e7f1f4aaf 100644 --- a/docs/UsersGuide/source/Quickstart_Container.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -1,8 +1,8 @@ .. _QuickstartC: -================================================= -Containerized Quick Start Guide (Recommended) -================================================= +==================================== +Quick Start Guide +==================================== This Workflow Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a container. The container approach provides a uniform enviroment in which to build and run the SRW. Normally, the details of building and running the SRW vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an EPIC-provided container reduces this variability and allows for a smoother SRW build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst deleted file mode 100644 index 2d846bb5da..0000000000 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ /dev/null @@ -1,176 +0,0 @@ -.. _SRWAppOverview: - -=========================================================== -Overview of the Short-Range Weather Application Workflow -=========================================================== - - -.. _WorkflowTaskDescription: - -Description of Workflow Tasks ------------------------------ -Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in :numref:`Figure %s `. Each task is described in :numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. - -.. _WorkflowTasksFig: - -.. figure:: _static/FV3LAM_wflow_flowchart.png - - *Flowchart of the workflow tasks* - -.. _WorkflowTasksTable: - -.. table:: Workflow tasks in SRW App - - +----------------------+------------------------------------------------------------+ - | **Workflow Task** | **Task Description** | - +======================+============================================================+ - | make_grid | Pre-processing task to generate regional grid files. Can | - | | be run, at most, once per experiment. | - +----------------------+------------------------------------------------------------+ - | make_orog | Pre-processing task to generate orography files. Can be | - | | run, at most, once per experiment. | - +----------------------+------------------------------------------------------------+ - | make_sfc_climo | Pre-processing task to generate surface climatology files. | - | | Can be run, at most, once per experiment. | - +----------------------+------------------------------------------------------------+ - | get_extrn_ics | Cycle-specific task to obtain external data for the | - | | initial conditions | - +----------------------+------------------------------------------------------------+ - | get_extrn_lbcs | Cycle-specific task to obtain external data for the | - | | lateral boundary (LB) conditions | - +----------------------+------------------------------------------------------------+ - | make_ics | Generate initial conditions from the external data | - +----------------------+------------------------------------------------------------+ - | make_lbcs | Generate lateral boundary conditions from the external data| - +----------------------+------------------------------------------------------------+ - | run_fcst | Run the forecast model (UFS weather model) | - +----------------------+------------------------------------------------------------+ - | run_post | Run the post-processing tool (UPP) | - +----------------------+------------------------------------------------------------+ - -Launch of Workflow -================== -There are two ways to launch the workflow using Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` -script, and (2) manually calling the ``rocotorun`` command. Moreover, you can run the workflow -separately using stand-alone scripts. - -An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login -shell is bash, it can be set as follows: - -.. code-block:: console - - export EXPTDIR=/path-to-experiment/directory - -Or if the login shell is csh/tcsh, it can be set using: - -.. code-block:: console - - setenv EXPTDIR /path-to-experiment/directory - -Launch with the ``launch_FV3LAM_wflow.sh`` script -------------------------------------------------- -To launch the ``launch_FV3LAM_wflow.sh`` script, simply call it without any arguments as follows: - -.. code-block:: console - - cd ${EXPTDIR} - ./launch_FV3LAM_wflow.sh - -This script creates a log file named ``log.launch_FV3LAM_wflow`` in the EXPTDIR directory -(described in :numref:`Section %s `) or appends to it if it already exists. -You can check the contents of the end of the log file (e.g. last 30 lines) using the command: - -.. code-block:: console - - tail -n 30 log.launch_FV3LAM_wflow - -This command will print out the status of the workflow tasks as follows: - -.. code-block:: console - - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ====================================================================================================== - 202006170000 make_grid druby://hfe01:33728 SUBMITTING - 0 0.0 - 202006170000 make_orog - - - - - - 202006170000 make_sfc_climo - - - - - - 202006170000 get_extrn_ics druby://hfe01:33728 SUBMITTING - 0 0.0 - 202006170000 get_extrn_lbcs druby://hfe01:33728 SUBMITTING - 0 0.0 - 202006170000 make_ics - - - - - - 202006170000 make_lbcs - - - - - - 202006170000 run_fcst - - - - - - 202006170000 run_post_00 - - - - - - 202006170000 run_post_01 - - - - - - 202006170000 run_post_02 - - - - - - 202006170000 run_post_03 - - - - - - 202006170000 run_post_04 - - - - - - 202006170000 run_post_05 - - - - - - 202006170000 run_post_06 - - - - - - - Summary of workflow status: - ~~~~~~~~~~~~~~~~~~~~~~~~~~ - - 0 out of 1 cycles completed. - Workflow status: IN PROGRESS - -Error messages for each task can be found in the task log files located in the ``EXPTDIR/log`` directory. In order to launch more tasks in the workflow, you just need to call the launch script again: - -.. code-block:: console - - ./launch_FV3LAM_wflow - -If everything goes smoothly, you will eventually get the following workflow status table as follows: - -.. code-block:: console - - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ====================================================================================================== - 202006170000 make_grid 8854765 SUCCEEDED 0 1 6.0 - 202006170000 make_orog 8854809 SUCCEEDED 0 1 27.0 - 202006170000 make_sfc_climo 8854849 SUCCEEDED 0 1 36.0 - 202006170000 get_extrn_ics 8854763 SUCCEEDED 0 1 54.0 - 202006170000 get_extrn_lbcs 8854764 SUCCEEDED 0 1 61.0 - 202006170000 make_ics 8854914 SUCCEEDED 0 1 119.0 - 202006170000 make_lbcs 8854913 SUCCEEDED 0 1 98.0 - 202006170000 run_fcst 8854992 SUCCEEDED 0 1 655.0 - 202006170000 run_post_00 8855459 SUCCEEDED 0 1 6.0 - 202006170000 run_post_01 8855460 SUCCEEDED 0 1 6.0 - 202006170000 run_post_02 8855461 SUCCEEDED 0 1 6.0 - 202006170000 run_post_03 8855462 SUCCEEDED 0 1 6.0 - 202006170000 run_post_04 8855463 SUCCEEDED 0 1 6.0 - 202006170000 run_post_05 8855464 SUCCEEDED 0 1 6.0 - 202006170000 run_post_06 8855465 SUCCEEDED 0 1 6.0 - -If all the tasks complete successfully, the workflow status in the log file will include the word “SUCCESS." -Otherwise, the workflow status will include the word “FAILURE." - -Manually launch by calling the ``rocotorun`` command ----------------------------------------------------- -To launch the workflow manually, the ``rocoto`` module should be loaded: - -.. code-block:: console - - module use rocoto - module load rocoto - -Then, launch the workflow: - -.. code-block:: console - - cd ${EXPTDIR} - rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 - -To check the status of the workflow, issue a ``rocotostat`` command: - -.. code-block:: console - - rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 - -Wait a few seconds and issue a second set of ``rocotorun`` and ``rocotostat`` commands: - -.. code-block:: console - - rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 - rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 - - - diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index ad0d16b25e..e66471d1e6 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -12,9 +12,8 @@ UFS Short-Range Weather App Users Guide Introduction - SRWAppOverview - Quickstart_Container - Quickstart_NonContainer + Quickstart + BuildRunSRW Components Include-HPCInstall InputOutputFiles From fc1a1d4cc40316355773996b74813dbf781d03c5 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Mar 2022 09:58:11 -0400 Subject: [PATCH 054/118] minor edits for Intro & QS --- docs/UsersGuide/source/Components.rst | 1 + docs/UsersGuide/source/FAQ.rst | 2 +- docs/UsersGuide/source/Glossary.rst | 8 +- docs/UsersGuide/source/Introduction.rst | 158 ++++++++++++------------ docs/UsersGuide/source/Quickstart.rst | 72 +++++------ 5 files changed, 124 insertions(+), 117 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index bd8b22c254..ed3eff1f2a 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -13,6 +13,7 @@ The SRW Application v2.0 release assembles a variety of components, including: These components are documented within this User's Guide and supported through a `community forum `_. +.. _Utils: Pre-processor Utilities and Initial Conditions ============================================== diff --git a/docs/UsersGuide/source/FAQ.rst b/docs/UsersGuide/source/FAQ.rst index 636e27ad21..50d3141006 100644 --- a/docs/UsersGuide/source/FAQ.rst +++ b/docs/UsersGuide/source/FAQ.rst @@ -86,7 +86,7 @@ command: .. code-block:: console - rocotorewind -w FV3SAR_wflow.xml -d FV3SAR_wflow.db -v 10 -c 201905200000 -t get_extrn_ics + rocotorewind -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -c 201905200000 -t get_extrn_ics where ``-c`` specifies the cycle date (first column of rocotostat output) and ``-t`` represents the task name (second column of rocotostat output). After using ``rocotorewind``, the next time ``rocotorun`` is used to diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 7099a6b1a2..ebded5df4a 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -15,6 +15,9 @@ Glossary Component Repository A :term:`repository` that contains, at a minimum, source code for a single component. + Container + `Docker `__ describes a container as "a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another." + CONUS Continental United States @@ -25,6 +28,9 @@ Glossary dynamical core Global atmospheric model based on fluid dynamics principles, including Euler's equations of motion. + EPIC + EPIC stands for the `Earth Prediction Innovation Center `__. EPIC seeks to accelerate scientific research and modeling contributions through continuous and sustained community engagement to produce the most accurate and reliable operational modeling system in the world. + FV3 The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL), it is a scalable and flexible dycore capable of both @@ -78,7 +84,7 @@ Glossary NEMSIO A binary format for atmospheric model output from :term:`NCEP`'s Global Forecast System (GFS). - NWP (Numerical Weather Prediction) + NWP Numerical Weather Prediction (NWP) takes current observations of weather and processes them with computer models to forecast the future state of the weather. Orography diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index cd8f699da7..0cd418f402 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -33,7 +33,61 @@ File paths or code that include angle brackets (e.g., ``build__` for beginners or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. * For background information on the SRW code repositories and directory structure, see :numref:`Section %s ` below. - * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. + * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. + + +.. _ComponentsOverview: + +SRW Components Overview +============================ + +Pre-processor Utilities and Initial Conditions +------------------------------------------------ + +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s ` and in the `UFS_UTILS User’s Guide `_. + + +Forecast Model +----------------- + +Atmospheric Model +^^^^^^^^^^^^^^^^^^^^^^ + +The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). +The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. + +Common Community Physics Package +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW release includes an experimental physics version and an updated operational version. + +Data Format +^^^^^^^^^^^^^^^^^^^^^^ + +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. + + +Unified Post-Processor (UPP) +-------------------------------- + +The `Unified Post Processor `__ (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). + + +Visualization Example +------------------------- + +This SRW Application release provides Python scripts to create basic visualizations of the model output. :numref:`Chapter %s ` contains usage information and instructions; instructions also appear at the top of the scripts. + +Build System and Workflow +---------------------------- + +The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `__ for more on workflow management). Individual components can also be run in a stand-alone, command line fashion. + +The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. + +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. + .. _SRWStructure: @@ -127,7 +181,7 @@ The ``ufs-srweather-app`` :term:`umbrella repository` structure is determined by Regional Workflow Sub-Directories ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -A number of sub-directories are created under the ``regional_workflow`` directory when the regional workflow is cloned (see directory diagram :ref:`above `). The contents of these sub-directories are described in :numref:`Table %s `. +A number of sub-directories are created under the ``regional_workflow`` directory when the regional workflow is cloned (see directory diagram :ref:`above `). :numref:`Table %s ` describes the contents of these sub-directories. .. _Subdirectories: @@ -154,7 +208,7 @@ A number of sub-directories are created under the ``regional_workflow`` director Experiment Directory Structure -------------------------------- -When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shows the contents of the experiment directory before the experiment workflow is run. +When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Step %s `), a user-defined experimental directory (``EXPTDIR``) is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shows the contents of the experiment directory before running the experiment workflow. .. _ExptDirStructure: @@ -201,9 +255,7 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` scr | YYYYMMDDHH | Cycle directory (empty) | +---------------------------+-------------------------------------------------------------------------------------------------------+ -In addition, the *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. -The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files -after the grid, orography, and/or surface climatology generation tasks are run. +In addition, running the SRW in *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files after the grid, orography, and/or surface climatology generation tasks are run. .. _FixDirectories: @@ -212,7 +264,7 @@ after the grid, orography, and/or surface climatology generation tasks are run. +-------------------------+----------------------------------------------------------+ | **Directory Name** | **Description** | +=========================+==========================================================+ - | fix_am | Directory containing the global `fix` (time-independent) | + | fix_am | Directory containing the global fix (time-independent) | | | data files. The experiment generation script copies | | | these files from a machine-dependent system directory. | +-------------------------+----------------------------------------------------------+ @@ -223,33 +275,31 @@ after the grid, orography, and/or surface climatology generation tasks are run. +-------------------------+----------------------------------------------------------+ Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named -``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. -Once the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks and the ``get_extrn_ics`` -and ``get_extrn_lbc`` tasks for the YYYYMMDDHH cycle have completed successfully, new files and -sub-directories are created, as described in :numref:`Table %s `. +``log.launch_FV3LAM_wflow`` will be created (unless it already exists) in ``EXPTDIR``. The first several workflow tasks (i.e., ``make_grid``, ``make_orog``, ``make_sfc_climo``, ``get_extrn_ics``, and ``get_extrn_lbc``) are preprocessing tasks, which result in the creation of new files and +sub-directories, described in :numref:`Table %s `. .. _CreatedByWorkflow: -.. table:: New directories and files created when the workflow is launched. +.. table:: New directories and files created when the workflow is launched :widths: 30 70 +---------------------------+--------------------------------------------------------------------+ - | **Directory/file Name** | **Description** | + | **Directory/File Name** | **Description** | +===========================+====================================================================+ - | YYYYMMDDHH | This is updated when the first cycle-specific workflow tasks are | - | | run, which are ``get_extrn_ics`` and ``get_extrn_lbcs`` (they are | - | | launched simultaneously for each cycle in the experiment). We | - | | refer to this as a “cycle directory”. Cycle directories are | - | | created to contain cycle-specific files for each cycle that the | - | | experiment runs. If ``DATE_FIRST_CYCL`` and ``DATE_LAST_CYCL`` | - | | were different, and/or ``CYCL_HRS`` contained more than one | - | | element in the ``config.sh`` file, then more than one cycle | - | | directory would be created under the experiment directory. | + | YYYYMMDDHH | This is a “cycle directory” that is updated when the first | + | | cycle-specific workflow tasks (``get_extrn_ics`` and | + | | ``get_extrn_lbcs``) are run. These tasks are launched | + | | simultaneously for each cycle in the experiment. Cycle directories | + | | are created to contain cycle-specific files for each cycle that | + | | the experiment runs. If ``DATE_FIRST_CYCL`` and ``DATE_LAST_CYCL`` | + | | are different, and/or if ``CYCL_HRS`` contains more than one | + | | element in the ``config.sh`` file, more than one cycle directory | + | | will be created under the experiment directory. | +---------------------------+--------------------------------------------------------------------+ - | grid | Directory generated by the ``make_grid`` task containing grid | - | | files for the experiment | + | grid | Directory generated by the ``make_grid`` task to store grid files | + | | for the experiment | +---------------------------+--------------------------------------------------------------------+ - | log | Contains log files generated by the overall workflow and its | + | log | Contains log files generated by the overall workflow and by its | | | various tasks. Look in these files to trace why a task may have | | | failed. | +---------------------------+--------------------------------------------------------------------+ @@ -262,65 +312,15 @@ sub-directories are created, as described in :numref:`Table %s `. The workflow tasks are described in :numref:`Section %s `). -.. _Utilities: - -SRW Component Summary: Pre-processor Utilities and Initial Conditions -========================================================================= - -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, this is used as input to the atmospheric model (FV3-LAM). Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. - - -Forecast Model ------------------ - -Atmospheric Model -^^^^^^^^^^^^^^^^^^^^^^ - -The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). -The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. - -Common Community Physics Package -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW release includes an experimental physics version and an updated operational version. - -Data Format -^^^^^^^^^^^^^^^^^^^^^^ - -The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. - - -Unified Post-Processor (UPP) --------------------------------- - -The `Unified Post Processor `__ (:term:`UPP`) is included in the SRW Application workflow. The UPP is designed to generate useful products from raw model output. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). - - -Visualization Example -------------------------- - -This SRW Application provides Python scripts to create basic visualizations of the model output. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. - -Build System and Workflow ----------------------------- - -The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `__ for more on workflow management). Individual components can also be run in a stand-alone, command line fashion. - -The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. - -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build-only platforms (Level 4). Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW must be :ref:`run within a container ` that contains the HPC-Stack, or the required libraries (i.e., HPC-Stack) will need to be installed as part of the :ref:`non-container `) SRW installation process. Once these prerequisite libraries are built, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. - User Support, Documentation, and Contributions to Development =============================================================== @@ -330,7 +330,7 @@ A list of available documentation is shown in :numref:`Table %s ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. +This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW. -The "out-of-the-box" SRW case described in this guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +The "out-of-the-box" SRW case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. + +.. attention:: + + The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. However, this guide can serve as a starting point for running the SRW App on other systems, too. .. _DownloadCodeC: Building the UFS SRW Application -=========================================== -The SRW Application source code is publicly available on GitHub and can be run in a container or locally, depending on user preference. The SRW Application relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. +=========================================== Prerequisites: Install Singularity ------------------------------------ @@ -26,7 +29,7 @@ To build and run the SRW App using a Singularity container, first install the Si Working in the Cloud ----------------------- -For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure the SRW builds and runs correctly. +For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure that the SRW builds and runs correctly. On NOAA Cloud systems, certain environment variables must be set *before* building the container: @@ -36,9 +39,10 @@ On NOAA Cloud systems, certain environment variables must be set *before* buildi export SINGULARITY_CACHEDIR=/lustre/cache export SINGULARITY_TEMPDIR=/lustre/tmp -* If the ``cache`` and ``tmp`` directories do not exist already, they must be created. +If the ``cache`` and ``tmp`` directories do not exist already, they must be created with a ``mkdir`` command. -* ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the files `__ and move it to the ``/contrib`` directory, which is much slower but persistent. +.. note:: + ``/lustre`` is a fast but non-persistent file system used on NOAA cloud systems. To retain work completed in this directory, `tar the files `__ and move them to the ``/contrib`` directory, which is much slower but persistent. .. _WorkOnHPC: @@ -46,7 +50,7 @@ Working on HPC Systems -------------------------- Those *not* working on HPC systems may skip to the :ref:`next step `. -On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On NOAA's Cloud platforms, the following commands should work: +On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On NOAA's Cloud platforms, the following commands will allocate a compute node: .. code-block:: console @@ -55,7 +59,7 @@ On HPC systems (including NOAA's Cloud platforms), allocate a compute node on wh mpirun -n 1 hostname ssh -The third command will output a hostname. This hostname should replace ```` in the last command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. +The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. The appropriate commands on other Level 1 platforms will vary, and users should consult the documentation for those platforms. @@ -79,7 +83,7 @@ Start the container and run an interactive shell within it: singularity shell -e --writable --bind /:/ ubuntu20.04-epic-srwapp-1.0 -The command above also binds the local directory to the container so that data can be shared between them. On NOAA systems, the local directory is usually the topmost directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. +The command above also binds the local directory to the container so that data can be shared between them. On Level 1 systems, ```` is usually the topmost directory (e.g., /lustre, /contrib, /work, or /home). Additional directories can be bound by adding another ``--bind /:/`` argument before the name of the container. .. attention:: * When binding two directories, they must have the same name. It may be necessary to ``cd`` into the container and create an appropriately named directory in the container using the ``mkdir`` command if one is not already there. @@ -91,7 +95,7 @@ The command above also binds the local directory to the container so that data c Set up the Build Environment ============================ -If the SRW Application has been built in a container provided by the Earth Prediction Innovation Center (EPIC), set build environments and modules within the ``ufs-srweather-app`` directory as follows: +Set the build environments and modules within the ``ufs-srweather-app`` directory as follows: .. code-block:: console @@ -118,17 +122,17 @@ Download and Stage the Data ============================ The SRW requires input files to run. These include static datasets, initial and boundary condition -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. .. _GenerateForecastC: Generate the Forecast Experiment ================================= -Generating the forecast experiment requires three steps: +To generate the forecast experiment, users must: -* :ref:`Set experiment parameters ` -* :ref:`Set Python and other environment parameters ` -* :ref:`Run a script to generate the experiment workflow ` +#. :ref:`Set experiment parameters ` +#. :ref:`Set Python and other environment parameters ` +#. :ref:`Run a script to generate the experiment workflow ` The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. @@ -138,7 +142,7 @@ Set the Experiment Parameters ------------------------------- Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in the ``config.sh`` file. Two example ``config.sh`` templates are provided: ``config.community.sh`` and ``config.nco.sh``. They can be found in the ``ufs-srweather-app/regional_workflow/ush`` directory. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. -Make a copy of ``config.community.sh`` to get started (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: +Make a copy of ``config.community.sh`` to get started. From the ``ufs-srweather-app`` directory, run the following commands: .. code-block:: console @@ -157,7 +161,7 @@ Next, edit the new ``config.sh`` file to customize it for your experiment. At a EXPT_BASEDIR="/home/$USER/expt_dirs" COMPILER="gnu" -Additionally, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and add the correct paths to the data. The following is a sample for a 48-hour forecast: +Additionally, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and add the correct paths to the data. The following is a sample for a 24-hour forecast: .. code-block:: @@ -165,7 +169,7 @@ Additionally, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and add the correct pa EXTRN_MDL_SOURCE_BASEDIR_ICS="/path/to/model_data/FV3GFS" EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" ) EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path/to/model_data/FV3GFS" - EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" \ "gfs.pgrb2.0p25.f030" "gfs.pgrb2.0p25.f036" "gfs.pgrb2.0p25.f042" "gfs.pgrb2.0p25.f048" ) + EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" ) On Level 1 systems, ``/path/to/model_data/FV3GFS`` should correspond to the location of the machine's global data. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Step %s `. @@ -185,7 +189,7 @@ On NOAA Cloud platforms, users may continue to the :ref:`next step ` on Level 1 systems. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. +On Level 1 systems, it should be possible to continue to the :ref:`next step ` after changing the settings above. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. For users interested in experimenting with a different grid, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. .. _SetUpPythonEnvC: Activate the Regional Workflow ---------------------------------------------- -Next, activate the regional workflow. +Next, activate the regional workflow: .. code-block:: console @@ -227,7 +231,7 @@ Run the following command to generate the workflow: This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The last line of output from this script should start with ``*/1 * * * *`` or ``*/3 * * * *``. -The generated workflow will be in the experiment directory specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in the experiment directory. +The generated workflow will be in the experiment directory specified in the ``config.sh`` file in :numref:`Step %s `. .. _RunUsingStandaloneScripts: @@ -237,7 +241,7 @@ Run the Workflow Using Stand-Alone Scripts .. note:: The Rocoto workflow manager cannot be used inside a container. -The regional workflow can be run using standalone shell scripts if the Rocoto software is not available on a given platform. If Rocoto *is* available, see `Section %s ` to run the workflow using Rocoto. +The regional workflow can be run using standalone shell scripts in cases where the Rocoto software is not available on a given platform. If Rocoto *is* available, see :numref:`Section %s ` to run the workflow using Rocoto. #. ``cd`` into the experiment directory @@ -248,13 +252,13 @@ The regional workflow can be run using standalone shell scripts if the Rocoto so export EXPTDIR=`pwd` setenv EXPTDIR `pwd` -#. Copy the wrapper scripts from the regional_workflow directory into your experiment directory. Each workflow task has a wrapper script that sets environment variables and run the job script. +#. Copy the wrapper scripts from the regional_workflow directory into the experiment directory. Each workflow task has a wrapper script that sets environment variables and runs the job script. .. code-block:: console cp ufs-srweather-app/regional_workflow/ush/wrappers/* . -#. Set the OMP_NUM_THREADS variable and fix dash/bash shell issue (this ensures the system does not use an alias of ``sh`` to dash). +#. Set the ``OMP_NUM_THREADS`` variable and fix dash/bash shell issue (this ensures the system does not use an alias of ``sh`` to dash). .. code-block:: console @@ -277,13 +281,6 @@ The regional workflow can be run using standalone shell scripts if the Rocoto so Check the batch script output file in your experiment directory for a “SUCCESS” message near the end of the file. -.. hint:: - If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. On an HPC system, the user will need to allocate a(nother) compute node. The process for doing so is system-dependent, and users should check the documentation available for their HPC system. Instructions for allocating a compute node on NOAA Cloud systems can be viewed in the :numref:`Step %s ` as an example. - -.. note:: - #. On most HPC systems, users will need to submit a batch job to run multi-processor jobs. On some HPC systems, users may be able to run the first two jobs (serial) on a login node/command-line. Example scripts for Slurm (Hera) and PBS (Cheyenne) resource managers are provided. These will need to be adapted to each user's system. This submit batch script is hard-coded per task, so it will need to be modified or copied to run each task. - - .. _RegionalWflowTasks: .. table:: List of tasks in the regional workflow in the order that they are executed. @@ -317,9 +314,12 @@ Check the batch script output file in your experiment directory for a “SUCCESS | | | | forecast hour) | +------------+------------------------+----------------+----------------------------+ -Example batch-submit scripts for Hera (Slurm) and Cheyenne (PBS) are included (``sq_job.sh`` -and ``qsub_job.sh``, respectively). These examples set the build and run environment for Hera or Cheyenne so that run-time libraries match the compiled libraries (i.e. netCDF, MPI). Users may either modify the submit batch script as each task is submitted, or duplicate this batch wrapper -for their system settings for each task. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). This piece will be unique to your platform. The tasks run by the regional workflow are shown in :numref:`Table %s `. Tasks with the same stage level may be run concurrently (no dependency). + +.. hint:: + If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. On an HPC system, the user will need to allocate a(nother) compute node. The process for doing so is system-dependent, and users should check the documentation available for their HPC system. Instructions for allocating a compute node on NOAA Cloud systems can be viewed in the :numref:`Step %s ` as an example. + +.. note:: + On most HPC systems, users will need to submit a batch job to run multi-processor jobs. On some HPC systems, users may be able to run the first two jobs (serial) on a login node/command-line. Example scripts for Slurm (Hera) and PBS (Cheyenne) resource managers are provided (``sq_job.sh`` and ``qsub_job.sh``, respectively). These examples will need to be adapted to each user's system. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). Plot the Output =============== From acb77c836ac88673c34695a3ecbfdd6274e84c64 Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Mar 2022 11:35:37 -0400 Subject: [PATCH 055/118] updates to BuildRun doc through 3.8.1 --- docs/UsersGuide/source/BuildRunSRW.rst | 54 +++++++++++++------------- 1 file changed, 27 insertions(+), 27 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 1bfa8c1e2d..aaf59faca1 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -4,13 +4,13 @@ Building and Running the SRW ===================================== -The UFS Short-Range Weather Application (SRW App) is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. +The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. -This chapter walks users through how to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: - The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user need to perform additional troubleshooting. + The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. .. note:: The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more cutomization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. @@ -27,14 +27,14 @@ The overall procedure for generating an experiment is shown in :numref:`Figure % * :ref:`Generate a regional workflow experiment ` * :ref:`Configure the experiment parameters ` * :ref:`Load the python environment for the regional workflow ` - * :ref:`Run the regional workflow ` + * :ref:`Run the regional workflow ` * :ref:`Optional: Plot the output ` .. _AppOverallProc: .. figure:: _static/FV3LAM_wflow_overall.png - *Overall layout of the SRW App* + *Overall layout of the SRW App Workflow* .. _HPCstackInfo: @@ -45,7 +45,7 @@ Install the HPC-Stack .. Attention:: Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud). -**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system and builds the software stack required for the `Unified Forecast System (UFS) `_ and applications. +**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system and builds the software stack required for `UFS `_ applications such as the SRW App. Background ---------------- @@ -54,7 +54,7 @@ The UFS Weather Model draws on over 50 code libraries to run its applications. T Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the next section. @@ -62,9 +62,7 @@ After completing installation, continue to the next section. Download the UFS SRW Application Code ===================================== -The SRW Application source code is publicly available on GitHub. It relies on a variety of components detailed in the :ref:`Components Chapter ` of this User's Guide. Users must (1) clone the UFS SRW Application umbrella repository and then (2) run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App and will clone the correct version of the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under the ``regional_workflow`` and ``src`` directories. - -Clone the release branch of the repository: +The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the release branch of the repository: .. code-block:: console @@ -81,12 +79,12 @@ The cloned repository contains the configuration files and sub-directories shown .. table:: Files and sub-directories of the ufs-srweather-app repository +--------------------------------+--------------------------------------------------------+ - | **File/directory Name** | **Description** | + | **File/Directory Name** | **Description** | +================================+========================================================+ | CMakeLists.txt | Main cmake file for SRW App | +--------------------------------+--------------------------------------------------------+ - | Externals.cfg | Tags of the GitHub repositories/branches for the | - | | external repositories | + | Externals.cfg | Includes tags pointing to the correct version of the | + | | external GitHub repositories/branches used in the SRW. | +--------------------------------+--------------------------------------------------------+ | LICENSE.md | CC0 license information | +--------------------------------+--------------------------------------------------------+ @@ -113,14 +111,16 @@ The cloned repository contains the configuration files and sub-directories shown Check Out External Components ================================ -Next, run the executable that pulls in SRW components from external repositories, including the regional_workflow, ufs-weather-model, ufs_utils, and upp repositories: +The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Users must run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. + +Run the executable that pulls in SRW components from external repositories: .. code-block:: console cd ufs-srweather-app ./manage_externals/checkout_externals -This step will use the configuration file ``Externals.cfg`` in the ``ufs-srweather-app`` directory to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s `. + .. _SetUpBuild: @@ -137,15 +137,15 @@ Before building the SRW App, the build environment must be set up for the user's -rw-rw-r-- 1 user ral 1228 Oct 9 10:09 build_jet_intel.env ... -On Level 1 systems, the commands in the ``build__.env`` files can be directly copy-pasted into the command line, or the file can be sourced from the ufs-srweather-app ``env`` directory. For example, on Hera, run: +On Level 1 systems, the commands in the ``build__.env`` files can be directly copy-pasted into the command line, or the file can be sourced from the ``ufs-srweather-app/env`` directory. For example, on Hera, run: .. code-block:: source env/build_hera_intel.env -from the main ufs-srweather-app directory to source the appropriate file. +from the main ``ufs-srweather-app`` directory to source the appropriate file. -On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, this process will typically involve commands in the form ``export =``. You may need to use ``setenv`` rather than ``export`` depending on your shell environment. +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables using commands in the form ``export =``. Users may need to use ``setenv`` rather than ``export`` depending on their shell environment. .. _BuildExecutables: @@ -201,12 +201,12 @@ The build will take a few minutes to complete. When it starts, a random number i +------------------------+---------------------------------------------------------------------------------+ | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | +------------------------+---------------------------------------------------------------------------------+ - | shave | Shaves the excess halo rows down to what is required for the LBCs in the | - | | orography and grid files | + | shave | Shaves the excess halo rows down to what is required for the lateral boundary | + | | conditions (LBC's) in the orography and grid files | +------------------------+---------------------------------------------------------------------------------+ | vcoord_gen | Generates hybrid coordinate interface profiles | +------------------------+---------------------------------------------------------------------------------+ - | fvcom_to_FV3 | Determine lake surface conditions for the Great Lakes | + | fvcom_to_FV3 | Determines lake surface conditions for the Great Lakes | +------------------------+---------------------------------------------------------------------------------+ | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | | | for global uniform grids | @@ -237,14 +237,14 @@ Download and Stage the Data ============================ The SRW requires input files to run. These include static datasets, initial and boundary conditions -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. +files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: Grid Configuration ======================= -The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, they will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. +The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. .. _PredefinedGrids: @@ -271,14 +271,14 @@ Generating the forecast experiment requires three steps: * :ref:`Set Python and other environment parameters ` * :ref:`Run a script to generate the experiment workflow ` -The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. +The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. Information in :numref:`Chapter %s: Configuring the Workflow ` can help with this. .. _ExptConfig: Set Experiment Parameters ---------------------------- -Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specific ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. For background info on ``config_defaults.sh``, read :numref:`Section %s ` or jump to :numref:`Section %s ` to continue configuring the experiment. +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specific ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. For background info on ``config_defaults.sh``, read :numref:`Section %s `, or jump to :numref:`Section %s ` to continue configuring the experiment. .. _DefaultConfigSection: @@ -286,10 +286,10 @@ Default configuration: ``config_defaults.sh`` ------------------------------------------------ .. note:: - Users may skip to :numref:`Step %s `. This section provides background information on how the SRW App uses the ``config_defaults.sh`` file, but this information is not necessary for running the SRW. + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file, but this information is not necessary for running the SRW. Users may skip to :numref:`Step %s ` to continue configuring their experiment. Important configuration variables in the ``config_defaults.sh`` file appear in -:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` +:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` settings. There is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and in :numref:`Chapter %s `. .. _ConfigVarsDefault: From 70a051b54aeda17039cec94dd850667223b0e1ed Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 17 Mar 2022 15:47:36 -0400 Subject: [PATCH 056/118] edits to Build/Run and Components --- docs/UsersGuide/source/BuildRunSRW.rst | 59 +++++++++++++++---------- docs/UsersGuide/source/Components.rst | 42 ++++++++---------- docs/UsersGuide/source/Glossary.rst | 17 ++++--- docs/UsersGuide/source/Introduction.rst | 2 +- 4 files changed, 64 insertions(+), 56 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index aaf59faca1..332592e907 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -286,7 +286,7 @@ Default configuration: ``config_defaults.sh`` ------------------------------------------------ .. note:: - This section provides background information on how the SRW App uses the ``config_defaults.sh`` file, but this information is not necessary for running the SRW. Users may skip to :numref:`Step %s ` to continue configuring their experiment. + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is not helpful but not essential to running the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. Important configuration variables in the ``config_defaults.sh`` file appear in :numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` @@ -405,7 +405,7 @@ settings. There is usually no need for a user to modify the default configuratio User-specific configuration: ``config.sh`` -------------------------------------------- -The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in the ``ufs-srweather-app/regional_workflow/ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. .. _ConfigCommunity: @@ -466,7 +466,7 @@ The user must specify certain basic information about the experiment in a ``conf +--------------------------------+-------------------+--------------------------------------------------------+ -To get started, make a copy of ``config.community.sh`` (under ``/regional_workflow/ush``). From the ``ufs-srweather-app`` directory, run: +To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather-app`` directory, run: .. code-block:: console @@ -481,7 +481,7 @@ Sample settings are indicated below for Level 1 platforms. Detailed guidance app .. important:: - If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to add the line ``COMPILER="gnu"`` to the ``config.sh`` file. + If you set up the build environment with the GNU compiler in :numref:`Section %s `, you will have to check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. .. hint:: @@ -548,7 +548,7 @@ For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use Set up the Python and other Environment Parameters -------------------------------------------------- -Next, load the appropriate Python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): +The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): .. code-block:: console @@ -576,11 +576,11 @@ Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` d The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. -This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``$EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``$EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). If a parameter is specified differently in these scripts, the file containing the last defined value will be used. -The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. +The generated workflow will appear in ``EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. .. _WorkflowGeneration: @@ -594,9 +594,16 @@ Description of Workflow Tasks -------------------------------- .. note:: - This section gives an overview of workflow tasks. To begin running the workflow, skip to :numref:`Step %s ` + This section gives a general overview of workflow tasks. To begin running the workflow, skip to :numref:`Step %s ` + +:numref:`Figure %s ` illustrates the overall workflow. Individual tasks that make up the workflow are specified in the ``FV3LAM_wflow.xml`` file. :numref:`Table %s ` describes the function of each task. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by adding the following lines to the ``config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script: + +.. code-block:: console + + RUN_TASK_MAKE_GRID=”FALSE” + RUN_TASK_MAKE_OROG=”FALSE” + RUN_TASK_MAKE_SFC_CLIMO=”FALSE” -Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in :numref:`Figure %s `. Each task is described in :numref:`Table %s `. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by setting ``RUN_TASK_MAKE_GRID=”FALSE”``, ``RUN_TASK_MAKE_OROG=”FALSE”``, and ``RUN_TASK_MAKE_SFC_CLIMO=”FALSE”`` in the ``regional_workflow/ush/config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script. As shown in the figure, the ``FV3LAM_wflow.xml`` file runs the specific j-job scripts in the prescribed order (``regional_workflow/jobs/JREGIONAL_[task name]``) when the ``launch_FV3LAM_wflow.sh`` is submitted. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksFig: @@ -604,6 +611,10 @@ Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in *Flowchart of the workflow tasks* + +The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workflow/jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. + + .. _WorkflowTasksTable: .. table:: Workflow tasks in SRW App @@ -624,11 +635,11 @@ Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in | | initial conditions | +----------------------+------------------------------------------------------------+ | get_extrn_lbcs | Cycle-specific task to obtain external data for the | - | | lateral boundary (LB) conditions | + | | lateral boundary conditions (LBC's) | +----------------------+------------------------------------------------------------+ | make_ics | Generate initial conditions from the external data | +----------------------+------------------------------------------------------------+ - | make_lbcs | Generate lateral boundary conditions from the external data| + | make_lbcs | Generate LBC's from the external data | +----------------------+------------------------------------------------------------+ | run_fcst | Run the forecast model (UFS weather model) | +----------------------+------------------------------------------------------------+ @@ -641,7 +652,7 @@ Workflow tasks are specified in the ``FV3LAM_wflow.xml`` file and illustrated in Run the Workflow Using Rocoto ============================= -The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts described in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. +The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: @@ -666,19 +677,19 @@ To run Rocoto using the ``launch_FV3LAM_wflow.sh`` script provided, simply call cd $EXPTDIR ./launch_FV3LAM_wflow.sh -This script creates a log file named ``log.launch_FV3LAM_wflow`` in the EXPTDIR directory or appends information to it if the file already exists. Check the end of the log file periodically to see how the experiment is progressing: +This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to it if the file already exists. Check the end of the log file periodically to see how the experiment is progressing: .. code-block:: console - tail -n 30 log.launch_FV3LAM_wflow + tail -n 40 log.launch_FV3LAM_wflow In order to launch additional tasks in the workflow, call the launch script again; this action will need to be repeated until all tasks in the workflow have been launched. To (re)launch the workflow and check its progress on a single line, run: .. code-block:: console - ./launch_FV3LAM_wflow.sh; tail -n 80 log.launch_FV3LAM_wflow + ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow -This will output the last 80 lines of the log file, which includes the status of the workflow tasks (e.g., SUCCEEDED, DEAD, RUNNING, SUBMITTING, QUEUED). The number 80 can be changed according to the user's preferences. The output will look like this: +This will output the last 40 lines of the log file, which list the status of the workflow tasks (e.g., SUCCEEDED, DEAD, RUNNING, SUBMITTING, QUEUED). The number 40 can be changed according to the user's preferences. The output will look like this: .. code-block:: console @@ -706,7 +717,7 @@ This will output the last 80 lines of the log file, which includes the status of 0 out of 1 cycles completed. Workflow status: IN PROGRESS -Error messages for each specific task can be found in the task log files located in the ``$EXPTDIR/log`` directory. +Error messages for each specific task can be found in the task log files located in ``$EXPTDIR/log``. If everything goes smoothly, you will eventually get the following workflow status table as follows: @@ -730,7 +741,7 @@ If everything goes smoothly, you will eventually get the following workflow stat 202006170000 run_post_05 8855464 SUCCEEDED 0 1 6.0 202006170000 run_post_06 8855465 SUCCEEDED 0 1 6.0 -If all the tasks complete successfully, the workflow status in the log file will include the word “SUCCESS." Otherwise, the workflow status will include the word “FAILURE." +If all the tasks complete successfully, the workflow status in the log file will indicate “SUCCESS." Otherwise, the workflow status will indicate “FAILURE." Launch the Rocoto Workflow Manually @@ -808,7 +819,7 @@ After loading Rocoto, call ``rocotorun`` from the experiment directory to launch rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -The ``rocotorun`` and ``rocotostat`` commands will need to be resubmitted regularly and repeatedly until the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. +The ``rocotorun`` and ``rocotostat`` commands above will need to be resubmitted regularly and repeatedly until the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed, users can open the ``make_grid.log`` file to see what caused the problem: @@ -825,13 +836,13 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai Automated Option ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry by entering the ``crontab -e`` command, which opens a crontab file. As mentioned in :ref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -where ```` is changed to correspond to the user's machine, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can also be changed and simply means that the workflow will be resubmitted every minute. +where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every minute. To check the experiment progress: @@ -844,9 +855,9 @@ After finishing the experiment, open the crontab using `` crontab -e`` and delet .. note:: - On Orion, *cron* is only available on the orion-login-1 node, so please use that node when running cron jobs on Orion. + On Orion, *cron* is only available on the orion-login-1 node, so users will need to work on that node when running *cron* jobs on Orion. -The workflow run is completed when all tasks have “SUCCEEDED”, and the rocotostat command will output the following: +The workflow run is complete when all tasks have “SUCCEEDED”, and the rocotostat command outputs the following: .. code-block:: console @@ -869,4 +880,4 @@ The workflow run is completed when all tasks have “SUCCEEDED”, and the rocot Plot the Output =============== -Two python scripts are provided to generate plots from the FV3-LAM post-processed GRIB2 output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. +Two python scripts are provided to generate plots from the :term:`FV3`-LAM post-processed :term:`GRIB2` output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index ed3eff1f2a..cf3728fbeb 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -4,11 +4,12 @@ SRW Components =============== -The SRW Application v2.0 release assembles a variety of components, including: +The SRW Application assembles a variety of components, including: + * Pre-processor Utilities & Initial Conditions -* Forecast Model -* Post-Processor -* Visualization Example +* UFS Weather Forecast Model +* Unified Post-Processor +* Visualization Examples * Build System and Workflow These components are documented within this User's Guide and supported through a `community forum `_. @@ -19,7 +20,7 @@ Pre-processor Utilities and Initial Conditions ============================================== The SRW Application includes a number of pre-processing utilities that initialize and prepare the -model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid ``regional_esg_grid/make_hgrid`` along with orography ``orog`` and surface climatology ``sfc_climo_gen`` files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format, needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. +model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. @@ -36,34 +37,32 @@ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`; described `here `__).Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each +Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. -The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model -ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in -netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. Post-processor ============== The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on -standard isobaric vertical coordinates. UPP can also be used to compute a variety of useful -diagnostic fields, as described in the `UPP User’s Guide `_. +standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful +diagnostic fields, as described in the `UPP User’s Guide `__. -Output from UPP can be used with visualization, plotting, and verification packages, or for -further downstream post-processing, e.g. statistical post-processing techniques. +Output from UPP can be used with visualization, plotting, and verification packages or in +further downstream post-processing (e.g., statistical post-processing techniques). Visualization Example ===================== A Python script is provided to create basic visualization of the model output. The script is designed to output graphics in PNG format for 14 standard meteorological variables -when using the pre-defined :term:`CONUS` domain. In addition, a difference plotting script is included +when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only -as an example for users familiar with Python and may be used to do a visual check to verify +as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results. -After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the `regional_workflow repository `_ under ush/Python. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. +After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. Build System and Workflow ========================= @@ -71,8 +70,7 @@ Build System and Workflow The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. -An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application: the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack. There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, -C, and C++ compiler, and an MPI library. +An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library. Once built, the provided experiment generator script can be used to create a Rocoto-based workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation @@ -82,13 +80,9 @@ This SRW Application release has been tested on a variety of platforms widely us researchers, such as the NOAA Research and Development High-Performance Computing Systems (RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne -system; the National Severe Storms Laboratory (NSSL) HPC machine called Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. +system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. -For the selected computational platforms that have been pre-configured (Level 1), all the -required libraries for building the SRW Application are available in a central place. That -means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both -been built. The SRW Application is expected to build and run out-of-the-box on these -pre-configured platforms. Users can download the SRW code and choose whether to run it :ref:`in a container ` or :ref:`locally `. +On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms. A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index ebded5df4a..3873298cfe 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -44,16 +44,19 @@ Glossary The second version of the World Meterological Organization's (WMO) standard for distributing gridded data. HPC-Stack - The `HPC-stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. + The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. HRRR - `High Resolution Rapid Refresh `. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. + `High Resolution Rapid Refresh `__. The HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation. Radar data is assimilated in the HRRR every 15 min over a 1-h period adding further detail to that provided by the hourly data assimilation from the 13km radar-enhanced Rapid Refresh. IC/LBC Initial conditions/lateral boundary conditions LAM - Limited Area Model. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. + Limited Area Model, formerly known as the "Stand-Alone Regional Model," or SAR. LAM grids use a regional (rather than global) configuration of the FV3 dynamical core. + + LBC + Lateral boundary conditions. MPI MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC). @@ -67,11 +70,11 @@ Glossary NCEPLIBS The software libraries created and maintained by :term:`NCEP` that are required for running - :term:`chgres_cube`, the UFS Weather Model, and :term:`UPP`. They are part of the HPC-Stack. + :term:`chgres_cube`, the UFS Weather Model, and :term:`UPP`. They are included in the `HPC-Stack `__. NCEPLIBS-external A collection of third-party libraries required to build :term:`NCEPLIBS`, :term:`chgres_cube`, - the UFS Weather Model, and :term:`UPP`. They are part of the HPC-Stack. + the UFS Weather Model, and :term:`UPP`. They are included in the `HPC-Stack `__. NCL An interpreted programming language designed specifically for scientific data analysis and @@ -88,10 +91,10 @@ Glossary Numerical Weather Prediction (NWP) takes current observations of weather and processes them with computer models to forecast the future state of the weather. Orography - The branch of physical geography dealing with mountains + The branch of physical geography dealing with mountains. RAP - `Rapid Refresh `. The continental-scale NOAA hourly-updated assimilation/modeling system operational at NCEP. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (HRRR) model. + `Rapid Refresh `__. The continental-scale NOAA hourly-updated assimilation/modeling system operational at NCEP. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (HRRR) model. Repository A central location in which files (e.g., data, code, documentation) are stored and managed. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 0cd418f402..6f8f0b6ac8 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for `multiple applications `__. The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application in a container and a :ref:`detailed guide ` for running the SRW on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. +The UFS can be configured for `multiple applications `__. The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application in a container and a :ref:`detailed guide ` for running the SRW on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: From 99127e7484eea9b177041dc39fa0b3bf1de50f7d Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Mar 2022 11:06:32 -0400 Subject: [PATCH 057/118] remove .gitignore --- .gitignore | 1 - 1 file changed, 1 deletion(-) delete mode 100644 .gitignore diff --git a/.gitignore b/.gitignore deleted file mode 100644 index e43b0f9889..0000000000 --- a/.gitignore +++ /dev/null @@ -1 +0,0 @@ -.DS_Store From b01268d5433f0db0119d9655d2cb086f3404da74 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Mar 2022 11:14:26 -0400 Subject: [PATCH 058/118] fix Ch 3 title, 4 supported platform levels note --- docs/UsersGuide/source/BuildRunSRW.rst | 4 ++-- docs/UsersGuide/source/Quickstart.rst | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 332592e907..015401ac4d 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -1,7 +1,7 @@ .. _BuildRunSRW: ===================================== -Building and Running the SRW +Building and Running the SRW App ===================================== The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. @@ -10,7 +10,7 @@ This chapter walks users through how to build and run the "out-of-the-box" case .. attention:: - The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. + All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. .. note:: The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more cutomization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 88c9295de6..e963c528df 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -10,7 +10,7 @@ The "out-of-the-box" SRW case described in this User's Guide builds a weather fo .. attention:: - The UFS defines `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. However, this guide can serve as a starting point for running the SRW App on other systems, too. + All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. However, this guide can serve as a starting point for running the SRW App on other systems, too. .. _DownloadCodeC: From da35184e6b5c4a1b2cc952d27d687ed34eb41981 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Mar 2022 11:39:54 -0400 Subject: [PATCH 059/118] fix typos, add term links --- docs/UsersGuide/source/BuildRunSRW.rst | 10 +++++----- docs/UsersGuide/source/Quickstart.rst | 4 ++-- 2 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 015401ac4d..d634d0bdf0 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -6,14 +6,14 @@ Building and Running the SRW App The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. -This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. .. note:: - The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more cutomization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. + The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: @@ -286,7 +286,7 @@ Default configuration: ``config_defaults.sh`` ------------------------------------------------ .. note:: - This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is not helpful but not essential to running the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is helpful but not essential to running the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. Important configuration variables in the ``config_defaults.sh`` file appear in :numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` @@ -405,7 +405,7 @@ settings. There is usually no need for a user to modify the default configuratio User-specific configuration: ``config.sh`` -------------------------------------------- -The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing. :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid-Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. .. _ConfigCommunity: @@ -473,7 +473,7 @@ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather cd regional_workflow/ush cp config.community.sh config.sh -The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index e963c528df..50436af86a 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -6,7 +6,7 @@ Quick Start Guide This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW. -The "out-of-the-box" SRW case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +The "out-of-the-box" SRW case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: @@ -149,7 +149,7 @@ Make a copy of ``config.community.sh`` to get started. From the ``ufs-srweather- cd regional_workflow/ush cp config.community.sh config.sh -The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 CCPP), and :term:`FV3`-based GFS raw external model data for initialization. +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. Next, edit the new ``config.sh`` file to customize it for your experiment. At a minimum, update the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``: From 1302868646eb26d61916ae953c7798ea242abe03 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 18 Mar 2022 13:11:22 -0400 Subject: [PATCH 060/118] other minor fixes/suggestions implemented --- docs/UsersGuide/source/BuildRunSRW.rst | 74 +++++--------------------- docs/UsersGuide/source/Quickstart.rst | 9 ++-- 2 files changed, 16 insertions(+), 67 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index d634d0bdf0..e7baac2ebc 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -113,7 +113,7 @@ Check Out External Components The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Users must run the ``checkout_externals`` script to link the necessary external repositories to the SRW App. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. -Run the executable that pulls in SRW components from external repositories: +Run the executable that pulls in SRW App components from external repositories: .. code-block:: console @@ -236,8 +236,7 @@ The build will take a few minutes to complete. When it starts, a random number i Download and Stage the Data ============================ -The SRW requires input files to run. These include static datasets, initial and boundary conditions -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: @@ -554,7 +553,7 @@ The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml source ../../env/wflow_.env -This command will activate the ``regional_workflow``. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: +This command will activate the ``regional_workflow`` conda environment. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: .. code-block:: console @@ -612,12 +611,12 @@ Description of Workflow Tasks *Flowchart of the workflow tasks* -The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workflow/jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. +The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workflow/jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script (or "ex-script") named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksTable: -.. table:: Workflow tasks in SRW App +.. table:: Workflow tasks in the SRW App +----------------------+------------------------------------------------------------+ | **Workflow Task** | **Task Description** | @@ -677,7 +676,7 @@ To run Rocoto using the ``launch_FV3LAM_wflow.sh`` script provided, simply call cd $EXPTDIR ./launch_FV3LAM_wflow.sh -This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to it if the file already exists. Check the end of the log file periodically to see how the experiment is progressing: +This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to it if the file already exists. The launch script also creates the ``log/FV3LAM_wflow.log`` file, which shows Rocoto task information. Check the end of the log files periodically to see how the experiment is progressing: .. code-block:: console @@ -750,63 +749,14 @@ Launch the Rocoto Workflow Manually Load Rocoto ^^^^^^^^^^^^^^^^ -Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. - -For most systems, a variant on the following commands will be necessary to load the Rocoto module: +Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. On Level 1 systems, the Rocoto modules are loaded automatically in :numref:`Step %s `. For most other systems, a variant on the following commands will be necessary to load the Rocoto module: .. code-block:: console module use module load rocoto -The commands for specific Level 1 platforms are described here: - -Cheyenne: - -.. code-block:: console - - module use -a /glade/p/ral/jntp/UFS_SRW_app/modules/ - module load rocoto - -Hera and Jet: - -.. code-block:: console - - module purge - module load rocoto - -Orion: - -.. code-block:: console - - module purge - module load contrib rocoto - -Gaea: - -.. code-block:: console - - module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles - module load rocoto/1.3.3 - -WCOSS_DELL_P3: - -.. code-block:: console - - module purge - module load lsf/10.1 - module use /gpfs/dell3/usrx/local/dev/emc_rocoto/modulefiles/ - module load ruby/2.5.1 rocoto/1.2.4 - -WCOSS_CRAY: - -.. code-block:: console - - module purge - module load xt-lsfhpc/9.1.3 - module use -a /usrx/local/emc_rocoto/modulefiles - module load rocoto/1.2.4 - +Some systems may require a version number (e.g., ``module load rocoto/1.3.3``) Run the Rocoto Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -836,13 +786,13 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai Automated Option ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *`` or ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console - */1 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + */3 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``1`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every minute. +where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. To check the experiment progress: @@ -851,7 +801,7 @@ To check the experiment progress: cd $EXPTDIR rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -After finishing the experiment, open the crontab using `` crontab -e`` and delete the crontab entry. +After finishing the experiment, open the crontab using ``crontab -e`` and delete the crontab entry. .. note:: diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 50436af86a..e79ad9e316 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -4,7 +4,7 @@ Quick Start Guide ==================================== -This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW. +This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW App build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW. The "out-of-the-box" SRW case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. @@ -29,7 +29,7 @@ To build and run the SRW App using a Singularity container, first install the Si Working in the Cloud ----------------------- -For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW using NOAA's Cloud resources must complete a few additional steps to ensure that the SRW builds and runs correctly. +For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW App using NOAA's Cloud resources must complete a few additional steps to ensure that the SRW App builds and runs correctly. On NOAA Cloud systems, certain environment variables must be set *before* building the container: @@ -59,7 +59,7 @@ On HPC systems (including NOAA's Cloud platforms), allocate a compute node on wh mpirun -n 1 hostname ssh -The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW from that node. +The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW App from that node. The appropriate commands on other Level 1 platforms will vary, and users should consult the documentation for those platforms. @@ -121,8 +121,7 @@ From the ``ufs-srweather-app`` directory, ``cd`` into the build directory and ru Download and Stage the Data ============================ -The SRW requires input files to run. These include static datasets, initial and boundary condition -files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. +The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. .. _GenerateForecastC: From a704a2fa61001d5944210d6ab36a5cc38ddfd4c0 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 21 Mar 2022 15:46:56 -0400 Subject: [PATCH 061/118] updated Intro based on feedback; changed SRW to SRW App throughout --- docs/UsersGuide/source/BuildRunSRW.rst | 15 +++--- docs/UsersGuide/source/Components.rst | 22 +++------ docs/UsersGuide/source/ConfigNewPlatform.rst | 6 +-- docs/UsersGuide/source/InputOutputFiles.rst | 48 +++++++------------- docs/UsersGuide/source/Introduction.rst | 44 +++++++++--------- docs/UsersGuide/source/Quickstart.rst | 8 ++-- 6 files changed, 60 insertions(+), 83 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index e7baac2ebc..d9967dce23 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -6,7 +6,7 @@ Building and Running the SRW App The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. -This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW experiment and can be modified to suit user goals. The "out-of-the-box" SRW case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The "out-of-the-box" SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: @@ -50,18 +50,18 @@ Install the HPC-Stack Background ---------------- -The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF, etc.) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW. +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g. NCEPLIBS, FMS, etc.) to libraries developed by NOAA's partners (e.g. PIO, ESMF, etc.) to truly third party libraries (e.g. NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW App) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the next section. .. _DownloadSRWApp: Download the UFS SRW Application Code -===================================== +====================================== The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the release branch of the repository: .. code-block:: console @@ -84,7 +84,8 @@ The cloned repository contains the configuration files and sub-directories shown | CMakeLists.txt | Main cmake file for SRW App | +--------------------------------+--------------------------------------------------------+ | Externals.cfg | Includes tags pointing to the correct version of the | - | | external GitHub repositories/branches used in the SRW. | + | | external GitHub repositories/branches used in the SRW | + | | App. | +--------------------------------+--------------------------------------------------------+ | LICENSE.md | CC0 license information | +--------------------------------+--------------------------------------------------------+ @@ -236,14 +237,14 @@ The build will take a few minutes to complete. When it starts, a random number i Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: Grid Configuration ======================= -The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. +The SRW App officially supports three different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the three pre-defined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. .. _PredefinedGrids: diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index cf3728fbeb..6a358594e0 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -1,8 +1,8 @@ .. _Components: -=============== -SRW Components -=============== +============================ +SRW Application Components +============================ The SRW Application assembles a variety of components, including: @@ -19,11 +19,9 @@ These components are documented within this User's Guide and supported through a Pre-processor Utilities and Initial Conditions ============================================== -The SRW Application includes a number of pre-processing utilities that initialize and prepare the -model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. -The SRW Application can be initialized from a range of operational initial condition files. It is -possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. +The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. .. WARNING:: For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. @@ -45,10 +43,7 @@ The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data Post-processor ============== -The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the -workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on -standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful -diagnostic fields, as described in the `UPP User’s Guide `__. +The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__. Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). @@ -57,10 +52,7 @@ Visualization Example ===================== A Python script is provided to create basic visualization of the model output. The script is designed to output graphics in PNG format for 14 standard meteorological variables -when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included -to visually compare two runs for the same domain and resolution. These scripts are provided only -as an example for users familiar with Python. They may be used to perform a visual check to verify -that the application is producing reasonable results. +when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results. After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index 4d972d7e31..9e6f719851 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -57,7 +57,7 @@ However, it is also possible to install these utilities via Macports (https://ww Installing NCEPLIBS-external ============================ -In order to facilitate the installation of NCEPLIBS (and therefore, the SRW and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory `. +In order to facilitate the installation of NCEPLIBS (and therefore, the SRW App and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory `. These instructions will install the NCEPLIBS-external in the current directory tree, so be sure you are in the desired location before starting. @@ -126,8 +126,8 @@ Further information on including prerequisite libraries, as well as other helpfu Once the NCEPLIBS package has been successfully installed, you can move on to building the UFS SRW Application. -Building the UFS Short-Range Weather Application (UFS SRW App) -============================================================== +Building the UFS SRW Application +======================================= Building the UFS SRW App is similar to building NCEPLIBS, in that the code is stored in a git repository and is built using CMake software. The first step is to retrieve the code from GitHub, using the variables defined earlier: .. code-block:: console diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index d1079c78fb..5e2f118cd0 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -1,10 +1,10 @@ .. _InputOutputFiles: -====================== +======================= Input and Output Files -====================== +======================= This chapter provides an overview of the input and output files needed by the components -of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. For SRW users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. +of the UFS SRW Application (i.e., :term:`UFS_UTILS`, the UFS :term:`Weather Model`, and the :term:`UPP`). Links to more detailed documentation for each of the components are provided. For SRW App users who want to jump straight to downloading and staging the files, see :numref:`Section %s `. .. _Input: @@ -25,8 +25,7 @@ the external model data can be found in :numref:`Section %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The -pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. +When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. UFS Weather Model ----------------- @@ -44,8 +43,7 @@ Workflow -------- The SRW Application uses a series of template files, combined with user-selected settings, to create the required namelists and parameter files needed by the Application. These -templates can be reviewed to see what defaults are being used and where configuration parameters -from the ``config.sh`` file are assigned. +templates can be reviewed to see what defaults are being used and where configuration parameters from the ``config.sh`` file are assigned. List of Template Files ^^^^^^^^^^^^^^^^^^^^^^ @@ -103,10 +101,7 @@ while information on the ``regional_grid.nml`` can be found in the `UFS_UTILS Us Migratory Route of the Input Files in the Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ :numref:`Figure %s ` shows how the case-specific input files in the -``ufs-srweather-app/regional_workflow/ush/templates/`` directory flow to the experiment directory. -The value of ``CCPP_PHYS_SUITE`` is specified in the configuration file ``config.sh``. The template -input files corresponding to ``CCPP_PHYS_SUITE``, such as ``field_table`` and ``nems_configure``, are copied to the experiment directory ``EXPTDIR``, and the namelist file of the weather model ``input.nml`` is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the script ``generate_FV3LAM_wflow.sh``. -While running the task ``RUN_FCST`` in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``EXPTDIR``, are linked to the cycle directory ``CYCLE_DIR/``. Additionally, ``diag_table`` and ``model_configure`` are copied from the ``templates`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. +``ufs-srweather-app/regional_workflow/ush/templates/`` directory flow to the experiment directory. The value of ``CCPP_PHYS_SUITE`` is specified in the configuration file ``config.sh``. The template input files corresponding to ``CCPP_PHYS_SUITE``, such as ``field_table`` and ``nems_configure``, are copied to the experiment directory ``EXPTDIR``, and the namelist file of the weather model ``input.nml`` is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the script ``generate_FV3LAM_wflow.sh``. While running the task ``RUN_FCST`` in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``EXPTDIR``, are linked to the cycle directory ``CYCLE_DIR/``. Additionally, ``diag_table`` and ``model_configure`` are copied from the ``templates`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``. .. _MigratoryRoute: @@ -150,14 +145,12 @@ UFS Weather Model As mentioned previously, the workflow can be run in ‘community’ or ‘nco’ mode, which determines the location and names of the output files. In addition to this option, output can also be in netCDF or NEMSIO format. The output file format is set in the ``model_configure`` files using the -``output_file`` variable. At this time, due to limitations in the post-processing component, only netCDF -format output is recommended for the SRW application. +``output_file`` variable. At this time, due to limitations in the post-processing component, only netCDF format output is recommended for the SRW Application. .. note:: In summary, the fully supported options for this release include running in ‘community’ mode with netCDF format output files. -In this case, the netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH`` directory. The bases of -the file names are specified in the input file ``model_configure`` and are set to the following in the SRW Application: +In this case, the netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH`` directory. The bases of the file names are specified in the input file ``model_configure`` and are set to the following in the SRW Application: * ``dynfHHH.nc`` * ``phyfHHH.nc`` @@ -169,16 +162,14 @@ Unified Post Processor (UPP) ---------------------------- Documentation for the UPP output files can be found `here `__. -For the SRW Application, the weather model netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH/postprd`` -directory and have the naming convention (file->linked to): +For the SRW Application, the weather model netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH/postprd`` directory and have the naming convention (file->linked to): * ``BGRD3D_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgrd3df{fhr}.tmXX.grib2`` * ``BGDAWP_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.bgdawpf{fhr}.tmXX.grib2`` The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by the user in the ``config.sh`` settings. -If you wish to modify the fields or levels that are output from the UPP, you will need to make -modifications to file ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. Specifically, if the code was cloned in the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``. +If you wish to modify the fields or levels that are output from the UPP, you will need to make modifications to file ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. Specifically, if the code was cloned in the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``. .. note:: This process requires advanced knowledge of which fields can be output for the UFS Weather Model. @@ -193,8 +184,7 @@ Once you have created the new flat text file reflecting your changes, you will n USE_CUSTOM_POST_CONFIG_FILE=”TRUE” CUSTOM_POST_CONFIG_PATH=”” -which tells the workflow to use the custom file located in the user-defined path. The path should -include the filename. If this is set to true and the file path is not found, then an error will occur when trying to generate the SRW Application workflow. +which tells the workflow to use the custom file located in the user-defined path. The path should include the filename. If this is set to true and the file path is not found, then an error will occur when trying to generate the SRW Application workflow. You may then start your case workflow as usual and the UPP will use the new flat ``*.txt`` file. @@ -202,8 +192,7 @@ You may then start your case workflow as usual and the UPP will use the new flat Downloading and Staging Input Data ================================== -A set of input files, including static (fix) data and raw initial and lateral boundary conditions -(:term:`IC/LBC`'s), are needed to run the SRW Application. +A set of input files, including static (fix) data and raw initial and lateral boundary conditions (:term:`IC/LBC`'s), are needed to run the SRW Application. .. _StaticFixFiles: @@ -229,7 +218,7 @@ Initial Condition Formats and Source ------------------------------------ The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS. -The data required to run the "out-of'the-box" SRW case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. +The data required to run the "out-of-the-box" SRW App case described in :numref:`Chapter %s ` is already preinstalled on `Level 1 `__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository `__ or on `AWS cloud storage `_. To add this data to your system, run the following commands from the ``ufs-srweather-app`` directory: @@ -260,13 +249,9 @@ These environment variables describe what :term:`IC/LBC` files to use (pre-stage Initial and Lateral Boundary Condition Organization --------------------------------------------------- The suggested directory structure and naming convention for the raw input files is described -below. While there is flexibility to modify these settings, this will provide the most reusability -for multiple dates when using the SRW Application workflow. +below. While there is flexibility to modify these settings, this will provide the most reusability for multiple dates when using the SRW Application workflow. -For ease of reusing the ``config.sh`` for multiple dates and cycles, it is recommended to set up -your raw :term:`IC/LBC` files such that it includes the model name (e.g., FV3GFS, NAM, RAP, HRRR) and ``YYYYMMDDHH``, for example: ``/path-to/model_data/FV3GFS/2019061518``. Since both initial -and lateral boundary condition files are necessary, you can also include an ICS and LBCS directory. -The sample IC/LBC's available at the FTP data repository are structured as follows: +For ease of reusing the ``config.sh`` for multiple dates and cycles, it is recommended to set up your raw :term:`IC/LBC` files such that it includes the model name (e.g., FV3GFS, NAM, RAP, HRRR) and ``YYYYMMDDHH``, for example: ``/path-to/model_data/FV3GFS/2019061518``. Since both initial and lateral boundary condition files are necessary, you can also include an ICS and LBCS directory. The sample IC/LBC's available at the FTP data repository are structured as follows: * ``/path-to/model_data/MODEL/YYYYMMDDHH/ICS`` * ``/path-to/model_data/MODEL/YYYYMMDDHH/LBCS`` @@ -330,8 +315,7 @@ Staging Initial Conditions Manually ----------------------------------- If users want to run the SRW Application with raw model files for dates other than what are currently available on the preconfigured platforms, they need to stage the data manually. -The data should be placed in ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. -The path to these variables can be set in the ``config.sh`` file. Raw model files are available from a number of sources. A few examples are provided here for convenience. +The data should be placed in ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. The path to these variables can be set in the ``config.sh`` file. Raw model files are available from a number of sources. A few examples are provided here for convenience. NOMADS: https://nomads.ncep.noaa.gov/pub/data/nccf/com/{model}/prod, where model may be: diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 6f8f0b6ac8..4f94a9ae2c 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -4,9 +4,11 @@ Introduction ============== -The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. The UFS is the source system for NOAA’s operational numerical weather prediction applications. It enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. +The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA’s operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS can be configured for `multiple applications `__. The configuration described in this documentation is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes out to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. Future work will expand the capabilities of the application to include data assimilation (DA) and a verification package (e.g., METplus). This documentation provides a :ref:`Quick Start Guide ` for running the application in a container and a :ref:`detailed guide ` for running the SRW on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. +The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. New and improved capabilities for this release include the addition of a verification package (MetPLUS) for both deterministic and ensemble simulations and support for four Stochastically Perturbed Perturbation (SPP) schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability. + +This documentation provides a :ref:`Quick Start Guide ` for running the SRW Application in a container and a :ref:`detailed guide ` for running the SRW App on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. The SRW App v1.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: @@ -28,23 +30,23 @@ This guide instructs both novice and experienced users on downloading, building, Variables presented as ``AaBbCc123`` in this User's Guide typically refer to variables in scripts, names of files, and directories. -File paths or code that include angle brackets (e.g., ``build__.env``) indicate that users should insert options appropriate to their SRW configuration (e.g., ``build_orion_intel.env``). +File paths or code that include angle brackets (e.g., ``build__.env``) indicate that users should insert options appropriate to their SRW App configuration (e.g., ``build_orion_intel.env``). .. hint:: - * To get started running the SRW, see the :ref:`Quick Start Guide ` for beginners or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. - * For background information on the SRW code repositories and directory structure, see :numref:`Section %s ` below. - * For an outline of SRW components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. + * To get started running the SRW App, see the :ref:`Quick Start Guide ` for beginners or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. + * For background information on the SRW App code repositories and directory structure, see :numref:`Section %s ` below. + * For an outline of SRW App components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. .. _ComponentsOverview: -SRW Components Overview -============================ +SRW App Components Overview +============================== Pre-processor Utilities and Initial Conditions ------------------------------------------------ -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. The pre-processing software converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s ` and in the `UFS_UTILS User’s Guide `_. +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. One pre-processing utility converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s ` and in the `UFS_UTILS User’s Guide `_. Forecast Model @@ -54,24 +56,23 @@ Atmospheric Model ^^^^^^^^^^^^^^^^^^^^^^ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). -The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and Noah Multi-parameterization (Noah MP) Land Surface Model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW release includes an experimental physics version and an updated operational version. +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App release includes an experimental physics version and an updated operational version. Data Format ^^^^^^^^^^^^^^^^^^^^^^ -The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. +The SRW App supports the use of external model data in :term:`GRIB2`, :term:`NEMSIO`, and netCDF format when generating initial and boundary conditions. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube`. Unified Post-Processor (UPP) -------------------------------- -The `Unified Post Processor `__ (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW, it converts data output formats from netCDF format on the native model grid to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). +The `Unified Post Processor `__ (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW App, it converts data output from netCDF format to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `_. Output from the UPP can be used with visualization, plotting, and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques). Visualization Example @@ -84,9 +85,9 @@ Build System and Workflow The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see `Rocoto documentation `__ for more on workflow management). Individual components can also be run in a stand-alone, command line fashion. -The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite for the forecast. +The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite used for the simulation. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW code ` without first installing prerequisites. On other platforms, the SRW can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms, the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. @@ -99,7 +100,7 @@ Code Repositories and Directory Structure Hierarchical Repository Structure ----------------------------------- -The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather-app`` and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The SRW Application includes the ``manage_externals`` tool and a configuration file called ``Externals.cfg``, which describes the external repositories associated with the SRW umbrella repository (see :numref:`Table %s `). +The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather-app`` and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called "externals," from additional repositories. The SRW Application includes the ``manage_externals`` tool and a configuration file called ``Externals.cfg``, which describes the external repositories associated with the SRW App umbrella repository (see :numref:`Table %s `). .. _top_level_repos: @@ -255,7 +256,7 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` scr | YYYYMMDDHH | Cycle directory (empty) | +---------------------------+-------------------------------------------------------------------------------------------------------+ -In addition, running the SRW in *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files after the grid, orography, and/or surface climatology generation tasks are run. +In addition, running the SRW App in *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files after the grid, orography, and/or surface climatology generation tasks are run. .. _FixDirectories: @@ -363,7 +364,7 @@ A list of available documentation is shown in :numref:`Table %s `. +utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each UFS component repository, which are outlined in the respective User's Guides listed in :numref:`Table %s `. Future Direction ================= @@ -372,9 +373,8 @@ Users can expect to see incremental improvements and additional capabilities in * A more extensive set of supported developmental physics suites. * A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. -* Inclusion of data assimilation, cycling, and ensemble capabilities. -* A verification package (e.g., METplus) integrated into the workflow. -* Inclusion of stochastic perturbation techniques. +* Inclusion of data assimilation and forecast restart/cycling capabilities. + .. bibliography:: references.bib diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index e79ad9e316..af48734a60 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -4,9 +4,9 @@ Quick Start Guide ==================================== -This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW App build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW. +This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via an :term:`EPIC`-provided container reduces this variability and allows for a smoother SRW App build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. -The "out-of-the-box" SRW case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: @@ -50,7 +50,7 @@ Working on HPC Systems -------------------------- Those *not* working on HPC systems may skip to the :ref:`next step `. -On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW. On NOAA's Cloud platforms, the following commands will allocate a compute node: +On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW App. On NOAA's Cloud platforms, the following commands will allocate a compute node: .. code-block:: console @@ -121,7 +121,7 @@ From the ``ufs-srweather-app`` directory, ``cd`` into the build directory and ru Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW. +The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GenerateForecastC: From 7fc263d169aa88fbd8aa61561ef200ca119b7761 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 21 Mar 2022 16:21:17 -0400 Subject: [PATCH 062/118] update comment to Intro citation --- docs/UsersGuide/source/Introduction.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 4f94a9ae2c..b63fedc989 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -15,7 +15,7 @@ The SRW App v1.0.0 citation is as follows and should be used when presenting res UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 .. - COMMENT: Update version numbers/citation for release! + COMMENT: Update version numbers/citation for release! Also update release date for citation! How to Use This Document From 10de71f603f34bf1f8ba6557409247e89517a269 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 22 Mar 2022 10:48:29 -0400 Subject: [PATCH 063/118] add user-defined vertical levels to future work --- docs/UsersGuide/source/Introduction.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index b63fedc989..0f57424d20 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -373,6 +373,7 @@ Users can expect to see incremental improvements and additional capabilities in * A more extensive set of supported developmental physics suites. * A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain. +* Add user-defined vertical levels (number and distribution). * Inclusion of data assimilation and forecast restart/cycling capabilities. From 92bddcae8da6ca734826fb1a145bae08b4c43937 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Mar 2022 11:41:03 -0400 Subject: [PATCH 064/118] Add instructions for srw_common module load --- docs/UsersGuide/source/BuildRunSRW.rst | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index d9967dce23..0be096c74c 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -128,7 +128,14 @@ Run the executable that pulls in SRW App components from external repositories: Set up the Build Environment ============================ -Before building the SRW App, the build environment must be set up for the user's specific platform. For Level 1 systems, scripts for loading the proper modules and/or setting the correct environment variables can be found in the ``env`` directory of the SRW App in files named ``build__.env``. Here is a sample directory listing of these build files: +Before building the SRW App, the build environment must be set up for the user's specific platform. There is a set of common modules required to build the SRW App. These are located in the ``env/srw_common`` file. To load the set of common modules, run: + +.. code-block:: console + module use + +where ```` is the full path to the ``env`` directory. + +Then, users must set up the platform-specific elements of the build environment. For Level 1 systems, scripts for loading the proper modules and/or setting the correct environment variables can be found in the ``env`` directory of the SRW App in files named ``build__.env``. Here is a sample directory listing of these build files: .. code-block:: console From 6fa50748e64d1dad7d63b7a2b333b3eadbad6c73 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Mar 2022 11:52:35 -0400 Subject: [PATCH 065/118] fix typo --- docs/UsersGuide/source/BuildRunSRW.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 0be096c74c..5481f6b25e 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -131,6 +131,7 @@ Set up the Build Environment Before building the SRW App, the build environment must be set up for the user's specific platform. There is a set of common modules required to build the SRW App. These are located in the ``env/srw_common`` file. To load the set of common modules, run: .. code-block:: console + module use where ```` is the full path to the ``env`` directory. From a5ae76eff8261007ea1f46023dd0372bf2b96802 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Mar 2022 18:51:42 -0400 Subject: [PATCH 066/118] update Intro & BuildRunSRW based on Mark's feedback --- docs/UsersGuide/source/BuildRunSRW.rst | 7 ++++++- docs/UsersGuide/source/Introduction.rst | 4 +++- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 5481f6b25e..85ae658c5e 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -801,7 +801,12 @@ For automatic resubmission of the workflow at regular intervals (e.g., every min */3 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. +where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. + +.. hint:: + + * On NOAA Cloud instances, ``*/1 * * * *`` is the preferred option for cron jobs because compute nodes will shut down if they remain idle too long. If the compute node shuts down, it can take 15-20 minutes to start up a new one. + * On other NOAA HPC systems, admins discourage the ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` is the preferred option for cron jobs on non-Cloud systems. To check the experiment progress: diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 0f57424d20..62c98c7c47 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -87,7 +87,7 @@ The SRW Application has a portable CMake-based build system that packages togeth The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite used for the simulation. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., NCEPLIBS) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms, the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., HPC-Stack) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms, the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. @@ -345,6 +345,8 @@ A list of available documentation is shown in :numref:`Table %s ` | + +----------------------------+---------------------------------------------------------------------------------+ | NCEPLIBS Documentation | https://github.com/NOAA-EMC/NCEPLIBS/wiki | +----------------------------+---------------------------------------------------------------------------------+ | NCEPLIBS-external | https://github.com/NOAA-EMC/NCEPLIBS-external/wiki | From ea17b193f04db2f3989d524ecee2007a8a5c6a1a Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 23 Mar 2022 19:01:44 -0400 Subject: [PATCH 067/118] minor intro updates --- docs/UsersGuide/source/Introduction.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 62c98c7c47..f266640f3d 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -87,7 +87,7 @@ The SRW Application has a portable CMake-based build system that packages togeth The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite used for the simulation. -This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (e.g., HPC-Stack) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms, the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +This SRW Application release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g. Hera, Orion), cloud environments, and generic Linux and macOS systems. Four `levels of support `_ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries (HPC-Stack) available in a central location. The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms, the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. From 1aa932247cabc1a2d8d0801e786c82ab6923a59b Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 25 Mar 2022 15:23:50 -0400 Subject: [PATCH 068/118] 1st round of jwolff's edits --- docs/UsersGuide/source/BuildRunSRW.rst | 26 +++++++++++++------------- docs/UsersGuide/source/Components.rst | 2 +- 2 files changed, 14 insertions(+), 14 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 85ae658c5e..4a22720537 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -6,14 +6,14 @@ Building and Running the SRW App The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. -This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The "out-of-the-box" SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The "out-of-the-box" SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) domain (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. .. note:: - The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth troubleshooting skills, especially on Level 3 and 4 systems, and is less appropriate for beginners. + The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth system-based knowledge, especially on Level 3 and 4 systems; it is less appropriate for beginners. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: @@ -54,7 +54,7 @@ The UFS Weather Model draws on over 50 code libraries to run its applications. T Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to run applications (such as the SRW App) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) or models that depend on it. Users can either build the HPC-stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. After completing installation, continue to the next section. @@ -62,7 +62,7 @@ After completing installation, continue to the next section. Download the UFS SRW Application Code ====================================== -The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the release branch of the repository: +The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the ``develop`` branch of the repository: .. code-block:: console @@ -98,7 +98,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | env | Contains build and workflow environment files | +--------------------------------+--------------------------------------------------------+ - | docs | Contains release notes, documentation, and Users' Guide| + | docs | Contains release notes, documentation, and User's Guide| +--------------------------------+--------------------------------------------------------+ | manage_externals | Utility for checking out external repositories | +--------------------------------+--------------------------------------------------------+ @@ -154,7 +154,7 @@ On Level 1 systems, the commands in the ``build__.env`` file from the main ``ufs-srweather-app`` directory to source the appropriate file. -On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables using commands in the form ``export =``. Users may need to use ``setenv`` rather than ``export`` depending on their shell environment. +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables using commands in the form ``export =``. Users may need to use ``setenv`` rather than ``export`` depending on their shell environment. For example, ``setenv ``. .. _BuildExecutables: @@ -294,7 +294,7 @@ Default configuration: ``config_defaults.sh`` ------------------------------------------------ .. note:: - This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is helpful but not essential to running the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is informative, but users do not need to modify ``config_defaults.sh`` to run the out-of-the-box case for the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. Important configuration variables in the ``config_defaults.sh`` file appear in :numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` @@ -413,7 +413,7 @@ settings. There is usually no need for a user to modify the default configuratio User-specific configuration: ``config.sh`` -------------------------------------------- -The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid-Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. .. _ConfigCommunity: @@ -426,7 +426,7 @@ The user must specify certain basic information about the experiment in a ``conf +--------------------------------+-------------------+--------------------------------------------------------+ | ACCOUNT | "project_name" | "an_account" | +--------------------------------+-------------------+--------------------------------------------------------+ - | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv15p2" | + | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv16" | +--------------------------------+-------------------+--------------------------------------------------------+ | VERBOSE | "TRUE" | "TRUE" | +--------------------------------+-------------------+--------------------------------------------------------+ @@ -440,7 +440,7 @@ The user must specify certain basic information about the experiment in a ``conf +--------------------------------+-------------------+--------------------------------------------------------+ | QUILTING | "TRUE" | "TRUE" | +--------------------------------+-------------------+--------------------------------------------------------+ - | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v15p2" | + | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v16" | +--------------------------------+-------------------+--------------------------------------------------------+ | FCST_LEN_HRS | "24" | "48" | +--------------------------------+-------------------+--------------------------------------------------------+ @@ -481,7 +481,7 @@ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather cd regional_workflow/ush cp config.community.sh config.sh -The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. @@ -505,8 +505,8 @@ Minimum parameter settings for Level 1 machines: ACCOUNT="" EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files" **Hera:** diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 6a358594e0..eb76b19c10 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -35,7 +35,7 @@ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each +Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. From 3d1cddbebc7e3f3664f6887ca614e82a7a33c740 Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 28 Mar 2022 19:20:31 -0400 Subject: [PATCH 069/118] 2nd round of jwolff updates --- docs/UsersGuide/source/BuildRunSRW.rst | 71 ++++++++++++++++----- docs/UsersGuide/source/Components.rst | 9 ++- docs/UsersGuide/source/InputOutputFiles.rst | 18 ++++-- docs/UsersGuide/source/Introduction.rst | 16 ++--- docs/UsersGuide/source/Quickstart.rst | 15 ++--- 5 files changed, 85 insertions(+), 44 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 4a22720537..5eee295f30 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -154,7 +154,13 @@ On Level 1 systems, the commands in the ``build__.env`` file from the main ``ufs-srweather-app`` directory to source the appropriate file. -On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check if Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables using commands in the form ``export =``. Users may need to use ``setenv`` rather than ``export`` depending on their shell environment. For example, ``setenv ``. +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__.env`` files can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: + +.. code-block:: + + export = + setenv + .. _BuildExecutables: @@ -495,7 +501,7 @@ Sample settings are indicated below for Level 1 platforms. Detailed guidance app To determine an appropriate ACCOUNT field for Level 1 systems, run ``groups``, and it will return a list of projects you have permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. -Minimum parameter settings for Level 1 machines: +Minimum parameter settings for running the out-of-the-box SRW App case on Level 1 machines: **Cheyenne:** @@ -505,28 +511,61 @@ Minimum parameter settings for Level 1 machines: ACCOUNT="" EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files" - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files" -**Hera:** +**Hera, Jet, Orion, Gaea:** + +The ``MACHINE``, ``ACCOUNT``, and ``EXPT_SUBDIR`` settings are the same as for Cheyenne, except that ``"cheyenne"`` should be switched to ``"hera"``, ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. Set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, but replace the file paths to Cheyenne's data with the file paths for the correct machine. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` use the same file path. + +On Hera: .. code-block:: console - MACHINE="hera" - ACCOUNT="" - EXPT_SUBDIR="" + "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data" + +On Jet: + +.. code-block:: console + + "/lfs4/BMC/wrfruc/FV3-LAM/model_data" + +On Orion: + +.. code-block:: console -**Jet, Orion, Gaea:** + "/work/noaa/fv3-cam/UFS_SRW_app/v1p0/model_data" -The settings are the same as for Hera, except that ``"hera"`` should be switched to ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. -For **WCOSS**, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: +On Gaea: + +.. code-block:: console + + "/lustre/f2/pdata/esrl/gsd/ufs/ufs-srw-release-v1.0.0/staged_extrn_mdl_files" + + +For **WCOSS** systems, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: .. code-block:: console MACHINE=”wcoss_cray” or MACHINE=”wcoss_dell_p3” ACCOUNT="my_account" EXPT_SUBDIR="my_expt_name" + USE_USER_STAGED_EXTRN_FILES="TRUE" + +For WCOSS_DELL_P3: + +.. code-block:: console + + EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data" + +For WCOSS_CRAY: + +.. code-block:: console + + EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data" **NOAA Cloud Systems:** @@ -630,14 +669,14 @@ The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workfl +----------------------+------------------------------------------------------------+ | **Workflow Task** | **Task Description** | +======================+============================================================+ - | make_grid | Pre-processing task to generate regional grid files. Can | - | | be run, at most, once per experiment. | + | make_grid | Pre-processing task to generate regional grid files. Only | + | | needs to be run once per experiment. | +----------------------+------------------------------------------------------------+ - | make_orog | Pre-processing task to generate orography files. Can be | - | | run, at most, once per experiment. | + | make_orog | Pre-processing task to generate orography files. Only | + | | needs to be run once per experiment. | +----------------------+------------------------------------------------------------+ | make_sfc_climo | Pre-processing task to generate surface climatology files. | - | | Can be run, at most, once per experiment. | + | | Only needs to be run, at most, once per experiment. | +----------------------+------------------------------------------------------------+ | get_extrn_ics | Cycle-specific task to obtain external data for the | | | initial conditions | diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index eb76b19c10..5576837164 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -19,7 +19,7 @@ These components are documented within this User's Guide and supported through a Pre-processor Utilities and Initial Conditions ============================================== -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle the correct number of halo shave points and topography filtering. The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User’s Guide `_. +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide `_. The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. @@ -31,19 +31,18 @@ Forecast Model ============== The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each -of the supported suites. +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. Post-processor ============== -The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__. +The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__. Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 5e2f118cd0..cfda134b60 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -25,17 +25,17 @@ the external model data can be found in :numref:`Section %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. +When a user runs the SRW Application as described in the Quick Start Guide :numref:`Chapter %s `, :numref:`Step %s Generate the Forecast Experiment ` links the input data for the pre-processing utilities from a location on disk to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found `here `__. UFS Weather Model ----------------- The input files for the weather model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files -must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s ` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. +must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s ` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s `. Unified Post Processor (UPP) ---------------------------- Documentation for the UPP input files can be found in the `UPP User's Guide -`__. +`__. .. _WorkflowTemplates: @@ -96,7 +96,7 @@ and are shown in :numref:`Table %s `. Additional information related to the ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on the ``regional_grid.nml`` can be found in the `UFS_UTILS User’s Guide -`_. +`_. Migratory Route of the Input Files in the Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -140,6 +140,9 @@ experiment run directory ``EXPTDIR/YYYYMMDDHH/INPUT`` and consist of the followi These output files are used as inputs for the UFS weather model, and are described in the `Users Guide `__. +.. + COMMENT: Change link above (structure of "latest" is significantly different) + UFS Weather Model ----------------- As mentioned previously, the workflow can be run in ‘community’ or ‘nco’ mode, which determines @@ -156,11 +159,12 @@ In this case, the netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH`` * ``phyfHHH.nc`` Additional details may be found in the UFS Weather Model `Users Guide -`__. +`__. + Unified Post Processor (UPP) ---------------------------- -Documentation for the UPP output files can be found `here `__. +Documentation for the UPP output files can be found `here `__. For the SRW Application, the weather model netCDF output files are written to the ``EXPTDIR/YYYYMMDDHH/postprd`` directory and have the naming convention (file->linked to): @@ -174,7 +178,7 @@ If you wish to modify the fields or levels that are output from the UPP, you wil .. note:: This process requires advanced knowledge of which fields can be output for the UFS Weather Model. -Use the directions in the `UPP User's Guide `__ for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). +Use the directions in the `UPP User's Guide `__ for details on how to make modifications to the ``fv3lam.xml`` file and for remaking the flat text file that the UPP reads, which is called ``postxconfig-NT-fv3lam.txt`` (default). Once you have created the new flat text file reflecting your changes, you will need to modify your ``config.sh`` to point the workflow to the new text file. In your ``config.sh``, set the following: diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index f266640f3d..6905e5a846 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -6,7 +6,7 @@ Introduction The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA’s operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`weather enterprise` (e.g. government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. -The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. New and improved capabilities for this release include the addition of a verification package (MetPLUS) for both deterministic and ensemble simulations and support for four Stochastically Perturbed Perturbation (SPP) schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability. +The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four Stochastically Perturbed Perturbation (SPP) schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability. This documentation provides a :ref:`Quick Start Guide ` for running the SRW Application in a container and a :ref:`detailed guide ` for running the SRW App on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. @@ -61,7 +61,7 @@ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Common Community Physics Package ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App release includes an experimental physics version and an updated operational version. +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App release includes an four physics suites. Data Format ^^^^^^^^^^^^^^^^^^^^^^ @@ -336,13 +336,13 @@ A list of available documentation is shown in :numref:`Table %s ` | @@ -360,9 +360,9 @@ A list of available documentation is shown in :numref:`Table %s ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. +This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via container reduces this variability and allows for a smoother SRW App build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. -The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. .. attention:: @@ -121,7 +121,7 @@ From the ``ufs-srweather-app`` directory, ``cd`` into the build directory and ru Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available, as long as the ``--bind`` command in :numref:`Step %s ` included the directory with the data. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in the :numref:`Section %s `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GenerateForecastC: @@ -130,7 +130,7 @@ Generate the Forecast Experiment To generate the forecast experiment, users must: #. :ref:`Set experiment parameters ` -#. :ref:`Set Python and other environment parameters ` +#. :ref:`Set Python and other environment parameters to activate the regional workflow ` #. :ref:`Run a script to generate the experiment workflow ` The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. @@ -148,7 +148,7 @@ Make a copy of ``config.community.sh`` to get started. From the ``ufs-srweather- cd regional_workflow/ush cp config.community.sh config.sh -The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v15.2 physics suite (FV3_GFS_v15p2 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. Next, edit the new ``config.sh`` file to customize it for your experiment. At a minimum, update the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``: @@ -285,7 +285,7 @@ Check the batch script output file in your experiment directory for a “SUCCESS .. table:: List of tasks in the regional workflow in the order that they are executed. Scripts with the same stage number may be run simultaneously. The number of processors and wall clock time is a good starting point for Cheyenne or Hera - when running a 48-h forecast on the 25-km CONUS domain. + when running a 48-h forecast on the 25-km CONUS domain. For a brief description of tasks, see :numref:`Table %s `. +------------+------------------------+----------------+----------------------------+ | **Stage/** | **Task Run Script** | **Number of** | **Wall clock time (H:MM)** | @@ -313,7 +313,6 @@ Check the batch script output file in your experiment directory for a “SUCCESS | | | | forecast hour) | +------------+------------------------+----------------+----------------------------+ - .. hint:: If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. On an HPC system, the user will need to allocate a(nother) compute node. The process for doing so is system-dependent, and users should check the documentation available for their HPC system. Instructions for allocating a compute node on NOAA Cloud systems can be viewed in the :numref:`Step %s ` as an example. From 173b8380fd1e01b04f12dfbcd8990d9c6a8ce33b Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 29 Mar 2022 10:43:38 -0400 Subject: [PATCH 070/118] update QS intro --- docs/UsersGuide/source/Quickstart.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index bf94555a6f..8485ce1d2a 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -4,7 +4,7 @@ Container-Based Quick Start Guide ==================================== -This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via container reduces this variability and allows for a smoother SRW App build and run experience. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability, particularly if they already have experience running the SRW App. +This Quick Start Guide will help users to build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity `__ :term:`container`. The container approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`’s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. However, the container is not compatible with the `Rocoto workflow manager `__, so users must run each task in the workflow manually. Additionally, the Singularity container can only run on a single compute node, which makes the container-based approach inadequate for large experiments. It is an excellent starting point for running the "out-of-the-box" SRW App case and other small experiments. However, the :ref:`non-container approach ` may be more appropriate for those users who desire additional customizability or more compute power, particularly if they already have experience running the SRW App. The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. From 09581c810e0fc6322981cf9a85764e56cb347cee Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 29 Mar 2022 12:11:27 -0400 Subject: [PATCH 071/118] fix minor physics details --- docs/UsersGuide/source/Components.rst | 2 +- docs/UsersGuide/source/Introduction.rst | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index 5576837164..e92d8c922e 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -35,7 +35,7 @@ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is a physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 6905e5a846..03d3dff1f5 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -61,7 +61,7 @@ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Common Community Physics Package ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App release includes an four physics suites. +The `Common Community Physics Package `_ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The upcoming SRW App release includes four physics suites. Data Format ^^^^^^^^^^^^^^^^^^^^^^ From a714d43581caaa774339e3c7f3066a63631e7014 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 29 Mar 2022 12:44:27 -0400 Subject: [PATCH 072/118] update citation and physics suite name --- docs/UsersGuide/source/Components.rst | 4 ++-- docs/UsersGuide/source/Introduction.rst | 2 +- docs/UsersGuide/source/references.bib | 8 ++++---- 3 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index e92d8c922e..f2f17c149a 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -31,11 +31,11 @@ Forecast Model ============== The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. -Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is a physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 03d3dff1f5..91473c9f7a 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -56,7 +56,7 @@ Atmospheric Model ^^^^^^^^^^^^^^^^^^^^^^ The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2020`). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. +(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2021`). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` can be found `here `__. Common Community Physics Package ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/docs/UsersGuide/source/references.bib b/docs/UsersGuide/source/references.bib index 984bff6f8a..836b246d10 100644 --- a/docs/UsersGuide/source/references.bib +++ b/docs/UsersGuide/source/references.bib @@ -1,6 +1,6 @@ -@article{BlackEtAl2020, - title={A Limited Area Modeling Capability for the Finite-Volume Cubed-Sphere (FV3) Dynamical Core}, +@article{BlackEtAl2021, + title={A limited area modeling capability for the finite-volume cubed-sphere (fv3) dynamical core}, author={T.L. Black and J.A. Abeles and B.T. Blake and D. Jovic and E. Rogers and X. Zhang and E.A. Aligo and L.C. Dawson and Y. Lin and E. Strobach and P.C. Shafran, and J.R. Carley}, - journal={Monthly Weather Review}, - year={Submitted}, + journal={AGU Journal of Advances in Earth Modeling Systems}, + year={2021}, } From 4757b40c307fce66d796dd3d6c91965b7ef30c18 Mon Sep 17 00:00:00 2001 From: gspetro Date: Tue, 29 Mar 2022 15:48:18 -0400 Subject: [PATCH 073/118] add compute node allocation info to QS --- docs/UsersGuide/source/Quickstart.rst | 23 ++++++++++++++++++++++- docs/UsersGuide/source/conf.py | 2 +- 2 files changed, 23 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 8485ce1d2a..23c3f0aeea 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -61,7 +61,28 @@ On HPC systems (including NOAA's Cloud platforms), allocate a compute node on wh The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW App from that node. -The appropriate commands on other Level 1 platforms will vary, and users should consult the documentation for those platforms. +The appropriate commands on other Level 1 platforms will vary, and users should consult the `documentation `__ for those platforms. In general, the allocation command will follow one of these two patterns depending on whether the system uses the Slurm or PBS resource manager respectively: + +.. code-block:: console + + salloc -N 1 -n -A -t + module load met/<10.0.0> -Then, the path to the MET and METplus directories must be set: +Then, the path to the MET and METplus directories must be added to ``config.sh``: .. code-block:: console - METPLUS_PATH="/contrib/METplus/METplus-4.1.0" - MET_INSTALL_DIR="/contrib/met/10.1.0" + METPLUS_PATH="" + MET_INSTALL_DIR="" -Users who have already staged the METplus verification data (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE". +Users who have already staged the METplus verification data (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. .. code-block:: console @@ -647,7 +660,7 @@ If users have access to NOAA HPSS but have not pre-staged the data, they can sim Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. -Next, the verification tasks must be turned on according to the user's needs: +Next, the verification tasks must be turned on according to the user's needs. Users should add the some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: .. code-block:: console @@ -656,7 +669,7 @@ Next, the verification tasks must be turned on according to the user's needs: RUN_TASK_VX_ENSGRID="TRUE" RUN_TASK_VX_ENSPOINT="TRUE" -These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. More details on all of the parameters in this section are available in :numref:`Chapter %s `. +These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s ` More details on all of the parameters in this section are available in :numref:`Chapter %s `. .. _SetUpPythonEnv: @@ -770,114 +783,114 @@ In addition to the baseline tasks described in :numref:`Table %s `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal (warn-on-forecast to climate) and spatial (storm to global) scales. It is supported via the `Developmental Testbed Center (DTC) `__. +The enhanced Model Evaluation Tools (`METplus `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. -The core components of the framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__, along with documentation for all other components of the framework found at the Documentation link for each component on the METplus `downloads `__ page. +METplus is preinstalled on all `Level 1 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack`. Users on non-Level 1 systems can follow the `MET Installation `__ and `METplus Installation `__ Guides for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. -Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations, along with ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App. Currently, the SRW App supports the use of NDAS observation files in prepBUFR format (which include conventional point-based surface and upper-air data) for point-based verification, gridded Climatology-Calibrated Precipitation Analysis (CCPA) data for accumulated precipitation evaluation, and Multi-Radar/Multi-Sensor (MRMS) gridded analysis data for composite reflectivity and echo top verification. +The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. -METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), NOAA/Environmental Modeling Center (EMC), and is open to community contributions. +Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files in `prepBUFR format `__ (which include conventional point-based surface and upper-air data) for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. + +METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (EMC), and it is open to community contributions. Visualization Example diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index 17194c6b6f..0a8e0dd7db 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -295,7 +295,7 @@ METplus Parameters Path to top-level directory of METplus installation. ``MET_BIN_EXEC``: (Default: "bin") - Name of the directory where the METplus executable is installed. + Location where METplus executables are installed. .. _METParamNote: diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index de5bd4ab3a..135b2c2a5e 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -46,6 +46,9 @@ Glossary dynamical core Global atmospheric model based on fluid dynamics principles, including Euler's equations of motion. + echo top + The radar-indicated top of an area of precipitation. Specifically, it contains the height of the 18 dBZ reflectivity value. + EPIC EPIC stands for the `Earth Prediction Innovation Center `__. EPIC seeks to accelerate scientific research and modeling contributions through continuous and sustained community engagement to produce the most accurate and reliable operational modeling system in the world. @@ -55,7 +58,7 @@ Glossary FV3 The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL), it is a scalable and flexible dycore capable of both - hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the + hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model. FVCOM @@ -120,7 +123,7 @@ Glossary visualization. Stands for NCAR Command Language. More information can be found at https://www.ncl.ucar.edu. NDAS - :term:`NAM` Data Assimilation System (NDAS) data. This data is required for use of the METplus verification suite within the SRW App. The most recent 1-2 days worth of data are publicly available in PrepBufr format and can be accessed `here `__. + :term:`NAM` Data Assimilation System (NDAS) data. This data is required for use of the METplus verification suite within the SRW App. The most recent 1-2 days worth of data are publicly available in PrepBufr format and can be accessed `here `__. The most recent 8 days of data can be accessed `here `__. NEMS The NOAA Environmental Modeling System is a common modeling framework whose purpose is diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 0a5a39f08b..96aa65c1d2 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -77,7 +77,7 @@ The `Unified Post Processor ` and on the `METplus website `__. +The Model Evaluation Tools (MET) package is a set of statistical verification tools developed by the `Developmental Testbed Center `__ (DTC) for use by the :term:`NWP` community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the unified METplus verification framework. The suite also includes the associated database and display systems called METviewer and METexpress. METplus spans a wide range of temporal and spatial scales. It is intended to be extensible through additional capabilities developed by the community. More details about METplus can be found in :numref:`Chapter %s ` and on the `METplus website `__. Visualization Example ------------------------- From 676a3512a91cf736e0d007aaf1b374378987852a Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 28 Apr 2022 17:48:37 -0400 Subject: [PATCH 113/118] add workflow svg diagram --- docs/UsersGuide/source/BuildRunSRW.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 046f7c87a0..566a50b015 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -734,7 +734,7 @@ Description of Workflow Tasks .. _WorkflowTasksFig: -.. figure:: _static/FV3LAM_wflow_flowchart.png +.. figure:: _static/FV3LAM_wflow_flowchart_v2.svg *Flowchart of the workflow tasks* From a6c81430cd7b11f2143e75b07de2bb02fa134ce7 Mon Sep 17 00:00:00 2001 From: gspetro Date: Fri, 29 Apr 2022 12:01:36 -0400 Subject: [PATCH 114/118] condense VX task table using ## --- docs/UsersGuide/source/BuildRunSRW.rst | 63 ++++++++------------------ 1 file changed, 18 insertions(+), 45 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 566a50b015..b1f5953ca4 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -800,14 +800,9 @@ In addition to the baseline tasks described in :numref:`Table %s Date: Fri, 29 Apr 2022 13:35:51 -0400 Subject: [PATCH 115/118] update README --- README.md | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 7eb717cd9f..744909ed42 100644 --- a/README.md +++ b/README.md @@ -2,12 +2,11 @@ The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. -The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/#/science/aboutapps). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from less than an hour out to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App v1.0.0 represents a snapshot of this continuously evolving system. The SRW App includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. +The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App release branches represent a snapshot of this continuously evolving system. The SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within the User's Guide and supported through a community forum (https://forums.ufscommunity.org/). -The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/latest/, while that specific to the SRW App v1.0.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.0/. The repository is at: https://github.com/ufs-community/ufs-srweather-app. +The UFS SRW App User's Guide associated with the development branch can be found at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v1.0.1 release can be found at: https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.1/. The GitHub repository link is: https://github.com/ufs-community/ufs-srweather-app. For instructions on how to clone the repository, build the code, and run the workflow, see: https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started -UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 - +UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 \ No newline at end of file From c0c79497148472e21c7bef5b7d6eb33fd9504b4f Mon Sep 17 00:00:00 2001 From: gspetro Date: Mon, 2 May 2022 15:08:28 -0400 Subject: [PATCH 116/118] add png and revert hpc-stack commits until PR#240 (mac docs) is approved --- docs/UsersGuide/source/BuildRunSRW.rst | 2 +- hpc-stack-mod | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index b1f5953ca4..0326ee79fa 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -734,7 +734,7 @@ Description of Workflow Tasks .. _WorkflowTasksFig: -.. figure:: _static/FV3LAM_wflow_flowchart_v2.svg +.. figure:: _static/FV3LAM_wflow_flowchart_v2.png *Flowchart of the workflow tasks* diff --git a/hpc-stack-mod b/hpc-stack-mod index 9ab5b476e9..0199b163a2 160000 --- a/hpc-stack-mod +++ b/hpc-stack-mod @@ -1 +1 @@ -Subproject commit 9ab5b476e97f83ace7def75254a361a9e1780484 +Subproject commit 0199b163a28d410524ebd9586699ca20620aa509 From 45f20354d118c757fa84e11e280e8f86bd982181 Mon Sep 17 00:00:00 2001 From: gspetro Date: Wed, 4 May 2022 12:55:51 -0400 Subject: [PATCH 117/118] jwolff edits --- docs/UsersGuide/source/BuildRunSRW.rst | 17 ++++++++++------- docs/UsersGuide/source/Components.rst | 2 +- docs/UsersGuide/source/ConfigNewPlatform.rst | 2 +- docs/UsersGuide/source/Glossary.rst | 6 +++--- docs/UsersGuide/source/Introduction.rst | 2 +- 5 files changed, 16 insertions(+), 13 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index a77e494f39..3946316ee6 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -638,10 +638,10 @@ For WCOSS_CRAY: Configure METplus Verification Suite (Optional) -------------------------------------------------- -Users who want to use the METplus verification suite to test their forecasts need to load the appropriate modules and add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. +Users who want to use the METplus verification suite to evaluate their forecasts need to load the appropriate modules and add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. .. note:: - METplus is preinstalled on `Level 1 `__ systems. METplus *installation* is currently not supported for this release of the SRW App, but METplus *use* is supported on systems with a functioning METplus installation. For more information about METplus, see :numref:`Section %s `. + METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on `Level 1 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. Once installed, METplus users must load the appropriate modules (changing the module location and MET version to correspond to their system's installation): @@ -657,7 +657,7 @@ Then, the path to the MET and METplus directories must be added to ``config.sh`` METPLUS_PATH="" MET_INSTALL_DIR="" -Users who have already staged the METplus verification data (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. +Users who have already staged the observation data needed for METplus (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. .. code-block:: console @@ -672,7 +672,7 @@ If users have access to NOAA HPSS but have not pre-staged the data, they can sim Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. -Next, the verification tasks must be turned on according to the user's needs. Users should add the some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: +Next, the verification tasks must be turned on according to the user's needs. Users should add some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: .. code-block:: console @@ -795,14 +795,17 @@ In addition to the baseline tasks described in :numref:`Table %s `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. -METplus is preinstalled on all `Level 1 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack`. Users on non-Level 1 systems can follow the `MET Installation `__ and `METplus Installation `__ Guides for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. +METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on all `Level 1 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack`. Users on non-Level 1 systems can follow the `MET Installation `__ and `METplus Installation `__ Guides for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index 399a7c3a99..29b2912978 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -364,7 +364,7 @@ Those requirements highlighted in **bold** are included in the NCEPLIBS-external * **netCDF (C and Fortran libraries)** * **HDF5** - * **ESMF** 8.0.0 + * **ESMF** 8.2.0 * **Jasper** * **libJPG** * **libPNG** diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst index 135b2c2a5e..ba07c66f26 100644 --- a/docs/UsersGuide/source/Glossary.rst +++ b/docs/UsersGuide/source/Glossary.rst @@ -7,7 +7,7 @@ Glossary .. glossary:: CCPA - Climatology-Calibrated Precipitation Analysis (CCPA) data. This data is required for use of the METplus verification suite within the SRW App. The most recent 8 days worth of data are publicly available and can be accessed `here `__. + Climatology-Calibrated Precipitation Analysis (CCPA) data. This data is required for METplus precipitation verification tasks within the SRW App. The most recent 8 days worth of data are publicly available and can be accessed `here `__. CCPP The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of codes containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. @@ -95,7 +95,7 @@ Glossary MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC). MRMS - Multi-Radar/Multi-Sensor (MRMS) System Analysis data. This data is required for use of the METplus verification suite within the SRW App. A two-day archive of precipitation, radar, and aviation and severe weather fields is publicly available and can be accessed `here `__. + Multi-Radar/Multi-Sensor (MRMS) System Analysis data. This data is required for METplus composite reflectivity or :term:`echo top` verification tasks within the SRW App. A two-day archive of precipitation, radar, and aviation and severe weather fields is publicly available and can be accessed `here `__. NAM `North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes. @@ -123,7 +123,7 @@ Glossary visualization. Stands for NCAR Command Language. More information can be found at https://www.ncl.ucar.edu. NDAS - :term:`NAM` Data Assimilation System (NDAS) data. This data is required for use of the METplus verification suite within the SRW App. The most recent 1-2 days worth of data are publicly available in PrepBufr format and can be accessed `here `__. The most recent 8 days of data can be accessed `here `__. + :term:`NAM` Data Assimilation System (NDAS) data. This data is required for METplus surface and upper-air verification tasks within the SRW App. The most recent 1-2 days worth of data are publicly available in PrepBufr format and can be accessed `here `__. The most recent 8 days of data can be accessed `here `__. NEMS The NOAA Environmental Modeling System is a common modeling framework whose purpose is diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 96aa65c1d2..b151fcc481 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -77,7 +77,7 @@ The `Unified Post Processor `__ (DTC) for use by the :term:`NWP` community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the unified METplus verification framework. The suite also includes the associated database and display systems called METviewer and METexpress. METplus spans a wide range of temporal and spatial scales. It is intended to be extensible through additional capabilities developed by the community. More details about METplus can be found in :numref:`Chapter %s ` and on the `METplus website `__. +The Model Evaluation Tools (MET) package is a set of statistical verification tools developed by the `Developmental Testbed Center `__ (DTC) for use by the :term:`NWP` community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the enhanced METplus verification framework. The suite also includes the associated database and display systems called METviewer and METexpress. METplus spans a wide range of temporal and spatial scales. It is intended to be extensible through additional capabilities developed by the community. More details about METplus can be found in :numref:`Chapter %s ` and on the `METplus website `__. Visualization Example ------------------------- From a02ab7df9e607c7803d1f6d29a4d53571deeec3c Mon Sep 17 00:00:00 2001 From: gspetro Date: Thu, 5 May 2022 11:29:18 -0400 Subject: [PATCH 118/118] add info on run_vx.local --- docs/UsersGuide/source/BuildRunSRW.rst | 17 +++++++++-------- 1 file changed, 9 insertions(+), 8 deletions(-) diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 3946316ee6..063ab79e98 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -638,19 +638,20 @@ For WCOSS_CRAY: Configure METplus Verification Suite (Optional) -------------------------------------------------- -Users who want to use the METplus verification suite to evaluate their forecasts need to load the appropriate modules and add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. +Users who want to use the METplus verification suite to evaluate their forecasts need to add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. -.. note:: +.. attention:: METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on `Level 1 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. -Once installed, METplus users must load the appropriate modules (changing the module location and MET version to correspond to their system's installation): +.. note:: + If METplus users update their METplus installation, they must update the module load statements in ``ufs-srweather-app/regional_workflow/modulefiles/tasks//run_vx.local`` file to correspond to their system's updated installation: -.. code-block:: console - - module use -a - module load met/<10.0.0> + .. code-block:: console + + module use -a + module load met/ -Then, the path to the MET and METplus directories must be added to ``config.sh``: +To use METplus verification, the path to the MET and METplus directories must be added to ``config.sh``: .. code-block:: console