Skip to content

Using GDASApp in Standalone Mode: Atmospheric applications

Azadeh Gholoubi edited this page Oct 12, 2022 · 2 revisions

The GDASApp is designed for two use cases:

  • Within NOAA-EMC/global-workflow to support GFS/GDAS cycled forecasts
  • As a standalone app to run forward operators or variational DA for evaluation/development purposes

This page will describe how to build and run the GDASApp on supported platforms and will work out of the box for certain time periods for which input datasets are staged.

Building GDASApp on Supported Platforms

Currently supported platforms

  • NOAA RDHPCS Orion
  • NOAA RDHPCS Hera

Clone GDASApp

git clone https://github.com/NOAA-EMC/GDASApp.git

Build GDASApp

There are two methods to build GDASApp, first the preferred way. With no modules loaded (besides git/2.28.0 on orion):

cd GDASApp
./build.sh -t $platform

where $platform is one of the supported options: orion or hera.

This will load all necessary modules, create a build directory, run ecbuild to grab JEDI repositories, and build all libraries and executables needed for the GDASApp.

If you get a message like $PS1 unbound variable when running the build script, try this alternative approach:

module use modulefiles
module load GDAS/$platform
./build.sh

This will run the build script in the generic mode assuming that all necessary libraries are already available through the loading of the platform specific lua modules.

The build process will take some time, so grab a coffee and check your email, and come back in ~30-60 mins to ensure everything has been built properly.

Running GDASApp in Standalone Mode

Standalone Run Driver Script

In standalone mode, there is one script to be executed by the user that will create a working directory, generate YAML from templates, stage necessary files, and create and submit a batch job to run the specified JEDI executable needed for the GDASApp.

ush/run_jedi_exe.py

To run this script, first ensure that you have the proper python virtual environment loaded. The easiest way to do this is just to load all of the necessary modules for your supported platform:

module use /path/to/clone/of/GDASApp/modulefiles
module load GDAS/$platform

Once your environment is setup, just run the script with an argument pointing towards the input YAML file for your case (more information on that YAML file below)

./run_jedi_exe.py --config /path/to/my/gdasapp_yamlfile.yaml

This should produce lots of output to the screen, and if all goes well, it should submit a job to the batch scheduler.

YAML File Examples

There are several example YAML files in the GDASApp repository: ush/examples/run_jedi_exe

Let us take a look at one with details on each entry below:

working directory: /work2/noaa/stmp/cmartin/gdas_single_test_hofx3d
GDASApp home: /work2/noaa/da/cmartin/GDASApp/work/GDASApp
GDASApp mode: hofx
executable options:
  obs_yaml_dir: /work2/noaa/da/cmartin/GDASApp/work/GDASApp/parm/atm/obs/config
  yaml_template: /work2/noaa/da/cmartin/GDASApp/work/GDASApp/parm/atm/hofx/hofx_nomodel.yaml
  executable: /work2/noaa/da/cmartin/GDASApp/work/GDASApp/build/bin/fv3jedi_hofx_nomodel.x
  obs_list: /work2/noaa/da/cmartin/GDASApp/work/GDASApp/parm/atm/obs/lists/gdas_prototype.yaml
  gdas_fix_root: /work2/noaa/da/cmartin/GDASApp/fix
  atm: true
  layout_x: 4
  layout_y: 4
  atm_window_length: PT6H
  valid_time: 2021-08-01T00:00:00Z
  dump: gdas
  case: C768
  levs: 128
  interp_method: barycentric
job options:
  machine: orion
  account: da-cpu
  queue: debug
  partition: debug
  walltime: '30:00'
  ntasks: 96
  modulepath: /work2/noaa/da/cmartin/GDASApp/work/GDASApp/modulefiles

The exact list of options may depend on the mode and use case, the example shown here is for 3D H(x). Now let's break down each of these lines individually:

  • working directory: - this is the path to where all files will be staged, and output will reside, this must be writable to you, the user
  • GDASApp home: - the path to where you cloned the GDASApp repository
  • GDASApp mode: - this tells the driver script which model (hofx or variational) you wish to run in
  • executable options: - this section contains all of the options for running GDASApp in standalone mode
    • obs_yaml_dir: - this is the path to where all of the YAML templates for each observation type that contain forward operator, QC, etc. information reside (the default should be /path/to/GDASApp/parm/atm/obs/config)
    • yaml_template: - this is the path to the high level YAML template to use for your standalone run (it should be consistent with GDASApp mode and executable options)
    • executable: - this is the path to which JEDI executable you wish to run (currently supported are fv3jedi_hofx_nomodel.x, fv3jedi_hofx.x and fv3jedi_var.x)
    • obs_list: - the path to a YAML file containing a list of templated paths to the observation YAML templates (that contain forward operator, QC, etc. information)
    • gdas_fix_root: - the path to staged fix files, this should not be changed except by advanced users and should always be pointing to one location on each supported platform
    • atm: - true or false, currently only true is supported, but this is for future support for soca and aero
    • layout_x and layout_y - FV3 specific variables that define how many subdomains in each direction each model tile is split into. These two numbers multiplied together and then multiplied by 6 must equal the ntasks option below.
    • atm_window_length: - length of the assimilation window (e.g. PT6H for six hours)
    • valid_time: - ISO 8601 string of the analysis/valid time
    • dump: - which dump/type of cycle to run (gdas or gfs)
    • case: - resolution of model backgrounds (C768 for example)
    • levs: - number of model vertical levels (128 normally)
    • interp_method: - select the horizontal interpolation method for the forward operators
  • job options: - configuration for the batch job submission
    • machine: - name of machine, only orion and hera are supported currently
    • account: - name of CPU account to charge job to (e.g. da-cpu)
    • queue: - queue to submit to
    • partition: - partition to submit to
    • walltime: - amount of wall clock time to request for the job
    • ntasks: - number of CPUs for job, must be consistent with layout_x and layout_y for FV3-based jobs
    • tasks-per-node: - (optional) number of tasks per node (defaults: hera=18, orion=24, other machines=12)
    • modulepath: - path to modulefiles, should normally be /path/to/clone/GDASApp/modulefiles

Working Directory Details

A ls -l on your working directory would result in something like this:

lrwxrwxrwx 1 cmartin stmp      44 Jul  7 12:29 berror -> /work2/noaa/da/cmartin/GDASApp/fix/bump/C768
drwxr-s--- 2 cmartin stmp    4096 Jul  7 10:50 bkg
lrwxrwxrwx 1 cmartin stmp      45 Jul  7 12:29 crtm -> /work2/noaa/da/cmartin/GDASApp/fix/crtm/2.3.0
drwxr-s--- 4 cmartin stmp    4096 Jul  7 10:50 Data
drwxr-s--- 2 cmartin stmp    4096 Jul  7 12:31 diags
lrwxrwxrwx 1 cmartin stmp      76 Jul  7 12:29 fv3jedi_hofx_nomodel.x -> /work2/noaa/da/cmartin/GDASApp/work/GDASApp/build/bin/fv3jedi_hofx_nomodel.x
-rw-r----- 1 cmartin stmp 1024050 Jul  7 12:31 GDASApp.o6130151
-rw-r----- 1 cmartin stmp   16879 Jul  7 12:35 gdas_hofx.yaml
-rw-r----- 1 cmartin stmp   24252 Jul  7 12:41 logfile.000000.out
drwxr-s--- 2 cmartin stmp    4096 Jul  7 10:50 obs
-rw-r----- 1 cmartin stmp     508 Jul  7 12:39 submit_job.sh
  • berror is a symbolic link to the staged background error files needed for variational DA
  • bkg contains model backgrounds copied from R2D2
  • crtm is a symbolic link to the staged CRTM coefficient files needed by CRTM
  • Data contains FV3-JEDI miscellaneous input fix files
  • diags will be where the output in observation space (e.g. H(x), OmF) netCDF IODA files are written to
  • anl (not shown) will be the output from variational DA (increments or analysis)
  • bc (not shown) will be the output VarBC files (if applicable)
  • fv3jedi_hofx_nomodel.x is a symbolic link to the executable specified in your YAML file
  • GDASApp.oNNNNNNN is the logfile of the output from the JEDI executable
  • gdas_hofx.yaml (or gdas_variational.yaml) is the YAML file that is provided as input to the JEDI executable
  • logfile.000000.out is a logfile from FMS produced by FV3
  • obs contains observation input files (IODA format) and bias correction input files copied from R2D2
  • submit_job.sh is the batch job submission script produced by run_jedi_exe.py

Staged Input Datasets

As of July 8, 2022, the following are available for testing:

  • C768 model backgrounds, bias correction and obs for H(x) evaluation for August 1 00z through August 7 18z
  • C96 model backgrounds, bias correction and obs for prototype variational analyses for December 21 (00-18z)