Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[develop] Add workflow for radar reflectivity processing. #807

Merged
merged 52 commits into from
May 30, 2023
Merged
Show file tree
Hide file tree
Changes from 50 commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
7cbc445
First draft, doesnt work
mkavulich Mar 23, 2023
340f9ab
Emergency commit, still not working
mkavulich Mar 23, 2023
5aa39f6
Mods to retrieve_data script for retrieving observations
mkavulich Mar 24, 2023
61af001
First draft of get obs exscript. Modified data_locations to look for
mkavulich Mar 25, 2023
01fb25b
First version that makes a correct XML! Still a few kinks to work out:
mkavulich Mar 25, 2023
5429625
It works! Just needed to specify HPSS partition
mkavulich Mar 25, 2023
d0472d8
Start new workflow yaml "warmstart.yaml" that will be used for GSI/DA…
mkavulich Apr 10, 2023
7471a0a
Prevent user from entering empty --tests argument to run_WE2E_tests.py
mkavulich Apr 10, 2023
9d1b37b
Fix incorrect path after rebase, add alternate path for older RAP data
mkavulich Apr 11, 2023
d59885c
Starting to link obs to tasks
mkavulich Apr 20, 2023
158b7e1
Some better design choices after putting some more thought into it
mkavulich Apr 20, 2023
24aac12
Data now successfully staged for process_bufr task!
mkavulich Apr 20, 2023
a99fd5b
Add NLDN lightning data to get_da_obs
mkavulich Apr 20, 2023
6679d59
Adding NLDN lightning to get_da_obs. Not fully tested yet, need to mo…
mkavulich Apr 22, 2023
64dbc63
Use "else" statement in final try block for generate_FV3LAM_wflow.py …
mkavulich Apr 22, 2023
1f1c497
Hot diggity, got the process_bufr and process_lightning tasks working…
mkavulich Apr 22, 2023
252eade
Untested first pass at defining the early workflow
christinaholtNOAA Apr 11, 2023
a612a0f
Partial incorporation of Christina's changes from #725
mkavulich Apr 24, 2023
870722e
CYCLE_TYPE is a run-time arg.
christinaholtNOAA Apr 28, 2023
769c1af
Make paths and data stores configurable.
christinaholtNOAA Apr 28, 2023
de99914
Remove duplicative information from machine files.
christinaholtNOAA Apr 28, 2023
4f56a54
Renaming process_bufr to process_bufrobs.
christinaholtNOAA Apr 28, 2023
412fa6d
Working prepbufr getting and processing workflow.
christinaholtNOAA Apr 28, 2023
5530100
This version of rrfs_beta might work.
christinaholtNOAA Apr 28, 2023
c9fe408
Turn it off since we can't use it.
christinaholtNOAA Apr 29, 2023
ec0eae1
Merge remote-tracking branch 'upstream/develop' into feature/add_get_…
mkavulich May 1, 2023
f2aca37
Updates to fix failing unit tests
mkavulich May 2, 2023
7e3171e
WIP. Waiting on HPSS maintenance.
christinaholtNOAA May 2, 2023
7ae52b5
Retrieve data task works here.
christinaholtNOAA May 5, 2023
c88704a
Merge remote-tracking branch 'upstream/develop' into feature/add_get_…
mkavulich May 6, 2023
5ab55b8
Update get_da_obs modulefiles for latest develop changes
mkavulich May 8, 2023
5bef9d4
Forgot to add CI sckip for new HPSS tests
mkavulich May 8, 2023
445c003
Remove UFS case study skip, will go with propsal from #776
mkavulich May 8, 2023
54df375
Merge remote-tracking branch 'upstream/develop' into feature/add_get_…
mkavulich May 8, 2023
0c4e3c6
Add command-line instructions for unit tests
mkavulich May 10, 2023
cbccf74
Change "two days ago" tests to "three days ago"...occasionally some d…
mkavulich May 10, 2023
751a11f
Remove references to current working directory; this causes occasiona…
mkavulich May 10, 2023
b158542
Allow this sort of similarity in tests.
christinaholtNOAA May 10, 2023
69c6abd
Merge remote-tracking branch 'upstream/develop' into feature/add_get_…
mkavulich May 15, 2023
f729373
Merge branch 'feature/add_get_obs_task' into add_radar_obs
christinaholtNOAA May 16, 2023
de5560c
Handle getting data in real time.
christinaholtNOAA May 16, 2023
b119d3f
Add back accidentally deleted file.
christinaholtNOAA May 16, 2023
50e4fb3
Process radar refl runs.
christinaholtNOAA May 16, 2023
52591c3
Testing new function for getting disk files.
christinaholtNOAA May 16, 2023
c7e9b04
Merge remote-tracking branch 'origin/develop' into add_radar_obs
christinaholtNOAA May 19, 2023
d668a0b
Fix a bad merge on this renamed file.
christinaholtNOAA May 19, 2023
aaa987f
Update ush/machine/jet.yaml
christinaholtNOAA May 22, 2023
e773db0
Update ush/machine/jet.yaml
christinaholtNOAA May 22, 2023
1689e43
Attempt to fix failure with nco input data.
christinaholtNOAA May 22, 2023
4d7ece9
Merge remote-tracking branch 'crh/add_radar_obs' into add_radar_obs
christinaholtNOAA May 22, 2023
c36dccc
Addressing Eddie's comments.
christinaholtNOAA May 30, 2023
48856d6
Merge remote-tracking branch 'origin/develop' into add_radar_obs
christinaholtNOAA May 30, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions jobs/JREGIONAL_GET_DA_OBS
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,9 @@ This script retrieves observation data for RRFS data assimilation tasks.
#-----------------------------------------------------------------------
#
export DATA="${COMIN}/obs"
export RADAR_DATA="${COMIN}/radar"
mkdir_vrfy -p "${DATA}"
mkdir_vrfy -p "${RADAR_DATA}"

# Set needed date/time variables
export START_DATE=$(echo "${PDY} ${cyc}")
Expand Down
27 changes: 25 additions & 2 deletions parm/data_locations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ FV3GFS:
- gpfs_dell1_nco_ops_com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
- com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
- com_gfs_v16.2_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
- com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
- com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
fcst:
- gpfs_dell1_nco_ops_com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
- com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_pgrb2.tar
Expand All @@ -111,7 +111,7 @@ FV3GFS:
- ['gpfs_dell1_nco_ops_com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_nca.tar', 'gpfs_dell1_nco_ops_com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_ncb.tar']
- ['com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_nca.tar', 'com_gfs_prod_gfs.{yyyymmdd}_{hh}.gfs_ncb.tar']
- ['com_gfs_v16.2_gfs.{yyyymmdd}_{hh}.gfs_nca.tar', 'com_gfs_v16.2_gfs.{yyyymmdd}_{hh}.gfs_ncb.tar']
- ['com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_nca.tar', 'com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_ncb.tar']
- ['com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_nca.tar', 'com_gfs_v16.3_gfs.{yyyymmdd}_{hh}.gfs_ncb.tar']
file_names:
<<: *gfs_file_names
aws:
Expand Down Expand Up @@ -316,6 +316,28 @@ GFS_obs:
obs:
- "{yy}{jjj}{hh}00.gfs.t{hh}z.syndata.tcvitals.tm00"

NSSL_mrms:

hpss:
protocol: htar
archive_format: tar
archive_path:
- /NCEPPROD/hpssprod/runhistory/rh{yyyy}/{yyyymm}/{yyyymmdd}
- /NCEPPROD/hpssprod/runhistory/rh{yyyy}/{yyyymm}/{yyyymmdd}
- /NCEPPROD/hpssprod/runhistory/rh{yyyy}/{yyyymm}/{yyyymmdd}
- /NCEPPROD/hpssprod/runhistory/rh{yyyy}/{yyyymm}/{yyyymmdd}
archive_internal_dir:
- ./upperair/mrms/conus/MergedReflectivityQC
archive_file_names:
- dcom_ldmdata_obs.tar
- dcom_prod_ldmdata_obs.tar
- ldmdata.tide.{yyyymmdd}.tar
- ldmdata.gyre.{yyyymmdd}.tar
file_names:
obs:
- "*MergedReflectivityQC_*_{yyyymmdd}-{hh}{min}*.grib2*"


RAP_obs:
hpss:
protocol: htar
Expand Down Expand Up @@ -367,6 +389,7 @@ RAP_obs:
- "{yyyymmddhh}.rap_e.t{hh}z.lgycld.tm00.bufr_d"
- "{yyyymmddhh}.rap_e.t{hh}z.nexrad.tm00.bufr_d"
- "{yyyymmddhh}.rap_e.t{hh}z.satwnd.tm00.bufr_d"
- "{yyyymmddhh}.rap_e.t{hh}z.lghtng.tm00.bufr_d"
aws:
protocol: download
url: https://noaa-rap-pds.s3.amazonaws.com/rap.{yyyymmdd}
Expand Down
2 changes: 2 additions & 0 deletions parm/wflow/da_data_preproc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ default_data_preproc_task: &default_preproc
subcyc: !cycstr "@M"
LOGDIR: !cycstr "&LOGDIR;"
CYCLE_TYPE: '#cycle_type#'
nprocs: '{{ parent.nnodes * parent.ppn }}'
native: '{{ platform.SCHED_NATIVE_CMD }}'
nodes: '{{ nnodes }}:ppn={{ ppn }}'
nnodes: 1
Expand Down Expand Up @@ -44,6 +45,7 @@ metatask_process_obs_cycle_type:
task_process_radarref_#cycle_type#:
<<: *default_preproc
command: '&LOAD_MODULES_RUN_TASK_FP; "process_obs" "&JOBSdir;/JREGIONAL_PROCESS_RADARREF"'
nnodes: 2
ppn: 24
join: !cycstr '&LOGDIR;/{{ jobname }}_@Y@m@d@H&LOGEXT;'
dependency:
Expand Down
69 changes: 65 additions & 4 deletions scripts/exregional_get_da_obs.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@
#-----------------------------------------------------------------------
#
. $USHdir/source_util_funcs.sh
source_config_for_task "task_get_da_obs" ${GLOBAL_VAR_DEFNS_FP}
. $USHdir/get_mrms_files.sh
source_config_for_task "task_get_da_obs|task_process_radarref" ${GLOBAL_VAR_DEFNS_FP}
#
#-----------------------------------------------------------------------
#
Expand Down Expand Up @@ -120,9 +121,9 @@ The command was:
${cmd}
"
# Link to GSI-expected filenames
mv_vrfy "${DATA}/${template_arr[0]}" "${DATA}/lghtngbufr"
mv_vrfy "${DATA}/${template_arr[1]}" "${DATA}/lgycld.bufr_d"
mv_vrfy "${DATA}/${template_arr[2]}" "${DATA}/prepbufr"
ln_vrfy "${DATA}/${template_arr[0]}" "${DATA}/lghtngbufr"
ln_vrfy "${DATA}/${template_arr[1]}" "${DATA}/lgycld.bufr_d"
ln_vrfy "${DATA}/${template_arr[2]}" "${DATA}/prepbufr"

#
#-----------------------------------------------------------------------
Expand Down Expand Up @@ -184,6 +185,66 @@ if [ "${NLDN_NEEDED:-}" = "TRUE" ]; then
done

fi
#
#-----------------------------------------------------------------------
#
# retrieve NSSL Mosaic Reflectivity observations
#
#-----------------------------------------------------------------------
#


# If files are available on disk, copy them here in the code. This
# approach is used here because of its efficiency compared to repeated
# calls to retrieve_data.py for this observation file type.

#
#-----------------------------------------------------------------------
#
# Check for files on disk first
#
#-----------------------------------------------------------------------
#
mrms="MergedReflectivityQC"

for timelevel in ${RADARREFL_TIMELEVEL[@]}; do
echo "timelevel = ${timelevel}"
timelevel=$( printf %2.2i $timelevel )
radar_output_path=${RADAR_DATA}/${timelevel}
mkdir -p $radar_output_path

#-----------------------------------------------------------------------
# copy observation files to staging location
#-----------------------------------------------------------------------
if [ -n "${NSSLMOSAIC:-}" ] ; then
get_mrms_files $timelevel $radar_output_path $mrms
fi

# Check to see if files were retrieved from disk
# Try other resources, if not
if [ ! -s ${radar_output_path}/filelist_mrms ]; then

cmd="
python3 -u ${USHdir}/retrieve_data.py \
--debug \
--file_set obs \
--config ${PARMdir}/data_locations.yml \
--cycle_date ${PDY}${cyc}${timelevel} \
--data_stores ${EXTRN_MDL_DATA_STORES} \
--data_type NSSL_mrms \
--output_path ${radar_output_path}
"

$cmd || print_err_msg_exit "\
Call to retrieve_data.py failed with a non-zero exit status.

The command was:
${cmd}
"
ls $radar_output_path/*${mrms}* > $radar_output_path/filelist_mrms
fi
done


#
#-----------------------------------------------------------------------
Expand Down
110 changes: 38 additions & 72 deletions scripts/exregional_process_radarref.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
#-----------------------------------------------------------------------
#
. $USHdir/source_util_funcs.sh
. $USHdir/get_mrms_files.sh
source_config_for_task "task_process_radarref|task_run_fcst" ${GLOBAL_VAR_DEFNS_FP}
#
#-----------------------------------------------------------------------
Expand Down Expand Up @@ -53,8 +54,6 @@ with FV3 for the specified cycle.
#
eval ${PRE_TASK_CMDS}

nprocs=$(( NNODES_PROCESS_RADARREF*PPN_PROCESS_RADARREF))

#
#-----------------------------------------------------------------------
#
Expand All @@ -80,17 +79,6 @@ DD=${YYYYMMDDHH:6:2}
#-----------------------------------------------------------------------
#
BKTYPE=0
if [ ${DO_SPINUP} == "TRUE" ]; then
if [ ${CYCLE_TYPE} == "spinup" ]; then
if [[ ${CYCL_HRS_SPINSTART[@]} =~ "$cyc" ]] ; then
BKTYPE=1
fi
fi
else
if [[ ${CYCL_HRS_PRODSTART[@]} =~ "$cyc" ]] ; then
BKTYPE=1
fi
fi

n_iolayouty=$(($IO_LAYOUT_Y-1))

Expand All @@ -103,84 +91,62 @@ n_iolayouty=$(($IO_LAYOUT_Y-1))
#-----------------------------------------------------------------------
#
print_info_msg "$VERBOSE" "
Getting into working directory for radar reflectivity process ..."
Getting into working directory for radar reflectivity process ... ${DATA}"

pregen_grid_dir=$DOMAIN_PREGEN_BASEDIR/${PREDEF_GRID_NAME}
print_info_msg "$VERBOSE" "pregen_grid_dir is $pregen_grid_dir"

for timelevel in ${RADARREFL_TIMELEVEL[@]}; do
echo "timelevel = ${timelevel}"
timelevel=$( printf %2.2i $timelevel )
mkdir_vrfy ${DATA}/${timelevel}
cd ${DATA}/${timelevel}

pregen_grid_dir=$DOMAIN_PREGEN_BASEDIR/${PREDEF_GRID_NAME}

print_info_msg "$VERBOSE" "pregen_grid_dir is $pregen_grid_dir"
mkdir_vrfy -p ${DATA}/${timelevel}
cd_vrfy ${DATA}/${timelevel}

#
#-----------------------------------------------------------------------
#
# link or copy background files
#
#-----------------------------------------------------------------------
#
#-----------------------------------------------------------------------
#
# copy background files
#
#-----------------------------------------------------------------------

if [ ${BKTYPE} -eq 1 ]; then
cp_vrfy ${pregen_grid_dir}/fv3_grid_spec fv3sar_grid_spec.nc
cp -f ${pregen_grid_dir}/fv3_grid_spec fv3sar_grid_spec.nc
else
if [ "${IO_LAYOUT_Y}" == "1" ]; then
cp_vrfy ${pregen_grid_dir}/fv3_grid_spec fv3sar_grid_spec.nc
cp -f ${pregen_grid_dir}/fv3_grid_spec fv3sar_grid_spec.nc
else
for iii in $(seq -w 0 $(printf %4.4i $n_iolayouty))
do
cp_vrfy ${pregen_grid_dir}/fv3_grid_spec.${iii} fv3sar_grid_spec.nc.${iii}
cp -f ${pregen_grid_dir}/fv3_grid_spec.${iii} fv3sar_grid_spec.nc.${iii}
done
fi
fi

#
#-----------------------------------------------------------------------
#
# link/copy observation files to working directory
#
#-----------------------------------------------------------------------

NSSL=${OBSPATH_NSSLMOSIAC}
#
#-----------------------------------------------------------------------
#
# copy observation files to working directory, if running in real
# time. otherwise, the get_da_obs data task will do this for you.
#
#-----------------------------------------------------------------------

mrms="MergedReflectivityQC"

# Link to the MRMS operational data
echo "timelevel = ${timelevel}"
echo "RADARREFL_MINS = ${RADARREFL_MINS[@]}"

# Link to the MRMS operational data
# This loop finds files closest to the given "timelevel"
for min in ${RADARREFL_MINS[@]}
do
min=$( printf %2.2i $((timelevel+min)) )
echo "Looking for data valid:"${YYYY}"-"${MM}"-"${DD}" "${cyc}":"${min}
sec=0
while [[ $sec -le 59 ]]; do
ss=$(printf %2.2i ${sec})
nsslfile=${NSSL}/*${mrms}_00.50_${YYYY}${MM}${DD}-${cyc}${min}${ss}.${OBS_SUFFIX}
if [ -s $nsslfile ]; then
echo 'Found '${nsslfile}
nsslfile1=*${mrms}_*_${YYYY}${MM}${DD}-${cyc}${min}*.${OBS_SUFFIX}
numgrib2=$(ls ${NSSL}/${nsslfile1} | wc -l)
echo 'Number of GRIB-2 files: '${numgrib2}
if [ ${numgrib2} -ge 10 ] && [ ! -e filelist_mrms ]; then
cp ${NSSL}/${nsslfile1} .
ls ${nsslfile1} > filelist_mrms
echo 'Creating links for ${YYYY}${MM}${DD}-${cyc}${min}'
fi
fi
((sec+=1))
done
done
if [ "${DO_REAL_TIME}" = true ] ; then
get_mrms_files $timelevel "./" $mrms
else
# The data was staged by the get_da_obs task, so copy from COMIN.
# Use copy here so that we can unzip if necessary.
cp_vrfy ${COMIN}/radar/${timelevel}/* .
fi # DO_REAL_TIME

if [ -s filelist_mrms ]; then

if [ ${OBS_SUFFIX} == "grib2.gz" ]; then
gzip -d *.gz
mv filelist_mrms filelist_mrms_org
ls MergedReflectivityQC_*_${YYYY}${MM}${DD}-${cyc}????.grib2 > filelist_mrms
fi
# Unzip files, if that's needed and update filelist_mrms
if [ $(ls *.gz 2> /dev/null | wc -l) -gt 0 ]; then
gzip -d *.gz
mv filelist_mrms filelist_mrms_org
ls ${mrms}_*_${YYYY}${MM}${DD}-${cyc}????.grib2 > filelist_mrms
fi

numgrib2=$(more filelist_mrms | wc -l)
print_info_msg "$VERBOSE" "Using radar data from: `head -1 filelist_mrms | cut -c10-15`"
Expand All @@ -189,7 +155,7 @@ for timelevel in ${RADARREFL_TIMELEVEL[@]}; do
# remove filelist_mrms if zero bytes
rm -f filelist_mrms

echo "WARNING: Not enough radar reflectivity files available for loop ${timelevel}."
echo "WARNING: Not enough radar reflectivity files available for timelevel ${timelevel}."
continue
fi

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ workflow:
PREDEF_GRID_NAME: RRFS_CONUS_3km
DATE_FIRST_CYCL: '2022072000'
DATE_LAST_CYCL: '2022072000'

PREEXISTING_DIR_METHOD: rename
task_process_radarref:
RADARREFL_TIMELEVEL: [0]
rocoto:
entities:
START_TIME_NSSLMOSAIC: "00:45:00"
Expand All @@ -20,5 +22,4 @@ rocoto:
tasks:
taskgroups: '{{ ["parm/wflow/da_data_preproc.yaml"]|include }}'
metatask_process_obs_cycle_type:
task_process_radarref_#cycle_type#:
task_process_lightning_#cycle_type#:
1 change: 0 additions & 1 deletion ush/config.da_cycling.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,6 @@ rocoto:
tasks:
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/da_data_preproc.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml"]|include }}'
metatask_process_obs_cycle_type:
task_process_radarref_#cycle_type#:
task_process_lightning_#cycle_type#:
metatask_run_ensemble:
task_run_fcst_mem#mem#:
Expand Down
Loading