Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates for HR4 tag #2914

Merged
merged 2 commits into from
Sep 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 16 additions & 23 deletions parm/config/gefs/config.ufs
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,8 @@ case "${fv3_res}" in
export nthreads_fv3_gfs=1
export nthreads_ufs=1
export nthreads_ufs_gfs=1
export xr_cnvcld=.false. # Do not pass conv. clouds to Xu-Randall cloud fraction
export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling
export xr_cnvcld=.false. # Do not pass conv. clouds to Xu-Randall cloud fraction
export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling
export cdmbgwd_gsl="40.0,1.77,1.0,1.0" # settings for GSL drag suite
export k_split=1
export n_split=4
Expand All @@ -104,8 +104,8 @@ case "${fv3_res}" in
export nthreads_fv3_gfs=1
export nthreads_ufs=1
export nthreads_ufs_gfs=1
export xr_cnvcld=".false." # Do not pass conv. clouds to Xu-Randall cloud fraction
export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling
export xr_cnvcld=".false." # Do not pass conv. clouds to Xu-Randall cloud fraction
export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling
export cdmbgwd_gsl="20.0,2.5,1.0,1.0" # settings for GSL drag suite
export knob_ugwp_tauamp=3.0e-3 # setting for UGWPv1 non-stationary GWD
export k_split=1
Expand Down Expand Up @@ -254,40 +254,33 @@ export ntasks_fv3_gfs
export ntasks_quilt
export ntasks_quilt_gfs

# Determine whether to use compression in the write grid component based on resolution
# Determine whether to use compression in the write grid component
# and whether to use parallel NetCDF based on resolution
case ${fv3_res} in
"C48" | "C96" | "C192" | "C384")
"C48" | "C96" | "C192")
zstandard_level=0
ideflate=0
quantize_nsd=0
OUTPUT_FILETYPE_ATM="netcdf"
OUTPUT_FILETYPE_SFC="netcdf"
;;
"C768" | "C1152" | "C3072")
"C384" | "C768" | "C1152" | "C3072")
zstandard_level=0
ideflate=1
quantize_nsd=5
;;
*)
echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}"
exit 15
;;
esac
export zstandard_level ideflate quantize_nsd

# Determine whether to use parallel NetCDF based on resolution
case ${fv3_res} in
"C48" | "C96" | "C192" | "C384")
OUTPUT_FILETYPE_ATM="netcdf"
OUTPUT_FILETYPE_SFC="netcdf"
;;
"C768" | "C1152" | "C3072")
OUTPUT_FILETYPE_ATM="netcdf_parallel"
OUTPUT_FILETYPE_SFC="netcdf_parallel"
if [[ "${fv3_res}" == "C384" ]]; then
OUTPUT_FILETYPE_SFC="netcdf" # For C384, the write grid component is better off with serial netcdf
else
OUTPUT_FILETYPE_SFC="netcdf_parallel"
fi
;;
*)
echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}"
exit 15
;;
esac
export zstandard_level ideflate quantize_nsd
export OUTPUT_FILETYPE_ATM OUTPUT_FILETYPE_SFC

# cpl defaults
Expand Down
31 changes: 12 additions & 19 deletions parm/config/gfs/config.ufs
Original file line number Diff line number Diff line change
Expand Up @@ -356,40 +356,33 @@ export ntasks_fv3_gfs
export ntasks_quilt_gdas
export ntasks_quilt_gfs

# Determine whether to use compression in the write grid component based on resolution
# Determine whether to use compression in the write grid component
# and whether to use parallel NetCDF based on resolution
case ${fv3_res} in
"C48" | "C96" | "C192" | "C384")
"C48" | "C96" | "C192")
zstandard_level=0
ideflate=0
quantize_nsd=0
OUTPUT_FILETYPE_ATM="netcdf"
OUTPUT_FILETYPE_SFC="netcdf"
;;
"C768" | "C1152" | "C3072")
"C384" | "C768" | "C1152" | "C3072")
zstandard_level=0
ideflate=1
quantize_nsd=5
;;
*)
echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}"
exit 15
;;
esac
export zstandard_level ideflate quantize_nsd

# Determine whether to use parallel NetCDF based on resolution
case ${fv3_res} in
"C48" | "C96" | "C192" | "C384")
OUTPUT_FILETYPE_ATM="netcdf"
OUTPUT_FILETYPE_SFC="netcdf"
;;
"C768" | "C1152" | "C3072")
OUTPUT_FILETYPE_ATM="netcdf_parallel"
OUTPUT_FILETYPE_SFC="netcdf_parallel"
if [[ "${fv3_res}" == "C384" ]]; then
OUTPUT_FILETYPE_SFC="netcdf" # For C384, the write grid component is better off with serial netcdf
else
OUTPUT_FILETYPE_SFC="netcdf_parallel"
fi
;;
*)
echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}"
exit 15
;;
esac
export zstandard_level ideflate quantize_nsd
export OUTPUT_FILETYPE_ATM OUTPUT_FILETYPE_SFC

# cpl defaults
Expand Down
8 changes: 4 additions & 4 deletions ush/forecast_postdet.sh
Original file line number Diff line number Diff line change
Expand Up @@ -233,14 +233,14 @@ EOF
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.atmf${FH3}.nc" "atmf${f_hhmmss}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${FH3}.nc" "sfcf${f_hhmmss}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.atm.logf${FH3}.txt" "log.atm.f${f_hhmmss}"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_atmf${FH3}.nc" "cubed_sphere_grid_atmf${f_hhmmss}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_sfcf${FH3}.nc" "cubed_sphere_grid_sfcf${f_hhmmss}.nc"
else
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.atmf${FH3}.nc" "atmf${FH3}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${FH3}.nc" "sfcf${FH3}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.atm.logf${FH3}.txt" "log.atm.f${FH3}"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_atmf${FH3}.nc" "cubed_sphere_grid_atmf${FH3}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_sfcf${FH3}.nc" "cubed_sphere_grid_sfcf${FH3}.nc"
if [[ "${DO_JEDIATMVAR:-}" == "YES" ]]; then
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_atmf${FH3}.nc" "cubed_sphere_grid_atmf${FH3}.nc"
${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_sfcf${FH3}.nc" "cubed_sphere_grid_sfcf${FH3}.nc"
fi
fi
if [[ "${WRITE_DOPOST}" == ".true." ]]; then
${NLN} "${COMOUT_ATMOS_MASTER}/${RUN}.t${cyc}z.master.grb2f${FH3}" "GFSPRS.GrbF${FH2}"
Expand Down
6 changes: 5 additions & 1 deletion ush/parsing_model_configure_FV3.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,11 @@ local WRITE_GROUP=${WRITE_GROUP:-1}
local WRTTASK_PER_GROUP=${WRTTASK_PER_GROUP:-24}
local ITASKS=1
local OUTPUT_HISTORY=${OUTPUT_HISTORY:-".true."}
local HISTORY_FILE_ON_NATIVE_GRID=".true."
if [[ "${DO_JEDIATMVAR:-}" == "YES" ]]; then
Copy link
Contributor

@DavidNew-NOAA DavidNew-NOAA Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be set in a config file, so that these lines in parsing_model_configure_FV3.sh become:

local HISTORY_FILE_ON_NATIVE_GRID=${history_file_on_native_grid:-".false."}

and history_file_on_native_grid is set in config.ufs

Maybe not, but I don't know what the design philosophy is and how all these scripts are supposed to fit together. It seems like some variables in this script are indeed coming from config.ufs.

local HISTORY_FILE_ON_NATIVE_GRID=".true."
else
local HISTORY_FILE_ON_NATIVE_GRID=".false."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick question, when HISTORY_FILE_ON_NATIVE_GRID=".false.", will the softlinks cubed_sphere_grid_atmf???.nc/cubed_sphere_grid_sfcf???.nc still be created?

Copy link
Contributor

@DavidNew-NOAA DavidNew-NOAA Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point @junwang-noaa . @aerorahul , logic should be added to forecast_postdet.sh where those soft links are created, depending on whether JEDI is being used.

Starting here:

${NLN} "${COMOUT_ATMOS_HISTORY}/${RUN}.t${cyc}z.cubed_sphere_grid_atmf${FH3}.nc" "cubed_sphere_grid_atmf${f_hhmmss}.nc"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch @junwang-noaa
I'll remove the linking for when this is false.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DavidNew-NOAA
Not sure how the cubed-sphere restarts got linked in the section where output_grid = Gaussian if-block.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aerorahul Yeah, that was a mistake on my part

Copy link
Contributor

@DavidNew-NOAA DavidNew-NOAA Sep 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aerorahul Well, maybe it was more sloppiness. That OUTPUT_GRID parameter is somewhat useless, because one cannot write only cubed-sphere histories the way UFS is configured. You either write Gaussian histories or both Gaussian and cubed-sphere histories, depending on how you set HISTORY_FILE_ON_NATIVE_GRID

fi
local WRITE_DOPOST=${WRITE_DOPOST:-".false."}
local WRITE_NSFLIP=${WRITE_NSFLIP:-".false."}
local NUM_FILES=${NUM_FILES:-2}
Expand Down
Loading