-
Notifications
You must be signed in to change notification settings - Fork 251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UFS-WM testing w/ spack-stack #1651
Comments
@ulmononian Do you have any update regarding spack stack docker container? Library update for hdf-1.14.0/netcdf-4.9.1/esmf-8.4.1/mapl-2.35.2 is on-going priority. Following the update, EPIC needs to maintain the container used for Jenkins-CI pipeline in real time. Please, let me know if we need a quick tag-up for this. |
the ufs-wm container based on the spack-stack unified environment package versions/variants will be delivered shortly after the release of for the interim, if you are interested, please have a look at the JEDI Skylab containers, which utilize spack-stack. they are available with (i) clang/mpich, and (ii) gnu/openmpi. note that Skylab has a different version of |
What about spack stack itself ? Is it going to have debug versions of MAPL or ESMF ? |
Yes it’s got both as part of the unified environment on the HPCs. Consistent builds of ESMF debug with MAPL debug, and ESMF release with MAPL release
… On Mar 10, 2023, at 8:24 AM, JONG KIM ***@***.***> wrote:
What about spack stack itself ? Is it going to have debug versions of MAPL or ESMF ?
—
Reply to this email directly, view it on GitHub <#1651 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AB5C2RN4DLIFIDD3LZSWIYLW3NBUXANCNFSM6AAAAAAVV7QPPY>.
You are receiving this because you were mentioned.
|
if you look at the beta version of the unified environment i shared in the issue description, you will see |
We should stop building debug versions of esmf (and mapl). |
Yes please! And even more important get rid of the annoying I_MPI debug library requirement.
… On Mar 10, 2023, at 8:55 AM, Dusan Jovic ***@***.***> wrote:
We should stop building debug versions of esmf (and mapl).
—
Reply to this email directly, view it on GitHub <#1651 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AB5C2RMXTHGGYD4NIF25CS3W3NFI7ANCNFSM6AAAAAAVV7QPPY>.
You are receiving this because you were mentioned.
|
@rhaesung @ulmononian @yichengt90 can we make a quick build test with land DA and noah-mp as well? |
do you mean ensure the land DA / noah-mp system builds & runs using the unified environment? or to add a specific land DA env into the unified environment (as is done for global workflow, srw, ufs-wm, etc.)? |
land DA/noah-mp build cases need both features of jedi and ufs-wm environments. just build test with land da release branch. But I hope to follow on noah-mp component build along with that. Let me know if we need a quick tag-up. |
the unified environment contains all necessary modules for building any of the jedi bundles. for example, in the case of land DA which currently uses the the land DA system (cloned/built from https://github.com/NOAA-EPIC/land-offline_workflow/tree/release/public-v1.0.0) was run for 2016 case using this UE-built fv3-bundle and a modified |
@ulmononian there is a re-syncing issue on land da side (NOAA-PSL/land-offline_workflow#29). Is it possible to install a similar version of this spack stack on hera? |
@rhaesung FYI |
a beta installation of the UE on hera is underway. i will share the path and updated |
@jkbk2004 @rhaesung i installed a beta UE on hera here: an updated modulefile for land DA on hera can be found here https://github.com/ulmononian/land-offline_workflow/blob/release/public-v1.0.0/modulefiles/landda_hera.intel.lua. |
Can we close this issue? |
Why is this issue still open? |
this can be closed! |
Description
As the transition from hpc-stack to spack-stack is ongoing (e.g., #1448, #1621, Acorn spack testing, spack-stack #454, spack-stack #478) a new spack-based Unified Environment (UE) has been developed to help facilitate the switch. This environment contains a "unified" set of compiler+MPI (Intel & GNU), libraries/packages, and modules to support the UFS-WM and various related apps (e.g., global-workflow, SRW, JEDI Skylab, and GSI).
The preliminary (beta) installation has been installed by @climbfuji here on Orion:
/work2/noaa/da/role-da/spack-stack-feature-r2d2-mysql/envs/unified-4.0.0-rc1/install
and can be loaded via:module use /work2/noaa/da/role-da/spack-stack-feature-r2d2-mysql/envs/unified-4.0.0-rc1/install/modulefiles/Core
module av
An initial testing round of the UFS-WM (as well as the global-workflow, SRW, SkyLab, and GSI) using the UE has been completed on Orion (@mark-a-potts successfully completed the full rt.sh suite w/ a new baseline). For a recent sample compile/run of cpld_control_p8, see:
/work/noaa/stmp/cbook/stmp/cbook/FV3_RT/rt_198650
). Some additional UFS-WM RTs have been performed with the UE on Parallel Works - AWS (e.g., cpld_control_c48); however, this testing is ongoing in collaboration with @yichengt90 / @clouden90.Solution
Upon release of [email protected], the Unified Environment will be installed in official NOAA-EPIC & JCSDA locations on these spack-stack pre-configured sites. Given that, testing of the UFS-WM with the spack-stack UE will need to be expanded significantly. Ideally, the full set of RTs should be run on each machine; new baselines will more than likely be required.
Module files will need to be updated concomitantly with this testing (e.g.: https://github.com/ulmononian/ufs-weather-model/blob/test_spack/modulefiles/ufs_orion.intel.lua). For running on the cloud, various modifications also need to be made to the RT scripts and configuration files (i.e.: #1650; see https://github.com/ulmononian/ufs-weather-model/tree/feature/noaacloud_rt).
Further, ESMF library naming and linking needs to be addressed (see #1498), but is currently handled in spack via NOAA-EMC/spack #238. Note that recently merged PR #1645 addressed the removal of the static parallelio requirement, which is pertinent to implementing spack-stack as the UE uses shared parallelio (with an exception for operational machines).
This issue can be used to track some of the testing (successes & failures!) and hopefully facilitate some discussion about the transition.
Related to
may help address #1147, #1448
pertains to #1621
Butterfly test results look good: cpld_control_p8. Comparison of 500mb temperature impact between this PR and develop branch is here:
Originally posted by @jkbk2004 in #1707 (comment)
The text was updated successfully, but these errors were encountered: