Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade subcomponents and the global workflow to use spack-stack #1868

Closed
DavidHuber-NOAA opened this issue Sep 19, 2023 · 22 comments · Fixed by #2084
Closed

Upgrade subcomponents and the global workflow to use spack-stack #1868

DavidHuber-NOAA opened this issue Sep 19, 2023 · 22 comments · Fixed by #2084
Assignees
Labels
feature New feature or request

Comments

@DavidHuber-NOAA
Copy link
Contributor

What new functionality do you need?

Migrate to spack-stack libraries for all subcomponents and the global workflow modules.

What are the requirements for the new functionality?

All module systems/components use spack-stack libraries. This has already been completed for the UFS, UFS_utils, GDAS App, and UPP repositories. The repos remaining are the GSI (NOAA-EMC/GSI#589), GSI-Utils (NOAA-EMC/GSI-utils#18), GSI-Monitor (NOAA-EMC/GSI-Monitor#98), gfs-utils, and verif-global.

Acceptance Criteria

The global workflow is able to run forecast-only and cycled experiments at all resolutions referencing spack-stack libraries on at least Hera and Orion.

Suggest a solution (optional)

No response

@DavidHuber-NOAA DavidHuber-NOAA added feature New feature or request triage Issues that are triage labels Sep 19, 2023
@DavidHuber-NOAA DavidHuber-NOAA self-assigned this Sep 19, 2023
@DavidHuber-NOAA
Copy link
Contributor Author

Opened a PR to upgrade the GSI to spack-stack NOAA-EMC/GSI#624.

@WalterKolczynski-NOAA WalterKolczynski-NOAA removed the triage Issues that are triage label Sep 25, 2023
@DavidHuber-NOAA
Copy link
Contributor Author

MET/METplus will be an issue with spack-stack. Currently, the builds available are 10.1.1/4.1.1, respectively, while I believe the verif-global system requires 9.1.x/3.1.x. I think I will need to create a separate module file for each system that loads the appropriate hpc-stack builds of these modules, though reading through #1756 and #1342, it seems this package needs an update anyway. Am I correct about the MET/METplus verions @malloryprow?

That said, looking through the spack repo, met/9.1.3 and metplus/3.1.1 are included and could at least in theory be installed. @AlexanderRichert-NOAA Would it be possible to install these under spack-stack 1.4.1 and/or 1.5.0 alongside the existing 10.1.1/4.1.1 builds?

@malloryprow
Copy link
Contributor

It be MET v9.1.3 and METplus v3.1.1.

@AlexanderRichert-NOAA
Copy link

@DavidHuber-NOAA yeah that shouldn't be a problem. Can you file an issue under spack-stack so we can track things there?

@DavidHuber-NOAA
Copy link
Contributor Author

@AlexanderRichert-NOAA installed a test environment for spack-stack on Hera:/scratch1/NCEPDEV/nems/Alexander.Richert/spack-stack-1.4.1-gw that was used to build UFS_Utils, GFS-Utils, GSI, GSI-monitor, and GSI-utils. I then removed all of the module use/load statements from the jobs/rocoto/* scripts and updated the global-workflow modulefiles on Hera to point to Alex's build and ran a test C96/C48 case for 1.5 cycles and compared this against develop.

The half-cycle gdas and enkfgdas outputs are nearly identical, with just 7 data points differing across all variables at forecast hour 9 in the archived grib2 files when compared with grib_compare.

The first location where differences are obvious are during the GDAS and GFS analyses on the first full cycle. Initial radiance penalties differ at the 12th decimal place. This is consistent with the regression test results seen previously in NOAA-EMC/GSI#589 that were tracked to CRTM optimization differences between spack-stack (compiled with RelWithDebInfo; -O2) and hpc-stack (Release; -O3). The differences in initial radiance values cause cascading differences between the two runs, so no further file comparisons were performed except to check that all files present in develop's $COMROT existed and were non-zero-sized in the spack-stack test, which they were.

Log files were then checked for errors/warnings and compared against develop. All warnings/errors were identical with the exception of the half cycle gdasarch job, which reported different errors when attempting to access HPSS. HPSS was down for maintenance yesterday, which explains these differences.

I will now move on to a C384/C192 test case.

@DavidHuber-NOAA
Copy link
Contributor Author

Also, I compared runtimes between develop and spack-stack. Almost every spack-stack job runs a little slower than the develop version. I believe this can likely be tracked down to optimization flags used in the libraries. Results shown below.

Task Duration Spack Duration Develop
gdasprep 263.0 257.0
gdasanal 1127.0 1117.0
gdassfcanl 39.0 32.0
gdasanalcalc 65.0 58.0
gdasanaldiag 165.0 149.0
gdasfcst 296.0 290.0
gdaspost_anl 94.0 77.0
gdaspost_f000 74.0 66.0
gdaspost_f001 79.0 74.0
gdaspost_f002 79.0 74.0
gdaspost_f003 81.0 69.0
gdaspost_f004 81.0 71.0
gdaspost_f005 85.0 69.0
gdaspost_f006 82.0 74.0
gdaspost_f007 82.0 71.0
gdaspost_f008 81.0 70.0
gdaspost_f009 80.0 70.0
gdasvrfy 539.0 447.0
gdasarch 40.0 33.0
enkfgdaseobs 469.0 464.0
enkfgdaseupd 169.0 168.0
enkfgdasechgres 27.0 22.0
enkfgdasediag 131.0 116.0
enkfgdasecen000 48.0 41.0
enkfgdasecen001 45.0 37.0
enkfgdasecen002 46.0 37.0
enkfgdasesfc 94.0 67.0
enkfgdasefcs01 215.0 206.0
enkfgdasepos000 37.0 25.0
enkfgdasepos001 36.0 24.0
enkfgdasepos002 41.0 24.0
enkfgdasepos003 36.0 27.0
enkfgdasepos004 37.0 23.0
enkfgdasepos005 38.0 23.0
enkfgdasepos006 37.0 23.0
enkfgdasearc00 64.0 24.0
gfsprep 248.0 231.0
gfsanal 608.0 611.0
gfssfcanl 40.0 32.0
gfsanalcalc 71.0 58.0
gfsfcst 456.0 442.0
gfspost_anl 101.0 91.0
gfspost_f000 74.0 62.0
gfspost_f006 84.0 73.0
gfspost_f012 81.0 70.0
gfspost_f018 82.0 74.0
gfspost_f024 82.0 73.0
gfsvrfy 168.0 157.0
gfspostsnd 118.0 70.0
gfsgempak 66.0 62.0
gfsarch 38.0 37.0

@DavidHuber-NOAA
Copy link
Contributor Author

Two C384/C192 test cases were run, one with 2 members out 1.5 cycles and another with 80 members out 3.5 cycles. A control was also run with 2 members out 1.5 cycles.

For the 2-member test, all jobs completed successfully. Log files were compared against the control and no new errors/warnings were generated. Additionally, file counts between the test and control archived products were identical.

For the 3.5-cycle test, one job initially failed (enkfgdaseupd on the 2nd full cycle) but then completed without intervention on the second attempt. enkf.x crashed when attempting to create an empty increment netCDF file at gridio_gfs.f90 with a segmentation fault (address not mapped to object at address 0x2effd2d3ef21). Since this only occured once out of 6 calls to enkf.x (including the C96/C48 case), I believe this was a fluke.

Timing differences seemed to improve for C384, with the analyses and forecasts coming in about the same or a little faster for the spack-stack case (see below). The post jobs are quite a bit slower, but they are also relatively cheap jobs to begin with. This may improve if we switch to spack-stack/1.5.0 which compiled many libraries with Release (i.e. -O3 instead of -O2 and with fewer debug options).

Task Time Spack Time Develop
gdasprep 305.0 317.0
gdasanal 1398.0 1394.0
gdassfcanl 61.0 55.0
gdasanalcalc 201.0 172.0
gdasanaldiag 178.0 185.0
gdasfcst 769.0 776.0
gdaspost_anl 229.0 199.0
gdaspost_f000 96.0 83.0
gdaspost_f001 106.0 89.0
gdaspost_f002 99.0 87.0
gdaspost_f003 105.0 88.0
gdaspost_f004 102.0 85.0
gdaspost_f005 103.0 92.0
gdaspost_f006 100.0 90.0
gdaspost_f007 103.0 85.0
gdaspost_f008 103.0 87.0
gdaspost_f009 105.0 86.0
gdasvrfy 543.0 463.0
gdasfit2obs 11.0 8.0
gdasarch 26.0 19.0
enkfgdaseobs 465.0 466.0
enkfgdaseupd 370.0 366.0
enkfgdasechgres 85.0 83.0
enkfgdasediag 185.0 168.0
enkfgdasecen000 152.0 148.0
enkfgdasecen001 165.0 145.0
enkfgdasecen002 170.0 153.0
enkfgdasesfc 110.0 102.0
enkfgdasefcs01 1060.0 1078.0
enkfgdasepos000 163.0 142.0
enkfgdasepos001 167.0 145.0
enkfgdasepos002 164.0 142.0
enkfgdasepos003 159.0 146.0
enkfgdasepos004 155.0 144.0
enkfgdasepos005 158.0 145.0
enkfgdasepos006 158.0 143.0
enkfgdasearc00 28.0 20.0
gfsprep 291.0 299.0
gfsanal 859.0 871.0
gfssfcanl 60.0 54.0
gfsanalcalc 189.0 162.0
gfsfcst 4096.0 4087.0
gfspost_anl 140.0 128.0
gfspost_f000 92.0 88.0
gfspost_f006 104.0 98.0
gfspost_f012 99.0 88.0
gfspost_f018 104.0 90.0
gfspost_f024 103.0 91.0
gfspost_f030 105.0 92.0
gfspost_f036 104.0 89.0
gfspost_f042 102.0 96.0
gfspost_f048 105.0 103.0
gfspost_f054 113.0 93.0
gfspost_f060 103.0 91.0
gfspost_f066 101.0 89.0
gfspost_f072 104.0 90.0
gfspost_f078 103.0 89.0
gfspost_f084 108.0 92.0
gfspost_f090 108.0 99.0
gfspost_f096 103.0 89.0
gfspost_f102 99.0 91.0
gfspost_f108 106.0 97.0
gfspost_f114 103.0 93.0
gfspost_f120 103.0 93.0
gfsvrfy 834.0 789.0
gfspostsnd 265.0 225.0
gfsgempak 173.0 181.0
gfsarch 191.0 187.0

DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 16, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 16, 2023
@DavidHuber-NOAA
Copy link
Contributor Author

I successfully ran 2 cycles on Hera, with the exception of the metplus and awips jobs. For awips, I opened issue NOAA-EMC/gfs-utils#33. For metplus, the initial failures were due to the undefined variable METPLUS_PATH and an old path for HOMEMET. After corrected these to point to the spack-stack installs of MET and METplus, the jobs failed with some cryptic python errors:

11/16 18:07:42.025 metplus (met_util.py:109) INFO: Log file: /scratch1/NCEPDEV/stmp2/David.Huber/RUNDIRS/ss_151/metpg2g1.267696/grid2grid_step1/metplus_output/logs/ss_151/master_metplus_grid2grid_step1_pres_gatherbyVSDB_for20221109_runon20231116180742.log
11/16 18:07:42.054 metplus.StatAnalysis (met_util.py:215) ERROR: Fatal error occurred
Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 417, in get_importer
    importer = sys.path_importer_cache[path_item]
KeyError: PosixPath('/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/wrappers')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/util/met_util.py", line 162, in run_metplus
    module = import_module(package_name)
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/wrappers/__init__.py", line 30, in <module>
    for (_, module_name, _) in iter_modules([package_dir]):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 129, in iter_modules
    for i in importers:
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 421, in get_importer
    importer = path_hook(path_item)
  File "<frozen importlib._bootstrap_external>", line 1632, in path_hook_for_FileFinder
  File "<frozen importlib._bootstrap_external>", line 1504, in __init__
  File "<frozen importlib._bootstrap_external>", line 182, in _path_isabs
AttributeError: 'PosixPath' object has no attribute 'startswith'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/util/met_util.py", line 171, in run_metplus
    raise NameError("There was a problem loading %s wrapper." % item)
NameError: There was a problem loading StatAnalysis wrapper.
11/16 18:07:42.054 metplus.StatAnalysis (met_util.py:215) ERROR: Fatal error occurred
Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 417, in get_importer
    importer = sys.path_importer_cache[path_item]
KeyError: PosixPath('/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/wrappers')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/util/met_util.py", line 162, in run_metplus
    module = import_module(package_name)
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/wrappers/__init__.py", line 30, in <module>
    for (_, module_name, _) in iter_modules([package_dir]):
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 129, in iter_modules
    for i in importers:
  File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 421, in get_importer
    importer = path_hook(path_item)
  File "<frozen importlib._bootstrap_external>", line 1632, in path_hook_for_FileFinder
  File "<frozen importlib._bootstrap_external>", line 1504, in __init__
  File "<frozen importlib._bootstrap_external>", line 182, in _path_isabs
AttributeError: 'PosixPath' object has no attribute 'startswith'

During handling of the above exception, another exception occurred:

And continues from there. This suggests to me that this version of metplus (3.1.1) is not compatible with the spack-stack Python version (3.10.8) and/or one of the Python packages installed with spack-stack. So, unfortunately, I don't think I will be able to get verif-global to work with spack-stack unless @malloryprow has an idea on how to fix this. In the meantime, I am going to set `DO_METP=NO" in config.base.emc.dyn by default and users will then need to run verif-global offline.

@malloryprow
Copy link
Contributor

malloryprow commented Nov 16, 2023

On first look, I can't say I know how to fix it. Not sure if there may be some difference with the python versions? It looks like this is using python-3.10.8? I know the python version I'm using in EMC_verif-global on WCOSS2 is 3.8.6.

DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 17, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 17, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 20, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 20, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 21, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 21, 2023
This points to the head of develop on NOAA-EMC which supports spack-stack.
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 21, 2023
@malloryprow
Copy link
Contributor

@WalterKolczynski-NOAA It seems the required METplus version for EMC_verif-global is not compatible with the spack-stack python version.

On WCOSS2, EMC_verif-global is using 3.8.6 with no issues. It look like spack-stack is using 3.10.8.

@WalterKolczynski-NOAA
Copy link
Contributor

Okay, we're going to need to figure out the MET issue, likely in a separate issue.

I did a little poking around, but we're going to need more info before figuring out how to precede:

  • Do MET and MET+ have ctests?
    • Did they pass when they were installed in spack-stack?
  • How difficult would it be to install more recent version of MET in stack?
  • How much change would be needed to the verification scripts to move to a more recent version of MET?

The first step after collecting this info may be asking the MET team if they can point us in the direction of what may be wrong. I'm sure if there are changes needed in MET, they won't want to be working on something two-versions old. That said, I don't know that python has had any changes that would break PosixPath or imports. I suspect some configuration error.

At any rate, if the MET and MET+ in the stack aren't working, they shouldn't stay in the stack broken. Whether that involves fixing them and/or replacing them with different versions is the question.

@malloryprow
Copy link
Contributor

The issues is coming from METplus.

EMC_verif-global is using these older versions of MET and METplus; honestly I'd call EMC_verif-global pretty much frozen code with EVS nearing code hand off to NCO. I don't think it makes sense to upgrade them to newer versions as that would be a big overhaul.

@malloryprow
Copy link
Contributor

I'll note it looks like the latest METplus version is using python 3.10.4.
https://github.com/dtcenter/METplus/blob/main_v5.1/environment.yml

@malloryprow
Copy link
Contributor

I don't know anything about spack-stack but maybe it is a configuration error like Walter mentioned?

Looking at this part

Traceback (most recent call last):

File "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/python-3.10.8-oqvn6sa/lib/python3.10/pkgutil.py", line 417, in get_importer
importer = sys.path_importer_cache[path_item]
KeyError: PosixPath('/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c/metplus/wrappers')

The first line has /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env
but the second line has /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon.

@DavidHuber-NOAA
Copy link
Contributor Author

@malloryprow The gsi-addon environment is built on top of the unified-env environment, so when the gsi-addon environment is loaded, you get access to both. There are only a few packages in the gsi-addon environment, primarily met/9.1.3, metplus/3.1.1, bufr/11.7.0, and gsi-ncdiag/1.1.2. There could still be a configuration issue elsewhere, though.

@malloryprow
Copy link
Contributor

Ah got it, okay! Thanks @DavidHuber-NOAA!

@malloryprow
Copy link
Contributor

malloryprow commented Nov 28, 2023

I did a test of METplus (outside of EMC_verif-global) with /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c vs. /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/metplus-5.1.0-n3vysib.

/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/intel/2021.5.0/metplus-3.1.1-uu37v6c threw similar errors as described above.

/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/unified-env/install/intel/2021.5.0/metplus-5.1.0-n3vysib worked though.

@DavidHuber-NOAA
Copy link
Contributor Author

I opened issue #2091 to continue tracking the spack-stack/metplus issue.

@malloryprow
Copy link
Contributor

Great! Let me know if there is anything I can help with. My development for EVS is all done.

DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 30, 2023
@RussTreadon-NOAA
Copy link
Contributor

RussTreadon-NOAA commented Nov 30, 2023

@DavidHuber-NOAA , GDASApp PR #774 adds a Hercules build capability to GDASApp (UFS, aka JEDI, DA). GDASApp has been built on Hercules. All test_gdasapp ctests pass. We should update the GDASApp hash after #774 enters GDASApp develop.

Realize that this issue is spack-stack specific, not hercules specific. GDASApp develop already builds with spack-stack 1.5.1 on Hera and Orion. This changed entered GDASApp develop at c92597b. GDASApp PR #774 will extend the GDASApp spack-stack 1.5.1 build to Hercules.

@DavidHuber-NOAA
Copy link
Contributor Author

Sounds good, thanks for the news @RussTreadon-NOAA!

DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 30, 2023
@DavidHuber-NOAA
Copy link
Contributor Author

During WCOSS2 testing, it was apparent that the hacks for efcs, fcst, and post jobs are still needed for WCOSS2 at least until gsi-ncdiag/1.1.2 is installed in /apps/test/hpc-stack/i-19.1.3.304__m-8.1.12__h-1.14.0__n-4.9.2__p-2.5.10__e-8.4.2, which is the stack that is used by the UFS. Once installed, the GSI, GSI-Utils, GSI-monitor, UFS_Utils, and GFS-Utils repositories will need to be upgraded on that system.

DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 30, 2023
DavidHuber-NOAA added a commit to DavidHuber-NOAA/global-workflow that referenced this issue Nov 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
5 participants