-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WCOSS2 Migration and Porting #398
Comments
@Hang-Lei-NOAA @junwang-noaa @MichaelLueken-NOAA @HelinWei-NOAA @YaliMao-NOAA @JiayiPeng-NOAA @jack-woollen @ShelleyMelchior-NOAA All, This is an epic issue in global-workflow to document the port of the GFS components to WCOSS2. If not already done, please open WCOSS2 port issues in your respective repositories and reference this issue. If your auth repo is still on VLab please comment in this issue with info on your issues in VLab. I will add links to the relevant issues in the table above. If you are listed in the table above you are a contact for your component but we understand you may not necessarily be the person doing the work. Please let me know if a different contact should be listed, thanks! I will be working on the workflow side of the port and will be looking for component updates so we can test the full GFS systems (both GFSv16 and GFSv17+) on WCOSS2. We will need both the production versions used by the GFSv16.1.2 (soon-to-be v16.1.3) and the develop/master branches ported to WCOSS2. As able, tests of ported copies should also be performed on tier 1 NOAA platforms (WCOSS-Dell, Hera, Orion, Jet) to check that R&D support is not broken. EIB can assist with testing obsproc on non-WCOSS platforms. Thank you all! |
Hi, All,
The hpc-stack v1.2.0 has been installed on WCOSS2 Acron for testing. Some
users have tested their code based on it.
Please load it as:
module load PrgEnv-intel/8.1.0
module load intel/19.1.3.304
module load craype/2.7.8
module load cray-mpich/8.1.7
module use
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
module load hpc/1.2.0
module load hpc-intel/19.1.3.304
module load hpc-cray-mpich/8.1.7
…------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/mpi/intel/19.1.3.304/cray-mpich/8.1.7
------------------------------------------------------------------
atlas/ecmwf-0.24.1 esmf/8_1_1 fms/2020.04.03
madis/4.3 ncio/1.0.0 netcdf/4.7.4 (D)
upp/10.0.8 w3emc/2.9.0 (D)
cdo/1.9.8 esmf/8_2_0_beta_snapshot_14 (D) fms/2021.03
(D) mapl/v2.7.3 nemsio/2.5.2 (D) pio/2.5.2 (D)
upp/10.0.9 (D) wgrib2/2.0.8 (D)
eckit/ecmwf-1.16.0 fckit/ecmwf-0.9.2 hdf5/1.10.6
(D) nccmp/1.8.9.0 nemsiogfs/2.5.3 (D) pio/2.5.3
w3emc/2.7.3 wrf_io/1.2.0 (D)
------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/19.1.3.304
------------------------------------------------------------------------
bacio/2.4.1 (D) g2/3.4.2 (D) g2tmpl/1.10.0 (D)
grib_util/1.2.2 (D) ip2/1.1.2 (D) jpeg/9.1.0
prod_util/1.2.2 (D) sp/2.3.3 (D) w3nco/2.4.1 (D)
yafyaml/v0.5.1
bufr/11.5.0 (D) g2/3.4.3 gfsio/1.4.1 (D)
hpc-cray-mpich/8.1.7 (L) jasper/2.0.22 (D) landsfcutil/2.4.1 (D)
sfcio/1.4.1 (D) szip/2.1.1 wgrib2/2.0.8ip
zlib/1.2.11 (D)
crtm/2.3.0 (D) g2c/1.6.4 (D) gftl-shared/v1.3.0 ip/3.3.3
(D) jasper/2.0.25 png/1.6.35 sigio/2.3.2
(D) udunits/2.2.28 (D) wgrib2/2.0.8ip2
----------------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/core
-----------------------------------------------------------------------------------
boost-headers/1.68.0 cmakemodules/v1.2.0 ecbuild/ecmwf-3.6.1
eigen/3.3.7 esma_cmake/v3.4.3 hpc-intel/19.1.3.304 (L)
hpc-python/3.8.6 json-schema-validator/2.1.0 json/3.9.1
pybind11/2.5.0
----------------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
----------------------------------------------------------------------------------
hpc/1.2.0 (L)
==========================================================================
Please note: This installation on Acorn is just for us to validate the
build of models on wcoss2 machines. EMC is not allowed to install on the
cactus and Dogwood. NCO installation based on hpc-stack and system
installed libs will be the only official on these machines. But we need to
validate models through this installation on Acorn.
Thanks,
Hang
On Thu, Aug 26, 2021 at 10:15 AM Kate Friedman ***@***.***> wrote:
@Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA> @junwang-noaa
<https://github.com/junwang-noaa> @MichaelLueken-NOAA
<https://github.com/MichaelLueken-NOAA> @HelinWei-NOAA
<https://github.com/HelinWei-NOAA> @YaliMao-NOAA
<https://github.com/YaliMao-NOAA> @JiayiPeng-NOAA
<https://github.com/JiayiPeng-NOAA> @jack-woollen
<https://github.com/jack-woollen> @ShelleyMelchior-NOAA
<https://github.com/ShelleyMelchior-NOAA>
All,
This is an epic issue in global-workflow to document the port of the GFS
components to WCOSS2. If not already done, please open WCOSS2 port issues
in your respective repositories and reference this issue. If your auth repo
is still on VLab please comment in this issue with info on your issues in
VLab. I will add links to the relevant issues in the table above.
If you are listed in the table above you are a contact for your component
but we understand you may not necessarily be the person doing the work.
Please let me know if a different contact should be listed, thanks!
I will be working on the workflow side of the port and will be looking for
component updates so we can test the full GFS systems (both GFSv16 and
GFSv17+) on WCOSS2. We will need both the production versions used by the
GFSv16.1.2 (soon-to-be v16.1.3) and the develop/master branches ported to
WCOSS2. As able, tests of ported copies should also be performed on tier 1
NOAA platforms (WCOSS-Dell, Hera, Orion, Jet) to check that R&D support is
not broken. EIB can assist with testing obsproc on non-WCOSS platforms.
Thank you all!
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKWSMFFR4WDG6GZQW3AGRI3T6ZD7TANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email>
.
|
Hi Hang,
Do we have hpc stack on Cactus?
If not, when can we expect?
Thanks
Moorthi
On Thu, Aug 26, 2021 at 11:30 AM Hang-Lei-NOAA ***@***.***>
wrote:
… Hi, All,
The hpc-stack v1.2.0 has been installed on WCOSS2 Acron for testing. Some
users have tested their code based on it.
Please load it as:
module load PrgEnv-intel/8.1.0
module load intel/19.1.3.304
module load craype/2.7.8
module load cray-mpich/8.1.7
module use
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
module load hpc/1.2.0
module load hpc-intel/19.1.3.304
module load hpc-cray-mpich/8.1.7
------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/mpi/intel/19.1.3.304/cray-mpich/8.1.7
------------------------------------------------------------------
atlas/ecmwf-0.24.1 esmf/8_1_1 fms/2020.04.03
madis/4.3 ncio/1.0.0 netcdf/4.7.4 (D)
upp/10.0.8 w3emc/2.9.0 (D)
cdo/1.9.8 esmf/8_2_0_beta_snapshot_14 (D) fms/2021.03
(D) mapl/v2.7.3 nemsio/2.5.2 (D) pio/2.5.2 (D)
upp/10.0.9 (D) wgrib2/2.0.8 (D)
eckit/ecmwf-1.16.0 fckit/ecmwf-0.9.2 hdf5/1.10.6
(D) nccmp/1.8.9.0 nemsiogfs/2.5.3 (D) pio/2.5.3
w3emc/2.7.3 wrf_io/1.2.0 (D)
------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/19.1.3.304
------------------------------------------------------------------------
bacio/2.4.1 (D) g2/3.4.2 (D) g2tmpl/1.10.0 (D)
grib_util/1.2.2 (D) ip2/1.1.2 (D) jpeg/9.1.0
prod_util/1.2.2 (D) sp/2.3.3 (D) w3nco/2.4.1 (D)
yafyaml/v0.5.1
bufr/11.5.0 (D) g2/3.4.3 gfsio/1.4.1 (D)
hpc-cray-mpich/8.1.7 (L) jasper/2.0.22 (D) landsfcutil/2.4.1 (D)
sfcio/1.4.1 (D) szip/2.1.1 wgrib2/2.0.8ip
zlib/1.2.11 (D)
crtm/2.3.0 (D) g2c/1.6.4 (D) gftl-shared/v1.3.0 ip/3.3.3
(D) jasper/2.0.25 png/1.6.35 sigio/2.3.2
(D) udunits/2.2.28 (D) wgrib2/2.0.8ip2
----------------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/core
-----------------------------------------------------------------------------------
boost-headers/1.68.0 cmakemodules/v1.2.0 ecbuild/ecmwf-3.6.1
eigen/3.3.7 esma_cmake/v3.4.3 hpc-intel/19.1.3.304 (L)
hpc-python/3.8.6 json-schema-validator/2.1.0 json/3.9.1
pybind11/2.5.0
----------------------------------------------------------------------------------
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
----------------------------------------------------------------------------------
hpc/1.2.0 (L)
==========================================================================
Please note: This installation on Acorn is just for us to validate the
build of models on wcoss2 machines. EMC is not allowed to install on the
cactus and Dogwood. NCO installation based on hpc-stack and system
installed libs will be the only official on these machines. But we need to
validate models through this installation on Acorn.
Thanks,
Hang
On Thu, Aug 26, 2021 at 10:15 AM Kate Friedman ***@***.***>
wrote:
> @Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA> @junwang-noaa
> <https://github.com/junwang-noaa> @MichaelLueken-NOAA
> <https://github.com/MichaelLueken-NOAA> @HelinWei-NOAA
> <https://github.com/HelinWei-NOAA> @YaliMao-NOAA
> <https://github.com/YaliMao-NOAA> @JiayiPeng-NOAA
> <https://github.com/JiayiPeng-NOAA> @jack-woollen
> <https://github.com/jack-woollen> @ShelleyMelchior-NOAA
> <https://github.com/ShelleyMelchior-NOAA>
>
> All,
>
> This is an epic issue in global-workflow to document the port of the GFS
> components to WCOSS2. If not already done, please open WCOSS2 port issues
> in your respective repositories and reference this issue. If your auth
repo
> is still on VLab please comment in this issue with info on your issues in
> VLab. I will add links to the relevant issues in the table above.
>
> If you are listed in the table above you are a contact for your component
> but we understand you may not necessarily be the person doing the work.
> Please let me know if a different contact should be listed, thanks!
>
> I will be working on the workflow side of the port and will be looking
for
> component updates so we can test the full GFS systems (both GFSv16 and
> GFSv17+) on WCOSS2. We will need both the production versions used by the
> GFSv16.1.2 (soon-to-be v16.1.3) and the develop/master branches ported to
> WCOSS2. As able, tests of ported copies should also be performed on tier
1
> NOAA platforms (WCOSS-Dell, Hera, Orion, Jet) to check that R&D support
is
> not broken. EIB can assist with testing obsproc on non-WCOSS platforms.
>
> Thank you all!
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <
#398 (comment)
>,
> or unsubscribe
> <
https://github.com/notifications/unsubscribe-auth/AKWSMFFR4WDG6GZQW3AGRI3T6ZD7TANCNFSM5B4EMJSQ
>
> .
> Triage notifications on the go with GitHub Mobile for iOS
> <
https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
>
> or Android
> <
https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
>
> .
>
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ALLVRYT4LKLBTCAHYZA3OJLT6ZMXXANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email>
.
--
Dr. Shrinivas Moorthi
Research Meteorologist
Modeling and Data Assimilation Branch
Environmental Modeling Center / National Centers for Environmental
Prediction
5830 University Research Court - (W/NP23), College Park MD 20740 USA
Tel: (301)683-3718
e-mail: ***@***.***
Phone: (301) 683-3718 Fax: (301) 683-3718
|
EMC cannot install libs on cactus. NCO will install the official library
softwares. Please wait.
EMC has installed the hpc-stack v1.2.0 on Acorn, please log into it and
make test:
On Fri, Aug 27, 2021 at 7:24 AM SMoorthi-emc ***@***.***>
wrote:
… Hi Hang,
Do we have hpc stack on Cactus?
If not, when can we expect?
Thanks
Moorthi
On Thu, Aug 26, 2021 at 11:30 AM Hang-Lei-NOAA ***@***.***>
wrote:
> Hi, All,
>
> The hpc-stack v1.2.0 has been installed on WCOSS2 Acron for testing. Some
> users have tested their code based on it.
> Please load it as:
> module load PrgEnv-intel/8.1.0
> module load intel/19.1.3.304
> module load craype/2.7.8
> module load cray-mpich/8.1.7
>
> module use
> /lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
> module load hpc/1.2.0
> module load hpc-intel/19.1.3.304
> module load hpc-cray-mpich/8.1.7
>
> ------------------------------------------------------------------
>
>
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/mpi/intel/19.1.3.304/cray-mpich/8.1.7
> ------------------------------------------------------------------
> atlas/ecmwf-0.24.1 esmf/8_1_1 fms/2020.04.03
> madis/4.3 ncio/1.0.0 netcdf/4.7.4 (D)
> upp/10.0.8 w3emc/2.9.0 (D)
> cdo/1.9.8 esmf/8_2_0_beta_snapshot_14 (D) fms/2021.03
> (D) mapl/v2.7.3 nemsio/2.5.2 (D) pio/2.5.2 (D)
> upp/10.0.9 (D) wgrib2/2.0.8 (D)
> eckit/ecmwf-1.16.0 fckit/ecmwf-0.9.2 hdf5/1.10.6
> (D) nccmp/1.8.9.0 nemsiogfs/2.5.3 (D) pio/2.5.3
> w3emc/2.7.3 wrf_io/1.2.0 (D)
> ------------------------------------------------------------------------
>
>
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/19.1.3.304
> ------------------------------------------------------------------------
> bacio/2.4.1 (D) g2/3.4.2 (D) g2tmpl/1.10.0 (D)
> grib_util/1.2.2 (D) ip2/1.1.2 (D) jpeg/9.1.0
> prod_util/1.2.2 (D) sp/2.3.3 (D) w3nco/2.4.1 (D)
> yafyaml/v0.5.1
> bufr/11.5.0 (D) g2/3.4.3 gfsio/1.4.1 (D)
> hpc-cray-mpich/8.1.7 (L) jasper/2.0.22 (D) landsfcutil/2.4.1 (D)
> sfcio/1.4.1 (D) szip/2.1.1 wgrib2/2.0.8ip
> zlib/1.2.11 (D)
> crtm/2.3.0 (D) g2c/1.6.4 (D) gftl-shared/v1.3.0 ip/3.3.3
> (D) jasper/2.0.25 png/1.6.35 sigio/2.3.2
> (D) udunits/2.2.28 (D) wgrib2/2.0.8ip2
>
>
----------------------------------------------------------------------------------
> /lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/core
>
>
-----------------------------------------------------------------------------------
> boost-headers/1.68.0 cmakemodules/v1.2.0 ecbuild/ecmwf-3.6.1
> eigen/3.3.7 esma_cmake/v3.4.3 hpc-intel/19.1.3.304 (L)
> hpc-python/3.8.6 json-schema-validator/2.1.0 json/3.9.1
> pybind11/2.5.0
>
>
----------------------------------------------------------------------------------
> /lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/modulefiles/stack
>
>
----------------------------------------------------------------------------------
> hpc/1.2.0 (L)
>
>
==========================================================================
> Please note: This installation on Acorn is just for us to validate the
> build of models on wcoss2 machines. EMC is not allowed to install on the
> cactus and Dogwood. NCO installation based on hpc-stack and system
> installed libs will be the only official on these machines. But we need
to
> validate models through this installation on Acorn.
>
> Thanks,
> Hang
>
> On Thu, Aug 26, 2021 at 10:15 AM Kate Friedman ***@***.***>
> wrote:
>
> > @Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA> @junwang-noaa
> > <https://github.com/junwang-noaa> @MichaelLueken-NOAA
> > <https://github.com/MichaelLueken-NOAA> @HelinWei-NOAA
> > <https://github.com/HelinWei-NOAA> @YaliMao-NOAA
> > <https://github.com/YaliMao-NOAA> @JiayiPeng-NOAA
> > <https://github.com/JiayiPeng-NOAA> @jack-woollen
> > <https://github.com/jack-woollen> @ShelleyMelchior-NOAA
> > <https://github.com/ShelleyMelchior-NOAA>
> >
> > All,
> >
> > This is an epic issue in global-workflow to document the port of the
GFS
> > components to WCOSS2. If not already done, please open WCOSS2 port
issues
> > in your respective repositories and reference this issue. If your auth
> repo
> > is still on VLab please comment in this issue with info on your issues
in
> > VLab. I will add links to the relevant issues in the table above.
> >
> > If you are listed in the table above you are a contact for your
component
> > but we understand you may not necessarily be the person doing the work.
> > Please let me know if a different contact should be listed, thanks!
> >
> > I will be working on the workflow side of the port and will be looking
> for
> > component updates so we can test the full GFS systems (both GFSv16 and
> > GFSv17+) on WCOSS2. We will need both the production versions used by
the
> > GFSv16.1.2 (soon-to-be v16.1.3) and the develop/master branches ported
to
> > WCOSS2. As able, tests of ported copies should also be performed on
tier
> 1
> > NOAA platforms (WCOSS-Dell, Hera, Orion, Jet) to check that R&D support
> is
> > not broken. EIB can assist with testing obsproc on non-WCOSS platforms.
> >
> > Thank you all!
> >
> > —
> > You are receiving this because you were mentioned.
> > Reply to this email directly, view it on GitHub
> > <
>
#398 (comment)
> >,
> > or unsubscribe
> > <
>
https://github.com/notifications/unsubscribe-auth/AKWSMFFR4WDG6GZQW3AGRI3T6ZD7TANCNFSM5B4EMJSQ
> >
> > .
> > Triage notifications on the go with GitHub Mobile for iOS
> > <
>
https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
> >
> > or Android
> > <
>
https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
> >
> > .
> >
>
> —
> You are receiving this because you are subscribed to this thread.
> Reply to this email directly, view it on GitHub
> <
#398 (comment)
>,
> or unsubscribe
> <
https://github.com/notifications/unsubscribe-auth/ALLVRYT4LKLBTCAHYZA3OJLT6ZMXXANCNFSM5B4EMJSQ
>
> .
> Triage notifications on the go with GitHub Mobile for iOS
> <
https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675
>
> or Android
> <
https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email
>
> .
>
--
Dr. Shrinivas Moorthi
Research Meteorologist
Modeling and Data Assimilation Branch
Environmental Modeling Center / National Centers for Environmental
Prediction
5830 University Research Court - (W/NP23), College Park MD 20740 USA
Tel: (301)683-3718
e-mail: ***@***.***
Phone: (301) 683-3718 Fax: (301) 683-3718
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKWSMFEUNIUBBWUJKRTXJVLT65YX5ANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
@Hang-Lei-NOAA I don't have access to Acorn. Can the hpc-stack be installed on Cactus under someone's home directory? |
Hi, George, This is not allowed by NCO/GDIT. We can do it until they agree
with it.
…On Fri, Aug 27, 2021 at 8:15 AM GeorgeGayno-NOAA ***@***.***> wrote:
@Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA> I don't have access to
Acorn. Can the hpc-stack be installed on Cactus under someone's home
directory?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKWSMFHEVR6MS6FDTHUK4C3T656UTANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
On Acorn, the environment path to w3emc is: The actual path to w3emc is: Please correct one of these paths so that they correspond with one another. |
@michael Lueken - NOAA Affiliate ***@***.***>
I try to repeat it, and find that the one you load is the NCO's wrong copy
and setting:
You should block NCO libs and use our hpc-stack installations.
***@***.***:~> module show w3emc/2.7.3
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
* /apps/ops/prod/libs*
/modulefiles/mpi/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3.lua:
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
help([[]])
conflict("w3emc")
load("sigio","nemsio")
prereq("sigio","nemsio")
setenv("w3emc_ROOT","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3")
setenv("w3emc_VERSION","2.7.3")
setenv("W3EMC_INC4","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/include_4")
setenv("W3EMC_INC8","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/include_8")
setenv("W3EMC_INCd","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/include_d")
setenv("W3EMC_LIB4","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/lib/libw3emc_4.a")
setenv("W3EMC_LIB8","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/lib/libw3emc_8.a")
setenv("W3EMC_LIBd","/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/lib/libw3emc_d.a")
…On Fri, Aug 27, 2021 at 8:28 AM MichaelLueken-NOAA ***@***.***> wrote:
@Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA>
On Acorn, the environment path to w3emc is:
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel/19.1.3.304/cray-mpich/8.1.4/w3emc/2.7.3/lib/libw3emc_4.a
The actual path to w3emc is:
/lfs/h1/emc/nceplibs/noscrub/hpc-stack/libs/hpc-stack/intel-19.1.3.304/cray-mpich-8.1.7/w3emc/2.7.3/lib/libw3emc_4.a
Please correct one of these paths so that they correspond with one another.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKWSMFGJVQWHGJA6Q6O2U2TT66AGNANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
Thanks, Hang! I was able to compile the code using your hpc-stack v1.2.0. |
WCOSS2 transition spreadsheet shared by Steven Earle gives the GFS the following timeframe:
The noted "resource name" is @aerorahul |
@KateFriedman-NOAA I have created the new tag for the version that can be run on wcoss2 |
@HelinWei-NOAA Great, thanks! Which WCOSS2 port does this tag support? The operational GFSv16 or developmental GFSv17+? This looks like it could potentially support both but we need to be careful of which stack is used: the production stack install for current ops systems or the upcoming development hpc-stack install (not yet installed) for future ops versions. One question to help with sorting that out...does the GLDAS need parallel netcdf? If so, I believe you need to load the netcdf module that is unlocked by loading the |
To build GFSv16 (tag v16.0.6) run the following sequence of commands on Cactus:
Dusan |
@KateFriedman-NOAA The UPP for GFSV16 wcoss2 transition branch is available at The GFSV16 offline post executable can be built with script under sorc/build_ncep_post.sh. |
@WenMeng-NOAA @DusanJovic-NOAA |
All, the parallel hdf5 and netcdf modules have been renamed on WCOSS2 to match the new hpc-stack naming convention found on other platforms. Please update your components builds to use the updated names, thanks! Before: Note: there are now two
The prior named module versions will be removed shortly. |
Added bufr sounding to the component list. @HuiyaChuang-NOAA @GuangPingLou-NOAA please provide a repo where the work will happen and an associated issue for tracking progress from this epic. Let me know if work is needed to codes that reside in global-workflow, I just started a PR with build changes for what we build in global-workflow for bufrsnd. See PR #444. In the meeting today I mentioned that we're currently building gfs_bufr and tocsbufr in global-workflow via build_gfs_bufrsnd.sh: |
Added GFS downstream AWIPS, bulletins, and GEMPAK to component table above. @BoiVuong-NOAA is the POC. |
Hi Kate,
I suppose I need to create a fork for working on the bufr sounding package,
right?
Is it correct that I should check out the branch of "feature/dev-wcoss2"?
Thanks,
Guang Ping
…On Fri, Sep 17, 2021 at 12:52 PM Kate Friedman ***@***.***> wrote:
Added bufr sounding to the component list. @HuiyaChuang-NOAA
<https://github.com/HuiyaChuang-NOAA> @GuangPingLou-NOAA
<https://github.com/GuangPingLou-NOAA> please provide a repo where the
work will happen and an associated issue for tracking progress from this
epic. Let me know if work is needed to codes that reside in
global-workflow, I just started a PR with build changes for what we build
in global-workflow for bufrsnd. See PR #444
<#444>. In the meeting
today I mentioned that we're currently building gfs_bufr and tocbufr in
global-workflow via build_gfs_bufrsnd.sh:
https://github.com/KateFriedman-NOAA/global-workflow/blob/feature/ops-wcoss2/sorc/build_gfs_bufrsnd.sh
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#398 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ALQGQJBMSGUFVY6P2EPVAKDUCNW3VANCNFSM5B4EMJSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
WCOSS2 Migration and Porting NOAA-EMC#398
WCOSS2 Migration and Porting NOAA-EMC#398
* Starting UPP version numbers with 10.0.0 for develop branch * Add codeowners file to develop
Finally got a fix from NCO for the problem I was experiencing. Tag verif_global_v2.9.0 from EMC_verif-global has support for WCOSS2!! |
Excellent! I'll test this new tag on WCOSS2 and update the relevant branches with it ( |
* feature/ops-wcoss2_v16.2.0: (415 commits) Add GFSv16.1.8 release notes Matching memory updates for awips/gempak in config Update prior GFS version in v16.2.0 release notes Update gfspostsnd job resources - oversubscribing Memory and resource adjustments to some jobs (NCO) Update to HOMENHC default path in JGLOBAL_ATMOS_TROPCY_QC_RELOC Update EMC tag name in v16.2.0 release notes Resource updates for analysis_calc job on WCOSS2 Updated error handling in gfs_bufr script Add -g and -traceback flags to utility builds if missing EnKf forecast serial netcdf updates and DELTIM=200 Add HOMEobsproc back to config.base.nco.static Update "excl" to "exclhost" in workflow_utils.py Update ecf PBS excl to exclhost Remove reference to HOMEobsproc in NCO config.base Update GFSv16.2.0 release notes for new hand-off tag Update WCOSS2 env file cpu-bind flags for threading Update UPP tag to upp_v8.1.2 Remove nco_ver from build.ver - not needed Update release notes to update prior version Update GFSv16.2.0 release notes to reflect new tag Increase post_master job to 126 tasks Update enkfgdas_sfc job to use 60GB Add gsl module load needed by nco module Set hyper=true for gdas_atmos_analysis_calc job Optimized gfs_forecast job resource configuration Add WCOSS2 operations gfs defs files Add missing --init flag to GSI checkout submodule update file Release_Notes.gfs.v16.1.7.txt add Release_Notes.gfs.v16.1.7.txt Code update to syndat_getjtbul.fd for v16.1.7 Update HOMEobsproc paths in config.base.nco.static Update obsproc package settings in dev config.base Update prep.sh to use new WCOSS2 obsproc packages Add obsproc/prepobs run versions to wcoss2.ver Add needed gempak subfolder to gempak ush scripts Update GSI submodule command and release notes GFS v16.1.6 update: Turn off uv 224 VADWND Update GLDAS tag to gldas_gfsv16_release.v.2.0.0 Update DMPDIR and BASE_GIT paths for WCOSS2 Update Externals.cfg with GFSv16.2.0 component versions Update release notes for current ops version Move all PBS place settings to separate line Remove commented out lines from transfer lists Update WAFS tag to gfs_wafs.v6.2.8 GFS v16.1.6: Update release notes and comment in config.anal Update npe_node_fcst_gfs in config.resources.emc.dyn Updates to support wcoss2.ver Fold in transfer parm list updates from NCO Move transfer lists into new transfer folder Update wave job resources with NCO feedback Update EMC tag name in v16.2.0 release notes Updates to run.ver and create wcoss2.ver Script updates from NCO GFS v16.1.6: GSI update to add commercial GPSRO in DO-4 Move excl setting into resource line in ecf scripts Update gfs_forecast job resources Update several versions in run.ver Add OMP_PLACES=cores for fcst block in WCOSS2.env Update compilation flags for gaussian_sfcanl build Add the following scripts changed to remove module load libjpeg: jgdas_wave_prep.ecf jgfs_wave_prep.ecf Remove hardwired DELL path util/ush/make_tif.sh Remove esmf from enkf fcst 1. A check on job/ush/script from HOMEgfs, I found the following reference to USE_CFP: gldas_forcing.sh exgdas_atmos_chgres_forenkf.sh exgdas_atmos_gldas.sh exgdas_enkf_update.sh exglobal_atmos_analysis.sh exglobal_diag.sh Correct analysis job walltimes in config.resources The following scripts changed to remove module load wgrib2: jenkfgdas_sfc.ecf jgfs_wave_prdgen_bulls.ecf jgdas_wave_postsbs.ecf Adjust analysis job walltimes for ops Add missing EXPDIR setting to JGDAS_ATMOS_GEMPAK Remove non-WCOSS2 references in nco.static configs Update EMC tag name in release notes Change npe_analdiag to 96 Remove npe_node_eupd=9 setting on WCOSS2 Update several GSI/EnKF job resources Update to correct infinite loop in gempak script Add missing character to GLDAS tag in release notes Update GLDAS tag to gldas_gfsv16_release.v.1.28.0 Remove excl for gfswaveprep job PBS directive Update GFSv16.2.0 release notes GEMPAK_META script updates from Boi Update GEMPAK scripts Update gempak job in setup_workflow scripts Back out comment of job variable in awips scripts Resource adjustments for eobs, waveprep, gfspost Resource updates for analysis and eobs Change RUN to RUN2 in awips scripts Change RUN to RUN2 in gempak pgrb2 spec script Correct config list for wavepostbndpntbll job Comment out job variable in awips ecf scripts Reduce gdas analysis job walltime back to 40mins Remove nth_max usage in WCOSS2.env A few resource updates from NCO and WCOSS_C removal Update analysis job walltime to 50mins Optimization resource updates from NCO remove obsproc ecfs and there references from suite.def. work needs to find a proper trigger for the remaining dump job bringing in changes from @WeiWei-NCO after his testing Update analysis job walltime to 50 mins Update gdasechgres job resources Update esfc and analysis job resources Update C384 and C768 values in config.fv3.emc.dyn Update config.resource.emc.dyn with tested values Cleanup of config.fv3.nco.static Add COMIN_OBS/COMIN_GES_OBS and related xml support Add COMIN_OBS/COMIN_GES_OBS and related xml support Update resources for gdasesfc job in ecf script Numerous resource updates based on optimization Update for C768 gdasfcst job resource settings Update analysis job ecf resource settings Update to WCOSS2 env file for waveprep job Add missing get_awipsgroups function to fcstonly Update memory setting in workflow_utils for gfs update resources for more jobs in ecf scripts from NCO update resources for more jobs in ecf scripts from NCO update resources for jgfs_atmos_tropcy_qc_reloc.ecf. Remove developer overwrite section update resources for jgdas_atmos_tropcy_qc_reloc.ecf. Remove developer overwrite section update resources for more jobs to include memory in ecf scripts. wave init jobs need modules for Intel loaded update resources for wave jobs to include memory in ecf scripts update resources for atmos chgres for enkf in ecf scripts update resources for atmos pp wafs_gcip in ecf scripts update resources for atmos gempak_meta in ecf scripts update resources for atmos gempak in ecf scripts update resources for wave init, post and prep jobs in ecf scripts fix resource allocations for some jobs that NCO flagged were allocating too many cores Remove unneeded COM paths from wavepostsbs JJOB Update GLDAS tag in release notes Update GLDAS tag to gldas_gfsv16_release.v1.25.0 wave init jobs just need cray-pals per NCO. remove rest put NCO identified changes from the global-workflow in a branch remove gdas remnant from enkfgdas jobnames some PBS jobnames were hardwired gdas or gfs, while some inherited from %RUN%. This commit uses %RUN% to make it consistent and possibly will open the door for further consolidation between gdas and gfs families post/anl job is the same as the forecast hour. create a link, instead of having a copy Update post job resources in config.resources.nco.static Update post jobs ecf script resources Update release notes for new EMC tag Update workflow_utils.py to support exclusive Add imagemagick_ver=7.0.8-7 to run.ver Update NCO resource config for memory Update v16.2.0 release notes for ecf script linking Add memory setting to jgfs_atmos_wafs_master.ecf Add memory setting to jgfs_atmos_wafs_grib2_0p25.ecf Add memory setting to jgfs_atmos_wafs_grib2.ecf Add memory setting to jgfs_atmos_wafs_blending_0p25.ecf Add memory setting to jgfs_atmos_wafs_blending.ecf Add memory setting to jgfs_atmos_awips_g2_master.ecf Add memory setting to jgfs_atmos_awips_master.ecf Add excl tag to jgfs_atmos_gempak.ecf Add excl tag to jgfs_forecast.ecf add pesky blank lines at the end of script. reviewers are brutal add script that sets up the links to the master.ecf that loop over forecast hours add gitignore in appropriate places to ignore links. update defs to the consistent grib_wafs ecf tasks remove duplicate jgfs_atmos_wafs_f*.ecf files and rename f00 as master remove duplicate jgfs_atmos_awips_g2_f*.ecf files and rename f000 as master remove duplicate jgfs_atmos_awips_f*.ecf files and rename f000 as master remove duplicate gfs_atmos_post_fxxx.ecf files and rename f000 as master fix typo that causes the opposite effect ignore gdas/atmos/post/ forecast hour ecf links remove duplicate gdas_atmos_post_fxxx.ecf files and rename f000 as master remove remnant from PR555 that copied gdas/enkf to enkfgdas. add gitignore in enkfgdas/post to ignore links remove duplicate enkfgdas_post_fxxx.ecf files and rename f003 as master add script that sets up the links to the master.ecf that loop over forecast hours add gitignore in appropriate places to ignore links. update defs to the consistent grib_wafs ecf tasks remove duplicate jgfs_atmos_wafs_f*.ecf files and rename f00 as master remove duplicate jgfs_atmos_awips_g2_f*.ecf files and rename f000 as master remove duplicate jgfs_atmos_awips_f*.ecf files and rename f000 as master remove duplicate gfs_atmos_post_fxxx.ecf files and rename f000 as master fix typo that causes the opposite effect ignore gdas/atmos/post/ forecast hour ecf links remove duplicate gdas_atmos_post_fxxx.ecf files and rename f000 as master remove remnant from PR555 that copied gdas/enkf to enkfgdas. add gitignore in enkfgdas/post to ignore links remove duplicate enkfgdas_post_fxxx.ecf files and rename f003 as master request exclusive node where ncpus=128 remove memory requests of 500gb and request exclusive node instead every ecf script that loads compiler dependent module, now loads PrgEnv-intel, craype and intel every ecf script that loads compiler dependent module, now loads PrgEnv-intel, craype and intel every ecf script that loads compiler dependent module, now loads PrgEnv-intel, craype and intel every ecf script that loads cray-mpich, now loads PrgEnv-intel, craype and intel. Ignore swp files Correct COMIN paths in GEMPAK driver scripts Update COMIN paths for ukmet, ecmwf, and nam Update WAFS tag to gfs_wafs.v6.2.7 add #PBS -l debug=true to all .ecf files Correct COMIN definitions Update GLDAS and WAFS tags in release notes Update GLDAS tag to gldas_gfsv16_release.v1.24.0 revert ecflow include files to NCO versions. Will adapt as necessary for proper use Remove NCO if-block from JJOB scripts Update ROTDIR in config.base.nco.static Remove ecflow post assignment in envir-p1.h Remove remark from envir-p1.h and head.h Update analysis ecflow script to use 128 for wcoss2 Remove extra CDATE Reference to NCO version: - Move enkf out of gdas and rename it to enkfgdas. Include all ecflow definition files job name Include all ecflow scripts name and job/log name - Move "model=gfs" to the top on each job except all jobs under obsproc. obsproc will no longer be part of GFS. Therefore leave it without change for testing purpose. - Remove the source of model_ver from each ecflow script except all jobs under obsproc. obsproc will no longer be part of GFS. Therefore leave it without change for testing purpose. Update enkf structure changes in ecflow definition files Update WAFS tag to gfs_wafs.v6.2.6 Remove envvar from module-setup.*.inc scripts Remove envvar from WCOSS2 driver scripts Update machine-setup based on NCO feedback Update builds and makefiles based on NCO feedback Modulefile updates based on NCO feedback Update build.ver based on NCO feedback Add OMP_STACKSIZE to WCOSS2.env for forecast Update Release_Notes.gfs.v16.2.0.md Fix GFS fcst APRUN_FV3 command ecflow package for wcoss2 GFS transition WCOSS2 Migration and Porting #398 ...
The migration of the GFS/global-workflow to WCOSS2 is complete. Both the operational GFSv16 and THANK YOU TO EVERYONE WHO HELPED WITH THESE PORTS! |
…OAA-EMC#410) (NOAA-EMC#398) * Remove Noah WRFv4 and all variables required only by this scheme; remove suite FV3_HAFS_v0_hwrf * Reduce memory required by MERRA2 data Co-authored-by: Denise Worthen <[email protected]>
This epic will document the migration and porting of the global workflow and its components to WCOSS2.
The GFSv16 operational tags need to be ported on WCOSS2 along with the current state of the workflow (and its components).
It is not expected to have bit-wise identical results between current WCOSS and WCOSS2, but the differences should be justified through a butterfly test.
Since GFSv16 did not use HPC-Stack, and WCOSS2 will use it, there may be differences in the versions of the libraries. It is acceptable to use the later (newer) versions of the libraries and utilities instead of those from the GFSv16 tag. These differences should be noted in the Issue description for the components.
Proposed here is a suggested approach for porting GFSv16 tag to WCOSS2:
develop
/master
to WCOSS2 and perform any relevant testing.develop
/master
that are relevant to GFSv16. Perform a butterfly test and compare output with WCOSS Dell P3.List of issues to facilitate migration of global workflow and its components to WCOSS2.
HDF5_ROOT
instead ofHDF5
variable for HDF5 library paths during build. Only run at C192 so far (ok).The text was updated successfully, but these errors were encountered: