-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updated Scripts to Add Meteorology Data set #29
Comments
19 tasks
Testing:
|
rburghol
changed the title
Create Script to Add Meteorology Data set
Create Script to Add Meteorology Data set to Model Scenario Tree
Jan 5, 2022
rburghol
changed the title
Create Script to Add Meteorology Data set to Model Scenario Tree
Create Scripts to Add Meteorology Data set
Jan 5, 2022
Test using new cbp exec framework. Successfully copies WDMs over.
|
Testing with single missing segment:
|
rburghol
changed the title
Create Scripts to Add Meteorology Data set
Updated Scripts to Add Meteorology Data set
Mar 3, 2022
Testing with a full basin
|
This was referenced Mar 14, 2022
15 tasks
Go up to date (@jdkleiner ):
|
Update 6/2022
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Document basic steps, with use examples of the entire workflow here. This includes new, single land segment or grouping (like sova, nova, ...) focused iterations, and is a condensed version of the more complete workflow give here: HARPgroup/HARParchive#62
All NLDAS2 scripts from download to WDM creation for a specific model met/prad scenario:
get_nldas_to_date
- iterate through and retrieve all data available (see model_meteorology/sh/get_nldas_to_date )cd /backup/meteorology/
get_nldas_to_date YYYY [ending jday=today]
get_nldas_to_date 2022
get_nldas_data.bash
(in model_meteorology/sh/get_nldas_data.bash )/etc/cron.daily/deq-drought-model
wget --load-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies -np -r -NP -R "*.xml" -c -N --content-disposition https://hydro1.gesdisc.eosdis.nasa.gov/data/NLDAS/NLDAS_FORA0125_H.002/[YEAR]/[JULIAN DAY]
wget --load-cookies ~/.urs_cookies --auth-no-challenge=on --keep-session-cookies -np -r -NP -R "*.xml" -c -N --content-disposition https://hydro1.gesdisc.eosdis.nasa.gov/data/NLDAS/NLDAS_FORA0125_H.002/2002/001/
001/
bash /backup/meteorology/p5_g2a.bash 2020010100 2020123123 /backup/meteorology /backup/meteorology/out/grid_met_csv
bash /backup/meteorology/g2a_one.bash 2020010100 2020123123 /backup/meteorology /backup/meteorology/out/grid_met_csv x393y93
.p5_g2a_all
@alexwlowe - this streamlines p5_g2a.bash, using some logic to eliminate the duplication for first year, just a single loop that doesn't care about the time frame - it can handle it../p5_g2a_all 19840101 20201231 /backup/meteorology /backup/meteorology/out/grid_met_csv
NLDAS2_GRIB_to_ASCII
once per year instead of multiple times per year.grid2land.sh 1985010100 2020123123 /backup/meteorology /backup/meteorology/out/grid_met_csv A51031
southern_a2l_timeframe.bash
(see NLDAS2 bash scripts HARParchive#156 )a2l_one
a2l_one startYYYYMMDDHH endYYYYMMDDHH land_segment
/backup/meteorology/a2l_one 2020010100 2020123123 /backup/meteorology/out/grid_met_csv /backup/meteorology/out/lseg_csv A51035
LongTermAvgRNMax landseg_csv_file_path rnmax_file_output_path num_segs lseg1 lseg2 lseg3...
LongTermAvgRNMax /opt/model/p53/p532_alex/input/unformatted/nldas2/harp2021/1984010100-2020123123 /opt/model/p53/p532_alex/input/unformatted/nldas2/harp2021/RNMax 1 A51175
/backup/meteorology/wdm_generation_allLsegs.bash
wdm_generation_p5.bash
wdm_generation_p6.bash
wdm_pm_one
wdm_pm_one land_segment YYYYMMDDHH YYYYMMDDHH source version
wdm_pm_one A51031 1984010100 2020123123 nldas1221 harp2021
wdm_insert_ALL
Expects a directory of text files for each met parameter to live ininput/unformatted/[data_source]/[version]
- sowdm_pm_one
copies the files from the met source into there (and creates those directories if they don't already exist)/backup/meteorology/out/lseg_wdm/
rather than the sub-directory of a code directory in p532_alex/backup/meteorology/out/lseg_wdm/
, rather thanwdm_pm_one
above. Copy WDMs to a model scenario withmake_met_scenario.sh
/input/scenario/climate/met/[met scenario name]
wdm_flow_csv
:wdm_flow_csv [scenario] [riverseg] [start year] [end year]
cbp wdm_flow_csv CFBASE30Y20180615_vadeq JL1_6770_6850 1984 2020
Rscript $CBP_ROOT/run/export/wdm_export_flow.R [scenario] [landseg] [syear] [eyear] [CBP_EXPORT_DIR] [CBP_ROOT]
Rscript $CBP_ROOT/run/export/wdm_export_flow.R CFBASE30Y20180615_vadeq N51003 1984 2020 /media/model/p6 /opt/model/p6/gb604b
filename="/media/model/p6/out/land/$scenario/eos/${landseg}_0111-0211-0411.csv"
wdm_export_land_flow()
exports separate files for each flow component (111,211,411) for each land use in a land segment.filename="/media/model/p6/out/land/$scenario/eos/${landseg}_0111-0211-0411.csv"
$i CBASE1808L55CY55R45P50R45P50Y
Met Scenario Creation Script for HSPF/CBP model
No longer relevant -- this has been supplanted by
wdm_pm_one
/opt/model/p53/p532_alex/bin/make_met_scenario.sh
make_met_scenario.sh start end met_name prad_name nldas_dir model_dir
Script Prototype
How to use nohup command in Linux:
note on the data models & code section of the project
The text was updated successfully, but these errors were encountered: