You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Create issue for problem handling timestamps in rhdf5, migrate comments from last weeks issues that are relevant so as to preserve (and review) our work. Handling timestamps in .h5 files with rhdf5 #241
Introduce next goal for data mining: LZS (may have to add this to the hsp2 simulation as a state variable that is tracked. we shall see).
Review basic models in vahydro -- Intake, and channel/impoundment primarily.
Mapping in R: @jdkleiner, when you think this is ready for prime time let's schedule a 30+ minute tutorial, and ideally, the 3 of us will come up with a goal for a "fact sheet" type Rmd to do a map, a hydro analysis and withdrawal summary table (using components from VWP where applicable).
Running demo of hsp2: I have 2 demo datasets already: landseg and riverseg. These demos use the basic HSP2 command line tools, and are simple and straightforward. However, I have to enable the install to be globally usable. Currently it is not.
Project Management/Medium-Term Challenges
Ultimately, we will be running hsp2 models via the same command line tools as the cbp hspf, not the hsp2 commands by themselves(demo above) since we will enable the cbp tools to use hsp2 or hspf. However, as we have yet to integrate hsp2 into the cbp commands, and also because these are good background, I think this a useful pursuit.
Accessing data from hdf5 database format: this is the central goal of the summer. There are 2 ways: R and cmd line tools (h5dump). R is easier, but there is a potential problem with timestamps, h5dump works fine it seems, but I don't know how to control formatting so there will need to be development to make it usable.
File sizes: already there seems to be some immediate drawbacks to HSP2. The HDF five data format as used by HSP to, is absolutely enormous. One land segment/land-use element generates a gigabyte of data for the 35 year model run period. Note: there are between 35 and 50 land-use/land segment elements for every river segment in the model, obviously this can't work. So, we will need to look at different ways to address this, maybe seeing if gzip can be used to compress, or maybe being very aggressive about cleanup after model runs -- that is, we export only those data components that we know we will need from the HDF5 file, then delete to save space. Nothing to figure out today, but we need to have this in our minds so we can come up with an ideal solution.
The text was updated successfully, but these errors were encountered:
Topics:
Project Management/Medium-Term Challenges
The text was updated successfully, but these errors were encountered: