-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Raster Workflow #82
Comments
create temp table tmp_prism_daily_01665500_precip_clipped as \lo_export 10133143 '/tmp/prism_01665500_daily_clipped.tiff' |
@rburghol @COBrogan This is my take on a workflow for our raster methods, at this moment, these steps download our three datasets, clip them to the extent of the nldas2 raster with the buffers, does our conversion to hourly data, resamples nldas2 and prism to daymets resolution, and then clips them to the actual watershed extent. I'm not sure how to feel about these rasters, obviously we can't just create more nldas2 data, but I thought resampling would lead to it having at least the same extent of daymet, instead all we have is two squares. I will say, the pro of this method is we have the full extent of daymet and prism, which last week we did not. |
Hey @mwdunlap2004 thanks for working through this. I would say that your workflow is really close, and to your final raster, this just suggests that we need to tweak something in the final steps, as it clearly is not getting the resample correctly. In other words, our proposed workflow will result in a By the way, that spatial variation in the |
[like] Scott, Durelle reacted to your message:
…________________________________
From: rburghol ***@***.***>
Sent: Wednesday, August 14, 2024 11:15:56 AM
To: HARPgroup/model_meteorology ***@***.***>
Cc: Subscribed ***@***.***>
Subject: Re: [HARPgroup/model_meteorology] Raster Workflow (Issue #82)
Hey @mwdunlap2004<https://github.com/mwdunlap2004> thanks for working through this. I would say that your workflow is really close, and to your final raster, this just suggests that we need to tweak something in the final steps, as it clearly is not getting the resample correctly. In other words, our proposed workflow will result in a daymet style raster, but we've just got something a little out of order here.
By the way, that spatial variation in the daymet raster looks really stark in contrast to the other rasters, indicating the potential benefit -- we just have to figure out what went wrong in the processing to end up at 2 blocks. I'm going to step through your process now and get back with you with suggestions. THis is timely as I am going to try to put this into the meta_model steps today.
—
Reply to this email directly, view it on GitHub<#82 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC4K42ZUVGT2A3LK4WSHAPLZRM36ZAVCNFSM6AAAAABMOEKOZSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBYGQ3TMNBRGI>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
@mwdunlap2004 these look really good - I should say that from my vantagepoint the code works as desired for the PRISM and daymet, successfully creating a 24 hour time series clipped to the boundary. My sense is that the issue you are having with the resampled NLDAS2 is due to clipping the nldas2 dataset when it is still in nldas2 format. In other words, you're starting with only 2 cells from the NLDAS2 data, and thus, the only daymet info you'll get are from cells that overlap those two cells, so, it is bound to look like 2 blocks (even if there is variation in sub-cells within those 2 large blocks). I am thinking that you will need to resample before you clip, or buffer the extent of the watershed before you clip (as @COBrogan suggested earlier when we were having a similar issue with PRISM). The same sequence tweak will be required to also insure that the PRISM captures the full data extent. One thing that will help to understand where things are happening are topic headers before each of your code blocks so that we know what you are trying to accomplish at each step. Note: given our growing understanding of memory requirements needed to store the 24 hour daymet data, these resampling steps will need to be moved to the final export phase, i.e., the If you want to explore the code needed to compute |
@rburghol I resampled both prism and nldas2 to daymet's resolution in the step before clipping. Are you saying it might be better to try and do the resample and clipping step together? Or are you saying resample should be one of our first steps instead of the last one? |
@mwdunlap2004 I think you have options, I think either could work:
Since both of these should work, we just have to choose which is more efficient for us. But first we need to figure our why it's not yet working -- or determine why the theory is in err :) |
@rburghol The output raster looks the same.... As in, we still only have two squares, and the values for those two squares match what they did in the previous rasters, but still. Not what we would have liked or expected to see. The colors look a little weird, but that's just because there are only two values, our previous rasters of hour 15, had around 1.7 as their value. |
Thanks for adding in the headers! I think that makes this a bit easier to understand at a glance. I also agree with Rob. Something is clearly going wrong at the resampling stage. I think focusing in on the |
Yearly example of our workflow:Downloads our full nldas2 data for 1 year
Downloads our full prism data for 1 year
Downloads our full daymet data for 1 year
Reclass our rasters, changes our value of -9999 to NULL for all of our datasets
Clips our nldas2 to our larger extent
Creates our daily precip for nldas2
Clips our daymet to the nldas2 extent
Clips our prism to the nldas2 extent
This creates our nldas2 fraction raster that we use for multiplication later
Multiplies our nldas2 fraction by our prism data
Multiplies daymet by our nldas2 fraction
Resamples prism and nldas2 to be at daymet's resolution
Clips all of our datasets to the correct watershed limits.
|
@rburghol overall our yearly workflow seems pretty good, but when I run the tmp_fraction_raster step (the one that creates our nldas2 fraction raster) I get a bunch of notices about null values, of course I set our null value as 0 for that step as we can't be dividing by zero so that could be why it's telling me that. But I just wanted to mention this error so when we meet later today it's on our minds. But the rasters I took a look at during the process looked exactly like we would have expected! so assuming those errors are just informing us that there were numerous zeros in our process, then I'm not considered, and I think this is a good method |
@mwdunlap2004 this is great news! I think that I still don't have a clear sense of the best way to handle NULL in the fraction raster step, so that will be one area to insure that we create a set of excellent case studies to validate the math and the water balance. I think maybe zooming in on a small watershed like the one you've been targeting would be a good plan. |
Just an FYI for when we get you the gage watershed geometries as WKT, you can read them in and plot them with: #Read in your well known text (WKT) using read.delim, read.csv, etc.
wktWatershed <- read.delim("poundCreek.txt",header = FALSE)
#Turn the WKT column into an sfc object with the appropriate coordinate system. I like to think of these as "raw" spatial files that don't have info besides geometries
watershed <- sf::st_as_sfc(wktWatershed$V1,crs=4326)
#Turn the sfc into sf, giving it attributes and stuff we can reference easily in R
watershedSF <- sf::st_as_sf(watershed)
#Read in an arbitrary raster file:
nldas <- raster("http://deq1.bse.vt.edu:81/met/PRISM/2015/004/PRISM_ppt_stable_4kmD2_20150104_bil.bil_CBP.gtiff")
#Crop the raster to speed up plotting
watershedbbox <- st_as_sfc(sf::st_bbox(watershedSF))
watershedBuffer <- st_buffer(watershedbbox,15000)
cropNLDAS <- raster::crop(nldas,st_as_sf(watershedBuffer))
#Create a plot, setting all margins to 2
par(mar=c(2,2,2,2))
plot(cropNLDAS,axes = TRUE)
plot(add = TRUE, watershedSF,lwd = 2) |
@COBrogan - we can pull the WKT in via rest as well, maybe the easiest way when showing an individual segment like you are demo'ing here. |
Goals
wdm
workflow. #86daymet_mod_daily
).daymet
andPRISM
) based on the ratings file for that watershed and best fit method. See Raster Mashups #55raster_templates
table for source)This is the order of steps for our workflow, this downloads all of our data, clips them to our new size, does our calculations on them, resamples to daymets resolution, and then clips them to the correct watershed size.
Downloads our full nldas2 data for 1 day
Downloads our full prism data for one day
Downloads our full daymet data for one day
Reclass our rasters, changes our value of -9999 to NULL for all of our datasets
Clips our nldas2 to our larger extent
Creates our daily precip for nldas2
Clips our daymet to the nldas2 extent
Clips our prism to the nldas2 extent
This creates our nldas2 fraction raster that we use for multiplication later
Multiplies our nldas2 fraction by our prism data
Multiplies daymet by our nldas2 fraction
Resamples prism and nldas2 to be at daymet's resolution
Clips all of our datasets to the correct watershed limits.
The text was updated successfully, but these errors were encountered: