Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LAI explosion? #844

Open
rgknox opened this issue Mar 10, 2022 · 9 comments
Open

LAI explosion? #844

rgknox opened this issue Mar 10, 2022 · 9 comments

Comments

@rgknox
Copy link
Contributor

rgknox commented Mar 10, 2022

Its possible that the issue we are seeing in PR #800 is related to how SLA changes during demotion. This is also relevant to #828

To summarize the issue, I suspect that when a large plant with lots of leaf area is demoted from the canopy to the understory (which is possible with both stochastic and rank-ordered demotion), we will encounter an "LAI explosion". Note that in our default parameter file, several of our sla_max'es are 10x larger than sla_top. Our algorithm assumes that when a plant is demoted to a lower layer, it is suddenly "under" all of these other leaves, which would push its mean SLA from SLA top to sla_max, thereby getting a many fold increase in SLA.

Perhaps we should modify our scheme such that plants are not influenced by canopy position, but SLA depth scaling si based more off of height?

@ckoven @glemieux @rosiealice @aswann @kovenock

Appologies if I'm re-hashing an old issue we've debated before!

@jkshuman
Copy link
Contributor

@JoshuaRady this sounds similar to the issue you were describing to me.

@rgknox
Copy link
Contributor Author

rgknox commented Mar 10, 2022

was thinking the same thing

@JoshuaRady
Copy link

Yes, this seem like the issue I'm currently working with. When my trees get large (approaching 50 cm) and one gets demoted the per tree LAI goes from a reasonable value of ~5 to close to 30, with VAI exceeding 30. This causes this error to trip. I'm seeing demotion both due to filling of the upper canopy layer and from fates_comp_excln type demotion.

I have been having trouble understanding what is happening in tree_lai() and interpreted the LAI jump as due to a crown area contraction, which didn't really make sense given that the spread factor was already maxed out or nearly so. I had not considered the SLA profile.

For me this happening in particular under RCP 8.5 later century forcing. I suspect this is because the foliage is not trimming much under this forcing and the trees are getting bigger faster. Moreover, my PFTs allometry has an increasing foliage density (leaf biomass per crown area) with size, which exacerbates the problem.

I was looking at changing fates_vai_width_increase_factor to make room for more VAI, since this is rare, but am not sure about sensible limits on this value.

@rgknox
Copy link
Contributor Author

rgknox commented Mar 10, 2022

One solution would be to let the cohorts maintain their own lai_above tendency. Instead of inheriting the LAI above instantaneously from the plant's canopy position, and using that to drive calculations on SLA, the lai_above term will be slowly pushed in the direction of the LAI in the canopy layer above.

@JoshuaRady
Copy link

That makes some sense. The lack of any subdominant canopy position in the current implementation is on highlight here. Something that would give demoted cohorts time to respond to their change in position would be good.

@rgknox
Copy link
Contributor Author

rgknox commented Mar 10, 2022

This might be a nice use of the new running mean feature. The cohort level, tracked leaf area above could follow an exponential moving average, and we could vary the window length (days) to test sensitivity.

@ckoven
Copy link
Contributor

ckoven commented Mar 10, 2022

In principle, I don't think it shouldn't matter very much if we were to add a memory feature like you describe, because demoted cohorts tend to get absorbed very quickly into whatever cohort of the approximate size is already there in the understory canopy position, and which will therefore have trimmed its leaves to be in carbon balance at that position. So I feel like this is most likely mainly a model stability problem of what to do in these weird transient edge cases? But even still, that this is happening at all points to potential parameter issues -- if a plant has an LAI ~ 40, then that probably means that some combination of the plant's crown area is too small, it's leaf biomass allocation is too large, and/or it's SLA is too large. If I remember correctly, I think we decided to leave this as a crasher in the code (rather than just ignoring all leaf area greater than some max valid value) specifically because if it is happening, then it is probably symptomatic of other problems in a simulation. @JoshuaRady in your case, particularly since you are using a larger exponent on leaf biomass than crown area, then it is less surprising that this might happen. So some combination of either changing the parameters (particularly decreasing slamax) or else adding more LAI bins, widening the LAI bin width, or using an exponential grid might be needed to handle the high LAIs if they are intended.

@rgknox
Copy link
Contributor Author

rgknox commented Mar 10, 2022

I think its a win-win to have this memory. Firstly, it seems more consistent with what would happen in nature, a tree is a manifestation of its environment, if its environment changes drastically, it doesn't immediately forget its past, it can't snap its fingers (twigs?) and change its SLA. The other win is that if this reduces our maximum LAIs, it simultaneously makes the model more robust and allows us to use array space to capture LAI with higher granularity.

@mpaiao
Copy link
Contributor

mpaiao commented Mar 11, 2022

@rgknox I like with the memory idea. Presumably this could be linked with the leaf turnover rates, so new leaves would be produced with the new SLA, but existing leaves would keep the pre-demotion values.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants