-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in calculating T_r and w #1097
Comments
An attempt to fix the T issue is now on the xbands branch. There one corrects for the fact that the radiation temperature is not estimated correctly, by solving for the T that would give the observed main frequency. w is not corrected for the fact that j is also lower than expected. (Note that xbands branch is up to date with dev) Note that it is apparent that you you have a cell which has a high T and the frequency boundaries are not set to capture the peak of the distribution then one can be very sensitive to photon statistics since one only samples the Jean's tail. |
Not sure whether I've fully understood, but would this give the same answer for these two cases? a) a single power law with a single photon generation band from 1e18 to 1e20 |
Almost certainly not, because if you are modelling a power law as a BB, one's entire approach is wrong. But this statement applies both to the "old" approach and the "new" one. |
I should add that I have run the regression tests now of "xbands" vs "dev". The do produce "statistics" changes in all of the spectra, though there is nothing major in terms of the overall appearance. |
Well, yes, in terms of ion populations, but I'd like us to think about this a bit - we use t_r in other ways. For example, if the answer to my question is no, it means the simple ion level populations would change for an identical spectrum depending on a frequency boundary choice, and that seems like the wrong approach to me. |
Yes ... "new" or "old", that would be true. |
Maybe I do not understand the example. Are you saying that in both cases, photons are actually only generated between 1e18 and 1e20, and that no photons are actually generated between 1e16 and 1e18, and that the only difference in the two cases is that I have arbitrarily set the band boundaries to different values. If that is what you mean, then the two cases will produce a different result in the "new" approach, but the same result in the "old" approach, because the old approach only looks at the photons that were generated. I had assume you were asking about the situation where in one case one was producing photons between 1e18 and 1e20 and in the other case where one was producing photons between 1e16 and 1e20. |
So I missed this comment, replying here instead of on the pull request. So yes, my thought experiment example is what you say
I really think the t_r should be the same regardless of the band boundaries in this specific case, because the photons generated are the same, and otherwise the code is making assumptions about how to extrapolate/what is being modelled. I realise it is sometimes useful to have this new mode though, so we could we have it as an option or something? |
With the changes t_r will be the same regardless of the band boundaries for a thermal spectral spectrum, so I not sure why you draw the conclusion that you do. If you use the old approach, you will not get to LTE for T_r when you should because if you use the mean frequency with considering the band boundaries, you will overestimate T_r (typically) and underestimate w. This was in fact what drove me to make this change. I set up conditions that should lead to an intensity that should be in LTE (both according to my estimate and according to Cloudy), and I found for these conditions I was getting a higher T_e than I expected and a lower w. With the correction, Python and Cloudy are consistent in that sense. To put it another way, if you want to have t_r constant regardless of the band boundaries, you have to consider the band boundaries, because the band boundaries determine the mean frequency of photons that are generated. |
Please note that, because I didn't want to change the calculation of t_r just before release I've currently made the t_r calculation depend on a variable BAND_CORRECTED_TRAD in estimators_simple.c. Setting this to TRUE will turn on the band corrected version Knox has implemented, but this should probably either depend on ionization mode or depend on a runtime flag. See PR #1104 for the actual merge. |
We are making an error in the calculation of weights and T_r in estimators_simple, because we do not take into account the fact that we do generate photons over all wavelengths. Here is what we do
We should make corrections to account for the frequency interval over which we are generating photons in ionizations:
As an example, for a very thin shell just above 10e4 degree star using the CV banding. we obtain an electron temperature of 1.05e4 K, which because of the fact the the radiation temperature is raised to the 4th power, yields a w of 0.34, instead of the expected 0.5.
That might not be too much, unless you have a banded range set up for radiation temperatures of 1e4, but end up with spectra which are actually characterized by a lower temperature. In these cases, the "measured radiation" temperature will remain fairly high, but the w's will drop below what they would be if the correct radiation temperature were calculated.
The solution evidently would be to account for the fact that fact that we are only producing photons over a limited spectral range.
(Note that it is unclear whether this is important to correct; but it certainly arises in comparisons to routines like Cloudy which tend to calculate thermal spectra over wider range than we do in Python. One can also imagine that the bias can go in the other direction from that indicated in the example above. If our spectra do not go to high enough frequencies, we can underestimate the radiation T, and overestimate w.)
The text was updated successfully, but these errors were encountered: