-
Notifications
You must be signed in to change notification settings - Fork 300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix LI L2 accumulated products 'with_area_definition': False
1-d coordinates computation
#2804
Fix LI L2 accumulated products 'with_area_definition': False
1-d coordinates computation
#2804
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just one question.
@@ -371,6 +371,9 @@ def inverse_projection(self, azimuth, elevation, proj_dict): | |||
azimuth = azimuth.values * point_height | |||
elevation = elevation.values * point_height | |||
|
|||
# In the MTG world, azimuth is defined as positive towards west, while proj expects it positive towards east | |||
azimuth *= -1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was going to comment that in-place multiplication with Dask arrays is dangerous, but it seems the array is already computed (azimuth.values
is accessed). Is that necessary for some further computations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That self.inverse_projection
method is called inside a da.map_blocks
:
satpy/satpy/readers/li_base_nc.py
Lines 339 to 344 in d29268c
# Daskify inverse projection computation: | |
lon, lat = da.map_blocks(self.inverse_projection, azimuth, elevation, proj_dict, | |
chunks=(2, azimuth.shape[0]), | |
meta=np.array((), dtype=azimuth.dtype), | |
dtype=azimuth.dtype, | |
) |
so believe it should remain daskified. IIRC, we needed to get the
.values
inside there as the following Proj
call accepts only computed values.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would have thought da.map_blocks()
would pass Numpy arrays to the method being used 🤔 Haven't used it in a while so might remember incorrectly. And if it works, then it must be correct because Numpy arrays wouldn't have .values
attribute causing a crash 😅
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2804 +/- ##
=======================================
Coverage 95.95% 95.95%
=======================================
Files 379 379
Lines 53888 53908 +20
=======================================
+ Hits 51708 51728 +20
Misses 2180 2180
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Pull Request Test Coverage Report for Build 9189835061Details
💛 - Coveralls |
This PR fixes a bug in the
li_l2_nc
reader when requesting accumulated products as 1-d arrays (by usingreader_kwargs={'with_area_definition': False}
). Note that it is anyway recommended to request accumulated products as 2-d arrays usingreader_kwargs={'with_area_definition': True}
(as discussed in #2789).The bug was due to a missing
*= -1
on the azimuth geos angle coordinate (used inside the 1-d coordinates calculation), needed to account for a convention difference between MTG data and Proj. Without this fix, the coordinates (particularly the longitude) are computed with fully wrong values.This PR fixes this issue and adds a test to check the consistency between the 1-d computed coordinates and the according coordinates extracted from the 2-d arrays with
AreaDefinition
. Before this fix this test would have failed.It also updates the test filecontents for more realistic tests.
Checking the reader on a real-life case, comparing the resampled 1-d arrays and the 2-d arrays:
Before this PR, the lightning data is at fully wrong locations (left: resampled 1-d arrays, right: 2-d):
After this PR, the results are comparable (left: resampled 1-d arrays, right: 2-d)::
According code: