Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix LI L2 accumulated products 'with_area_definition': False 1-d coordinates computation #2804

Merged
merged 7 commits into from
Jun 4, 2024

Conversation

ameraner
Copy link
Member

This PR fixes a bug in the li_l2_nc reader when requesting accumulated products as 1-d arrays (by using reader_kwargs={'with_area_definition': False}). Note that it is anyway recommended to request accumulated products as 2-d arrays using reader_kwargs={'with_area_definition': True} (as discussed in #2789).

The bug was due to a missing *= -1 on the azimuth geos angle coordinate (used inside the 1-d coordinates calculation), needed to account for a convention difference between MTG data and Proj. Without this fix, the coordinates (particularly the longitude) are computed with fully wrong values.

This PR fixes this issue and adds a test to check the consistency between the 1-d computed coordinates and the according coordinates extracted from the 2-d arrays with AreaDefinition. Before this fix this test would have failed.
It also updates the test filecontents for more realistic tests.

  • Tests added

Checking the reader on a real-life case, comparing the resampled 1-d arrays and the 2-d arrays:

Before this PR, the lightning data is at fully wrong locations (left: resampled 1-d arrays, right: 2-d):
image

After this PR, the results are comparable (left: resampled 1-d arrays, right: 2-d)::
image

According code:

from satpy import Scene
from glob import glob
import matplotlib.pyplot as plt
import os
import numpy as np
from satpy.utils import debug_on;

debug_on()
from satpy.writers import get_enhanced_image
import matplotlib

matplotlib.use("TkAgg")

ll_bbox = [33, 13, 60, 27]

# # initialise SEVIRI Scene
sevscn = Scene(
    filenames=['/tcenas/fbf/MSG/in/MSG3/OPE3/SEVI-MSG15/MSG3-SEVI-MSG15-0100-NA-20240501121242.333000000Z-NA.nat'],
    reader='seviri_l1b_native')
sevscn.load(['natural_color'])
sevscn_r = sevscn.resample('mtg_fci_fdss_2km', resampler='nearest')
sevscn_r = sevscn_r.crop(ll_bbox=ll_bbox)
seviri_img = np.moveaxis(get_enhanced_image(sevscn_r['natural_color']).data.values, 0, -1)

# initialise LI L2 acc product scene as 1-d array output
path_to_testdata = '/tcenas/scratch/andream/lil2a/'
scn = Scene(filenames=glob(os.path.join(path_to_testdata, '*AF*---*T_0073_0001.nc')), reader='li_l2_nc',
            reader_kwargs={'with_area_definition': False})
scn.load(['flash_accumulation'])
# resample 1-d array onto grid
scn_r = scn.resample('mtg_fci_fdss_2km', resampler='bucket_avg')
scn_r_c = scn_r.crop(ll_bbox=ll_bbox)

# plot resampled 1-d arrays on top of SEVIRI
crs = scn_r_c["flash_accumulation"].attrs['area'].to_cartopy_crs()
fig, axs = plt.subplots(1, 2, subplot_kw={'projection': crs})
ax = axs[0]
ax.coastlines(resolution='10m')

ax.imshow(seviri_img, transform=crs, extent=crs.bounds, origin='upper',
          interpolation='none')
ax.imshow(scn_r_c["flash_accumulation"], transform=crs, extent=crs.bounds, origin='upper',
          interpolation='none')


# initialise LI L2 acc product scene as 2-d array output (recommended, does not require resampling)
scn_adef = Scene(filenames=glob(os.path.join(path_to_testdata, '*AF*---*T_0073_0001.nc')), reader='li_l2_nc',
            reader_kwargs={'with_area_definition': True})
scn_adef.load(['flash_accumulation'], upper_right_corner='NE')
scn_adef_c = scn_adef.crop(ll_bbox=ll_bbox)

# plot 2d-array on top of SEVIRI
crs = scn_adef_c["flash_accumulation"].attrs['area'].to_cartopy_crs()
ax = axs[1]
ax.coastlines(resolution='10m')

ax.imshow(seviri_img, transform=crs, extent=crs.bounds, origin='upper',
          interpolation='none')
ax.imshow(scn_adef_c["flash_accumulation"], transform=crs, extent=crs.bounds, origin='upper',
          interpolation='none')

plt.show(block=False)

Copy link
Member

@pnuu pnuu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just one question.

@@ -371,6 +371,9 @@ def inverse_projection(self, azimuth, elevation, proj_dict):
azimuth = azimuth.values * point_height
elevation = elevation.values * point_height

# In the MTG world, azimuth is defined as positive towards west, while proj expects it positive towards east
azimuth *= -1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was going to comment that in-place multiplication with Dask arrays is dangerous, but it seems the array is already computed (azimuth.values is accessed). Is that necessary for some further computations?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That self.inverse_projection method is called inside a da.map_blocks:

# Daskify inverse projection computation:
lon, lat = da.map_blocks(self.inverse_projection, azimuth, elevation, proj_dict,
chunks=(2, azimuth.shape[0]),
meta=np.array((), dtype=azimuth.dtype),
dtype=azimuth.dtype,
)

so believe it should remain daskified. IIRC, we needed to get the .values inside there as the following Proj call accepts only computed values.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would have thought da.map_blocks() would pass Numpy arrays to the method being used 🤔 Haven't used it in a while so might remember incorrectly. And if it works, then it must be correct because Numpy arrays wouldn't have .values attribute causing a crash 😅

Copy link

codecov bot commented May 22, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95.95%. Comparing base (3b9c04e) to head (d29268c).
Report is 502 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2804   +/-   ##
=======================================
  Coverage   95.95%   95.95%           
=======================================
  Files         379      379           
  Lines       53888    53908   +20     
=======================================
+ Hits        51708    51728   +20     
  Misses       2180     2180           
Flag Coverage Δ
behaviourtests 4.09% <0.00%> (-0.01%) ⬇️
unittests 96.05% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@coveralls
Copy link

Pull Request Test Coverage Report for Build 9189835061

Details

  • 21 of 21 (100.0%) changed or added relevant lines in 4 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.002%) to 96.034%

Totals Coverage Status
Change from base Build 9157044280: 0.002%
Covered Lines: 51600
Relevant Lines: 53731

💛 - Coveralls

@ameraner ameraner added this to the v0.49.0 milestone May 22, 2024
@mraspaud mraspaud merged commit be628a4 into pytroll:main Jun 4, 2024
18 of 19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants