Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GOES-R GLM L2 Gridded product reader and small ABI L1b changes #854

Merged
merged 18 commits into from
Dec 11, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions AUTHORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ The following people have made contributions to this project:
- [Suyash Behera (Suyash458)](https://github.com/Suyash458)
- [Andrew Brooks (howff)](https://github.com/howff)
- Guido della Bruna - meteoswiss
- [Eric Bruning (deeplycloudy)](https://github.com/deeplycloudy)
- [Lorenzo Clementi (loreclem)](https://github.com/loreclem)
- [Colin Duff (ColinDuff)](https://github.com/ColinDuff)
- [Radar, Satellite and Nowcasting Division (meteoswiss-mdr)](https://github.com/meteoswiss-mdr)
Expand Down
3 changes: 3 additions & 0 deletions doc/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -221,6 +221,9 @@ the base Satpy installation.
* - GEO-KOMPSAT-2 AMI L1B data in NetCDF4 format
- `ami_l1b`
- Beta
* - GOES-R GLM Grided Level 2 in NetCDF4 format
- `glm_l2`
- Beta


Indices and tables
Expand Down
8 changes: 8 additions & 0 deletions satpy/etc/composites/glm.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
sensor_name: visir/glm
composites:
C14_flash_extent_density:
compositor: !!python/name:satpy.composites.BackgroundCompositor
standard_name: c14_flash_extent_density
prerequisites:
- flash_extent_density
- C14
19 changes: 19 additions & 0 deletions satpy/etc/enhancements/glm.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
enhancements:
flash_extent_density:
name: flash_extent_density
operations:
- name: colorize
method: !!python/name:satpy.enhancements.colorize
kwargs:
palettes:
- {colors: ylorrd, min_value: 0, max_value: 20}
# Requires C14 from ABI
c14_flash_extent_density:
standard_name: c14_flash_extent_density
operations:
- name: stretch
method: !!python/name:satpy.enhancements.stretch
kwargs:
stretch: crude
min_stretch: [0, 0, 0]
max_stretch: [1, 1, 1]
76 changes: 38 additions & 38 deletions satpy/etc/readers/abi_l2_nc.yaml

Large diffs are not rendered by default.

49 changes: 49 additions & 0 deletions satpy/etc/readers/glm_l2.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
reader:
name: glm_l2
short_name: GLM Level 2
long_name: GOES-R GLM Level 2
description: >
NetCDF4 reader for GOES-R series GLM data. Currently only gridded L2 files
output from `gltmtools <https://github.com/deeplycloudy/glmtools>`_ are
supported.
sensors: [glm]
reader: !!python/name:satpy.readers.yaml_reader.FileYAMLReader
# file pattern keys to sort files by with 'satpy.utils.group_files'
group_keys: ['start_time', 'platform_shortname', 'scene_abbr']

# Typical filenames from Unidata THREDDS server:
# http://thredds.unidata.ucar.edu/thredds/catalog/satellite/goes/east/
# products/GeostationaryLightningMapper/CONUS/current/catalog.html
# OR_GLM-L2-GLMC-M3_G16_s20191920000000_e20191920001000_c20191920001380.nc

file_types:
glm_l2_imagery:
file_reader: !!python/name:satpy.readers.glm_l2.NCGriddedGLML2
file_patterns: ['{system_environment:2s}_{mission_id:3s}-L2-GLM{scene_abbr:s}-{scan_mode:2s}_{platform_shortname:3s}_s{start_time:%Y%j%H%M%S%f}_e{end_time:%Y%j%H%M%S%f}_c{creation_time:%Y%j%H%M%S%f}.nc']
# glm_l2_lcfa — add this with glmtools

datasets:
flash_extent_density:
name: flash_extent_density
file_type: glm_l2_imagery
group_extent_density:
name: group_extent_density
file_type: glm_l2_imagery
flash_centroid_density:
name: flash_centroid_density
file_type: glm_l2_imagery
group_centroid_density:
name: group_centroid_density
file_type: glm_l2_imagery
average_flash_area:
name: average_flash_area
file_type: glm_l2_imagery
minimum_flash_area:
name: minimum_flash_area
file_type: glm_l2_imagery
average_group_area:
name: average_group_area
file_type: glm_l2_imagery
total_energy:
name: total_energy
file_type: glm_l2_imagery
28 changes: 15 additions & 13 deletions satpy/readers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,8 @@


class TooManyResults(KeyError):
"""Special exception when one key maps to multiple items in the container."""

pass


Expand Down Expand Up @@ -258,17 +260,14 @@ def get_key(key, key_container, num_results=1, best=True,


class DatasetDict(dict):

"""Special dictionary object that can handle dict operations based on
dataset name, wavelength, or DatasetID.
"""Special dictionary object that can handle dict operations based on dataset name, wavelength, or DatasetID.

Note: Internal dictionary keys are `DatasetID` objects.
"""

def __init__(self, *args, **kwargs):
super(DatasetDict, self).__init__(*args, **kwargs)
"""

def keys(self, names=False, wavelengths=False):
"""Give currently contained keys."""
# sort keys so things are a little more deterministic (.keys() is not)
keys = sorted(super(DatasetDict, self).keys())
if names:
Expand Down Expand Up @@ -302,6 +301,7 @@ def getitem(self, item):
return super(DatasetDict, self).__getitem__(item)

def __getitem__(self, item):
"""Get item from container."""
try:
# short circuit - try to get the object without more work
return super(DatasetDict, self).__getitem__(item)
Expand All @@ -318,8 +318,7 @@ def get(self, key, default=None):
return super(DatasetDict, self).get(key, default)

def __setitem__(self, key, value):
"""Support assigning 'Dataset' objects or dictionaries of metadata.
"""
"""Support assigning 'Dataset' objects or dictionaries of metadata."""
d = value
if hasattr(value, 'attrs'):
# xarray.DataArray objects
Expand Down Expand Up @@ -369,13 +368,15 @@ def contains(self, item):
return super(DatasetDict, self).__contains__(item)

def __contains__(self, item):
"""Check if item exists in container."""
try:
key = self.get_key(item)
except KeyError:
return False
return super(DatasetDict, self).__contains__(key)

def __delitem__(self, key):
"""Delete item from container."""
try:
# short circuit - try to get the object without more work
return super(DatasetDict, self).__delitem__(key)
Expand Down Expand Up @@ -447,7 +448,7 @@ def group_files(files_to_sort, reader=None, time_threshold=10,
if group_keys is None:
group_keys = reader_instance.info.get('group_keys', ('start_time',))
file_keys = []
for filetype, filetype_info in reader_instance.sorted_filetype_items():
for _, filetype_info in reader_instance.sorted_filetype_items():
for f, file_info in reader_instance.filename_items_for_filetype(files_to_sort, filetype_info):
group_key = tuple(file_info.get(k) for k in group_keys)
file_keys.append((group_key, f))
Expand Down Expand Up @@ -484,7 +485,6 @@ def group_files(files_to_sort, reader=None, time_threshold=10,

def read_reader_config(config_files, loader=UnsafeLoader):
"""Read the reader `config_files` and return the info extracted."""

conf = {}
LOG.debug('Reading %s', str(config_files))
for config_file in config_files:
Expand All @@ -509,7 +509,7 @@ def load_reader(reader_configs, **reader_kwargs):


def configs_for_reader(reader=None, ppp_config_dir=None):
"""Generator of reader configuration files for one or more readers
"""Generate reader configuration files for one or more readers.

Args:
reader (Optional[str]): Yield configs only for this reader
Expand Down Expand Up @@ -719,8 +719,10 @@ def load_readers(filenames=None, reader=None, reader_kwargs=None,
LOG.debug(str(err))
continue

if readers_files:
loadables = reader_instance.select_files_from_pathnames(readers_files)
if not readers_files:
# we weren't given any files for this reader
continue
loadables = reader_instance.select_files_from_pathnames(readers_files)
if loadables:
reader_instance.create_filehandlers(loadables, fh_kwargs=reader_kwargs_without_filter)
reader_instances[reader_instance.name] = reader_instance
Expand Down
20 changes: 18 additions & 2 deletions satpy/readers/abi_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,10 @@ def __init__(self, filename, filename_info, filetype_info):
mask_and_scale=False,
chunks={'lon': CHUNK_SIZE, 'lat': CHUNK_SIZE}, )

self.nc = self.nc.rename({'t': 'time'})
if 't' in self.nc.dims:
self.nc = self.nc.rename({'t': 'time'})
platform_shortname = filename_info['platform_shortname']
self.platform_name = PLATFORM_NAMES.get(platform_shortname)
self.sensor = 'abi'

if 'goes_imager_projection' in self.nc:
self.nlines = self.nc['y'].size
Expand All @@ -68,6 +68,11 @@ def __init__(self, filename, filename_info, filetype_info):

self.coords = {}

@property
def sensor(self):
"""Get sensor name for current file handler."""
return 'abi'

def __getitem__(self, item):
"""Wrap `self.nc[item]` for better floating point precision.

Expand Down Expand Up @@ -245,6 +250,17 @@ def end_time(self):
"""End time of the current file's observations."""
return datetime.strptime(self.nc.attrs['time_coverage_end'], '%Y-%m-%dT%H:%M:%S.%fZ')

def spatial_resolution_to_number(self):
"""Convert the 'spatial_resolution' global attribute to meters."""
res = self.nc.attrs['spatial_resolution'].split(' ')[0]
if res.endswith('km'):
res = int(float(res[:-2]) * 1000)
elif res.endswith('m'):
res = int(res[:-1])
else:
raise ValueError("Unexpected 'spatial_resolution' attribute '{}'".format(res))
return res

def __del__(self):
"""Close the NetCDF file that may still be open."""
try:
Expand Down
5 changes: 1 addition & 4 deletions satpy/readers/abi_l1b.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,10 +57,7 @@ def get_dataset(self, key, info):
res.attrs['units'] = '%'

res.attrs.update({'platform_name': self.platform_name,
'sensor': self.sensor,
'satellite_latitude': float(self['nominal_satellite_subpoint_lat']),
'satellite_longitude': float(self['nominal_satellite_subpoint_lon']),
'satellite_altitude': float(self['nominal_satellite_height']) * 1000.})
'sensor': self.sensor})

# Add orbital parameters
projection = self.nc["goes_imager_projection"]
Expand Down
11 changes: 0 additions & 11 deletions satpy/readers/abi_l2_nc.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,17 +73,6 @@ def get_dataset(self, key, info):

return variable

def spatial_resolution_to_number(self):
"""Convert the 'spatial_resolution' global attribute to meters."""
res = self.nc.attrs['spatial_resolution'].split(' ')[0]
if res.endswith('km'):
res = int(float(res[:-2]) * 1000)
elif res.endswith('m'):
res = int(res[:-1])
else:
raise ValueError("Unexpected 'spatial_resolution' attribute '{}'".format(res))
return res

def available_datasets(self, configured_datasets=None):
"""Add resolution to configured datasets."""
for is_avail, ds_info in (configured_datasets or []):
Expand Down
117 changes: 117 additions & 0 deletions satpy/readers/glm_l2.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (c) 2019 Satpy developers
#
# This file is part of satpy.
#
# satpy is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software
# Foundation, either version 3 of the License, or (at your option) any later
# version.
#
# satpy is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR
# A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with
# satpy. If not, see <http://www.gnu.org/licenses/>.
"""Geostationary Lightning Mapper reader for the Level 2 format from glmtools.

More information about `glmtools` and the files it produces can be found on
the project's GitHub repository:

https://github.com/deeplycloudy/glmtools

"""
import logging
from datetime import datetime

from satpy.readers.abi_base import NC_ABI_BASE

logger = logging.getLogger(__name__)

PLATFORM_NAMES = {
'G16': 'GOES-16',
'G17': 'GOES-17',
}

# class NC_GLM_L2_LCFA(BaseFileHandler): — add this with glmtools


class NCGriddedGLML2(NC_ABI_BASE):
"""File reader for individual GLM L2 NetCDF4 files."""

@property
def sensor(self):
"""Get sensor name for current file handler."""
return 'glm'

@property
def start_time(self):
"""Start time of the current file's observations."""
return datetime.strptime(self.nc.attrs['time_coverage_start'], '%Y-%m-%dT%H:%M:%SZ')

@property
def end_time(self):
"""End time of the current file's observations."""
return datetime.strptime(self.nc.attrs['time_coverage_end'], '%Y-%m-%dT%H:%M:%SZ')

def get_dataset(self, key, info):
"""Load a dataset."""
logger.debug('Reading in get_dataset %s.', key.name)
res = self[key.name]
res.attrs.update({'platform_name': self.platform_name,
'sensor': self.sensor})
res.attrs.update(self.filename_info)

# Add orbital parameters
projection = self.nc["goes_imager_projection"]
res.attrs['orbital_parameters'] = {
'projection_longitude': float(projection.attrs['longitude_of_projection_origin']),
'projection_latitude': float(projection.attrs['latitude_of_projection_origin']),
'projection_altitude': float(projection.attrs['perspective_point_height']),
'satellite_nominal_latitude': float(self['nominal_satellite_subpoint_lat']),
'satellite_nominal_longitude': float(self['nominal_satellite_subpoint_lon']),
# 'satellite_nominal_altitude': float(self['nominal_satellite_height']),
}

res.attrs.update(key.to_dict())
# remove attributes that could be confusing later
res.attrs.pop('_FillValue', None)
res.attrs.pop('scale_factor', None)
res.attrs.pop('add_offset', None)
res.attrs.pop('_Unsigned', None)
res.attrs.pop('ancillary_variables', None) # Can't currently load DQF
# add in information from the filename that may be useful to the user
# for key in ('observation_type', 'scene_abbr', 'scan_mode', 'platform_shortname'):
for attr in ('scene_abbr', 'scan_mode', 'platform_shortname'):
res.attrs[attr] = self.filename_info[attr]
# copy global attributes to metadata
for attr in ('scene_id', 'orbital_slot', 'instrument_ID', 'production_site', 'timeline_ID'):
res.attrs[attr] = self.nc.attrs.get(attr)
return res

def available_datasets(self, configured_datasets=None):
"""Check actual Add information to configured datasets."""
# we know the actual resolution
res = self.spatial_resolution_to_number()

# update previously configured datasets
for is_avail, ds_info in (configured_datasets or []):
# some other file handler knows how to load this
# don't override what they've done
if is_avail is not None:
yield is_avail, ds_info

matches = self.file_type_matches(ds_info['file_type'])
if matches and ds_info.get('resolution') != res:
# we are meant to handle this dataset (file type matches)
# and the information we can provide isn't available yet
new_info = ds_info.copy()
new_info['resolution'] = res
exists = ds_info['name'] in self.nc
yield exists, new_info
elif is_avail is None:
# we don't know what to do with this
# see if another future file handler does
yield is_avail, ds_info
Loading