Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fire blending to GOCART #2883

Open
wants to merge 91 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
c8e6344
add bones to the fire blending in the prep_emissions job
bbakernoaa Sep 13, 2024
4d81cb8
pycodestyle
bbakernoaa Sep 13, 2024
5f86abd
pycodestyle fixes
bbakernoaa Sep 13, 2024
62dbf31
fix prep_emission jjob
bbakernoaa Sep 13, 2024
ae3f599
Merge branch 'develop' into feature/fire_blending
bbakernoaa Sep 13, 2024
16e21bd
Merge branch 'develop' into feature/fire_blending
aerorahul Sep 14, 2024
b22d7bc
Cleanup job for GEFS (#2919)
AntonMFernando-NOAA Sep 14, 2024
9965857
Update config.resources for bufr sounding job postsnd (#2917)
BoCui-NOAA Sep 16, 2024
5b57604
Update global atmos upp job to use COMIN/COMOUT (#2867)
mingshichen-noaa Sep 16, 2024
1f95b2b
Update to obsproc/v1.2.0 and prepobs/v1.1.0 (#2903)
KateFriedman-NOAA Sep 18, 2024
eb8bc78
Merge remote-tracking branch 'remotes/upstream/develop' into feature/…
bbakernoaa Sep 19, 2024
9a9a874
Merge branch 'develop' into feature/fire_blending
aerorahul Sep 25, 2024
3b87649
Update parm/prep/aero_emissions.yaml
bbakernoaa Oct 7, 2024
21d7bad
Update parm/prep/aero_emissions.yaml
bbakernoaa Oct 7, 2024
58e7280
Update parm/prep/aero_emissions.yaml
bbakernoaa Oct 7, 2024
f74f471
Update parm/prep/aero_emissions.yaml
bbakernoaa Oct 7, 2024
945508f
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 15, 2024
eb53925
add coarsen_scale to yaml and use in aero_emission.py
bbakernoaa Oct 15, 2024
55d7ada
remove debugging print statements
bbakernoaa Oct 15, 2024
4aec698
Update parm/prep/aero_emissions.yaml
bbakernoaa Oct 15, 2024
2d8155d
add init doc block
bbakernoaa Oct 15, 2024
389f9da
remove comment
bbakernoaa Oct 15, 2024
2875b6e
move syncing to initialize
bbakernoaa Oct 15, 2024
404131d
Merge remote-tracking branch 'refs/remotes/origin/feature/fire_blendi…
bbakernoaa Oct 15, 2024
ca87ed9
add loger info for this and error trapping
bbakernoaa Oct 15, 2024
9ff202f
add more changes
bbakernoaa Oct 15, 2024
55d5892
more changes, docstrings, error checking etc
bbakernoaa Oct 16, 2024
f853e4f
pycodestyle changes
bbakernoaa Oct 16, 2024
4dfba87
add blended emission option
bbakernoaa Oct 16, 2024
dc5c8bc
copy blending from COMOUT if AERO_EMIS_FIRE == blending
bbakernoaa Oct 16, 2024
dbc63e8
Update ush/forecast_postdet.sh
bbakernoaa Oct 17, 2024
a43e0c6
Update ush/forecast_postdet.sh
bbakernoaa Oct 17, 2024
6836bfc
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
4e30517
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
d0b20b4
changing resources
bbakernoaa Oct 17, 2024
1fa7df1
remove starthour
bbakernoaa Oct 17, 2024
d1dad35
Merge branch 'develop' into feature/fire_blending
bbakernoaa Oct 17, 2024
4a696b3
fix pycodestyle
bbakernoaa Oct 17, 2024
8d9a124
complete passing vars from yaml
bbakernoaa Oct 17, 2024
42da2da
pycodestyle again
bbakernoaa Oct 17, 2024
9753ddc
fix passing matching vars arrays from yaml file
bbakernoaa Oct 17, 2024
84edf69
more changes to fix issues
bbakernoaa Oct 17, 2024
9583817
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
ec4f462
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
65572fa
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
ffb8664
add changes to copy the data for each fire case and add a method that…
bbakernoaa Oct 17, 2024
7a1efd1
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 17, 2024
b628f29
last suggetions?
bbakernoaa Oct 17, 2024
3c170db
Merge branch 'develop' into feature/fire_blending
bbakernoaa Oct 17, 2024
d44c5b9
add with command when opening files
bbakernoaa Oct 18, 2024
5d2f493
Merge remote-tracking branch 'refs/remotes/origin/feature/fire_blendi…
bbakernoaa Oct 18, 2024
f204898
Update scripts/exglobal_prep_emissions.py
bbakernoaa Oct 18, 2024
e7bf48a
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 18, 2024
eab4554
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 18, 2024
71a0547
Merge branch 'develop' into feature/fire_blending
bbakernoaa Oct 22, 2024
d6ce0ab
Update aero_emissions.py
bbakernoaa Oct 22, 2024
313cb62
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 30, 2024
438c421
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 30, 2024
8a471fa
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Oct 30, 2024
1df78dd
Merge branch 'develop' into feature/fire_blending
bbakernoaa Oct 31, 2024
4e660a5
addressing many issues
bbakernoaa Oct 31, 2024
f4a31bd
Adding n_persist to the yaml file and populating that through
bbakernoaa Oct 31, 2024
c379b5c
Merge branch 'develop' into feature/fire_blending
bbakernoaa Oct 31, 2024
82a2805
Merge branch 'develop' into feature/fire_blending
bbakernoaa Nov 4, 2024
6d12fb3
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Nov 12, 2024
934c95f
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Nov 12, 2024
46838fd
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Nov 12, 2024
d6e8fc1
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Nov 12, 2024
5039957
Update ush/python/pygfs/task/aero_emissions.py
bbakernoaa Nov 12, 2024
5124844
Use current_cycle
zmoon Nov 21, 2024
9b86ea7
Seems like this should be underscore
zmoon Nov 21, 2024
5cdd6a6
`config_dict`
zmoon Nov 21, 2024
4ced2d2
Break long line
zmoon Nov 21, 2024
d8b6d29
Just use the args
zmoon Nov 21, 2024
511629c
Add missing inits
zmoon Nov 21, 2024
6ecf5c6
Remove some unused imports and sort
zmoon Nov 21, 2024
98ab6e6
Address unused vars
zmoon Nov 21, 2024
7aae5e9
Seems like these are really assuming lists of str
zmoon Nov 21, 2024
1b56d63
Rename not needed
zmoon Nov 21, 2024
7ea3b68
Replace input/output var lists with mapping
zmoon Nov 21, 2024
5ce4047
Some spelling/typing/docstring fixes
zmoon Nov 21, 2024
0670aae
Merge remote-tracking branch 'origin/develop' into feature/fire_blending
zmoon Nov 21, 2024
8b3718f
Does nothing
zmoon Nov 25, 2024
3227263
Log before raise
zmoon Nov 25, 2024
991b348
Fix HFED guard in init
zmoon Nov 26, 2024
ca3bc1a
Updates following convo with Barry
zmoon Nov 26, 2024
7df5b5e
Dict of files from yaml
zmoon Nov 26, 2024
bc9e8f9
Use new files dict
zmoon Nov 26, 2024
d047882
FATAL ERROR
zmoon Dec 3, 2024
8fabb61
Merge remote-tracking branch 'origin/develop' into feature/fire_blending
zmoon Dec 3, 2024
46fdc0b
`f` -> `filepath`
zmoon Dec 3, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions parm/prep/aero_emissions.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
aero_emissions:
config:
debug: False
ratio: 0.95 # weighting ratio
emistype: 'QFED' # EMission Type, Valid answers: 'QFED, GBBEPx, HFED'
ratio: 0.95 # weighting ratio for obs
emistype: 'QFED' # emission type {'QFED', 'GBBEPx', 'HFED'}
climfile_str: 'GBBEPx-all01GRID_v4r0_climMean' # climate file base string used for glob later
bbakernoaa marked this conversation as resolved.
Show resolved Hide resolved
GBBEPx_version: 'v4r0' # gbbepx version
qfed_version: '006' # qfed version
gbbepx_version: 'v4r0' # GBBEPx version
qfed_version: '006' # QFED version
species: ['so2', 'oc', 'bc'] # species to be used
historical: False # set to true to just use true data for the given day
coarsen_scale: 150 # scale for coarsen function to generate weights
Expand Down
4 changes: 2 additions & 2 deletions ush/forecast_postdet.sh
Original file line number Diff line number Diff line change
Expand Up @@ -689,8 +689,8 @@ GOCART_rc() {
[[ ${status} -ne 0 ]] && exit "${status}"
fi

# Link blending emissions if AERO_EMIS_FIRE == blending
if [[ "${AERO_EMIS_FIRE}" == "blending" && "${RUN}" == "gefs" ]]; then
# Link blended emissions if AERO_EMIS_FIRE is 'blended'
if [[ "${AERO_EMIS_FIRE}" == "blended" && "${RUN}" == "gefs" ]]; then
${NCP} "${COMOUT_CHEM_HISTORY}/${RUN}.${current_cycle:0:8}.${RUN}_blended_emissions.nc" "${DATA}"
fi

Expand Down
204 changes: 70 additions & 134 deletions ush/python/pygfs/task/aero_emissions.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,12 @@ def __init__(self, config: Dict[str, Any]) -> None:
logger.debug(f"aero_emission_yaml:\n{pformat(self.task_config.aero_emission_yaml)}")

config = self.task_config.aero_emission_yaml['aero_emissions']['config']
qfedfiles = [os.path.basename(fname[0]) for fname in config['data_in']['qfed']['copy']]
hfedfiles = [os.path.basename(fname[0]) for fname in config['data_in']['hfed']['copy']]
gbbepxfiles = [os.path.basename(fname[0]) for fname in config['data_in']['gbbepx']['copy']]
climofiles = [os.path.basename(fname[0]) for fname in config['data_in']['climo']['copy']]
emission_files = {}
for emission_type in ['qfed', 'hfed', 'gbbepx', 'climo']:
emission_files[emission_type] = [
os.path.basename(src)
for src, _ in config['data_in'][emission_type]['copy']
]
n_persist = config['n_persist']

localdict = AttrDict(
Expand All @@ -67,14 +69,13 @@ def __init__(self, config: Dict[str, Any]) -> None:
"current_date": self.task_config.PDY,
'config': config,
'emistype': config['emistype'],
'climofiles': climofiles,
'qfedfiles': qfedfiles,
'hfedfiles': hfedfiles,
'gbbepxfiles': gbbepxfiles,
'emisfiles': emission_files,
'n_persist': n_persist
}
)

# TODO: if AERO_EMIS_FIRE is not 'blended' we don't want to do anything!
Copy link

@zmoon zmoon Dec 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WalterKolczynski-NOAA could you share some insights on how we should accomplish this? Is there a way to get the value of AERO_EMIS_FIRE (from parm/config/{gefs,gfs)/config.aero) within this Task? Or should the Task just be skipped somewhere further up the workflow scripting chain?

cc: @bbakernoaa

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the job isn't needed at all, then it should be skipped at the workflow level (in gfs_tasks.py/gefs_tasks.py).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, it should be removed in workflow/applications/gfs_cycled.py etc, which may also require changed to gfs_tasks.py, etc.


# Extend task_config with localdict
self.task_config = AttrDict(**self.task_config, **localdict)

Expand All @@ -97,14 +98,13 @@ def initialize(self) -> None:
emistype = self.task_config.emistype

# Copy climatology files to run directory except for HFED
# HFED is already smoothed, monthly data
# QFED or GBBBEPx will be blended with climatology
if emistype.lower() != 'hfed':
logger.info(
f"Copy HFED '{data_in.hfed}' data to run directory"
)
logger.info("Copy climatology data to run directory")
FileHandler(data_in.climo).sync()
logger.info(f"Copy {emistype} data to run directory")
FileHandler(data_in[emistype.lower()]).sync()
logger.info("Copy climatology data to run directory")
FileHandler(data_in.climo).sync()
logger.info(f"Copy {emistype} data to run directory")
FileHandler(data_in[emistype.lower()]).sync()

@logit(logger)
def run(self) -> None:
Expand All @@ -122,30 +122,21 @@ def run(self) -> None:
config_dict = self.task_config['config']
emistype = self.task_config['emistype']
ratio = config_dict['ratio']
climfiles = self.task_config['climofiles']
climfiles = self.task_config['emisfiles']['climo']
coarsen_scale = config_dict['coarsen_scale']
out_var_dict = config_dict['output_var_map']
current_date = self.task_config['current_date']
n_persist = config_dict['n_persist']

emission_map = {'qfed': self.task_config['qfedfiles'],
'gbbepx': self.task_config['gbbepxfiles'],
'hfed': self.task_config['hfedfiles']}
try:
basefile = self.task_config['emisfiles'][emistype.lower()]
except KeyError as e:
logger.exception(f"{emistype.lower()} is not a supported emission type")
raise Exception(
f"FATAL ERROR: {emistype.lower()} is not a supported emission type, ABORT!"
) from e

if emistype.lower() != 'blended':
try:
basefile = emission_map[emistype.lower()]
except KeyError as err:
raise KeyError(f"FATAL ERROR: {emistype.lower()} is not a supported emission type, ABORT!") from err

if emistype.lower() == 'hfed':
AerosolEmissions.process_hfed(
files=basefile,
out_name=config_dict.data_out['copy'][0][0],
out_var_dict=out_var_dict)
else:
if emistype.lower() != 'hfed':
dset = AerosolEmissions.make_fire_emission(
d=current_date,
climos=climfiles,
ratio=ratio,
scale_climo=True,
Expand All @@ -156,54 +147,12 @@ def run(self) -> None:

AerosolEmissions.write_ncf(dset, config_dict.data_out['copy'][0][0])

@staticmethod
@logit(logger)
def process_hfed(files: List[str], out_name: str, out_var_dict: Dict[str, str] = None) -> None:
"""
Process HFED files to generate fire emissions data.

Parameters
----------
files : list
List of HFED files to process.
out_name : str
Name of the output file to save the processed data.
out_var_dict : dict, optional
Mapping of input variable name to desired (output) variable name.

Returns
-------
None
"""
if out_var_dict is None:
raise Exception("FATAL ERROR: No output variable mapping provided")

if len(files) == 0:
raise Exception("FATAL ERROR: Received empty list of HFED files")

found_species = []
dset_dict = {}
for f in sorted(files):
logger.info(f"Opening HFED file: {f}")
_, input_var = os.path.basename(f).split(".")[1].split("_")
found_species.append(input_var)
try:
with xr.open_dataset(f, decode_cf=False).biomass as da:
da.name = out_var_dict[input_var]
dset_dict[da.name] = da
except Exception as ee:
logger.exception(f"FATAL ERROR: unable to read dataset {ee}")
raise Exception("FATAL ERROR: Unable to read dataset, ABORT!")

dset = xr.Dataset(dset_dict)

AerosolEmissions.write_ncf(dset, out_name)

@staticmethod
@logit(logger)
def open_qfed(files: List[str], out_var_dict: Dict[str, str] = None) -> xr.Dataset:
WalterKolczynski-NOAA marked this conversation as resolved.
Show resolved Hide resolved
"""
Open QFED2 fire emissions data and renames variables to a standard (using the GBBEPx names to start with).
Open QFED2 fire emissions data combining files into one Dataset
and renaming variables to standard (GBBEPx) names.

Parameters
----------
Expand All @@ -218,9 +167,11 @@ def open_qfed(files: List[str], out_var_dict: Dict[str, str] = None) -> xr.Datas
Dataset containing the fire emissions data
"""
if out_var_dict is None:
logger.info("No output variable mapping provided")
raise Exception("FATAL ERROR: No output variable mapping provided")

if len(files) == 0:
logger.info("No files provided")
raise Exception("FATAL ERROR: Received empty list of QFED files")

found_species = []
Expand All @@ -233,9 +184,9 @@ def open_qfed(files: List[str], out_var_dict: Dict[str, str] = None) -> xr.Datas
with xr.open_dataset(f, decode_cf=False).biomass as da:
da.name = out_var_dict[input_var]
dset_dict[da.name] = da
except Exception as ee:
logger.exception(f"FATAL ERROR: unable to read dataset {ee}")
raise Exception("FATAL ERROR: Unable to read dataset, ABORT!")
except Exception as e:
logger.exception(f"Unable to read dataset: {f}")
zmoon marked this conversation as resolved.
Show resolved Hide resolved
raise Exception("FATAL ERROR: Unable to read dataset, ABORT!") from e

dset = xr.Dataset(dset_dict)

Expand All @@ -260,14 +211,14 @@ def open_climatology(files: List[str]) -> xr.Dataset:
das = []

logger.info("Process Climatology Files")
for filename in sorted(files):
logger.info(f" Opening Climatology File: {filename}")
for f in sorted(files):
WalterKolczynski-NOAA marked this conversation as resolved.
Show resolved Hide resolved
logger.info(f"Opening Climatology File: {f}")
try:
with xr.open_dataset(filename, engine="netcdf4") as da:
with xr.open_dataset(f, engine="netcdf4") as da:
das.append(da)
except Exception as ee:
logger.exception("Encountered an error reading climatology file, {error}".format(error=ee))
raise Exception("FATAL ERROR: Unable to read file, ABORT!")
except Exception as e:
logger.exception(f"Encountered an error reading climatology file: {f}")
zmoon marked this conversation as resolved.
Show resolved Hide resolved
raise Exception("FATAL ERROR: Unable to read file, ABORT!") from e

return xr.concat(das, dim="time")

Expand Down Expand Up @@ -301,28 +252,28 @@ def write_ncf(dset: xr.Dataset, outfile: str) -> None:
encoding["time"] = dict(dtype="i4")
try:
dset.load().to_netcdf(outfile, encoding=encoding)
except Exception as ee:
logger.exception("Encountered an exception in writing dataset, {}".format(ee))
raise Exception("FATAL ERROR: Unable to write dataset, ABORT!")
except Exception as e:
logger.exception("Encountered an error writing dataset")
zmoon marked this conversation as resolved.
Show resolved Hide resolved
raise Exception("FATAL ERROR: Unable to write dataset, ABORT!") from e

@staticmethod
@logit(logger)
def create_climatology(
emissions: xr.DataArray, climatology: xr.DataArray, lat_coarse: int = 50, lon_coarse: int = 50
) -> xr.Dataset:
"""
Create scaled climatology data based on emission data.
Create scaled daily climatology data based on observed emission data.

Parameters
----------
emissions : xarray.DataArray
Emission data.
Emission data. Just one time step. Same grid as the climatology.
climatology : xarray.Dataset
Input climatology data.
Input climatology data. Multiple days of daily data.
lat_coarse : int, optional
Coarsening factor for latitude. Defaults to 50.
Coarsening factor for latitude. Defaults to 50 (0.1 deg -> 5 deg).
lon_coarse : int, optional
Coarsening factor for longitude. Defaults to 50.
Coarsening factor for longitude. Defaults to 50 (0.1 deg -> 5 deg).

Returns
-------
Expand All @@ -332,36 +283,27 @@ def create_climatology(
# Create a copy of the climatology
clim = climatology.copy()

# Coarsen the climatology
clim_coarse = climatology.coarsen(
lat=lat_coarse, lon=lon_coarse, boundary="trim"
).sum()

# Calculate the ratio of emissions to climatology and handle NaN values
ratio = (emissions.squeeze().data / clim_coarse.where(clim_coarse > 0)).fillna(
0
)

# Interpolate the ratio to match the coordinates of the climatology
ratio_interp = ratio.sel(lat=clim.lat, lon=clim.lon, method="nearest")
# We coarsen for regional scaling, to avoid small differences in fire locations
coarsen_kws = dict(lat=lat_coarse, lon=lon_coarse, boundary="trim")
clim_coarse = climatology.coarsen(**coarsen_kws).sum()
obs_coarse = emissions.squeeze().coarsen(**coarsen_kws).sum()

# Loop through each time slice and scale the climatology
for index in range(0, len(clim.time)):
# Get the current time slice of the climatology
clim_slice = clim.data[index, :, :]
# Calculate the coarse ratio of emissions to climatology
# Where climatology is not positive, the ratio will be 0
ratio_coarse = (obs_coarse.data / clim_coarse.where(clim_coarse > 0)).fillna(0)

# Scale the current time slice
scaled_slice = clim_slice * ratio_interp[index, :, :]
# Interpolate (uncoarsen) the ratio to match the coordinates of the climatology
# (this should be the same grid as the QFED/GBBEPx emissions)
ratio = ratio_coarse.sel(lat=clim.lat, lon=clim.lon, method="nearest")

# Update the climatology with the scaled time slice
clim.data[index, :, :] = scaled_slice.squeeze().data
# Scale data
clim.data = clim.data * ratio.data

return clim.compute()

@staticmethod
@logit(logger)
def make_fire_emission(
d: str,
climos: List[str],
ratio: float,
scale_climo: bool,
Expand All @@ -375,8 +317,6 @@ def make_fire_emission(

Parameters
----------
d : str or pd.Timestamp
The date for which fire emissions are generated.
climos : list
List of pre-calculated climatology data files for scaling.
ratio : float
Expand All @@ -397,53 +337,49 @@ def make_fire_emission(
xr.Dataset
xarray Dataset object representing fire emissions data for each forecast day.
"""
# open fire emission
# Open fire emissions
if isinstance(obsfile, (str, bytes)):
obsfile = [obsfile]
if "QFED".lower() in obsfile[0].lower():
ObsEmis = AerosolEmissions.open_qfed(obsfile, out_var_dict=out_var_dict)
if "qfed" in obsfile[0].lower():
obs = AerosolEmissions.open_qfed(obsfile, out_var_dict=out_var_dict)
else:
ObsEmis = xr.open_mfdataset(obsfile, decode_cf=False)
# GBBEPx, already combined and with correct names
obs = xr.open_mfdataset(obsfile, decode_cf=False)

# open climatology
# Open climatology
climo = AerosolEmissions.open_climatology(climos)
climo = climo.sel(lat=ObsEmis["lat"], lon=ObsEmis["lon"], method="nearest")

# make weighted climo
ObsEmisC = ObsEmis.coarsen(lat=coarsen_scale, lon=coarsen_scale, boundary="trim").sum()
climo = climo.sel(lat=obs["lat"], lon=obs["lon"], method="nearest")

# Blend
dsets = []
climo_scaled = {}
for tslice in range(len(climos)):
# make copy of original data
if tslice == 0:
dset = ObsEmis.copy()
dset = obs.copy()
else:
dset = dsets[tslice - 1].copy()
dset.update({"time": [float(tslice * 24)]})
dset.time.attrs = ObsEmis.time.attrs
dset.time.attrs = obs.time.attrs

for v in ObsEmis.data_vars:
for v in obs.data_vars:
if not scale_climo:
if tslice > n_persist:
dset[v].data = (
ratio * dset[v] + (1 - ratio) * climo[v].data[tslice, :, :]
)
else:
if tslice == 0:

climo_scaled[v] = AerosolEmissions.create_climatology(
ObsEmisC[v], climo[v], lon_coarse=150, lat_coarse=150
obs[v], climo[v], lon_coarse=coarsen_scale, lat_coarse=coarsen_scale
)
else:
if tslice > n_persist:
dset[v].data = (
ratio * dset[v] + (1 - ratio) * climo_scaled[v].data[tslice, :, :]
)
else:
dset[v] = dset[v]

dsets.append(dset)

return xr.concat(dsets, dim="time")

@logit(logger)
Expand Down