Skip to content

Commit

Permalink
tropo_pyaps3: add --custom-height for testing purpose (#431)
Browse files Browse the repository at this point in the history
+ view: auto update vlim after masking for multi-subplots for better display when the non-zero values are largely off from zero, which is the case of the absolute tropo delay in ERA5.h5 file.

+ tropo_pyaps3.py: add --custom-height for testing purpose

+ test_smallbaselineApp.py: take codacy suggestion

+ docs/README: one line for the badges

+ docs/dask.md: fix inproper display of dask performance figure on readthedocs
  • Loading branch information
yunjunz authored Aug 28, 2020
1 parent dbd117d commit e2e673f
Show file tree
Hide file tree
Showing 8 changed files with 41 additions and 25 deletions.
8 changes: 4 additions & 4 deletions docs/FAQs.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
## Frequently Asked Questions

### 1. What's the sign convention of the line-of-sight data?
#### 1. What's the sign convention of the line-of-sight data?

For line-of-sight (LOS) phase in the unit of radians, positive value represents motion away from the satellite. We assume the "date1_date2" format for the interferogram with "date1" being the earlier acquisition.
For line-of-sight (LOS) phase in the unit of radians, i.e. 'unwrapPhase' dataset in `ifgramStack.h5` file, positive value represents motion away from the satellite. We assume the "date1_date2" format for the interferogram with "date1" being the earlier acquisition.

For LOS displacement in the unit of meters, positive value represents motion toward the satellite (uplift for pure vertical motion).
For LOS displacement in the unit of meters, i.e. 'timeseries' dataset in `timeseries.h5` file positive value represents motion toward the satellite (uplift for pure vertical motion).

### 2. How to prepare the input for MintPy if I am using InSAR software rather than ISCE stack processors and ARIA-tools?
#### 2. How to prepare the input for MintPy if I am using InSAR software rather than ISCE stack processors and ARIA-tools?



Expand Down
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[![Language](https://img.shields.io/badge/python-3.5%2B-blue.svg)](https://www.python.org/)
[![Docs Status](https://readthedocs.org/projects/mintpy/badge/?version=latest)](https://mintpy.readthedocs.io/?badge=latest)
[![CircleCI](https://img.shields.io/circleci/build/github/insarlab/MintPy.svg?color=green&logo=circleci)](https://circleci.com/gh/insarlab/MintPy)
[![Latest version](https://img.shields.io/badge/latest%20version-v1.2.3-yellowgreen.svg)](https://github.com/insarlab/MintPy/releases)
[![Version](https://img.shields.io/badge/version-1.2.3-yellowgreen.svg)](https://github.com/insarlab/MintPy/releases)
[![License](https://img.shields.io/badge/license-GPLv3-yellow.svg)](https://github.com/insarlab/MintPy/blob/main/LICENSE)
[![Forum](https://img.shields.io/badge/forum-Google%20Group-orange.svg)](https://groups.google.com/forum/#!forum/mintpy)
[![Citation](https://img.shields.io/badge/doi-10.1016%2Fj.cageo.2019.104331-blue)](https://doi.org/10.1016/j.cageo.2019.104331)
Expand Down
2 changes: 1 addition & 1 deletion docs/dask.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ A typical run time without local cluster is 30 secs and with 8 workers 11.4 secs

To show the run time improvement, we test three datasets (South Isabela, Fernandina, and Kilauea) with different number of cores and same amount of allocated memory (4 GB) on a compute node in the [Stampede2 cluster's skx-normal queue](https://portal.tacc.utexas.edu/user-guides/stampede2#overview-skxcomputenodes). Results are as below:

![Dask LocalCluster Performance](https://github.com/insarlab/MintPy-tutorial/blob/main/docs/dask_local_cluster_performance.png)
![Dask LocalCluster Performance](https://yunjunzhang.files.wordpress.com/2020/08/dask_local_cluster_performance.png)

#### 1.5 Known problems ####

Expand Down
2 changes: 1 addition & 1 deletion mintpy/ifgram_inversion.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def create_parser():
help='Output file name. (default: %(default)s).')
parser.add_argument('--ref-date', dest='ref_date', help='Reference date, first date by default.')
parser.add_argument('--skip-reference', dest='skip_ref', action='store_true',
help='Skip checking reference pixel value, for simulation testing.')
help='[for offset and testing] do not apply spatial referencing.')

# solver
solver = parser.add_argument_group('solver', 'solver for the network inversion problem')
Expand Down
10 changes: 9 additions & 1 deletion mintpy/tropo_pyaps3.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,9 @@ def create_parser():
'e.g.: '+WEATHER_DIR_DEMO)
delay.add_argument('-g','--geomtry', dest='geom_file', type=str,
help='geometry file including height, incidenceAngle and/or latitude and longitude')
delay.add_argument('--custom-height', dest='custom_height', type=float,
help='[for testing] specify a custom height value for delay calculation.')

delay.add_argument('--tropo-file', dest='tropo_file', type=str,
help='tropospheric delay file name')
delay.add_argument('--verbose', dest='verbose', action='store_true', help='Verbose message.')
Expand Down Expand Up @@ -601,8 +604,13 @@ def get_dataset_size(fname):
# prepare geometry data
geom_obj = geometry(inps.geom_file)
geom_obj.open()
inps.dem = geom_obj.read(datasetName='height')
inps.inc = geom_obj.read(datasetName='incidenceAngle')
inps.dem = geom_obj.read(datasetName='height')

# for testing
if inps.custom_height:
print('use input custom height of {} m for vertical integration'.format(inps.custom_height))
inps.dem[:] = inps.custom_height

if 'latitude' in geom_obj.datasetNames:
# for dataset in geo OR radar coord with lookup table in radar-coord (isce, doris)
Expand Down
4 changes: 2 additions & 2 deletions mintpy/utils/readfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -397,7 +397,7 @@ def read_binary_file(fname, datasetName=None, box=None, xstep=1, ystep=1):

# data structure - file specific based on file extension
data_type = 'float32'
num_band = 1
num_band = 1

if fext in ['.unw', '.cor', '.hgt', '.msk']:
num_band = 2
Expand Down Expand Up @@ -1245,7 +1245,7 @@ def read_binary(fname, shape, box=None, data_type='float32', byte_order='l',
else:
raise ValueError('unrecognized complex band:', cpx_band)

# skipping/multilooking
# skipping/multilooking
if xstep * ystep > 1:
data = data[int(ystep/2)::ystep,
int(xstep/2)::xstep]
Expand Down
36 changes: 22 additions & 14 deletions mintpy/view.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,7 +347,7 @@ def update_inps_with_file_metadata(inps, metadata):

##################################################################################################
def update_data_with_plot_inps(data, metadata, inps):
# Seed Point
# 1. spatial referencing with respect to the seed point
if inps.ref_yx: # and inps.ref_yx != [int(metadata['REF_Y']), int(metadata['REF_X'])]:
try:
ref_y = inps.ref_yx[0] - inps.pix_box[1]
Expand All @@ -363,7 +363,7 @@ def update_data_with_plot_inps(data, metadata, inps):
else:
inps.ref_yx = None

# Convert data to display unit and wrap
# 2. scale data based on the display unit and re-wrap
(data,
inps.disp_unit,
inps.disp_scale,
Expand All @@ -376,7 +376,7 @@ def update_data_with_plot_inps(data, metadata, inps):
if inps.wrap:
inps.vlim = inps.wrap_range

# 1.6 Min / Max - Data/Display
# 3. update display min/max
inps.dlim = [np.nanmin(data), np.nanmax(data)]
if not inps.vlim: # and data.ndim < 3:
inps.vlim = [np.nanmin(data), np.nanmax(data)]
Expand Down Expand Up @@ -969,27 +969,24 @@ def read_data4figure(i_start, i_end, inps, metadata):
ystep=inps.multilook_num)[0]
data -= ref_data

# v/dlim, adjust data if all subplots share the same unit
# This could be:
# check if all subplots share the same data unit, they could have/be:
# 1) the same type OR
# 2) velocity or timeseries OR
# 3) horizontal/vertical output from asc_desc2horz_vert.py
# 4) data/model output from load_gbis.py OR
# 5) binary files with multiple undefined datasets, as band1, band2, etc.
if (len(inps.dsetFamilyList) == 1
if (len(inps.dsetFamilyList) == 1
or inps.key in ['velocity', 'timeseries', 'inversion']
or all(d in inps.dsetFamilyList for d in ['horizontal', 'vertical'])
or inps.dsetFamilyList == ['data','model','residual']
or inps.dsetFamilyList == ['band{}'.format(i+1) for i in range(len(inps.dsetFamilyList))]):
data, inps = update_data_with_plot_inps(data, metadata, inps)
same_unit4all_subplots = True
else:
same_unit4all_subplots = False

if (not inps.vlim
and not (inps.dsetFamilyList[0].startswith('unwrap') and not inps.file_ref_yx)
and inps.dsetFamilyList[0] not in ['bperp']):
data_mli = multilook_data(data, 10, 10)
inps.vlim = [np.nanmin(data_mli), np.nanmax(data_mli)]
del data_mli
inps.dlim = [np.nanmin(data), np.nanmax(data)]
# adjust data due to spatial referencing and unit related scaling
if same_unit4all_subplots:
data, inps = update_data_with_plot_inps(data, metadata, inps)

# mask
if inps.msk is not None:
Expand All @@ -1000,6 +997,17 @@ def read_data4figure(i_start, i_end, inps, metadata):
if inps.zero_mask:
vprint('masking pixels with zero value')
data = np.ma.masked_where(data == 0., data)

# update display min/max
if (same_unit4all_subplots
and all(arg not in sys.argv for arg in ['-v', '--vlim'])
and not (inps.dsetFamilyList[0].startswith('unwrap') and not inps.file_ref_yx)
and inps.dsetFamilyList[0] not in ['bperp']):
data_mli = multilook_data(data, 10, 10)
inps.vlim = [np.nanmin(data_mli), np.nanmax(data_mli)]
del data_mli
inps.dlim = [np.nanmin(data), np.nanmax(data)]

return data


Expand Down
2 changes: 1 addition & 1 deletion test/test_smallbaselineApp.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ def test_dataset(dset_name, test_dir, fresh_start=True, test_pyaps=False):
cmd = 'smallbaselineApp.py {}'.format(template_file)
print(cmd)
status = subprocess.Popen(cmd, shell=True).wait()
if status is not 0:
if status != 0:
raise RuntimeError('Test failed for example dataset {}'.format(dset_name))

# custom plot of velocity map
Expand Down

0 comments on commit e2e673f

Please sign in to comment.