-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #226 from ASFHyP3/develop
Release 0.7.0
- Loading branch information
Showing
10 changed files
with
733 additions
and
23 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -29,26 +29,7 @@ jobs: | |
python -m build | ||
- name: upload to PyPI.org | ||
uses: pypa/[email protected].10 | ||
uses: pypa/[email protected].12 | ||
with: | ||
user: __token__ | ||
password: ${{ secrets.TOOLS_PYPI_PAK }} | ||
|
||
verify-distribution: | ||
runs-on: ubuntu-latest | ||
needs: | ||
- call-version-info-workflow | ||
- distribute | ||
defaults: | ||
run: | ||
shell: bash -l {0} | ||
steps: | ||
- uses: actions/checkout@v4 | ||
|
||
- uses: mamba-org/setup-micromamba@v1 | ||
with: | ||
environment-file: environment.yml | ||
|
||
- name: Ensure asf_tools v${{ needs.call-version-info-workflow.outputs.version }}} is pip installable | ||
run: | | ||
python -m pip install asf_tools==${{ needs.call-version-info-workflow.outputs.version_tag }} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
These scripts are for creating a global (or regional) water mask dataset based off of OpenStreetMaps, and optionally augmented by ESA WorldCover. | ||
|
||
For the OSM water mask dataset, follow these steps to replicate our dataset: | ||
|
||
1. Download the "Latest Weekly Planet PBF File" file from here: "https://planet.openstreetmap.org/". | ||
2. Download the WGS84 water polygons shapefile from: "https://osmdata.openstreetmap.de/data/water-polygons.html". | ||
3. The files should be unzipped and you should have something like `planet.osm.pbf` or `planet.pbf` and `water_polygons.shp` (and the support files for `water_polygons.shp`). | ||
4. Run ```generate_osm_dataset --planet-file-path [path-to-planet.pbf] --ocean-polygons-path [path-to-water-polygons.shp] --lat-begin -85 --lat-end 85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5``` | ||
5. Run ```fill_missing_tiles --fill-value 0 --lat-begin -90 --lat-end -85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5``` | ||
6. Run ```fill_missing_tiles --fill-value 1 --lat-begin 85 --lat-end 90 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5``` | ||
|
||
For the WorldCover water mask dataset, follow these steps: | ||
|
||
1. Download the portions of the dataset for the areas you would like to cover from here: "https://worldcover2020.esa.int/downloader" | ||
2. Extract the contents into a folder. Note, if you download multiple portions of the dataset, extract them all into the same folder. | ||
3. Run ```generate_worldcover_dataset --worldcover-tiles-dir [path-to-worldcover-data] --lat-begin 55 --lat-end 80 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5``` | ||
|
||
Note that we only use WorldCover data over Alaska, Canada, and Russia for our dataset. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,75 @@ | ||
import argparse | ||
import os | ||
import subprocess | ||
|
||
import numpy as np | ||
from osgeo import gdal, osr | ||
|
||
from asf_tools.watermasking.utils import lat_lon_to_tile_string | ||
|
||
|
||
gdal.UseExceptions() | ||
|
||
|
||
def main(): | ||
|
||
parser = argparse.ArgumentParser( | ||
prog='fill_missing_tiles.py', | ||
description='Script for creating filled tifs in areas with missing tiles.' | ||
) | ||
|
||
parser.add_argument('--fill-value', help='The value to fill the data array with.', default=0) | ||
parser.add_argument('--lat-begin', help='The minimum latitude of the dataset in EPSG:4326.', default=-85) | ||
parser.add_argument('--lat-end', help='The maximum latitude of the dataset in EPSG:4326.', default=85) | ||
parser.add_argument('--lon-begin', help='The minimum longitude of the dataset in EPSG:4326.', default=-180) | ||
parser.add_argument('--lon-end', help='The maximum longitude of the dataset in EPSG:4326.', default=180) | ||
parser.add_argument('--tile-width', help='The desired width of the tile in degrees.', default=5) | ||
parser.add_argument('--tile-height', help='The desired height of the tile in degrees.', default=5) | ||
|
||
args = parser.parse_args() | ||
|
||
fill_value = int(args.fill_value) | ||
lat_begin = int(args.lat_begin) | ||
lat_end = int(args.lat_end) | ||
lon_begin = int(args.lon_begin) | ||
lon_end = int(args.lon_end) | ||
tile_width = int(args.tile_width) | ||
tile_height = int(args.tile_height) | ||
|
||
lat_range = range(lat_begin, lat_end, tile_height) | ||
lon_range = range(lon_begin, lon_end, tile_width) | ||
|
||
for lat in lat_range: | ||
for lon in lon_range: | ||
|
||
tile = lat_lon_to_tile_string(lat, lon, is_worldcover=False, postfix='') | ||
tile_tif = 'tiles/' + tile + '.tif' | ||
tile_cog = 'tiles/cogs/' + tile + '.tif' | ||
|
||
print(f'Processing: {tile}') | ||
|
||
xmin, ymin = lon, lat | ||
pixel_size_x = 0.00009009009 | ||
pixel_size_y = 0.00009009009 | ||
|
||
# All images in the dataset should be this size. | ||
data = np.empty((55500, 55500)) | ||
data.fill(fill_value) | ||
|
||
driver = gdal.GetDriverByName('GTiff') | ||
dst_ds = driver.Create(tile_tif, xsize=data.shape[0], ysize=data.shape[1], bands=1, eType=gdal.GDT_Byte) | ||
dst_ds.SetGeoTransform([xmin, pixel_size_x, 0, ymin, 0, pixel_size_y]) | ||
srs = osr.SpatialReference() | ||
srs.ImportFromEPSG(4326) | ||
dst_ds.SetProjection(srs.ExportToWkt()) | ||
dst_band = dst_ds.GetRasterBand(1) | ||
dst_band.WriteArray(data) | ||
del dst_ds | ||
|
||
command = f'gdal_translate -of COG -co NUM_THREADS=all_cpus {tile_tif} {tile_cog}'.split(' ') | ||
subprocess.run(command) | ||
os.remove(tile_tif) | ||
|
||
|
||
if __name__ == '__main__': | ||
main() |
Oops, something went wrong.