Skip to content

Commit

Permalink
Merge pull request #226 from ASFHyP3/develop
Browse files Browse the repository at this point in the history
Release 0.7.0
  • Loading branch information
AndrewPlayer3 authored Mar 1, 2024
2 parents 078a5ca + b030f1e commit bc8a572
Show file tree
Hide file tree
Showing 10 changed files with 733 additions and 23 deletions.
21 changes: 1 addition & 20 deletions .github/workflows/distribute.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,26 +29,7 @@ jobs:
python -m build
- name: upload to PyPI.org
uses: pypa/[email protected].10
uses: pypa/[email protected].12
with:
user: __token__
password: ${{ secrets.TOOLS_PYPI_PAK }}

verify-distribution:
runs-on: ubuntu-latest
needs:
- call-version-info-workflow
- distribute
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v4

- uses: mamba-org/setup-micromamba@v1
with:
environment-file: environment.yml

- name: Ensure asf_tools v${{ needs.call-version-info-workflow.outputs.version }}} is pip installable
run: |
python -m pip install asf_tools==${{ needs.call-version-info-workflow.outputs.version_tag }}
2 changes: 1 addition & 1 deletion .github/workflows/static-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ jobs:
steps:
- uses: actions/checkout@v4

- uses: actions/setup-python@v4
- uses: actions/setup-python@v5
with:
python-version: "3.10"

Expand Down
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.7.0]

## Added
* Scripts and entrypoints for generating our global watermasking dataset added to `watermasking`.


## [0.6.0]

Expand Down
3 changes: 3 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@ dependencies:
- boto3
- fiona
- gdal>=3.7
- geopandas
- numpy
- osmium-tool
- pyogrio
- pysheds>=0.3
- rasterio
- scikit-fuzzy
Expand Down
8 changes: 6 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,14 +25,15 @@ dependencies = [
"astropy",
"fiona",
"gdal>=3.3",
"geopandas",
"numpy",
"pyogrio",
"pysheds>=0.3",
"rasterio",
"scikit-fuzzy",
"scikit-image",
"scipy",
"shapely",
"tqdm",
"shapely"
]
dynamic = ["version"]

Expand All @@ -41,6 +42,9 @@ make_composite = "asf_tools.composite:main"
water_map = "asf_tools.hydrosar.water_map:main"
calculate_hand = "asf_tools.hydrosar.hand.calculate:main"
flood_map = "asf_tools.hydrosar.flood_map:main"
generate_osm_dataset = "asf_tools.watermasking.generate_osm_tiles:main"
generate_worldcover_dataset = "asf_tools.watermasking.generate_worldcover_tiles:main"
fill_missing_tiles = "asf_tools.watermasking.fill_missing_tiles:main"

[project.entry-points.hyp3]
water_map = "asf_tools.hydrosar.water_map:hyp3"
Expand Down
18 changes: 18 additions & 0 deletions src/asf_tools/watermasking/README.MD
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
These scripts are for creating a global (or regional) water mask dataset based off of OpenStreetMaps, and optionally augmented by ESA WorldCover.

For the OSM water mask dataset, follow these steps to replicate our dataset:

1. Download the "Latest Weekly Planet PBF File" file from here: "https://planet.openstreetmap.org/".
2. Download the WGS84 water polygons shapefile from: "https://osmdata.openstreetmap.de/data/water-polygons.html".
3. The files should be unzipped and you should have something like `planet.osm.pbf` or `planet.pbf` and `water_polygons.shp` (and the support files for `water_polygons.shp`).
4. Run ```generate_osm_dataset --planet-file-path [path-to-planet.pbf] --ocean-polygons-path [path-to-water-polygons.shp] --lat-begin -85 --lat-end 85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```
5. Run ```fill_missing_tiles --fill-value 0 --lat-begin -90 --lat-end -85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```
6. Run ```fill_missing_tiles --fill-value 1 --lat-begin 85 --lat-end 90 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```

For the WorldCover water mask dataset, follow these steps:

1. Download the portions of the dataset for the areas you would like to cover from here: "https://worldcover2020.esa.int/downloader"
2. Extract the contents into a folder. Note, if you download multiple portions of the dataset, extract them all into the same folder.
3. Run ```generate_worldcover_dataset --worldcover-tiles-dir [path-to-worldcover-data] --lat-begin 55 --lat-end 80 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```

Note that we only use WorldCover data over Alaska, Canada, and Russia for our dataset.
75 changes: 75 additions & 0 deletions src/asf_tools/watermasking/fill_missing_tiles.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import argparse
import os
import subprocess

import numpy as np
from osgeo import gdal, osr

from asf_tools.watermasking.utils import lat_lon_to_tile_string


gdal.UseExceptions()


def main():

parser = argparse.ArgumentParser(
prog='fill_missing_tiles.py',
description='Script for creating filled tifs in areas with missing tiles.'
)

parser.add_argument('--fill-value', help='The value to fill the data array with.', default=0)
parser.add_argument('--lat-begin', help='The minimum latitude of the dataset in EPSG:4326.', default=-85)
parser.add_argument('--lat-end', help='The maximum latitude of the dataset in EPSG:4326.', default=85)
parser.add_argument('--lon-begin', help='The minimum longitude of the dataset in EPSG:4326.', default=-180)
parser.add_argument('--lon-end', help='The maximum longitude of the dataset in EPSG:4326.', default=180)
parser.add_argument('--tile-width', help='The desired width of the tile in degrees.', default=5)
parser.add_argument('--tile-height', help='The desired height of the tile in degrees.', default=5)

args = parser.parse_args()

fill_value = int(args.fill_value)
lat_begin = int(args.lat_begin)
lat_end = int(args.lat_end)
lon_begin = int(args.lon_begin)
lon_end = int(args.lon_end)
tile_width = int(args.tile_width)
tile_height = int(args.tile_height)

lat_range = range(lat_begin, lat_end, tile_height)
lon_range = range(lon_begin, lon_end, tile_width)

for lat in lat_range:
for lon in lon_range:

tile = lat_lon_to_tile_string(lat, lon, is_worldcover=False, postfix='')
tile_tif = 'tiles/' + tile + '.tif'
tile_cog = 'tiles/cogs/' + tile + '.tif'

print(f'Processing: {tile}')

xmin, ymin = lon, lat
pixel_size_x = 0.00009009009
pixel_size_y = 0.00009009009

# All images in the dataset should be this size.
data = np.empty((55500, 55500))
data.fill(fill_value)

driver = gdal.GetDriverByName('GTiff')
dst_ds = driver.Create(tile_tif, xsize=data.shape[0], ysize=data.shape[1], bands=1, eType=gdal.GDT_Byte)
dst_ds.SetGeoTransform([xmin, pixel_size_x, 0, ymin, 0, pixel_size_y])
srs = osr.SpatialReference()
srs.ImportFromEPSG(4326)
dst_ds.SetProjection(srs.ExportToWkt())
dst_band = dst_ds.GetRasterBand(1)
dst_band.WriteArray(data)
del dst_ds

command = f'gdal_translate -of COG -co NUM_THREADS=all_cpus {tile_tif} {tile_cog}'.split(' ')
subprocess.run(command)
os.remove(tile_tif)


if __name__ == '__main__':
main()
Loading

0 comments on commit bc8a572

Please sign in to comment.