-
Notifications
You must be signed in to change notification settings - Fork 2
Remote Sensing
Compile list of remote sensing resources here
lidR : R package for airborne LiDAR data manipulation and visualization for forestry applications
A set of efficient tools for working with point cloud data. The group has a licence available to group memebers upon request.
PDAL is a C++ BSD library for translating and manipulating point cloud data. It is very much like the GDAL library which handles raster and vector data.
Global Ecosystem Dynamics Investigation is a NASA mission to measure how deforestation has contributed to atmospheric CO₂ concentrations. A full-waveform LIDAR was attached to the International Space Station to provide the first global, high-resolution observations of forest vertical structure.
Python tutorials: https://git.earthdata.nasa.gov/projects/LPDUR/repos/gedi-tutorials/browse
R Package rGEDI: https://github.com/carlos-alberto-silva/rGEDI
To drastically reduce the amount of data that necessary to download (through server side spatial filtering), follow the following steps.
-
Using the GEDI Finder API, generate a list of the files for download
-
Go to "https://search.earthdata.nasa.gov/search" and type "GEDI" in the search bar
-
Pick the "GEDI L2A Elevation and Height Metrics Data Global Footprint Level V001" option
-
copy the comma separated list of files we want to download (which we created from steps 0A-0B using the gedifinder API to calculate the files that correspond to our pixels. If we chose not to perform this optional step, we can instead just query the gedifinder API using the UK's geographical bounding box coordinates to get a list of files that's only slightly bigger (around 15-20%) than using the optional step, with the benefit being that we don't waste a lot of time)
-
Click "download files" and select customize. Then insert UK's bounding box coordinates (so that we can select only the points we need from each individual file)
-
Wait for orders to complete and check email to get link to data.
-
click on the html and download zip files one by one
-
Place all zip files in suitably named directories and run code from Jupyter notebook
-
follow the instructions in the code comments and run code (The code may have to be rerun a few times as per the instructions, with slight modifications each time; once per order. We can maybe workout a loop to do this but this has been hard)