Skip to content

Code repository for the paper Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks

Notifications You must be signed in to change notification settings

mayrajeo/tree-detection-evo

Repository files navigation

Tree species classification from from airborne LiDAR and hyperspectral data using 3D convolutional neural networks

Table of Contents

About

This is a code repository for our paper Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks

During the last two decades, forest monitoring and inventory systems have moved from field surveys to remote sensing-based methods. These methods tend to focus on economically significant components of forests, thus leaving out many factors vital for forest biodiversity, such as the occurrence of species with low economical but high ecological values. Airborne hyperspectral imagery has shown significant potential for tree species classification, but the most common analysis methods, such as random forest and support vector machines, require manual feature engineering in order to utilize both spatial and spectral features, whereas deep learning methods are able to extract these features from the raw data.

Our research focused on the classification of the major tree species Scots pine, Norway spruce and birch, together with an ecologically valuable keystone species, European aspen, which has a sparse and scattered occurrence in boreal forests. We compared the performance of three-dimensional convolutional neural networks (3D-CNNs) with the support vector machine, random forest, gradient boosting machine and artificial neural network in individual tree species classification from hyperspectral data with high spatial and spectral resolution. We collected hyperspectral and LiDAR data along with extensive ground reference data measurements of tree species from the 83 km² study area located in the southern boreal zone in Finland. A LiDAR-derived canopy height model was used to match ground reference data to aerial imagery. The best performing 3D-CNN, utilizing 4 m image patches, was able to achieve an F1-score of 0.91 for aspen, an overall F1-score of 0.86 and an overall accuracy of 87% while the lowest performing 3D-CNN utilizing 10 m image patches achieved an F1-score of 0.83 and an accuracy of 85%. In comparison, the support-vector machine achieved an F1-score of 0.82 and an accuracy of 82.4% and the artificial neural network achieved an F1-score of 0.82 and an accuracy of 81.7%. Compared to the reference models, 3D-CNNs were more efficient in distinguishing coniferous species from each other, with a concurrent high accuracy for aspen classification.

Deep neural networks, being black box models, hide the information about how they reach their decision. We used both occlusion and saliency maps to interpret our models. Finally, we used the best performing 3D-CNN to produce a wall-to-wall tree species map for the full study area that can later be used as a reference prediction in, for instance, tree species mapping from multispectral satellite images. The improved tree species classification demonstrated by our study can benefit both sustainable forestry and biodiversity conservation.

Results Main tree species aggregated to 10 m grid. Each grid cell is labeled with the most common tree species based on the number of treetops. Smaller figure shows a 100m X 100m patch with a larger aspen stand

Getting started

Project members can access preinstalled conda environment by running source conda_activate.sh. This version has fastai2 v0.0.17 and pytorch 1.3.0. This project should work with latest versions (at the time of writing fastai2==0.0.25 and pytorch=1.6.0, but this hasn't been tested yet.

Do not use any CSC modules with this conda environment when running python based scripts and notebooks.

For R-files, use module load r-env

Installation

fastai2 was officially released and renamed to fastai 21.8.2020. The work was done with prerelease version, but it should work with the version available on pip and conda by running

shopt -s globstar
perl -pi -e 's/fastai2/fastai/g' **/*

Then run conda env create -f environment.yml and pip install fastai.

Steps below show how to replicate development environment.

Run conda env create -f environment.yml, and then use editable install of fastai2 and fastcore:

git clone https://github.com/fastai/fastcore
cd fastcore
pip install -e ".[dev]"

cd ..
git clone --recurse-submodules https://github.com/fastai/fastai2
cd fastai2
pip install -e ".[dev]"

NOTE: Since Pytorch version 1.6, the default save format has changed from Pickle-based to zip-file based. torch.load should however work with older versions.

Data

Unfortunately, data is not (yet) publically available. Further questions can be sent to authors.

Workflow

All steps in our work are presented either in Jupyter Notebooks or individual scripts

Preprocessing

Preprocessing is described in notebook Data exploration and preprocessing.

Individual tree detection and matching field data to detected tree crowns

Process and steps is described in notebook Individual tree detection, segmentation and matching to field data.

Training and validation data generation

Process is described in notebook Training and validation data generation.

Model training

Reference methods

Training and validation process is presented in notebook Comparison methods.

3D CNNs

Example of training process is presented in notebook Tree species classification with fastai-v2

Inference and interpretation

Inferences and comparisons between different CNN-models are presented in notebook Full tile predictions

Interpretation based on saliency and occlusion is presented in notebook Saliences

Authors

Janne Mäyrä (corresponding author), Sarita Keski-Saari, Sonja Kivinen, Topi Tanhuanpää, Pekka Hurskainen, Peter Kullberg, Laura Poikolainen, Arto Viinikka, Sakari Tuominen, Timo Kumpula, Petteri Vihervaara

About

Code repository for the paper Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages