Skip to content

multimodallearning/deep_staple

Repository files navigation

DeepSTAPLE: Learning to predict multimodal registration quality for unsupervised domain adaptation

Estimating registration noise with semantic segmentation models.

keywords: domain adaptation, multi-atlas registration, label noise, consensus, curriculum learning

Main contribution

This code uses data parameters (https://github.com/apple/ml-data-parameters) to weight noisy atlas samples as a simple but effective extension of semantic segmentation models. During training the data parameters (scalar values assigned to each instance of a registered label) can estimate the label trustworthiness globally across all multi-atlas candidates of all images. We optimized the base method according to the special characteristics of semantic segmentation tasks.

Noisy labels

Setup

Install pypoetry from https://python-poetry.org/ Change into the directory containing the pyproject.toml file and nstall a virtual env with:

poetry init
poetry lock
poetry install

If you do not want to use poetry a list of dependencies is contained in the pyproject.toml file. To use the logging capabilities create an account on wandb.org

Dataset

The used dataset can be found at: https://wiki.cancerimagingarchive.net/pages/viewpage.action?pageId=70229053 The CrossMoDa challenge website which used this dataset can be found at: https://crossmoda-challenge.ml/

We rebuilt the CrossMoDa dataset with instructions of: https://github.com/KCL-BMEIS/VS_Seg

During preprocessing a Docker container is deployed which runs a Slicer.org script - make sure to have docker installed and sufficient permissions. Execute all cells in ./deep_staple/preprocessing/fetch_dataset.ipynb to get the dataset from TCIA and convert it to the necessary file structure for the dataloader.

Data artifacts

Pre-registered (noisy) labels for training can be downloaded with data_artifacts/download_artifacts.sh

Training

Either run main_deep_staple.py or use the notebook main_deep_staple.ipynb

Settings can be changed inside the config_dict

Label consensus creation

After network training a .pth data file is written to ./data/output/<run_name>/train_label_snapshot.pth Open .deep_staple/postprocessing/consensus/consensus.ipynb to create consensi.

Mapping of paper contents and code

Drawing data parameters

weight = torch.sigmoid(bare_weight)

Data parameter loss

dp_loss = (dp_loss*weight).sum() + risk_regularization.sum()

Risk regularization

risk_regularization = -weight*p_pred_num/(dp_logits.shape[-3]*dp_logits.shape[-2]*dp_logits.shape[-1])

Fixed weighting

weight = weight/fixed_weighting[b_idxs_dataset]

Out-of-line backpropagation process

if config.use_ool_dp_loss:

Consensus generation via weighted voting

"def calc_dp_consensus(lbl_list, weighting_list):\n",

Citation

DeepSTAPLE: Learning to predict multimodal registration quality for unsupervised domain adaptation. By Christian Weihsbach, Alexander Bigalke, Christian N. Kruse, Hellena Hempe, Mattias P Heinrich. WBIR 2022

Contact:

For any problems or questions please open an issue.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published