Skip to content

ybrackenier/ZS-SSL-PyTorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ZS-SSL: Zero-Shot Self-Supervised Learning

🚩PyTorch: This is the pytorch implementation of ZS-SSL in the following ICLR paper: Zero-Shot Self-Supervised Learning for MRI Reconstruction.

🚩Tensorflow: For tensorflow (original) implementation please visit (zs-ssl-tensorflow-implementation).

🚩 If you find our work is helpful for you, please star this repo and cite our paper

ZS-SSL Overview

ZS-SSL enables physics-guided deep learning MRI reconstruction using only a single slice/sample (paper). Succintly, ZS-SSL partitions the available measurements from a single scan into three disjoint sets. Two of these sets are used to enforce data consistency and define loss during training for self-supervision, while the last set serves to self-validate, establishing an early stopping criterion. In the presence of models pre-trained on a database with different image characteristics, ZS-SSL can be combined with transfer learning (TL) for faster convergence time and reduced computational complexity.


An overview of the proposed zero-shot self-supervised learning approach. a) Acquired measurements for the single scan are partitioned into three sets: a training (Θ) and loss mask (Λ) for self-supervision, and a self-validation mask for automated early stopping (Γ). b) The parameters, θ, of the unrolled MRI reconstruction network are updated using Θ and Λ in the data consistency (DC) units of the unrolled network and for defining loss, respectively. c) Concurrently, a k-space validation procedure is used to establish the stopping criterion by using Ω\Γ in the DC units and Γ to measure a validation loss. d) Once the network training has been stopped due to an increasing trend in the k-space validation loss, the final reconstruction is performed using the relevant learned network parameters and all the acquired measurements in the DC unit.

Installation

Dependencies are given in environment.yml. A new conda environment can be installed with

conda env create -f environment.yaml

Datasets

We have used the fastMRI dataset in our experiments.

How to use

ZS-SSL training and reconstruction can be performed by running zs_ssl_recon.ipynb file. Prior to running training file, hyperparameters such as number of unrolled blocks, split ratio for validation,training and loss masks can be adjusted from parser_ops.py. If ZS-SSL training has been done, zs_ssl_inference.ipynb can be directly used for reconstruction.

We highly recommend the users to set the outer k-space regions with no signal as 1 in training mask to ensure consistency with acquired measurements. Please refer to our SSDU repository for further details.

Early Automated Stopping

In parser_ops.py, we have also defined a parameter (--stop_training) to automatically stop the training process. The --stop_training parameter denotes the number of consecutive epochs without achieving a lower validation loss (to disable early automated stopping, set --stop_training to the number of epochs).

Citation

If you find the codes useful in your research, please cite

@inproceedings{
yaman2022zeroshot,
title={Zero-Shot Self-Supervised Learning for {MRI} Reconstruction},
author={Burhaneddin Yaman and Seyed Amir Hossein Hosseini and Mehmet Akcakaya},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=085y6YPaYjP}
}

Copyright & License Notice

© 2021 Regents of the University of Minnesota

ZS-SSL is copyrighted by Regents of the University of Minnesota and covered by US 17/075,411. Regents of the University of Minnesota will license the use of ZS-SSL solely for educational and research purposes by non-profit institutions and US government agencies only. For other proposed uses, contact [email protected]. The software may not be sold or redistributed without prior approval. One may make copies of the software for their use provided that the copies, are not sold or distributed, are used under the same terms and conditions. As unestablished research software, this code is provided on an "as is'' basis without warranty of any kind, either expressed or implied. The downloading, or executing any part of this software constitutes an implicit agreement to these terms. These terms and conditions are subject to change at any time without prior notice.

Questions

If you have questions or issues, please open an issue or reach out to me at yaman013 at umn.edu .

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 97.6%
  • Python 2.4%