Skip to content

ZhAnGToNG1/transfer_learning_cspt

Repository files navigation

Consecutive Pretraining: A Knowledge Transfer Learning Strategy with Unlabelled Data for Remote Sensing Domain Promotion

This repository contains PyTorch implementation and pretrained models of CSPT.

Give a star! ⭐️ if this project helped you.

Pretrained models:

The pre-trained models based on ViT-B are released in Model Zoo (code:dspt).

Updates🌟 :

  • May 7, 2022: All pretrained models of various remote sensing downstream tasks are released publicly.
  • August 1, 2022: Update the code about pre-training and fine-tuning.

Installation🚀:

Please refer to install.md for installation.

Getting Started🚀:

Please refer to get_started.md for the basic usage.

Acknowledgement

The code is built using the MAE, MMdetection and BEiT repository.

Citation

@article{zhang2022consecutive,
  title={Consecutive pre-training: A knowledge transfer learning strategy with relevant unlabeled data for remote sensing domain},
  author={Zhang, Tong and Gao, Peng and Dong, Hao and Zhuang, Yin and Wang, Guanqun and Zhang, Wei and Chen, He},
  journal={Remote Sensing},
  volume={14},
  number={22},
  pages={5675},
  year={2022},
  publisher={MDPI}
}

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages