Skip to content

JWFanggit/ConMAE-Vehicle-ReID

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

onMAE: Contour Guided MAE for Unsupervised Vehicle Re-Identification

#This is a PyTorch/GPU implementation of the paper [ConMAE: Contour Guided MAE for Unsupervised Vehicle Re-Identification](CCDC 2023)

This implementation is based on the secondary development of Masked Autoencoders Are Scalable Vision Learners and Cluster Contrast for Unsupervised Person Re-Identification, thanks to the authors of the above work.

When using this code, you will first pretrain using the code in "\ConMAE\upstream", then train the model obtained from the above step in unsupervised vehicle reidentification using the code in "\ConMAE\downstream".

Prepare ImageNet Pre-trained Models for MAE

When training with the backbone of MAE, you need to download the ImageNet-pretrained ViT-Base model from this link

upstream

To pre-train ViT-Base with main_pretrain.py, we utilize 2 GTX-2080TI GPUs for training.

    --height 224 --width 224\
    --batch_size 64 \
    --model mae_vit_base_patch16 \
    --norm_pix_loss \
    --mask_ratio 0.75 \
    --epochs 200 \
    --blr 1.5e-4 --weight_decay 0.05 \

downstream

When training the ViT we got in upstream with main.py, we utilize 2 GTX-2080TI GPUs.

    --height 224 --width 224\
    --batch_size 64 \
    --model MAE_base \
    --epochs 60 \
    --iters 400\
    --momentum 0.1\
    --eps 0.7\
    --num-instances 4\
    --lr 1.5e-5\
    --weight_decay 5e-4 \

###If you have any questions, please email "[email protected]".

If you are interested in this work and want to use the code, please cite our paper as:

@inproceedings{ConMAE-REID,
  author    = {Jing Yang and
               Jianwu Fang and
               Hongke Xu},
  title     = {ConMAE: Contour Guided {MAE} for Unsupervised Vehicle Re-Identification},
   booktitle   = {The 35th  Chinese Control and Decision Conference (CCDC)},
  year      = {2023},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published