Skip to content
/ RepQ-ViT Public

[ICCV 2023] RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers

License

Notifications You must be signed in to change notification settings

zkkli/RepQ-ViT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers

This repository contains the official PyTorch implementation for the ICCV2023 paper "RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers". RepQ-ViT decouples the quantization and inference processes and applies scale reparameterization to solve the extreme distribution issues in vision transformers, including post-LayerNorm and post-Softmax activations as follows:

  • Post-LayerNorm activations with severe inter-channel variation:

  • Post-Softmax activations with power-law features:

Installation

  • Timm version is recommended to be 0.4.12.

  • To install RepQ-ViT and develop locally:

    git clone https://github.com/zkkli/RepQ-ViT.git
    cd RepQ-ViT

Quantization

Please see classification readme for instructions to reproduce classification results on ImageNet and see detection readme for instructions to reproduce detection results on COCO.

Citation

We appreciate it if you would please cite the following paper if you found the implementation useful for your work:

@inproceedings{li2023repq,
  title={Repq-vit: Scale reparameterization for post-training quantization of vision transformers},
  author={Li, Zhikai and Xiao, Junrui and Yang, Lianwei and Gu, Qingyi},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={17227--17236},
  year={2023}
}

About

[ICCV 2023] RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages