Skip to content

HYOJINPARK/TTVOS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Trainig Script for TTVOS

This code implementation of TinyML 2021 symposium paper : ightweight Video Object Segmentation with Adaptive Template Attention Module and Temporal Consistency Loss .

We improve accuracy more by using better attention clue. The paper about new method will be uploaded soon.

sample ours hard

sample ours easy

Dependencies:

python (>= 3.6)
numpy
pytorch (>= 1.8 )
torchvision
pillow
tqdm
imgaug
opencv

Datasets utilized:

DAVIS : download here

/path/DAVIS
|-- Annotations/
|-- ImageSets/
|-- JPEGImages/

YouTubeVOS : download here

/path/ytvos2018
|-- train/
|-- train_all_frames/
|-- valid/
`-- valid_all_frames/

Saliency dataset : download here

/path/Saliency
|-- ECSSD/
|-- HKU-IS/
|-- MSRA10K/

Release

DAVIS

Backbone FLOP (G) Param (M) J & F 16 J & F 17 FPS link
HRNet 10.61 1.61 81.1 62.1 78.3 Google Drive
RN18 55.23 12.5 82.2 66.3 54.5 Google Drive
RN50 83.86 14.8 83.1 69.5 37.7 Google Drive
MobileNetV3 7.56 3.66 79.8 62.9 74.6 Google Drive

Run

Train

  1. Downlaod pre-trained HRNet in here and re-name to 'hrnet_w18_small_model_v1.pth'
  2. Put the weights at the path ['./nnWeight']
  3. Run train.py

test

  1. put the best file path [save_dir] and name [pth] in config setting

Acknowledgement

This codebase borrows the code and structure from official A-GAME repository. We are grateful to Facebook Inc. with valuable discussions.

Reference

The codebase is built based on following works

@article{park2020ttvos,
  title={TTVOS: Lightweight Video Object Segmentation with Adaptive Template Attention Module and Temporal Consistency Loss},
  author={Park, Hyojin and Venkatesh, Ganesh and Kwak, Nojun},
  journal={arXiv preprint arXiv:2011.04445},
  year={2020}
}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages