Joint-task Self-supervised Learning for Temporal Correspondence
Xueting Li*, Sifei Liu*, Shalini De Mello, Xiaolong Wang, Jan Kautz, Ming-Hsuan Yang.
(* equal contributions)
In Neural Information Processing Systems (NeurIPS), 2019.
If you use our code in your research, please use the following BibTex:
@inproceedings{uvc_2019,
Author = {Xueting Li and Sifei Liu and Shalini De Mello and Xiaolong Wang and Jan Kautz and Ming-Hsuan Yang},
Title = {Joint-task Self-supervised Learning for Temporal Correspondence},
Booktitle = {NeurIPS},
Year = {2019},
}
Method | J_mean | J_recall | J_decay | F_mean | F_recall | F_decay |
---|---|---|---|---|---|---|
Ours | 0.563 | 0.650 | 0.289 | 0.592 | 0.641 | 0.354 |
Ours - track | 0.577 | 0.683 | 0.263 | 0.613 | 0.698 | 0.324 |
The code is tested in the following environment:
- Ubuntu 16.04
- Pytorch 1.1.0, tqdm, scipy 1.2.1
To test on DAVIS2017 for instance segmentation mask propagation, please run:
python test.py -d /workspace/DAVIS/ -s 480
Important parameters:
-c
: checkpoint path.-o
: results path.-d
: DAVIS 2017 dataset path.-s
: test resolution, all results in the paper are tested on 480p images, i.e.-s 480
.
Please check the test.py
file for other parameters.
To test on DAVIS2017 by tracking & propagation, please run:
python test_with_track.py -d /workspace/DAVIS/ -s 480
Similar parameters as test.py
, please see the test_with_track.py
for details.
To test on VIP, please run the following command with your own VIP path:
python test_mask_vip.py -o results/VIP/category/ --scale_size 560 560 --pre_num 1 -d /DATA/VIP/VIP_Fine/Images/ --val_txt /DATA/VIP/VIP_Fine/lists/val_videos.txt -c weights/checkpoint_latest.pth.tar
and then:
python eval_vip.py -g DATA/VIP/VIP_Fine/Annotations/Category_ids/ -p results/VIP/category/
Please check out this branch. The code is borrowed from TimeCycle.
We use the kinetics dataset for training.
python track_match_v1.py --wepoch 10 --nepoch 30 -c match_track_switch --batchsize 40 --coord_switch 0 --lc 0.3
- This code is based on TPN and TimeCycle.
- For any issues, please contact [email protected] or [email protected].