You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the experiment of table 2 using this repo and the final accuracy for all classes on CIFAR10 and CIFAR100 are only 60.28 and 38.60, respectively, which are much lower than the reported.
To reproduce this paper, I ran the following three files sequentially: "contrastive_train.sh", "extract_features.sh", and "k_means.sh".
Following the implementation details in the original paper, I used ViT-B-16 backbone with DINO pre-trained weights (weights are downloaded from this repo: https://github.com/facebookresearch/dino) and fine-tune the final transformer block.
The total number of clusters for k-means clustering is set as the total number of classes.
The default setting of this repo seems to be set on Stanford Cars, so can you give detailed parameters for CIFAR10/CIFAR100??
The text was updated successfully, but these errors were encountered:
I ran the experiment of table 2 using this repo and the final accuracy for all classes on CIFAR10 and CIFAR100 are only 60.28 and 38.60, respectively, which are much lower than the reported.
To reproduce this paper, I ran the following three files sequentially: "contrastive_train.sh", "extract_features.sh", and "k_means.sh".
Following the implementation details in the original paper, I used ViT-B-16 backbone with DINO pre-trained weights (weights are downloaded from this repo: https://github.com/facebookresearch/dino) and fine-tune the final transformer block.
The total number of clusters for k-means clustering is set as the total number of classes.
The default setting of this repo seems to be set on Stanford Cars, so can you give detailed parameters for CIFAR10/CIFAR100??
The text was updated successfully, but these errors were encountered: