Skip to content

Train SOTA Architectures on benchmark data-sets of images with PyTorch

Notifications You must be signed in to change notification settings

UmbertoTomasini/diffeo-sota

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Train SOTA Architectures on benchmark Data-sets with PyTorch

Dependencies (other than common ones):

The list of parameters includes:

  • Dataset (see below) - ptr indicates the train-set size.
  • Architecture (see below)
  • Optimizer (sgd, adam)
  • lr and lr scheduler (cosineannealing, none)
  • loss function (crossentropy for multi-class, hinge for one class)
  • training in feature or lazy regime with alpha-trick (featlazy to 1 and vary alpha)
  • ...

Example:

    python main.py --epochs 200 --save_best_net 1 --save_dynamics 0 --diffeo 0 --batch_size 32 --net:str 'EfficientNetB0' --dataset:str 'cifar10' --seed_init 0 --ptr 1024
Datasets
mnist
fashionmnist
cifar10
svhn
tiny-imagenet

Models impelementations are based on github.com/kuangliu/pytorch-cifar. The list includes:

Models
Fully Connected
LeNet
AlexNet
VGG16
ResNet18
ResNet50
ResNet101
RegNetX_200MF
RegNetY_400MF
MobileNetV2
ResNeXt29(32x4d)
ResNeXt29(2x64d)
SimpleDLA
DenseNet121
PreActResNet18
DPN92
DLA

About

Train SOTA Architectures on benchmark data-sets of images with PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%