imagenet pretrained weights with nopad in conv1x1 layers
Pre-release
Pre-release
These are the weights finetuned on previously trained weights, with the exception that the last 2 conv1x1 layers now have no padding
this allows for faster inference and I tried to improve the accuracy and here are the models so far.
the sample quantized weights are statically quantized, in order to get full accuracy like the one in full precision weights, they must be fintuned in QAT.