Skip to content

imagenet pretrained weights with nopad in conv1x1 layers

Pre-release
Pre-release
Compare
Choose a tag to compare
@Coderx7 Coderx7 released this 07 Apr 09:47
· 11 commits to master since this release

These are the weights finetuned on previously trained weights, with the exception that the last 2 conv1x1 layers now have no padding
this allows for faster inference and I tried to improve the accuracy and here are the models so far.
the sample quantized weights are statically quantized, in order to get full accuracy like the one in full precision weights, they must be fintuned in QAT.