Why Slayer BatchNorm layer has the parameters for only one channel though there are multiple channels for cuba.Conv block #328
Replies: 1 comment
-
it is resolved with calling forward propagation of the model with dummy inputs before loading the weights. Closing the thread. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi.
In the following code i set the batchnorm as a neuronal parameter and use them for cuba.Conv block as follows expecting that there will be 16 set of batchnorm parameters so that each channel has different parameters. But it seems there is only one running_mean and running_var for the whole set of channels. Because of this i am facing a problem with mapping spiking models trained in other frameworks (such as spikingjelly) to slayer. Is there a way to set the number of channels for the slayer's batchnorm layer?
neuron_kwargs = {
'threshold' : 1.0,
'current_decay' : 0.2,
'voltage_decay' : 0.03,
'tau_grad' : 0.03,
'scale_grad' : 3,
'requires_grad' : False,
'norm': slayer.neuron.norm.WgtScaleBatchNorm,
}
slayer.block.cuba.Conv(neuron_kwargs,1, 16 , kernel_size=3, stride=1, padding=1),
Thanks and Rgds,
Udayanga
Beta Was this translation helpful? Give feedback.
All reactions