Skip to content

prachikashikar/Exponent-Sharing-on-HLS-LeNet-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Exponent Sharing in LeNet

The HLS implementation of LeNet is taken from here. Further, it is modified to share exponents in a layerwise fashion. Any layer, in the end, does Generalized Matrix Mulplications (GEMM) between input and weights. Here weights are stored as proposed in [1]. The implementation of an independent GEMM is shared #here.

References

[1] P. Kashikar, S. Sinha and A. K. Verma, "Exploiting Weight Statistics for Compressed Neural Network Implementation on Hardware," 2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems (AICAS), 2021, pp. 1-4, doi: 10.1109/AICAS51828.2021.9458581.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published