Skip to content

diff7/Awesome-binary-networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 

Repository files navigation

Awesome-binary-networks

Currently, there are two main directions to improving performance of binary nets.

  1. The first direction focuses on modifying or adjusting architectures to fit binary weights. Mainly, we want to increase models capacity by adding more parameters but there are some other tricks too. Since linear and convolutional layers take most of the computation research mainly focus on binarizing weights and activations only for these operations while other parts are remained real valued. This approach reduces a number floating point operations but requires mixed precision computation. Therefore, it is unclear if such networks can be efficiently executed without specific software and hardware solutions.

  2. The second direction is focused on optimization procedure. Since weights are binarized, we need to come up with some solution on how to backpropagate through binarization function, usually sign function.

Unsurprisingly, approaches based on architecture modification show marginally better results.

Papers are split into two parts. Introduction papers, which I recommend reading first to get familiar with the problem and others not less interesting papers.

INTRODUCTION PAPERS

2016
Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
https://arxiv.org/abs/1602.02830

2016
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
https://arxiv.org/abs/1603.05279

2017
The High-Dimensional Geometry of Binary Neural Networks
https://arxiv.org/pdf/1705.07199.pdf

2020
Binary neural networks: A survey
https://www.sciencedirect.com/science/article/pii/S0031320320300856


KEEP READING ...

2017
Towards Accurate Binary Convolutional Neural Network
https://arxiv.org/abs/1711.11294

2018
Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved Representational Capability and Advanced Training Algorithm
https://arxiv.org/abs/1808.00278

2018
Binary Ensemble Neural Network: More Bits per Network or More Networks per Bit?
https://arxiv.org/pdf/1806.07550.pdf

2019
Structured Binary Neural Networks for Image Recognition
https://arxiv.org/abs/1909.09934

2019
Circulant Binary Convolutional Networks: Enhancing the Performance of 1-bit
https://arxiv.org/pdf/1910.10853.pdf

2020
Widening and Squeezing: Towards Accurate and Efficient QNNs
https://arxiv.org/abs/2002.00555

2020
ReActNet: Towards Precise Binary Neural Network with Generalized Activation Functions
https://link.springer.com/chapter/10.1007/978-3-030-58568-6_9

2020
Training Binary Neural Networks using the Bayesian Learning Rule
https://arxiv.org/abs/2002.10778

2021
Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning A Randomly Weighted Network
https://arxiv.org/abs/2103.09377

2021
FracBNN: Accurate and FPGA-Efficient Binary Neural Networks with Fractional Activations #toread
https://dl.acm.org/doi/abs/10.1145/3431920.3439296

2021
Finding Everything within Random Binary Networks
https://arxiv.org/abs/2110.08996

2022
Self-distribution binary neural networks
https://link.springer.com/article/10.1007/s10489-022-03348-z

2022
An Empirical study of Binary Neural Networks' Optimisation
https://openreview.net/forum?id=rJfUCoR5KX

2022
Block Walsh-Hadamard Transform Based Binary Layers in Deep Neural Networks
https://arxiv.org/abs/2201.02711

2020
Binarizing MobileNet via Evolution-based Searching
https://arxiv.org/abs/2005.06305

2020
ShiftAddNet: A Hardware-Inspired Deep Network
https://proceedings.neurips.cc/paper/2020/hash/1cf44d7975e6c86cffa70cae95b5fbb2-Abstract.html

2020
Least squares binary quantization of neural networks
https://arxiv.org/abs/2001.02786

2020
BINARYDUO: REDUCING GRADIENT MISMATCH IN BI-NARY ACTIVATION NETWORK BY COUPLING BINARY ACTIVATIONS
https://arxiv.org/pdf/2002.06517.pdf

2019 Back to Simplicity: How to Train Accurate BNNs from Scratch? https://arxiv.org/pdf/1906.08637.pdf

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published