A pure numpy
implementation of layers
, learning algorithms
and objective functions
for neural networks
.
Layers:
- Linear
- Convolution (in-progress)
Activation Functions:
- Sigmoid
- Softmax
- ReLU
- Tanh (in-progress)
- LeakyReLU
- Stochastic Mini-Batch Gradient Descent
- Area Under the Curve (AUC)
- Mean Squared Error
- Categorical Cross Entropy