Skip to content

Latest commit

 

History

History
38 lines (25 loc) · 731 Bytes

README.md

File metadata and controls

38 lines (25 loc) · 731 Bytes

PyNN

A pure numpy implementation of layers, learning algorithms and objective functions for neural networks.

Layers and Activation Functions

Layers:

  1. Linear
  2. Convolution (in-progress)

Activation Functions:

  1. Sigmoid
  2. Softmax
  3. ReLU
  4. Tanh (in-progress)
  5. LeakyReLU

Learning Algorithms

  1. Stochastic Mini-Batch Gradient Descent

Metrics

  1. Area Under the Curve (AUC)

Objective Functions

  1. Mean Squared Error
  2. Categorical Cross Entropy