Skip to content

Latest commit

 

History

History
9 lines (5 loc) · 745 Bytes

activation-function.md

File metadata and controls

9 lines (5 loc) · 745 Bytes

Activation function

A neural network without an activation function is essentially just a linear regression model. The activation function does the non-linear transformation to the input making it capable to learn and perform more complex tasks. Linear transformations would never be able to perform such tasks.

Activation functions make the back-propagation possible since the gradients are supplied along with the error to update the weights and biases. Without the differentiable non linear function, this would not be possible.

Reference

Fundamentals of Deep Learning – Activation Functions and When to Use Them?