Skip to content

A neural network built from scratch to classify MNIST handwritten digits, achieving 98.02% accuracy without using any ML library (only NumPy). This project demonstrates core neural network concepts, including forward propagation, backpropagation, and gradient descent.

Notifications You must be signed in to change notification settings

yayme/MNIST-NeuralNetwork-NumPy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

MNIST Neural Network (NumPy Only)

Project Overview

This project implements a neural network from scratch to classify MNIST handwritten digits with a high accuracy of 98.02%. It is entirely built using only NumPy without the use of any machine learning libraries like TensorFlow or PyTorch. The network consists of input, hidden, and output layers with backpropagation to adjust weights over 100 epochs.

Techniques Used

  • Data Preprocessing: Flattened images and normalized pixel values to ensure efficient training.
  • Neural Network Layers: Implemented a 3-layer neural network with one hidden layer of 128 units.
  • Activation Function: Used the sigmoid activation function for both hidden and output layers.
  • Backpropagation: Calculated error gradients and updated weights to minimize the loss function.
  • Gradient Descent: Used a fixed learning rate of 0.1 to optimize the model over time.

Results

  • Training Accuracy: 98.02%
  • Loss: Converges to 0.0007 after 40 epochs

About

A neural network built from scratch to classify MNIST handwritten digits, achieving 98.02% accuracy without using any ML library (only NumPy). This project demonstrates core neural network concepts, including forward propagation, backpropagation, and gradient descent.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages