Skip to content

MJAHMADEE/AutoEncoders_for_Classification

Repository files navigation

AutoEncoders for Classification 🤖

Python PyTorch Machine Learning

This project leverages AutoEncoders in PyTorch for feature extraction and classification on the MNIST dataset, demonstrating how unsupervised learning can enhance supervised tasks.

Features 🌟

  • Implements an AutoEncoder for dimensionality reduction and feature extraction.
  • Uses a neural network classifier to categorize images based on learned representations.
  • Trains and evaluates on the MNIST dataset, providing insights into model performance.
  • Includes data visualization of the training process and prediction results.

Setup and Installation 🛠️

  1. Clone the GitHub repository.
  2. Ensure Python 3.x and PyTorch are installed.
  3. Install additional dependencies as listed in requirements.txt.

Data 📁

The project uses the MNIST dataset, a collection of handwritten digits, to train and test the model's performance.

Usage 🚀

  • Run the training script to build and train the AutoEncoder and classifier models.
  • Evaluate the model using the test script, which outputs accuracy, precision, recall, and F1 score metrics.
  • Visualize the training process through loss and accuracy plots, and understand the model decisions with a confusion matrix.

Results 📊

The README includes a section on the results obtained from training, highlighting key performance metrics and visualizations like loss curves and confusion matrices.

Contributing 🤝

Contributions to the project are welcome. Follow the standard fork, branch, and pull request workflow to propose changes.

License 📜

This project is released under the MIT License. See the LICENSE file for more details.

Acknowledgements 🙌

  • The PyTorch team for an excellent deep learning framework.
  • The MNIST dataset maintainers for providing a reliable dataset used widely in machine learning research.

For more details, please refer to the project repository.

Open In Colab

Releases

No releases published

Packages

No packages published