Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Multiclass Classification

Overview

This project implements one-vs-all (OvA) classification using logistic regression and a pre-trained neural network to recognize handwritten digits (0-9) from the MNIST dataset.

Algorithm

One-vs-All Logistic Regression

For K classes, K separate binary classifiers are trained. Each classifier distinguishes one class from all others. Prediction selects the class with the highest probability.

Pre-trained Neural Network

A two-layer neural network with pre-loaded weights is used for forward propagation (prediction only). The network architecture is:

  • Input layer: 400 units (20x20 pixel images)
  • Hidden layer: 25 units with sigmoid activation
  • Output layer: 10 units (digits 0-9)

Files

File Description
sample3.m Main script: one-vs-all logistic regression
sample3_nn.m Main script: neural network prediction with pre-trained weights
lrCostFunction.m Regularized logistic regression cost function
oneVsAll.m Trains K binary classifiers
predictOneVsAll.m Predicts using one-vs-all classifiers
predict.m Neural network forward propagation
displayData.m Displays digit images in a grid
fmincg.m Conjugate gradient optimization
ex3data1.mat MNIST handwritten digit dataset
ex3weights.mat Pre-trained neural network weights

Key Results

  • One-vs-All: Training set accuracy of ~95.0%
  • Neural Network: Training set accuracy of ~97.5%

Visualization

Multiclass Classification Visualization

Top-left: Sample handwritten digits. Top-right: Per-class accuracy. Bottom-left: Confusion matrix. Bottom-right: Learned weights for each digit class.

Credit

Exercises from Andrew Ng's Machine Learning course on Coursera, completed by Keivan Hassani Monfared.