Skip to content

My simple comparison of the Adam, Adadelta and SGD optimizers employed in a CNN based MNIST classifier.

License

Notifications You must be signed in to change notification settings

emushtaq/nn_optimizers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Optimizers - A comparison

My simple comparison of the Adam, Adadelta and SGD optimizers used in a CNN based MNIST classifier.

Visualizing Accuracy and Loss *:

  1. SGD OPTIMIZER

alt text alt text

  1. ADAM OPTIMIZER

alt text alt text

  1. ADADELTA OPTIMIZER

alt text alt text

* Tuning the right hyperparams and altering the number of epochs and batch size could improve results vastly.

About

My simple comparison of the Adam, Adadelta and SGD optimizers employed in a CNN based MNIST classifier.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages