Make yourown code for NN including
- Back-prop
- feedforward
- Mini-batch
- Gradient Descent with Adaptive Learning Rate
- Gradient Descent with RMSprop, Momentum and Adam optimization methods
Try different learning rate and different optimization techniques and see how affect the training.
Try different batch-size and see how that affect the training.