Skip to content

Latest commit

 

History

History
27 lines (19 loc) · 2.21 KB

File metadata and controls

27 lines (19 loc) · 2.21 KB

Dropout: A regularization technique

Katharina Breininger

27.07.2023

Dropout for NNs

Materials and further reading

Selection of influential papers (partly referenced in the lecture):

  • [1] Zhang et al.: Understanding deep learning requires rethinking generalization, Proc. ICLR 2017 abs
  • [2] Hinton et al.: Improving neural networks by preventing co-adaptation of feature detectors, arXiv 2012 abs
  • [3] Srivastava et al.: Dropout: A Simple Way to Prevent Neural Networks from Over fitting, PMLR 2014 abs
  • [4] Wan et al.: Regularization of Neural Networks using DropConnect, PMLR 2013 abs
  • [5] Tompson et al.: Efficient Object Localization Using Convolutional Networks, Proc. CVPR 2015 abs
  • [6] Ghiasi et al.: DropBlock: A regularization method for convolutional networks, Proc. NeurIPS 2018, abs
  • [7] Wang et al.: Fast dropout training, Proc. NeurIPS 2013 abs
  • Follow-up work by Kingma et al.: Variational Dropout and the Local Reparameterization Trick, Proc. NeurIPS 2015 abs
  • [8] Gal and Ghahramani: Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, PMLR 2016 abs