Skip to content

ekbreininger/dropout_regularization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dropout: A regularization technique

Katharina Breininger

27.07.2023

Dropout for NNs

Materials and further reading

Selection of influential papers (partly referenced in the lecture):

  • [1] Zhang et al.: Understanding deep learning requires rethinking generalization, Proc. ICLR 2017 abs
  • [2] Hinton et al.: Improving neural networks by preventing co-adaptation of feature detectors, arXiv 2012 abs
  • [3] Srivastava et al.: Dropout: A Simple Way to Prevent Neural Networks from Over fitting, PMLR 2014 abs
  • [4] Wan et al.: Regularization of Neural Networks using DropConnect, PMLR 2013 abs
  • [5] Tompson et al.: Efficient Object Localization Using Convolutional Networks, Proc. CVPR 2015 abs
  • [6] Ghiasi et al.: DropBlock: A regularization method for convolutional networks, Proc. NeurIPS 2018, abs
  • [7] Wang et al.: Fast dropout training, Proc. NeurIPS 2013 abs
  • Follow-up work by Kingma et al.: Variational Dropout and the Local Reparameterization Trick, Proc. NeurIPS 2015 abs
  • [8] Gal and Ghahramani: Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, PMLR 2016 abs

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published