Katharina Breininger
27.07.2023
- Lecture Slides
- Chapter 7: Regularization for Deep Learning (Goodfellow, Bengio, Courville: Deep Learning, MIT Press, 2016)
- 3.2 Bias-Variance Decomposition (Bishop: Pattern Recognition & Machine Learning, Springer, 2006, p. 147ff)
- Video explanation by Andrew Ng - DeepLearningAI - Part 1 and Part 2
- [1] Zhang et al.: Understanding deep learning requires rethinking generalization, Proc. ICLR 2017 abs
- [2] Hinton et al.: Improving neural networks by preventing co-adaptation of feature detectors, arXiv 2012 abs
- [3] Srivastava et al.: Dropout: A Simple Way to Prevent Neural Networks from Overfitting, PMLR 2014 abs
- [4] Wan et al.: Regularization of Neural Networks using DropConnect, PMLR 2013 abs
- [5] Tompson et al.: Efficient Object Localization Using Convolutional Networks, Proc. CVPR 2015 abs
- [6] Ghiasi et al.: DropBlock: A regularization method for convolutional networks, Proc. NeurIPS 2018, abs
- [7] Wang et al.: Fast dropout training, Proc. NeurIPS 2013 abs
- Follow-up work by Kingma et al.: Variational Dropout and the Local Reparameterization Trick, Proc. NeurIPS 2015 abs
- [8] Gal and Ghahramani: Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, PMLR 2016 abs
