The Implicit and Explicit Regularization Effects of Dropout
By Colin Wei et al
Published on June 10, 2020
Read the original document by opening this link in a new tab.
Table of Contents
1. Introduction
2. The Implicit and Explicit Regularization Effects of Dropout
3. Disentangling Explicit and Implicit Regularization in Dropout
4. Characterizing the Dropout Regularizers
Summary
The document discusses the regularization effects of dropout in neural networks. It explains how dropout introduces both explicit and implicit regularization effects, analyzing them through controlled experiments. The explicit regularizer is characterized in terms of model and loss derivatives, focusing on data-dependent stability. The study provides insights into the effectiveness of dropout in training models and highlights the importance of understanding both regularization effects for practical applications.