Dropout: A Simple Way to Prevent Neural Networks from Overfitting

By Nitish Srivastava et al
Published on June 10, 2014
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
Keywords
1. Introduction
2. Motivation
3. Related Work
4. Model Description
5. Learning Dropout Nets

Summary

The document discusses the concept of dropout as a technique to prevent overfitting in neural networks. It explains how dropout works by randomly dropping units during training, preventing co-adaptation and improving generalization. The paper presents experimental results showing the effectiveness of dropout in various domains. It also explores the motivation behind dropout, related work in regularization techniques, and provides a detailed description of the dropout neural network model and training procedure.
×
This is where the content will go.