Deep Convolutional Neural Network Design Patterns

By Leslie N. Smith et al
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
2. Related Work
3. Design Patterns
3.1 High-level Architecture Design
3.2 Detailed Architecture Design
3.2.1 Joining Branches: Concatenation, Summation/Mean, Maxout
4. Experiments
4.1 Architectural Innovations

Summary

This document discusses deep convolutional neural network design patterns, focusing on bridging the gap for inexperienced deep learning practitioners. It explores architectural innovations such as Fractal of FractalNet network, Stagewise Boosting Networks, and Taylor Series Networks. The paper also presents 14 original design patterns for neural network architectures, aiming to provide guidance on design principles. Additionally, it reviews recent research on network architectures, particularly Residual Networks, and introduces various design patterns such as Proliferate Paths, Strive for Simplicity, Increase Symmetry, Pyramid Shape, Over-train, Cover the Problem Space, Incremental Feature Construction, Normalize Layer Inputs, Input Transition, Available Resources Guide Layer Widths, Summation Joining, Down-sampling Transition, and Maxout for Competition. The text also discusses the importance of design choices, trade-offs, and the significance of generalization in deep learning.
×
This is where the content will go.