Squeeze-and-Excitation Networks

By J. Hu et al
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
Introduction
Related Work
Squeeze-and-Excitation Blocks
Instantiations
Model and Computational Complexity

Summary

The document discusses the Squeeze-and-Excitation (SE) networks, focusing on recalibrating channel-wise feature responses in convolutional neural networks. It introduces the SE block, a novel architectural unit that adaptively recalibrates features by modeling interdependencies between channels. The SE blocks are shown to improve performance for CNNs at slight additional computational cost. Various architectures, including SE-Inception and SE-ResNet, are proposed. The computational complexity of SE-ResNet-50 is evaluated, showing a minor increase in FLOPs with improved accuracy compared to ResNet-50.
×
This is where the content will go.