What Is the State of Neural Network Pruning?

By Davis Blalock et al.
Published on March 6, 2020
Read the original document by opening this link in a new tab.

Table of Contents

ABSTRACT
INTRODUCTION
OVERVIEW OF PRUNING
Definitions
High-Level Algorithm
Differences Betweeen Pruning Methods
Evaluating Pruning
LESSONS FROM THE LITERATURE
Papers Used in Our Analysis
How Effective is Pruning?
Pruning vs Architecture Changes
MISSING CONTROLLED COMPARISONS
Omission of Comparison
Ignoring Pre-2010s Methods
Ignoring Recent Methods

Summary

Neural network pruning involves reducing the size of a network by removing parameters. The literature on pruning reveals a lack of standardized benchmarks and metrics, hindering accurate comparisons between techniques. The document discusses various pruning methods, their effectiveness in compressing models without loss of accuracy, and the challenges in comparing different pruning approaches. It emphasizes the importance of controlled comparisons and the need to address experimental standardization in pruning research.
×
This is where the content will go.