Neural Architecture Search: Insights From 1000 Papers

By Colin White et al
Published on Jan. 25, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
1.1 A Brief History of NAS and Relation to Other Fields
1.2 Background and Definitions
2. Search Spaces
2.1 Terminology
2.2 Macro Search Spaces
2.3 Chain-Structured Search Spaces

Summary

In the past decade, advances in deep learning have resulted in breakthroughs in a variety of areas, including computer vision, natural language understanding, speech recognition, and reinforcement learning. Specialized, high-performing neural architectures are crucial to the success of deep learning in these areas. Neural architecture search (NAS), the process of automating the design of neural architectures for a given task, is an inevitable next step in automating machine learning and has already outpaced the best human-designed architectures on many tasks. In the past few years, research in NAS has been progressing rapidly, with over 1000 papers released since 2020. This survey provides an organized and comprehensive guide to neural architecture search, giving a taxonomy of search spaces, algorithms, and speedup techniques. The document discusses resources such as benchmarks, best practices, other surveys, and open-source libraries.
×
This is where the content will go.