Read the original document by opening this link in a new tab.
Table of Contents
1. Introduction
2. Differentiable Architecture Search
3. Search Space
4. Continuous Relaxation and Optimization
5. Approximate Architecture Gradient
6. Deriving Discrete Architectures
7. Experiments and Results
Summary
This paper addresses the scalability challenge of architecture search by formulating the task in a differentiable manner. Unlike conventional approaches of applying evolution or reinforcement learning over a discrete and non-differentiable search space, our method is based on the continuous relaxation of the architecture representation, allowing efficient search of the architecture using gradient descent. Extensive experiments on CIFAR-10, ImageNet, Penn Treebank and WikiText-2 show that our algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques. Our implementation has been made publicly available to facilitate further research on efficient architecture search algorithms.