Evolutionary Neural AutoML for Deep Learning

By Jason Liang et al.
Published on June 10, 2019
Read the original document by opening this link in a new tab.

Table of Contents

1. INTRODUCTION
2. BACKGROUND AND RELATED WORK
3. LEAF OVERVIEW
3.1 Algorithm Layer
3.2 System Layer
3.3 Problem-Domain Layer

Summary

Deep neural networks (DNNs) have achieved state-of-the-art results in various domains, but optimizing their architecture and hyperparameters is challenging. This paper introduces LEAF, an evolutionary AutoML framework that goes beyond hyperparameter optimization to include network architecture and size optimization. By leveraging evolutionary algorithms and distributed computing, LEAF demonstrates improved performance in medical image classification and natural language analysis. The system addresses the need for democratizing AI and simplifying the configuration of DNNs. CoDeepNEAT, a cooperative coevolutionary algorithm, forms the core of LEAF, allowing for the evolution of DNN architectures and hyperparameters. The system layer parallelizes training on cloud infrastructure, while the problem-domain layer focuses on tasks like hyperparameter tuning and complexity minimization. LEAF provides a foundation for advancing AI and enhancing practical applications.
×
This is where the content will go.