Knowledge Distillation: A Survey

By J. Gou et al.
Published on May 20, 2021
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
Introduction
1. Knowledge Distillation
2. Knowledge Types
3. Distillation Strategies
4. Teacher-Student Architectures
5. Latest Knowledge Distillation Approaches
6. Performance Comparison
7. Applications of Knowledge Distillation
8. Challenges and Future Directions

Summary

This paper provides a comprehensive survey of knowledge distillation, focusing on knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison, and applications. It discusses the challenges in knowledge distillation and provides insights for future research. The survey covers different types of knowledge, distillation strategies, and structures, along with various real-world applications. The authors present a detailed overview of knowledge distillation and its recent progress, highlighting its importance in the realm of deep learning and model compression.
×
This is where the content will go.