Summary
This paper provides a comprehensive survey of knowledge distillation, focusing on knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison, and applications. It discusses the challenges in knowledge distillation and provides insights for future research. The survey covers different types of knowledge, distillation strategies, and structures, along with various real-world applications. The authors present a detailed overview of knowledge distillation and its recent progress, highlighting its importance in the realm of deep learning and model compression.