BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning
Knowledge distillation guides student networks’ training and enhances their performance through excellent teacher networks. However, along with the performance advantages, knowledge distillation also entails a huge computational burden, sometimes tens or even hundreds of times that of traditional training methods. So, this paper designs a book-based knowledge distillation (BookKD) to minimize the costs of knowledge distillation while improving performance. First, a decoupling-based knowledge distillation framework is designed. By decoupling the traditional knowledge distillation process into two independent sub-processes, book-making and book-learning, knowledge distillation can be completed with little resource consumption. Second, a book-making method based on knowledge ensemble and knowledge regularization is developed, which makes books by organizing and processing the knowledge generated by teachers. These books can replace these teachers to provide sufficient knowledge with little distillation costs. Finally, a book-learning method based on entropy dynamic adjustment and label smoothing is designed. The entropy dynamic adjustment optimizes the training loss and mitigates student networks’ difficulty in learning books. Label smoothing alleviates the student network's over-confidence in ground truth labels, which increases its attention to the class similarity knowledge in books. BookKD is tested on three image classification datasets, CIFAR100, ImageNet and ImageNet100, and an object detection dataset PASCAL VOC 2007. The experiment results indicate the advantages of BookKD in reducing distillation costs and improving distillation performance.
National Natural Science Foundation of China;National Natural Science Foundation of China;
|WOS Research Area|
Computer Science, Artificial Intelligence
|WOS Accession No|
|ESI Research Field|
Cited Times [WOS]:0
|Document Type||Journal Article|
|Department||Research Institute of Trustworthy Autonomous Systems|
1.Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education,School of Artificial Intelligence,Xidian University,Xi'an,Shaanxi Province,710071,China
2.The Research Institute of Trustworthy Autonomous Systems,Southern University of Science and Technology,Shenzhen,518055,China
3.The Institute of Medical Artificial Intelligence,the Second Affiliated Hospital of Xi'an Jiaotong University,Xi'an,710004,China
Zhu，Songling,Shang，Ronghua,Tang，Ke,et al. BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning[J]. Knowledge-Based Systems,2023,279.
Zhu，Songling,Shang，Ronghua,Tang，Ke,Xu，Songhua,&Li，Yangyang.(2023).BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning.Knowledge-Based Systems,279.
Zhu，Songling,et al."BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning".Knowledge-Based Systems 279(2023).
|Files in This Item:||There are no files associated with this item.|
|Recommend this item|
|Export to Endnote|
|Export to Excel|
|Export to Csv|
|Similar articles in Google Scholar|
|Similar articles in Baidu Scholar|
|Similar articles in Bing Scholar|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.