Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation
Human intent prediction (HIP) and human activity recognition (HAR) are important for human–robot interactions. However, human–robot interface signals are user-dependent. A classifier trained on labeled source subjects performs poorly on unlabeled target subjects. Besides, previous methods used a single learner, which may only learn a subset of features and degrade their performance on target subjects. Last, HIP and HAR require real-time computing on edge devices whose computational capabilities limit the model size. To address these issues, this paper designs an ensemble diverse hypotheses (EDH) and knowledge distillation (EDHKD) method. EDH mitigates the cross-subject divergence by training feature generators to minimize the upper bound of the classification discrepancy among multiple classifiers. EDH also maximizes the discrepancy among multiple feature generators to learn diverse and complete features. After training EDH, a lightweight student network (EDHKD) distills the knowledge from EDH to a single feature generator and classifier to significantly decrease the model size but remain accurate. The performance of EDHKD is theoretically demonstrated and experimentally validated. Results show that EDH can learn diverse features and adapt well to unknown target subjects. With only soft labels provided by EDH, the student network (EDHKD) can inherit the knowledge learned by EDH and classify unlabeled target data of a 2D moon dataset and two human locomotion datasets with the accuracy at 96.9%, 94.4%, and 97.4%, respectively, in no longer than 1 millisecond. Compared to the benchmark method, EDHKD lifts the target-domain classification accuracy by 1.3% and 7.1% in the two human locomotion datasets. EDHKD also stabilizes learning curves. Therefore, EDHKD significantly increases the generalization ability and efficiency of the HIP and HAR.
First ; Corresponding
Cited Times [WOS]:0
|Document Type||Journal Article|
|Department||Department of Mechanical and Energy Engineering|
1.Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems,Department of Mechanical and Energy Engineering,Southern University of Science and Technology,Shenzhen,518055,China
2.Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities,Southern University of Science and Technology,Shenzhen,518055,China
3.Department of Mechanical Engineering,The University of British Columbia,Vancouver,Canada
4.Department of Electrical and Computer Engineering,The University of British Columbia,Vancouver,Canada
|First Author Affilication||Department of Mechanical and Energy Engineering; Southern University of Science and Technology|
|Corresponding Author Affilication||Department of Mechanical and Energy Engineering; Southern University of Science and Technology|
|First Author's First Affilication||Department of Mechanical and Energy Engineering|
Zhang，Kuangen,Chen，Jiahong,Wang，Jing,et al. Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation[J]. Information Fusion,2023,93:268-281.
Zhang，Kuangen.,Chen，Jiahong.,Wang，Jing.,Chen，Xinxing.,Leng，Yuquan.,...&Fu，Chenglong.(2023).Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation.Information Fusion,93,268-281.
Zhang，Kuangen,et al."Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation".Information Fusion 93(2023):268-281.
|Files in This Item:||There are no files associated with this item.|
|Recommend this item|
|Export to Endnote|
|Export to Excel|
|Export to Csv|
|Similar articles in Google Scholar|
|Similar articles in Baidu Scholar|
|Similar articles in Bing Scholar|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.