中文版 | English
Title

Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation

Author
Corresponding AuthorFu,Chenglong
Publication Years
2023-05-01
DOI
Source Title
ISSN
1566-2535
EISSN
1872-6305
Volume93Pages:268-281
Abstract
Human intent prediction (HIP) and human activity recognition (HAR) are important for human–robot interactions. However, human–robot interface signals are user-dependent. A classifier trained on labeled source subjects performs poorly on unlabeled target subjects. Besides, previous methods used a single learner, which may only learn a subset of features and degrade their performance on target subjects. Last, HIP and HAR require real-time computing on edge devices whose computational capabilities limit the model size. To address these issues, this paper designs an ensemble diverse hypotheses (EDH) and knowledge distillation (EDHKD) method. EDH mitigates the cross-subject divergence by training feature generators to minimize the upper bound of the classification discrepancy among multiple classifiers. EDH also maximizes the discrepancy among multiple feature generators to learn diverse and complete features. After training EDH, a lightweight student network (EDHKD) distills the knowledge from EDH to a single feature generator and classifier to significantly decrease the model size but remain accurate. The performance of EDHKD is theoretically demonstrated and experimentally validated. Results show that EDH can learn diverse features and adapt well to unknown target subjects. With only soft labels provided by EDH, the student network (EDHKD) can inherit the knowledge learned by EDH and classify unlabeled target data of a 2D moon dataset and two human locomotion datasets with the accuracy at 96.9%, 94.4%, and 97.4%, respectively, in no longer than 1 millisecond. Compared to the benchmark method, EDHKD lifts the target-domain classification accuracy by 1.3% and 7.1% in the two human locomotion datasets. EDHKD also stabilizes learning curves. Therefore, EDHKD significantly increases the generalization ability and efficiency of the HIP and HAR.
Keywords
URL[Source Record]
Indexed By
Language
English
SUSTech Authorship
First ; Corresponding
Funding Project
National Natural Science Foundation of China["U1913205","62103180","51805237","52175272"] ; Guangdong Basic and Applied Basic Research Foundation[2020B1515120098] ; Stable Support Plan Program of Shenzhen Natural Science Fund[20200925174640002] ; Science, Technology, and Innovation Commission of Shenzhen Municipality[ZDSYS-20200811143601004] ; China Postdoctoral Science Foundation[2021M701577]
WOS Research Area
Computer Science
WOS Subject
Computer Science, Artificial Intelligence ; Computer Science, Theory & Methods
WOS Accession No
WOS:001012832400001
Publisher
Scopus EID
2-s2.0-85146055105
Data Source
Scopus
Citation statistics
Cited Times [WOS]:0
Document TypeJournal Article
Identifierhttp://kc.sustech.edu.cn/handle/2SGJ60CL/442568
DepartmentDepartment of Mechanical and Energy Engineering
Affiliation
1.Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems,Department of Mechanical and Energy Engineering,Southern University of Science and Technology,Shenzhen,518055,China
2.Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities,Southern University of Science and Technology,Shenzhen,518055,China
3.Department of Mechanical Engineering,The University of British Columbia,Vancouver,Canada
4.Department of Electrical and Computer Engineering,The University of British Columbia,Vancouver,Canada
First Author AffilicationDepartment of Mechanical and Energy Engineering;  Southern University of Science and Technology
Corresponding Author AffilicationDepartment of Mechanical and Energy Engineering;  Southern University of Science and Technology
First Author's First AffilicationDepartment of Mechanical and Energy Engineering
Recommended Citation
GB/T 7714
Zhang,Kuangen,Chen,Jiahong,Wang,Jing,et al. Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation[J]. Information Fusion,2023,93:268-281.
APA
Zhang,Kuangen.,Chen,Jiahong.,Wang,Jing.,Chen,Xinxing.,Leng,Yuquan.,...&Fu,Chenglong.(2023).Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation.Information Fusion,93,268-281.
MLA
Zhang,Kuangen,et al."Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation".Information Fusion 93(2023):268-281.
Files in This Item:
There are no files associated with this item.
Related Services
Fulltext link
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Export to Excel
Export to Csv
Altmetrics Score
Google Scholar
Similar articles in Google Scholar
[Zhang,Kuangen]'s Articles
[Chen,Jiahong]'s Articles
[Wang,Jing]'s Articles
Baidu Scholar
Similar articles in Baidu Scholar
[Zhang,Kuangen]'s Articles
[Chen,Jiahong]'s Articles
[Wang,Jing]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang,Kuangen]'s Articles
[Chen,Jiahong]'s Articles
[Wang,Jing]'s Articles
Terms of Use
No data!
Social Bookmark/Share
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.