Title | Learning Dual-Fused Modality-Aware Representations for RGBD Tracking |
Author | |
Corresponding Author | Feng Zheng |
DOI | |
Publication Years | 2022-11-15
|
Conference Name | European Conference on Computer Vision2022
|
Conference Date | 2022/10/23-2022/10/27
|
Conference Place | 特拉维夫
|
Abstract | Object tracking is to localize an arbitrary object in a video sequence, given only the object description in the first frame. It can be applied in lots of applications in video surveillance, autonomous driving [18,35,23], and robotics [19]. Recent years witness the development of RGBD (RGB+Depth) object tracking thanks to the affordable and accurate depth cameras. RGBD tracking aims to track the objects more robustly and accurately with the help of depth information, even in color-failed scenarios, e.g., target occlusion and dark scenes. Compared to conventional RGB-based tracking, the major difficulty of RGBD S. Gao et al. |
Keywords | |
SUSTech Authorship | First
; Corresponding
|
Language | English
|
Data Source | 人工提交
|
Publication Status | 在线出版
|
Citation statistics |
Cited Times [WOS]:0
|
Document Type | Conference paper |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/415606 |
Department | Department of Computer Science and Engineering |
Affiliation | 1.Department of Computer Science and Engineering, Southern University of Science and Technology, Shenzhen, China 2.University of Birmingham, Birmingham, United Kingdom 3.University of Electronic Science and Technology of China , Chengdu, China |
First Author Affilication | Department of Computer Science and Engineering |
Corresponding Author Affilication | Department of Computer Science and Engineering |
First Author's First Affilication | Department of Computer Science and Engineering |
Recommended Citation GB/T 7714 |
Shang Gao,Jinyu Yang,Zhe Li,et al. Learning Dual-Fused Modality-Aware Representations for RGBD Tracking[C],2022.
|
Files in This Item: | There are no files associated with this item. |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment