Title | Computational approaches for the reconstruction of optic nerve fibers along the visual pathway from medical images: a comprehensive review |
Author | |
Corresponding Author | Jin,Richu; Liu,Jiang |
Publication Years | 2023
|
DOI | |
Source Title | |
ISSN | 1662-4548
|
EISSN | 1662-453X
|
Volume | 17 |
Abstract | Optic never fibers in the visual pathway play significant roles in vision formation. Damages of optic nerve fibers are biomarkers for the diagnosis of various ophthalmological and neurological diseases; also, there is a need to prevent the optic nerve fibers from getting damaged in neurosurgery and radiation therapy. Reconstruction of optic nerve fibers from medical images can facilitate all these clinical applications. Although many computational methods are developed for the reconstruction of optic nerve fibers, a comprehensive review of these methods is still lacking. This paper described both the two strategies for optic nerve fiber reconstruction applied in existing studies, i.e., image segmentation and fiber tracking. In comparison to image segmentation, fiber tracking can delineate more detailed structures of optic nerve fibers. For each strategy, both conventional and AI-based approaches were introduced, and the latter usually demonstrates better performance than the former. From the review, we concluded that AI-based methods are the trend for optic nerve fiber reconstruction and some new techniques like generative AI can help address the current challenges in optic nerve fiber reconstruction. |
Keywords | |
URL | [Source Record] |
Indexed By | |
Language | English
|
SUSTech Authorship | First
; Corresponding
|
Funding Project | National Natural Science Foundation of China["62101236","82102189"]
; General Program of National Natural Science Foundation of China[82272086]
; Guangdong Provincial Department of Education[2020ZDZX3043]
; Guangdong Provincial Key Laboratory[2020B121201001]
; Shenzhen Natural Science Fund[JCYJ20200109140820699]
; Stable Support Plan Program[20200925174052004]
|
WOS Research Area | Neurosciences & Neurology
|
WOS Subject | Neurosciences
|
WOS Accession No | WOS:001003123100001
|
Publisher | |
Scopus EID | 2-s2.0-85161427109
|
Data Source | Scopus
|
Citation statistics |
Cited Times [WOS]:0
|
Document Type | Journal Article |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/560290 |
Department | Research Institute of Trustworthy Autonomous Systems 工学院_计算机科学与工程系 |
Affiliation | 1.Research Institute of Trustworthy Autonomous Systems,Southern University of Science and Technology,Shenzhen,China 2.Department of Computer Science and Engineering,Southern University of Science and Technology,Shenzhen,China 3.Guangdong Provincial Key Laboratory of Brain-inspired Intelligent Computation,Department of Computer Science and Engineering,Southern University of Science and Technology,Shenzhen,China |
First Author Affilication | Research Institute of Trustworthy Autonomous Systems; Department of Computer Science and Engineering |
Corresponding Author Affilication | Research Institute of Trustworthy Autonomous Systems; Department of Computer Science and Engineering |
First Author's First Affilication | Research Institute of Trustworthy Autonomous Systems |
Recommended Citation GB/T 7714 |
Jin,Richu,Cai,Yongning,Zhang,Shiyang,et al. Computational approaches for the reconstruction of optic nerve fibers along the visual pathway from medical images: a comprehensive review[J]. Frontiers in Neuroscience,2023,17.
|
APA |
Jin,Richu.,Cai,Yongning.,Zhang,Shiyang.,Yang,Ting.,Feng,Haibo.,...&Liu,Jiang.(2023).Computational approaches for the reconstruction of optic nerve fibers along the visual pathway from medical images: a comprehensive review.Frontiers in Neuroscience,17.
|
MLA |
Jin,Richu,et al."Computational approaches for the reconstruction of optic nerve fibers along the visual pathway from medical images: a comprehensive review".Frontiers in Neuroscience 17(2023).
|
Files in This Item: | There are no files associated with this item. |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment