中文版 | English
Title

Lightweight single image super-resolution with attentive residual refinement network

Author
Corresponding AuthorZhang, Rumin
Publication Years
2022-08-21
DOI
Source Title
ISSN
0925-2312
EISSN
1872-8286
Volume500
Abstract
In recent years, deep convolutional neural network (CNN) based single image super-resolution (SISR) methods have been demonstrated impressive performance in terms of quantitative metrics and visual effects. Most CNN-based SISR methods can learn the complex non-linear mapping between lowresolution (LR) images and their corresponding high-resolution (HR) images due to the powerful representation capabilities of deep convolutional neural networks. However, as the depth and width of the SISR networks increase, the parameters of SISR networks will increase dramatically, leading to huge computational cost and large memory consumption, making them impractical in real-world applications. To attack the above issues, we propose an accurate and lightweight deep convolutional neural network, named Attentive Residual Refinement Network (ARRFN), to recover the high-resolution image from the original low-resolution image directly for SISR. In general, our proposed ARRFN consists of three parts, a feature extraction block, a stack of attentive residual refinement blocks (ARRFB), and a multi-scale separable upscaling module (MSSU), respectively. Specifically, our ARRFB consists of two branches, a regular residual learning branch and an attentive residual refinement branch. The former conducts regular residual learning by two residual blocks while the latter refines the residual information from the two residual blocks of the former branch with an attentive residual mechanism to further enhance the representation capabilities of the network. Furthermore, a multi-scale separable upsampling module (MSSU) is proposed to replace the regular upsampling operation for better SR results. Extensive experiments on several standard benchmarks show that the proposed method outperforms state-of-the-art SR methods in terms of quantitative metrics, visual quality, memory footprint, and inference time. (c) 2022 Elsevier B.V. All rights reserved.
Keywords
URL[Source Record]
Indexed By
SCI ; EI
Language
English
SUSTech Authorship
Corresponding
Funding Project
National Key R&D Program of the Ministry of Science and Technology[2021YFE0204000] ; National Natural Science Foundation of Guang-dong Province[2022A1515011835] ; China Postdoctoral Science Foundation[2021M703687]
WOS Research Area
Computer Science
WOS Subject
Computer Science, Artificial Intelligence
WOS Accession No
WOS:000822674600001
Publisher
ESI Research Field
COMPUTER SCIENCE
Data Source
Web of Science
Citation statistics
Cited Times [WOS]:3
Document TypeJournal Article
Identifierhttp://kc.sustech.edu.cn/handle/2SGJ60CL/356191
DepartmentSUSTech Institute of Microelectronics
Affiliation
1.Sun Yat sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
2.Southern Univ Sci & Technol, Sch Microelect, Shenzhen 518055, Peoples R China
Corresponding Author AffilicationSUSTech Institute of Microelectronics
Recommended Citation
GB/T 7714
Qin, Jinghui,Zhang, Rumin. Lightweight single image super-resolution with attentive residual refinement network[J]. NEUROCOMPUTING,2022,500.
APA
Qin, Jinghui,&Zhang, Rumin.(2022).Lightweight single image super-resolution with attentive residual refinement network.NEUROCOMPUTING,500.
MLA
Qin, Jinghui,et al."Lightweight single image super-resolution with attentive residual refinement network".NEUROCOMPUTING 500(2022).
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Export to Excel
Export to Csv
Altmetrics Score
Google Scholar
Similar articles in Google Scholar
[Qin, Jinghui]'s Articles
[Zhang, Rumin]'s Articles
Baidu Scholar
Similar articles in Baidu Scholar
[Qin, Jinghui]'s Articles
[Zhang, Rumin]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Qin, Jinghui]'s Articles
[Zhang, Rumin]'s Articles
Terms of Use
No data!
Social Bookmark/Share
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.