中文版 | English
Title

An Energy-Efficient Mixed-Bitwidth Systolic Accelerator for NAS-Optimized Deep Neural Networks

Author
Publication Years
2022
DOI
Source Title
ISSN
1063-8210
EISSN
1557-9999
VolumePPIssue:99Pages:1-13
Abstract
Optimized deep neural network (DNN) models and energy-efficient hardware designs are of great importance in edge-computing applications. The neural architecture search (NAS) methods are employed for DNN model optimization with mixed-bitwidth networks. To satisfy the computation requirements, mixed-bitwidth convolution accelerators are highly desired for low-power and high-throughput performance. There exist several methods to support mixed-bitwidth multiply-accumulate (MAC) operations in DNN accelerator designs. The low-bitwidth-combination (LBC) method improves the low-bitwidth throughput with a large hardware cost. The high-bitwidth-split (HBS) method minimizes the additional logic gates for configuration. However, the throughput performance in the low-bitwidth mode is poor. In this work, a bit-split-and-combination (BSC) systolic accelerator is proposed. The BSC-based MAC unit is designed to support mixed-bitwidth operations with the best overall performance. Besides, interprocessing element (PE) systolic and intra-PE paralleled dataflow not only improves throughput performance in mixed-bitwidth modes, but also saves power performance for data transmission. The proposed work is designed and synthesized in a 28-nm process. The BSC MAC unit achieves a maximum 2.08 $\times$ and 1.75 $\times$ energy efficiency improvement than the HBS and LBC unit, respectively. Compared with the state-of-the-art accelerators, the proposed work also achieves excellent energy-efficient performance with 20.02, 23.55, and 30.17 TOPS/W on mixed-bitwidth VGG-16, ResNet-18, and LeNet-5 benchmarks at 0.6 V, respectively.
Keywords
URL[Source Record]
Language
English
SUSTech Authorship
First
ESI Research Field
ENGINEERING
Scopus EID
2-s2.0-85140753011
Data Source
Scopus
PDF urlhttps://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9920733
Citation statistics
Cited Times [WOS]:0
Document TypeJournal Article
Identifierhttp://kc.sustech.edu.cn/handle/2SGJ60CL/407146
DepartmentSUSTech Institute of Microelectronics
Affiliation
1.Ministry of Education, School of Microelectronics and the Engineering Research Center of Integrated Circuits for Next-Generation Communications, Southern University of Science and Technology, Shenzhen, China
2.Department of Computer Science and Engineering, University of California at Merced, Merced, CA, USA
3.Department of Communications and Computer Engineering, Kyoto University, Kyoto, Japan
4.School of Microelectronics, Fudan University, Shanghai, China
5.Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
First Author AffilicationSUSTech Institute of Microelectronics
First Author's First AffilicationSUSTech Institute of Microelectronics
Recommended Citation
GB/T 7714
Mao,Wei,Dai,Liuyao,Li,Kai,et al. An Energy-Efficient Mixed-Bitwidth Systolic Accelerator for NAS-Optimized Deep Neural Networks[J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,2022,PP(99):1-13.
APA
Mao,Wei.,Dai,Liuyao.,Li,Kai.,Cheng,Quan.,Wang,Yuhang.,...&Yu,Hao.(2022).An Energy-Efficient Mixed-Bitwidth Systolic Accelerator for NAS-Optimized Deep Neural Networks.IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,PP(99),1-13.
MLA
Mao,Wei,et al."An Energy-Efficient Mixed-Bitwidth Systolic Accelerator for NAS-Optimized Deep Neural Networks".IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS PP.99(2022):1-13.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Export to Excel
Export to Csv
Altmetrics Score
Google Scholar
Similar articles in Google Scholar
[Mao,Wei]'s Articles
[Dai,Liuyao]'s Articles
[Li,Kai]'s Articles
Baidu Scholar
Similar articles in Baidu Scholar
[Mao,Wei]'s Articles
[Dai,Liuyao]'s Articles
[Li,Kai]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Mao,Wei]'s Articles
[Dai,Liuyao]'s Articles
[Li,Kai]'s Articles
Terms of Use
No data!
Social Bookmark/Share
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.