中文版 | English
Title

SOTER: Guarding Black-box Inference for General Neural Networks at the Edge

Author
Corresponding AuthorJianyu Jiang
Joint first authorTianxiang Shen; Ji Qi
Publication Years
2022-07-11
Conference Name
2022 USENIX Annual Technical Conference
Conference Date
July 11–13, 2022
Conference Place
Carlsbad, CA, USA
Abstract

The prosperity of AI and edge computing has pushed more and more well-trained DNN models to be deployed on third-party edge devices to compose mission-critical applications. This necessitates protecting model confidentiality at untrusted devices, and using a co-located accelerator (e.g., GPU) to speed up model inference locally. Recently, the community has sought to improve the security with CPU trusted execution environments (TEE). However, existing solutions either run an entire model in TEE, suffering from extremely high inference latency, or take a partition-based approach to handcraft partial model via parameter obfuscation techniques to run on an untrusted GPU, achieving lower inference latency at the expense of both the integrity of partitioned computations outside TEE and accuracy of obfuscated parameters.

We propose SOTER, the first system that can achieve model confidentiality, integrity, low inference latency and high accuracy in the partition-based approach. Our key observation is that there is often an \textit{associativity} property among many inference operators in DNN models. Therefore, SOTER automatically transforms a major fraction of associative operators into \textit{parameter-morphed}, thus \textit{confidentiality-preserved} operators to execute on untrusted GPU, and fully restores the execution results to accurate results with associativity in TEE. Based on these steps, SOTER further designs an \textit{oblivious fingerprinting} technique to safely detect integrity breaches of morphed operators outside TEE to ensure correct executions of inferences. Experimental results on six prevalent models in the three most popular categories show that, even with stronger model protection, SOTER achieves comparable performance with partition-based baselines while retaining the same high accuracy as insecure inference.

SUSTech Authorship
Others
Language
English
URL[Source Record]
Data Source
人工提交
PDF urlhttps://www.usenix.org/system/files/atc22-shen.pdf
Document TypeConference paper
Identifierhttp://kc.sustech.edu.cn/handle/2SGJ60CL/416078
DepartmentDepartment of Computer Science and Engineering
Affiliation
1.The University of Hong Kong
2.Huawei Technologies Co., Ltd
3.The Hong Kong Polytechnic University
4.Southern University of Science and Technology
Recommended Citation
GB/T 7714
Tianxiang Shen,Ji Qi,Jianyu Jiang,et al. SOTER: Guarding Black-box Inference for General Neural Networks at the Edge[C],2022.
Files in This Item:
File Name/Size DocType Version Access License
2022 Soter.pdf(1543KB) Restricted Access--
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Export to Excel
Export to Csv
Altmetrics Score
Google Scholar
Similar articles in Google Scholar
[Tianxiang Shen]'s Articles
[Ji Qi]'s Articles
[Jianyu Jiang]'s Articles
Baidu Scholar
Similar articles in Baidu Scholar
[Tianxiang Shen]'s Articles
[Ji Qi]'s Articles
[Jianyu Jiang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Tianxiang Shen]'s Articles
[Ji Qi]'s Articles
[Jianyu Jiang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.