Title | Optimization-Derived Learning with Essential Convergence Analysis of Training and Hyper-training |
Author | |
Corresponding Author | Zhang, Jin |
Publication Years | 2022
|
Conference Name | 38th International Conference on Machine Learning (ICML)
|
ISSN | 2640-3498
|
Source Title | |
Conference Date | JUL 17-23, 2022
|
Conference Place | null,Baltimore,MD
|
Publication Place | 1269 LAW ST, SAN DIEGO, CA, UNITED STATES
|
Publisher | |
Abstract | Recently, Optimization-Derived Learning (ODL) has attracted attention from learning and vision areas, which designs learning models from the perspective of optimization. However, previous ODL approaches regard the training and hyper-training procedures as two separated stages, meaning that the hyper-training variables have to be fixed during the training process, and thus it is also impossible to simultaneously obtain the convergence of training and hyper-training variables. In this work, we design a Generalized Krasnoselskii-Mann (GKM) scheme based on fixed-point iterations as our fundamental ODL module, which unifies existing ODL methods as special cases. Under the GKM scheme, a Bilevel Meta Optimization (BMO) algorithmic framework is constructed to solve the optimal training and hypertraining variables together. We rigorously prove the essential joint convergence of the fixed-point iteration for training and the process of optimizing hyper-parameters for hyper-training, both on the approximation quality, and on the stationary analysis. Experiments demonstrate the efficiency of BMO with competitive performance on sparse coding and real-world applications such as image deconvolution and rain streak removal. |
SUSTech Authorship | Corresponding
|
Language | English
|
URL | [Source Record] |
Indexed By | |
Funding Project | National Natural Science Foundation of China["61922019","61733002","62027826","11971220"]
; National Key R&D Program of China[2020YFB1313503]
; major key project of PCL[PCL2021A12]
; Shenzhen Science and Technology Program[RCYX20200714114700072]
; Guangdong Basic and Applied Basic Research Foundation[2022B1515020082]
|
WOS Research Area | Computer Science
|
WOS Subject | Computer Science, Artificial Intelligence
|
WOS Accession No | WOS:000900064903044
|
Data Source | Web of Science
|
Citation statistics |
Cited Times [WOS]:0
|
Document Type | Conference paper |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/536096 |
Department | Department of Mathematics |
Affiliation | 1.Dalian Univ Technol, DUT RU Int Sch Informat Sci & Engn, Dalian, Liaoning, Peoples R China 2.Key Lab Ubiquitous Network & Serv Software Liaoni, Dalian, Liaoning, Peoples R China 3.Peng Cheng Lab, Shenzhen, Guangdong, Peoples R China 4.Univ Victoria, Dept Math & Stat, Victoria, BC, Canada 5.Southern Univ Sci & Technol, SUSTech Int Ctr Math, Dept Math, Shenzhen, Guangdong, Peoples R China 6.Natl Ctr Appl Math Shenzhen, Shenzhen, Guangdong, Peoples R China |
Corresponding Author Affilication | Department of Mathematics |
First Author's First Affilication | Department of Mathematics |
Recommended Citation GB/T 7714 |
Liu, Risheng,Liu, Xuan,Zeng, Shangzhi,et al. Optimization-Derived Learning with Essential Convergence Analysis of Training and Hyper-training[C]. 1269 LAW ST, SAN DIEGO, CA, UNITED STATES:JMLR-JOURNAL MACHINE LEARNING RESEARCH,2022.
|
Files in This Item: | There are no files associated with this item. |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment