Title | Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention |
Author | |
Corresponding Author | Chen,Jiyuan; Song,Xuan |
Publication Years | 2023
|
ISSN | 1045-0823
|
Source Title | |
Volume | 2023-August
|
Pages | 2315-2323
|
Abstract | Recent years have witnessed the great potential of attention mechanism in graph representation learning. However, while variants of attention-based GNNs are setting new benchmarks for numerous real-world datasets, recent works have pointed out that their induced attentions are less robust and generalizable against noisy graphs due to lack of direct supervision. In this paper, we present a new framework which utilizes the tool of causality to provide a powerful supervision signal for the learning process of attention functions. Specifically, we estimate the direct causal effect of attention to the final prediction, and then maximize such effect to guide attention attending to more meaningful neighbors. Our method can serve as a plug-and-play module for any canonical attention-based GNNs in an end-to-end fashion. Extensive experiments on a wide range of benchmark datasets illustrated that, by directly supervising attention functions, the model is able to converge faster with a clearer decision boundary, and thus yields better performances. |
SUSTech Authorship | First
; Corresponding
|
Language | English
|
URL | [Source Record] |
Scopus EID | 2-s2.0-85170383707
|
Data Source | Scopus
|
Document Type | Conference paper |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/560047 |
Affiliation | 1.Southern University of Science and Technology,Shenzhen,China 2.Microsoft Research Asia,Beijing,China |
First Author Affilication | Southern University of Science and Technology |
Corresponding Author Affilication | Southern University of Science and Technology |
First Author's First Affilication | Southern University of Science and Technology |
Recommended Citation GB/T 7714 |
Wang,Hongjun,Chen,Jiyuan,Du,Lun,et al. Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention[C],2023:2315-2323.
|
Files in This Item: | There are no files associated with this item. |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment