TY - GEN
T1 - Supervised Attention Using Homophily in Graph Neural Networks
AU - Chatzianastasis, Michail
AU - Nikolentzos, Giannis
AU - Vazirgiannis, Michalis
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - Graph neural networks have become the standard approach for dealing with learning problems on graphs. Among the different variants of graph neural networks, graph attention networks (GATs) have been applied with great success to different tasks. In the GAT model, each node assigns an importance score to its neighbors using an attention mechanism. However, similar to other graph neural networks, GATs aggregate messages from nodes that belong to different classes, and therefore produce node representations that are not well separated with respect to the different classes, which might hurt their performance. In this work, to alleviate this problem, we propose a new technique that can be incorporated into any graph attention model to encourage higher attention scores between nodes that share the same class label. We evaluate the proposed method on several node classification datasets demonstrating increased performance over standard baseline models.
AB - Graph neural networks have become the standard approach for dealing with learning problems on graphs. Among the different variants of graph neural networks, graph attention networks (GATs) have been applied with great success to different tasks. In the GAT model, each node assigns an importance score to its neighbors using an attention mechanism. However, similar to other graph neural networks, GATs aggregate messages from nodes that belong to different classes, and therefore produce node representations that are not well separated with respect to the different classes, which might hurt their performance. In this work, to alleviate this problem, we propose a new technique that can be incorporated into any graph attention model to encourage higher attention scores between nodes that share the same class label. We evaluate the proposed method on several node classification datasets demonstrating increased performance over standard baseline models.
KW - Graph Attention Networks
KW - Graph Neural Networks
KW - Supervised Attention
U2 - 10.1007/978-3-031-44216-2_47
DO - 10.1007/978-3-031-44216-2_47
M3 - Conference contribution
AN - SCOPUS:85174604831
SN - 9783031442155
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 576
EP - 586
BT - Artificial Neural Networks and Machine Learning – ICANN 2023 - 32nd International Conference on Artificial Neural Networks, Proceedings
A2 - Iliadis, Lazaros
A2 - Papaleonidas, Antonios
A2 - Angelov, Plamen
A2 - Jayne, Chrisina
PB - Springer Science and Business Media Deutschland GmbH
T2 - 32nd International Conference on Artificial Neural Networks, ICANN 2023
Y2 - 26 September 2023 through 29 September 2023
ER -