Supervised Attention Using Homophily in Graph Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Graph neural networks have become the standard approach for dealing with learning problems on graphs. Among the different variants of graph neural networks, graph attention networks (GATs) have been applied with great success to different tasks. In the GAT model, each node assigns an importance score to its neighbors using an attention mechanism. However, similar to other graph neural networks, GATs aggregate messages from nodes that belong to different classes, and therefore produce node representations that are not well separated with respect to the different classes, which might hurt their performance. In this work, to alleviate this problem, we propose a new technique that can be incorporated into any graph attention model to encourage higher attention scores between nodes that share the same class label. We evaluate the proposed method on several node classification datasets demonstrating increased performance over standard baseline models.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2023 - 32nd International Conference on Artificial Neural Networks, Proceedings
EditorsLazaros Iliadis, Antonios Papaleonidas, Plamen Angelov, Chrisina Jayne
PublisherSpringer Science and Business Media Deutschland GmbH
Pages576-586
Number of pages11
ISBN (Print)9783031442155
DOIs
Publication statusPublished - 1 Jan 2023
Event32nd International Conference on Artificial Neural Networks, ICANN 2023 - Heraklion, Greece
Duration: 26 Sept 202329 Sept 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14257 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference32nd International Conference on Artificial Neural Networks, ICANN 2023
Country/TerritoryGreece
CityHeraklion
Period26/09/2329/09/23

Keywords

  • Graph Attention Networks
  • Graph Neural Networks
  • Supervised Attention

Fingerprint

Dive into the research topics of 'Supervised Attention Using Homophily in Graph Neural Networks'. Together they form a unique fingerprint.

Cite this