TY - GEN
T1 - Kernel graph convolutional neural networks
AU - Nikolentzos, Giannis
AU - Meladianos, Polykarpos
AU - Tixier, Antoine Jean Pierre
AU - Skianis, Konstantinos
AU - Vazirgiannis, Michalis
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2018.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - Graph kernels have been successfully applied to many graph classification problems. Typically, a kernel is first designed, and then an SVM classifier is trained based on the features defined implicitly by this kernel. This two-stage approach decouples data representation from learning, which is suboptimal. On the other hand, Convolutional Neural Networks (CNNs) have the capability to learn their own features directly from the raw data during training. Unfortunately, they cannot handle irregular data such as graphs. We address this challenge by using graph kernels to embed meaningful local neighborhoods of the graphs in a continuous vector space. A set of filters is then convolved with these patches, pooled, and the output is then passed to a feedforward network. With limited parameter tuning, our approach outperforms strong baselines on 7 out of 10 benchmark datasets. Code and data are publicly available (https://github.com/giannisnik/cnn-graph-classification).
AB - Graph kernels have been successfully applied to many graph classification problems. Typically, a kernel is first designed, and then an SVM classifier is trained based on the features defined implicitly by this kernel. This two-stage approach decouples data representation from learning, which is suboptimal. On the other hand, Convolutional Neural Networks (CNNs) have the capability to learn their own features directly from the raw data during training. Unfortunately, they cannot handle irregular data such as graphs. We address this challenge by using graph kernels to embed meaningful local neighborhoods of the graphs in a continuous vector space. A set of filters is then convolved with these patches, pooled, and the output is then passed to a feedforward network. With limited parameter tuning, our approach outperforms strong baselines on 7 out of 10 benchmark datasets. Code and data are publicly available (https://github.com/giannisnik/cnn-graph-classification).
UR - https://www.scopus.com/pages/publications/85054793692
U2 - 10.1007/978-3-030-01418-6_3
DO - 10.1007/978-3-030-01418-6_3
M3 - Conference contribution
AN - SCOPUS:85054793692
SN - 9783030014179
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 22
EP - 32
BT - Artificial Neural Networks and Machine Learning – ICANN 2018 - 27th International Conference on Artificial Neural Networks, 2018, Proceedings
A2 - Kurkova, Vera
A2 - Hammer, Barbara
A2 - Manolopoulos, Yannis
A2 - Iliadis, Lazaros
A2 - Maglogiannis, Ilias
PB - Springer Verlag
T2 - 27th International Conference on Artificial Neural Networks, ICANN 2018
Y2 - 4 October 2018 through 7 October 2018
ER -