On the Expressive Power of Spectral Invariant Graph Neural Networks

Bohang Zhang, Lingxiao Zhao, Haggai Maron

Research output: Contribution to journalConference articlepeer-review

Abstract

Incorporating spectral information to enhance Graph Neural Networks (GNNs) has shown promising results but raises a fundamental challenge due to the inherent ambiguity of eigenvectors.Various architectures have been proposed to address this ambiguity, referred to as spectral invariant architectures.Notable examples include GNNs and Graph Transformers that use spectral distances, spectral projection matrices, or other invariant spectral features.However, the potential expressive power of these spectral invariant architectures remains largely unclear.The goal of this work is to gain a deep theoretical understanding of the expressive power obtainable when using spectral features.We first introduce a unified message-passing framework for designing spectral invariant GNNs, called Eigenspace Projection GNN (EPNN).A comprehensive analysis shows that EPNN essentially unifies all prior spectral invariant architectures, in that they are either strictly less expressive or equivalent to EPNN.A fine-grained expressiveness hierarchy among different architectures is also established.On the other hand, we prove that EPNN itself is bounded by a recently proposed class of Subgraph GNNs, implying that all these spectral invariant architectures are strictly less expressive than 3-WL.Finally, we discuss whether using spectral features can gain additional expressiveness when combined with more expressive GNNs.

Original languageEnglish
Pages (from-to)60496-60526
Number of pages31
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Externally publishedYes
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this