Invariant and equivariant graph networks

Haggai Maron, Heli Ben-Hamu, Nadav Shamir, Yaron Lipman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Invariant and equivariant networks have been successfully used for learning images, sets, point clouds, and graphs. A basic challenge in developing such networks is finding the maximal collection of invariant and equivariant linear layers. Although this question is answered for the first three examples (for popular transformations, at-least), a full characterization of invariant and equivariant linear layers for graphs is not known. In this paper we provide a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and show that their dimension, in case of edge-value graph data, is 2 and 15, respectively. More generally, for graph data defined on k-tuples of nodes, the dimension is the k-th and 2k-th Bell numbers. Orthogonal bases for the layers are computed, including generalization to multigraph data. The constant number of basis elements and their characteristics allow successfully applying the networks to different size graphs. From the theoretical point of view, our results generalize and unify recent advancement in equivariant deep learning. In particular, we show that our model is capable of approximating any message passing neural network. Applying these new linear layers in a simple deep neural network framework is shown to achieve comparable results to state-of-the-art and to have better expressivity than previous invariant and equivariant bases.

Original languageEnglish
Title of host publication7th International Conference on Learning Representations, ICLR 2019
StatePublished - 2019
Externally publishedYes
Event7th International Conference on Learning Representations, ICLR 2019 - New Orleans, United States
Duration: 6 May 20199 May 2019

Conference

Conference7th International Conference on Learning Representations, ICLR 2019
Country/TerritoryUnited States
CityNew Orleans
Period6/05/199/05/19

All Science Journal Classification (ASJC) codes

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Invariant and equivariant graph networks'. Together they form a unique fingerprint.

Cite this