Expressive Sign Equivariant Networks for Spectral Geometric Learning

Derek Lim, Joshua Robinson, Stefanie Jegelka, Yaron Lipman, Haggai Maron

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recent work has shown the utility of developing machine learning models that respect the symmetries of eigenvectors.
These works promote sign invariance, since for any eigenvector
the negation
is also an eigenvector.
In this work, we demonstrate that sign equivariance is useful for applications such as building orthogonally equivariant models and link prediction. To obtain these benefits, we develop novel sign equivariant neural network architectures. These models are based on our analytic characterization of the sign equivariant polynomials and thus inherit provable expressiveness properties
Original languageEnglish
Title of host publicationICLR 2023 Workshop on Physics for Machine Learning
StatePublished - 2023
Externally publishedYes
EventICLR 2023 Workshop on Physics for Machine Learning - Hybrid, Kigali, Rwanda
Duration: 4 May 20234 May 2023
https://physics4ml.github.io/

Conference

ConferenceICLR 2023 Workshop on Physics for Machine Learning
Country/TerritoryRwanda
CityKigali
Period4/05/234/05/23
Internet address

Fingerprint

Dive into the research topics of 'Expressive Sign Equivariant Networks for Spectral Geometric Learning'. Together they form a unique fingerprint.

Cite this