Gurugubelli, S and Chepuri, SP (2024) SANN: SIMPLE YET POWERFUL SIMPLICIAL-AWARE NEURAL NETWORKS. In: 12th International Conference on Learning Representations, ICLR 2024, 7 May 2024through 11 May 2024, Hybrid, Vienna.
PDF
Int_Con_Lea_Rep_ICL_ 2024.pdf - Published Version Restricted to Registered users only Download (1MB) | Request a copy |
Abstract
Simplicial neural networks (SNNs) are deep models for higher-order graph representation learning. SNNs learn low-dimensional embeddings of simplices in a simplicial complex by aggregating features of their respective upper, lower, boundary, and coboundary adjacent simplices. The aggregation in SNNs is carried out during training. Since the number of simplices of various orders in a simplicial complex is significantly large, the memory and training-time requirement in SNNs is enormous. In this work, we propose a scalable simplicial-aware neural network (SaNN) model with a constant run-time and memory requirements independent of the size of the simplicial complex and the density of interactions in it. SaNN is based on pre-aggregated simplicial-aware features as inputs to a neural network, so it has a strong simplicial-structural inductive bias. We provide theoretical conditions under which SaNN is provably more powerful than the Weisfeiler-Lehman (WL) graph isomorphism test and as powerful as the simplicial Weisfeiler-Lehman (SWL) test. We also show that SaNN is permutation and orientation equivariant and satisfies simplicial-awareness of the highest order in a simplicial complex. We demonstrate via numerical experiments that despite being computationally economical, the proposed model achieves state-of-the-art performance in predicting trajectories, simplicial closures, and classifying graphs. © 2024 12th International Conference on Learning Representations, ICLR 2024. All rights reserved.
Item Type: | Conference Paper |
---|---|
Publication: | 12th International Conference on Learning Representations, ICLR 2024 |
Publisher: | International Conference on Learning Representations, ICLR |
Additional Information: | The copyright for this article belongs to International Conference on Learning Representations, ICLR. |
Keywords: | Graph representation; High-order; Higher-order; Learn+; Low dimensional embedding; Neural-networks; Order graph; Simple++; Simplicial complex; Time requirements |
Department/Centre: | Division of Electrical Sciences > Electrical Communication Engineering |
Date Deposited: | 06 Sep 2024 12:05 |
Last Modified: | 06 Sep 2024 12:05 |
URI: | http://eprints.iisc.ac.in/id/eprint/86035 |
Actions (login required)
View Item |