🐼PANDA: Expanded Width-Aware Message Passing Beyond Rewiring, ICML 2024
-
Updated
Jul 19, 2024 - Python
🐼PANDA: Expanded Width-Aware Message Passing Beyond Rewiring, ICML 2024
[ICML 2024] Structure-Aware E(3)-Invariant Molecular Conformer Aggregation Networks.
Master thesis: JAT (Jraph Attention Networks), a deep learning architecture to predict the potential energy and forces of molecules. Adapts Graph Attention Networks (GATv2) within the Message Passing Neural Networks framework to computational chemistry in JAX
Efficient Subgraph GNNs by Learning Effective Selection Policies (ICLR 2024)
Official repository for Self-Attention Message Passing for Contrastive Few-Shot Learning
Lorentz group equivariant autoencoders based on Lorentz Group Network
Measuring generalization properties of graph neural networks
Message Passing Neural Networks for Simplicial and Cell Complexes
Equivariant Subgraph Aggregation Networks (ICLR 2022 Spotlight)
Graph neural network autoencoders for jets in HEP
Understanding and Extending Subgraph GNNs by Rethinking their Symmetries (NeurIPS 2022 Oral)
GGPM - GraphNN Generation of Organic Photovoltaic Molecules
Graph Neural Network creation module, implemented in Tensorflow 2 with examples using the module and the iGNNition library for fast GNN prototyping.
Add a description, image, and links to the message-passing-neural-network topic page so that developers can more easily learn about it.
To associate your repository with the message-passing-neural-network topic, visit your repo's landing page and select "manage topics."