Schedule for: 22w5125 - Deep Exploration of non-Euclidean Data with Geometric and Topological Representation Learning

Beginning on Sunday, July 10 and ending Friday July 15, 2022

All times in UBC Okanagan, Canada time, PDT (UTC-7).

Sunday, July 10
16:00 - 23:00 Check-in begins at 16:00 on Sunday and is open 24 hours (Nechako Residence)
Monday, July 11
08:00 - 08:45 Breakfast (Sunshine Café)
08:45 - 09:00 Introduction and Welcome by the UBCO BIRS Staff (Arts Building room 386)
09:00 - 09:45 Ira Ktena: How fair is your graph? Exploring fairness concerns in neuroimaging studies (Arts Building room 386)
09:45 - 10:30 Stephan Günnemann: Graph Neural Networks for Molecular Systems (Arts Building room 386)
10:30 - 11:10 Coffee Break (ASC 310)
11:10 - 11:45 Bastian Rieck: Geometrical-Topological Loss Terms for Shape Analysis (Arts Building room 386)
12:30 - 13:45 Lunch (Sunshine Café)
14:00 - 14:45 Fernando Gama: Local Transferability of Graph Neural Networks (Arts Building room 386)
14:45 - 15:15 Yizhou Sun: Graph Neural ODEs for Dynamical System Modeling (Arts Building room 386)
15:15 - 15:45 Coffee Break (ASC 310)
15:30 - 16:00 Santiago Segarra: Principled Simplicial Neural Networks (Arts Building room 386)
16:00 - 16:30 Michael Perlmutter: Geometric Scattering: Theory and Applications (Arts Building room 386)
16:30 - 18:00 Free Discussion (Arts Building room 386)
18:00 - 19:00 Dinner (Sunshine Café)
Tuesday, July 12
08:00 - 09:00 Breakfast (Sunshine Café)
09:00 - 10:00 Elizabeth Munch: Crafting Topological Features (Arts Building room 386)
10:00 - 10:30 Ningyuan Huang: Graph Symmetry and Graph Spectra (Arts Building room 386)
10:30 - 11:00 Coffee Break (ASC 310)
11:00 - 11:45 Ron Levie: A New Generalization Bound for Message Passing Neural (Arts Building room 386)
11:45 - 12:30 Edward De Brouwer: Topological Graph Neural Networks (Arts Building room 386)
12:30 - 12:45 Group photos (meet at Arts Building room 386)
12:45 - 13:45 Lunch (Sunshine Café)
13:45 - 14:30 Renjie Liao: On the Generalization of Graph Neural Networks
Graph neural networks (GNNs) are increasingly popular in handling graph-structured data. Although substantial progress has been achieved in the expressiveness/capacity of GNNs, we do not know much about the generalization ability of GNNs from the statistical learning theory perspective. In this talk, I will introduce our recent work that derives generalization bounds for the two prominent classes of GNNs, namely graph convolutional networks (GCNs) and message passing GNNs (MPGNNs), via a PAC-Bayesian approach. Our result reveals that the maximum node degree and spectral norm of the weights govern the generalization bounds of both models. For MPGNNs, our PAC-Bayes bound improves over the Rademacher complexity-based bound, showing a tighter dependency on the maximum node degree and the maximum hidden dimension.
(Arts Building room 386)
14:30 - 15:15 Sarah McGuire: A Simplicial Pooling Layer (Arts Building room 386)
15:15 - 15:45 Coffee Break (ASC 310)
15:45 - 16:30 Mikhail Galkin: Inductive Graph Reasoning Without Node Features (Arts Building room 386)
16:30 - 18:00 Discussion Panel (Arts Building room 386)
18:00 - 19:00 Dinner (Sunshine Café)
Wednesday, July 13
08:00 - 09:00 Breakfast (Sunshine Café)
09:00 - 09:45 Christopher Morris: Towards Understanding the Expressive Power of Graph Networks (Arts Building room 386)
09:45 - 10:30 Oluwadamilola Fasina: Diffusion Curvature for Estimating Local Curvature of High-Dimensional Systems (Arts Building room 386)
10:30 - 11:00 Coffee Break (ASC 310)
11:00 - 12:00 Smita Krishnaswamy: Flows and Dynamics on Manifolds with Neural ODEs (Arts Building room 386)
12:00 - 12:45 Frederik Wenkel: Solving Graph Learning Tasks via Hybrid Scattering Networks
Graph neural networks (GNNs) have attracted much attention due to their ability to leverage the intrinsic geometries of the underlying data for graph learning tasks such as graph or node classification (or regression). However, many popular models essentially rely on low-pass filtering and are subject to so-called oversmoothing, where subsequent GNN layers increasingly smooth node representations, making them indistinguishable. We propose hybrid scattering networks that combine ideas from graph convolutional networks (GCNs) and the geometric scattering transform to overcome those shortcomings. While GCN filters capture low-frequency information and have relatively small receptive fields, scattering filters are sensitive to band-frequency information and have larger receptive fields. We theoretically demonstrate that scattering channels can learn more powerful structure-aware node representations than GCN and empirically demonstrate the efficacy of the proposed hybrid approaches for various graph learning tasks. In more recent work we explore the power of hybrid scattering networks to approximate solutions of combinatorial problems like the NP-hard maximum clique problem.
(Arts Building room 386)
12:45 - 18:00 Lunch and free afternoon (Sunshine Café)
Thursday, July 14
08:00 - 09:00 Breakfast (Sunshine Café)
09:00 - 09:45 Cristian Bodnar: Deep Learning on Topological Spaces (Arts Building room 386 + ZOOM)
09:45 - 10:30 Ameya Velingker: Affinity-Aware Graph Networks
Abstract: "Graph Neural Networks (GNNs) have emerged as a powerful technique for learning on relational data. Owing to the relatively limited number of message passing steps they perform -- and hence a smaller receptive field -- there has been significant interest in improving their expressivity by incorporating structural aspects of the underlying graph. In this paper, we explore the use of affinity measures as features in graph neural networks, in particular measures arising from random walks, including effective resistance, hitting and commute times. We propose message passing networks based on these features and evaluate their performance on a variety of node and graph property prediction tasks. Our architecture has lower computational complexity, while our features are invariant to the permutations of the underlying graph. The measures we compute allow the network to exploit the connectivity properties of the graph, thereby allowing us to outperform relevant benchmarks for a wide variety of tasks, often with significantly fewer message passing steps. On one of the largest publicly available graph regression datasets, OGB-LSC-PCQM4Mv1, we obtain the best known single-model validation MAE at the time of writing.
(Arts Building room 386)
10:30 - 11:00 Coffee Break (ASC 310)
11:00 - 11:45 Yan Leng: Learning to Infer Structures of Network Games
Strategic interactions between a group of individuals or organisations can be modelled as games played on networks, where a player's payoff depends not only on their actions but also on those of their neighbours. Inferring the network structure from observed game outcomes (equilibrium actions) is an important problem with numerous potential applications in economics and social sciences. Existing methods mostly require the knowledge of the utility function associated with the game, which is often unrealistic to obtain in real-world scenarios. We adopt a transformer-like architecture which correctly accounts for the symmetries of the problem and learns a mapping from the equilibrium actions to the network structure of the game without explicit knowledge of the utility function. We test our method on three different types of network games using both synthetic and real-world data, and demonstrate its effectiveness in network structure inference and superior performance over existing methods.
(Arts Building room 386)
11:45 - 12:30 Ladislav Rampasek: Recipe for a General, Powerful, Scalable Graph Transformer
In this talk I will present our recent recipe on how to build a general, powerful, scalable (GPS) graph Transformer with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph Transformers (GTs) have gained popularity in the field of graph representation learning with a variety of recent publications, but they lack a common foundation about what constitutes a good positional or structural encoding, and what differentiates them. We summarize the different types of encodings with a clearer definition and categorize them as being local, global, or relative. Further, GTs remain constrained to small graphs with few hundred nodes, and we propose the first architecture with a complexity linear to the number of nodes and edges O(N+E) by decoupling the local real-edge aggregation from the fully-connected Transformer. We argue that this decoupling does not negatively affect the expressivity, with our architecture being a universal function approximator for graphs. Our GPS recipe consists of choosing 3 main ingredients: (i) positional/structural encoding, (ii) local message-passing mechanism, and (iii) global attention mechanism. We build and open-source a modular framework GraphGPS that supports multiple types of encodings and that provides efficiency and scalability both in small and large graphs. Finally, I will present a new set of benchmarking datasets, the Long Range Graph Benchmark (LRGB). LRGB consists of 5 datasets that arguably require the model to capture long-range interactions (LRI) to achieve strong performance in each task, making it suitable for benchmarking and exploration of MPNNs and Graph Transformer architectures that intend to go beyond local smoothing.
(Arts Building room 386)
12:30 - 13:45 Lunch (Sunshine Café)
13:45 - 14:45 Soledad Villar: Exact units equivariance in machine learning (Arts Building room 386)
14:45 - 15:30 Guy Wolf: Multiscale exploration of single cell data with geometric harmonic analysis (Arts Building room 386)
15:30 - 16:00 Coffee Break (ASC 310)
16:30 - 18:00 Discussion Panel (Arts Building room 386)
18:00 - 19:00 Dinner (Sunshine Café)
Friday, July 15
08:00 - 09:00 Breakfast (Sunshine Café)
09:00 - 09:45 Hannes Stärk: EquiBind: Geometric Deep Learning for Drug Binding Structure Prediction (Arts Building room 386 + ZOOM)
09:45 - 10:30 Alex Tong: Multiscale Earth Mover’s Distances (Arts Building room 386 + ZOOM)
10:30 - 11:00 Check-out (Nechako Residence)