
Event Date
Speaker: Arash A. Amini (Associate Professor, Dept. of Statistics and Data science, University of California Los Angeles)
Title: "Polynomial Graph Neural Networks: Theoretical Limits and Graph Noise Impact"
Abstract: This talk examines the theoretical foundations of Graph Neural Networks (GNNs), focusing on polynomial GNNs (Poly-GNNs). We start with empirical evidence challenging the need for complex GNN architectures in semi-supervised node classification, showing simpler methods often perform comparably.
We then analyze Poly-GNNs within a contextual stochastic block model, addressing a key question: Does increasing GNN depth improve class separation in node representations? Our results show that for large graphs, the rate of class separation remains constant regardless of network depth. We demonstrate how “graph noise” can overpower other signals in deeper networks, negating the benefits of additional feature aggregation.
Our analysis employs techniques from random matrix theory, leading to combinatorics of walks and walk sequences on graphs. To obtain sharp bounds on the rate of separation, we characterize the leading term in graph noise; these insights pave the way for deriving more precise limit theorems for Poly-GNNs.
Bios: Arash A. Amini is an Associate Professor of Statistics and Data Science at the University of California, Los Angeles. He received his Ph.D. in electrical engineering from the University of California, Berkeley in 2011, and completed a postdoctoral fellowship at the University of Michigan. His research spans high-dimensional statistics, functional and nonparametric estimation, network data analysis, optimization, and graphical models, with recent work shedding light on the performance limits of graph neural networks. At UCLA, he teaches courses from introductory machine learning to advanced theoretical statistics, mentoring the next generation of statisticians and data scientists.
Faculty website (links to UCLA): http://www.stat.ucla.edu/~arashamini/