Fast and Effective GNN Training through Sequences of Random Path Graphs

Fabio Vitale
Francesco Paolo Nerini
Andre Panisson
Francesco Bonchi
2025
Google Scholar

Abstract

We present a novel scalable framework for training GNNs in node classification tasks, based on effective resistance, a standard tool in spectral graph theory. Unlike other spectral and graph modification approaches to GNN training, our method progressively refines the GNN weights on a sequence of random spanning trees suitably transformed into path graphs which, despite their simplicity, are shown to retain essential topological and node information of the original input graph. The sparse nature of these path graphs substantially lightens the computational burden of GNN training. This not only enhances scalability but also improves accuracy in subsequent test phases. In particular, we focus on small training set regimes, which are of great practical importance, since in many real-world scenarios labels may be challenging to obtain. We show that our framework yields very good empirical results because it effectively counters the training deterioration caused by overfitting when the training set is small. Moreover, we successfully address common issues like over-squashing and over-smoothing while, at the same time, avoiding under-reaching phenomena. Although our framework is flexible and can be deployed in several types of GNNs, in this paper we focus on graph convolutional networks and carry out an extensive experimental investigation on a number of real-world graph benchmarks, where we achieve simultaneous improvement of training speed and test accuracy over a wide pool of representative baselines.