SWIFT: Enabling Large-Scale Temporal Graph Learning on a Single Machine

Research output: Contribution to journalJournal articlepeer-review

Abstract

Temporal graph neural networks (T-GNNs) are crucial for modeling dynamic graphs, capturing evolving structures and interactions to address complex temporal properties in applications like event prediction, dynamic social network analysis, and temporal knowledge graph reasoning. However, training T-GNNs is hampered by the massive scale of graphs and complex temporal dynamics, leading to significant runtime and memory efficiency challenges. To tackle these challenges, this paper proposes SWIFT, the first secondary memory-based T-GNN training system for large-scale temporal graph learning on a single machine. SWIFT employs a novel bucket-based pipeline parallelism strategy to efficiently manage data flows across GPU, main, and secondary memories, addressing the computation and memory bottlenecks that hinder scaling for large-scale temporal graphs. Remarkably, SWIFT surpasses its main memory-based counterparts in runtime efficiency while requiring significantly less main memory. Extensive experiments demonstrate that SWIFT achieves up to a 4.3× speedup and a 7.9X reduction in main memory usage compared to state-of-the-art baselines on large temporal graphs.
Original languageEnglish
Article number266
Pages (from-to)1-27
Number of pages27
JournalProceedings of the ACM on Management of Data
Volume3
Issue number4
DOIs
Publication statusPublished - 23 Sept 2025

Fingerprint

Dive into the research topics of 'SWIFT: Enabling Large-Scale Temporal Graph Learning on a Single Machine'. Together they form a unique fingerprint.

Cite this