Towards Deepening Graph Neural Networks: A GNTK-based Optimization Perspective

Wei Huang*, Yayong Li, Weitao Du, Jie Yin, Richard Yi Da Xu, Ling Chen, Miao Zhang

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Graph convolutional networks (GCNs) and their variants have achieved great success in dealing with graph-structured data. Nevertheless, it is well known that deep GCNs suffer from the over-smoothing problem, where node representations tend to be indistinguishable as more layers are stacked up. The theoretical research to date on deep GCNs has focused primarily on expressive power rather than trainability, an optimization perspective. Compared to expressivity, trainability attempts to address a more fundamental question: Given a sufficiently expressive space of models, can we successfully find a good solution via gradient descent-based optimizers? This work fills this gap by exploiting the Graph Neural Tangent Kernel (GNTK), which governs the optimization trajectory under gradient descent for wide GCNs. We formulate the asymptotic behaviors of GNTK in the large depth, which enables us to reveal the dropping trainability of wide and deep GCNs at an exponential rate in the optimization process. Additionally, we extend our theoretical framework to analyze residual connection-based techniques, which are found to be merely able to mitigate the exponential decay of trainability mildly. Inspired by our theoretical insights on trainability, we propose Critical DropEdge, a connectivity-aware and graph-adaptive sampling method, to alleviate the exponential decay problem more fundamentally. Experimental evaluation consistently confirms using our proposed method can achieve better results compared to relevant counterparts with both infinite-width and finite-width.
Original languageEnglish
Title of host publicationProceedings of Tenth International Conference on Learning Representations, ICLR 2022
PublisherInternational Conference on Learning Representations
Pages1-26
Number of pages26
Publication statusPublished - 25 Apr 2022
EventThe Tenth International Conference on Learning Representations, ICLR 2022 - Virtual
Duration: 25 Apr 202229 Apr 2022
https://iclr.cc/Conferences/2022
https://openreview.net/group?id=ICLR.cc/2022/Conference

Conference

ConferenceThe Tenth International Conference on Learning Representations, ICLR 2022
Period25/04/2229/04/22
Internet address

Fingerprint

Dive into the research topics of 'Towards Deepening Graph Neural Networks: A GNTK-based Optimization Perspective'. Together they form a unique fingerprint.

Cite this