On the Overlooked Structure of Stochastic Gradients

Zeke Xie*, Qian Yuan Tang*, Mingming Sun, Ping Li

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

2 Citations (Scopus)

Abstract

Stochastic gradients closely relate to both optimization and generalization of deep neural networks (DNNs). Some works attempted to explain the success of stochastic optimization for deep learning by the arguably heavy-tail properties of gradient noise, while other works presented theoretical and empirical evidence against the heavy-tail hypothesis on gradient noise. Unfortunately, formal statistical tests for analyzing the structure and heavy tails of stochastic gradients in deep learning are still under-explored. In this paper, we mainly make two contributions. First, we conduct formal statistical tests on the distribution of stochastic gradients and gradient noise across both parameters and iterations. Our statistical tests reveal that dimension-wise gradients usually exhibit power-law heavy tails, while iteration-wise gradients and stochastic gradient noise caused by minibatch training usually do not exhibit power-law heavy tails. Second, we further discover that the covariance spectra of stochastic gradients have the power-law structures overlooked by previous studies and present its theoretical implications for training of DNNs. While previous studies believed that the anisotropic structure of stochastic gradients matters to deep learning, they did not expect the gradient covariance can have such an elegant mathematical structure. Our work challenges the existing belief and provides novel insights on the structure of stochastic gradients in deep learning.

Original languageEnglish
Title of host publication37th Conference on Neural Information Processing Systems, NeurIPS 2023
EditorsA. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
PublisherNeural Information Processing Systems Foundation
Number of pages20
ISBN (Print)9781713899921
Publication statusPublished - Dec 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - Ernest N. Morial Convention Center, New Orleans, United States
Duration: 10 Dec 202316 Dec 2023
https://proceedings.neurips.cc/paper_files/paper/2023 (conference paper search)
https://openreview.net/group?id=NeurIPS.cc/2023/Conference#tab-accept-oral (conference paper search)
https://neurips.cc/Conferences/2023 (conference website)

Publication series

NameAdvances in Neural Information Processing Systems
Volume36
ISSN (Print)1049-5258
NameNeurIPS Proceedings

Conference

Conference37th Conference on Neural Information Processing Systems, NeurIPS 2023
Country/TerritoryUnited States
CityNew Orleans
Period10/12/2316/12/23
Internet address

Scopus Subject Areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this