Adversarial Robustness through the Lens of Causality

Yonggang ZHANG, Mingming Gong, Tongliang Liu, Gang Niu, Xinmei Tian, Bo Han*, Bernhard Scholkopf, Kun Zhang

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

Abstract

The adversarial vulnerability of deep neural networks has attracted significant attention in machine learning. As causal reasoning has an instinct for modeling distribution change, it is essential to incorporate causality into analyzing this specific type of distribution change induced by adversarial attacks. However, causal formulations of the intuition of adversarial attacks and the development of robust DNNs are still lacking in the literature. To bridge this gap, we construct a causal graph to model the generation process of adversarial examples and define the adversarial distribution to formalize the intuition of adversarial attacks. From the causal perspective, we study the distinction between the natural and adversarial distribution and conclude that the origin of adversarial vulnerability is the focus of models on spurious correlations. Inspired by the causal understanding, we propose the \emph{Causal}-inspired \emph{Adv}ersarial distribution alignment method, CausalAdv, to eliminate the difference between natural and adversarial distributions by considering spurious correlations. Extensive experiments demonstrate the efficacy of the proposed method. Our work is the first attempt towards using causality to understand and mitigate the adversarial vulnerability.
Original languageEnglish
Number of pages20
Publication statusPublished - 25 Apr 2022
EventICLR 2022 - Tenth International Conference on Learning Representations - Virtual
Duration: 25 Apr 202229 Apr 2022
https://iclr.cc/Conferences/2022

Conference

ConferenceICLR 2022 - Tenth International Conference on Learning Representations
Period25/04/2229/04/22
Internet address

User-Defined Keywords

  • Adversarial examples
  • Causality

Fingerprint

Dive into the research topics of 'Adversarial Robustness through the Lens of Causality'. Together they form a unique fingerprint.

Cite this