Learning without Isolation: Pathway Protection for Continual Learning

  • Zhikang Chen
  • , Abudukelimu Wuerkaixi
  • , Sen Cui
  • , Haoxuan Li
  • , Ding Li
  • , Jingfeng Zhang
  • , Bo Han
  • , Gang Niu
  • , Houfang Liu
  • , Yi Yang*
  • , Sifan Yang
  • , Changshui Zhang*
  • , Tianling Ren*
  • *Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Deep networks are prone to catastrophic forgetting during sequential task learning, i.e., losing the knowledge about old tasks upon learning new tasks. To this end, continual learning (CL) has emerged, whose existing methods focus mostly on regulating or protecting the parameters associated with the previous tasks. However, parameter protection is often impractical, since the size of parameters for storing the old-task knowledge increases linearly with the number of tasks, otherwise it is hard to preserve the parameters related to the old-task knowledge. In this work, we bring a dual opinion from neuroscience and physics to CL: in the whole networks, the pathways matter more than the parameters when concerning the knowledge acquired from the old tasks. Following this opinion, we propose a novel CL framework, learning without isolation (LwI), where model fusion is formulated as graph matching and the pathways occupied by the old tasks are protected without being isolated. Thanks to the sparsity of activation channels in a deep network, LwI can adaptively allocate available pathways for a new task, realizing pathway protection and addressing catastrophic forgetting in a parameter-effcient manner. Experiments on popular benchmark datasets demonstrate the superiority of the proposed LwI.

Original languageEnglish
Title of host publicationProceedings of the 42nd International Conference on Machine Learning, ICML 2025
PublisherML Research Press
Pages9377-9399
Number of pages23
Publication statusPublished - Jul 2025
Event42nd International Conference on Machine Learning, ICML 2025 - Vancouver Convention Center, Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025
https://icml.cc/Conferences/2025 (Conference Website)
https://icml.cc/virtual/2025/calendar (Conference Calendar)
https://proceedings.mlr.press/v267/ (Conference Proceedings)

Publication series

NameProceedings of Machine Learning Research
PublisherML Research Press
Volume267

Conference

Conference42nd International Conference on Machine Learning, ICML 2025
Country/TerritoryCanada
CityVancouver
Period13/07/2519/07/25
Internet address

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 4 - Quality Education
    SDG 4 Quality Education

Fingerprint

Dive into the research topics of 'Learning without Isolation: Pathway Protection for Continual Learning'. Together they form a unique fingerprint.

Cite this