Inducer-tuning: Connecting Prefix-tuning and Adapter-tuning

Yifan Chen, Devamanyu Hazarika, Mahdi Namazifar, Yang Liu, Di Jin*, Dilek Hakkani-Tur

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

3 Citations (Scopus)

Abstract

Prefix-tuning, or more generally continuous prompt tuning, has become an essential paradigm of parameter-efficient transfer learning. Using a large pre-trained language model (PLM), prefix-tuning can obtain strong performance by training only a small portion of parameters. In this paper, we propose to understand and further develop prefix-tuning through the kernel lens. Specifically, we make an analogy between prefixes and inducing variables in kernel methods and hypothesize that prefixes serving as inducing variables would improve their overall mechanism. From the kernel estimator perspective, we suggest a new variant of prefix-tuning-inducer-tuning, which shares the exact mechanism as prefix-tuning while leveraging the residual form found in adapter-tuning. This mitigates the initialization issue in prefix-tuning. Through comprehensive empirical experiments on natural language understanding and generation tasks, we demonstrate that inducer-tuning can close the performance gap between prefix-tuning and fine-tuning.

Original languageEnglish
Title of host publicationProceeding of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022
PublisherAssociation for Computational Linguistics (ACL)
Pages793-808
Number of pages16
ISBN (Print)9781959429401
Publication statusPublished - Dec 2022
Event2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 - Abu Dhabi, United Arab Emirates
Duration: 7 Dec 202211 Dec 2022
https://2022.emnlp.org/ (Conference website)
https://aclanthology.org/events/emnlp-2022/#2022emnlp-main (Conference proceeding)

Conference

Conference2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Period7/12/2211/12/22
Internet address

Scopus Subject Areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Inducer-tuning: Connecting Prefix-tuning and Adapter-tuning'. Together they form a unique fingerprint.

Cite this