NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension

Xin He, Jiangchao Yao, Yuxin Wang, Zhenheng Tang, Ka Chun Cheung, Simon See, Bo Han, Xiaowen Chu*

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

7 Citations (Scopus)

Abstract

One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (i.e., subnet). However, the inconsistency of characteristics among subnets incurs serious interference in the optimization, resulting in poor performance ranking correlation of subnets. Subsequent explorations decompose supernet weights via a particular criterion, e.g., gradient matching, to reduce the interference; yet they suffer from huge computational cost and low space separability. In this work, we propose a lightweight and effective local intrinsic dimension (LID)-based method NAS-LID. NAS-LID evaluates the geometrical properties of architectures by calculating the low-cost LID features layer-by-layer, and the similarity characterized by LID enjoys better separability compared with gradients, which thus effectively reduces the interference among subnets. Extensive experiments on NASBench-201 indicate that NAS-LID achieves superior performance with better efficiency. Specifically, compared to the gradient-driven method, NAS-LID can save up to 86% of GPU memory overhead when searching on NASBench-201. We also demonstrate the effectiveness of NAS-LID on ProxylessNAS and OFA spaces. Source code:https://github.com/marsggbo/NAS-LID.
Original languageEnglish
Title of host publicationProceedings of the 37th AAAI Conference on Artificial Intelligence
EditorsBrian Williams, Yiling Chen, Jennifer Neville
Place of PublicationWashington, DC
PublisherAAAI press
Pages7839-7847
Number of pages9
Edition1st
ISBN (Electronic)9781577358800
DOIs
Publication statusPublished - 27 Jun 2023
Event37th AAAI Conference on Artificial Intelligence, AAAI 2023 - Washington, United States
Duration: 7 Feb 202314 Feb 2023
https://ojs.aaai.org/index.php/AAAI/issue/view/553
https://aaai-23.aaai.org/

Publication series

NameProceedings of the AAAI Conference on Artificial Intelligence
PublisherAAAI Press
Number6
Volume37
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468

Conference

Conference37th AAAI Conference on Artificial Intelligence, AAAI 2023
Country/TerritoryUnited States
CityWashington
Period7/02/2314/02/23
Internet address

Fingerprint

Dive into the research topics of 'NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension'. Together they form a unique fingerprint.

Cite this