Abstract
The task of transfer learning using pretrained convolutional neural networks is considered. We propose a convolution-SVD layer to analyze the convolution operators with a singular value decomposition computed in the Fourier domain. Singular vectors extracted from the source domain are transferred to the target domain, whereas the singular values are finetuned with a target data set. In this way, dimension reduction is achieved to avoid overfitting, while some flexibility to fine-tune the convolution kernels is maintained. We extend an existing convolution kernel reconstruction algorithm to allow for a reconstruction from an arbitrary set of learned singular values. A generalization bound for a single convolution-SVD layer is devised to show the consistency between training and testing errors. We further introduce a notion of transfer learning gap. We prove that the testing error for a single convolution-SVD layer is bounded in terms of the gap, which motivates us to develop a regularization model with the gap as the regularizer. Numerical experiments are conducted to demonstrate the superiority of the proposed model in solving classification problems and the influence of various parameters. In particular, the regularization is shown to yield a significantly higher prediction accuracy.
Original language | English |
---|---|
Pages (from-to) | 1678-1712 |
Number of pages | 35 |
Journal | Neural Computation |
Volume | 35 |
Issue number | 10 |
Early online date | 8 Sept 2023 |
DOIs | |
Publication status | Published - Oct 2023 |
Scopus Subject Areas
- Arts and Humanities (miscellaneous)
- Cognitive Neuroscience