Abstract
This paper provides a general result on controlling local Rademacher complexities, which captures in an elegant form to relate complexities with constraints on expected norms to the corresponding ones with constraints on empirical norms. This result is convenient to apply and could yield refined local Rademacher complexity bounds for function classes satisfying general entropy conditions. We demonstrate the power of our complexity bounds by applying them to simplify the derivation of effective generalization error bounds.
Original language | English |
---|---|
Pages (from-to) | 320-330 |
Number of pages | 11 |
Journal | Neurocomputing |
Volume | 218 |
DOIs | |
Publication status | Published - 19 Dec 2016 |
Scopus Subject Areas
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence
User-Defined Keywords
- Covering numbers
- Learning theory
- Local Rademacher complexity