Abstract
The methods of computing the ridge parameters have been studied for more than four decades. However, there is still no way to compute its optimal value. Nevertheless, many methods have been proposed to yield ridge regression estimators of smaller mean squared errors than the least square estimators empirically. This paper compares the mean squared errors of 26 existing methods for ridge regression in different scenarios. A new approach is also proposed, which minimizes the empirical mean squared errors iteratively. It is found that the existing methods can be divided into two groups: one is those that are better, but only slightly, than the least squares method in many cases, and the other is those that are much better than the least squares method in only some cases but can be (sometimes much) worse than it in many others. The new method, though not uniformly the best, outperforms the least squares method well in many cases and underperforms it only slightly in a few cases.
Original language | English |
---|---|
Pages (from-to) | 625-639 |
Number of pages | 15 |
Journal | Computational Statistics |
Volume | 30 |
Issue number | 2 |
Early online date | 31 Jan 2015 |
DOIs | |
Publication status | Published - Jun 2015 |
Scopus Subject Areas
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Computational Mathematics
User-Defined Keywords
- Least squares
- Multicollinearity
- Optimal ridge parameter