TY - JOUR

T1 - Accelerating the quadratic lower-bound algorithm via optimizing the shrinkage parameter

AU - Tian, Guo Liang

AU - TANG, Man Lai

AU - Liu, Chunling

N1 - Funding Information:
The authors would like to thank the Editor, an Associate Editor and one referee for their helpful comments and suggestions. GL Tian’s research was supported in part by a grant (Project Code: 2009-1115-9042 ) from HKU Seed Funding Program for Basic Research and in part by a grant ( HKU 779210M ) from the Research Grant Council of the Hong Kong Special Administrative Region .

PY - 2012/2/1

Y1 - 2012/2/1

N2 - When the Newton-Raphson algorithm or the Fisher scoring algorithm does not work and the EM-type algorithms are not available, the quadratic lower-bound (QLB) algorithm may be a useful optimization tool. However, like all EM-type algorithms, the QLB algorithm may also suffer from slow convergence which can be viewed as the cost for having the ascent property. This paper proposes a novel 'shrinkage parameter' approach to accelerate the QLB algorithm while maintaining its simplicity and stability (i.e., monotonic increase in log-likelihood). The strategy is first to construct a class of quadratic surrogate functions Q r(θ|θ(t)) that induces a class of QLB algorithms indexed by a 'shrinkage parameter' r (r∈R) and then to optimize r over R under some criterion of convergence. For three commonly used criteria (i.e., the smallest eigenvalue, the trace and the determinant), we derive a uniformly optimal shrinkage parameter and find an optimal QLB algorithm. Some theoretical justifications are also presented. Next, we generalize the optimal QLB algorithm to problems with penalizing function and then investigate the associated properties of convergence. The optimal QLB algorithm is applied to fit a logistic regression model and a Cox proportional hazards model. Two real datasets are analyzed to illustrate the proposed methods.

AB - When the Newton-Raphson algorithm or the Fisher scoring algorithm does not work and the EM-type algorithms are not available, the quadratic lower-bound (QLB) algorithm may be a useful optimization tool. However, like all EM-type algorithms, the QLB algorithm may also suffer from slow convergence which can be viewed as the cost for having the ascent property. This paper proposes a novel 'shrinkage parameter' approach to accelerate the QLB algorithm while maintaining its simplicity and stability (i.e., monotonic increase in log-likelihood). The strategy is first to construct a class of quadratic surrogate functions Q r(θ|θ(t)) that induces a class of QLB algorithms indexed by a 'shrinkage parameter' r (r∈R) and then to optimize r over R under some criterion of convergence. For three commonly used criteria (i.e., the smallest eigenvalue, the trace and the determinant), we derive a uniformly optimal shrinkage parameter and find an optimal QLB algorithm. Some theoretical justifications are also presented. Next, we generalize the optimal QLB algorithm to problems with penalizing function and then investigate the associated properties of convergence. The optimal QLB algorithm is applied to fit a logistic regression model and a Cox proportional hazards model. Two real datasets are analyzed to illustrate the proposed methods.

KW - Cox proportional hazards model

KW - EM-type algorithms

KW - Logistic regression

KW - Newton-Raphson algorithm

KW - Optimal QLB algorithm

KW - QLB algorithm

UR - http://www.scopus.com/inward/record.url?scp=80053280614&partnerID=8YFLogxK

U2 - 10.1016/j.csda.2011.07.013

DO - 10.1016/j.csda.2011.07.013

M3 - Journal article

AN - SCOPUS:80053280614

SN - 0167-9473

VL - 56

SP - 255

EP - 265

JO - Computational Statistics and Data Analysis

JF - Computational Statistics and Data Analysis

IS - 2

ER -