TY - JOUR
T1 - Efficient neural networks for solving variational inequalities
AU - Jiang, Suoliang
AU - Han, Deren
AU - Yuan, Xiaoming
N1 - Copyright:
Copyright 2012 Elsevier B.V., All rights reserved.
PY - 2012/6/1
Y1 - 2012/6/1
N2 - In this paper, we propose efficient neural network models for solving a class of variational inequality problems. Our first model can be viewed as a generalization of the basic projection neural network proposed by Friesz et al. [3]. As the basic projection neural network, it only needs some function evaluations and projections onto the constraint set, which makes the model very easy to implement, especially when the constraint set has some special structure such as a box, or a ball. Under the condition that the underlying mapping F is pseudo-monotone with respect to a solution, a condition that is much weaker than those required by the basic projection neural network, we prove the global convergence of the proposed neural network. If F is strongly pseudo-monotone, we prove its globally exponential stability. Then to improve the efficient of the neural network, we modify it by choosing a new direction that is bounded away from zero. Under the condition that the underlying mapping F is co-coercive, a condition that is a little stronger than pseudo-monotone but is still weaker than those required by the basic projection neural network, we prove the exponential stability and global convergence of the improved model. We also reported some computational results, which illustrated that the new method is more efficient than that of Friesz et al. [3].
AB - In this paper, we propose efficient neural network models for solving a class of variational inequality problems. Our first model can be viewed as a generalization of the basic projection neural network proposed by Friesz et al. [3]. As the basic projection neural network, it only needs some function evaluations and projections onto the constraint set, which makes the model very easy to implement, especially when the constraint set has some special structure such as a box, or a ball. Under the condition that the underlying mapping F is pseudo-monotone with respect to a solution, a condition that is much weaker than those required by the basic projection neural network, we prove the global convergence of the proposed neural network. If F is strongly pseudo-monotone, we prove its globally exponential stability. Then to improve the efficient of the neural network, we modify it by choosing a new direction that is bounded away from zero. Under the condition that the underlying mapping F is co-coercive, a condition that is a little stronger than pseudo-monotone but is still weaker than those required by the basic projection neural network, we prove the exponential stability and global convergence of the improved model. We also reported some computational results, which illustrated that the new method is more efficient than that of Friesz et al. [3].
KW - Co-coercive mappings
KW - Exponential stability
KW - Projection neural networks
KW - Pseudo-monotone mapping
KW - Variational inequalities
UR - http://www.scopus.com/inward/record.url?scp=84863413533&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2012.01.020
DO - 10.1016/j.neucom.2012.01.020
M3 - Journal article
AN - SCOPUS:84863413533
SN - 0925-2312
VL - 86
SP - 97
EP - 106
JO - Neurocomputing
JF - Neurocomputing
ER -