TY - GEN
T1 - Improved lagrange nonlinear programming neural networks for inequality constraints
AU - Huang, Yuancan
PY - 2006
Y1 - 2006
N2 - By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, u i2, i = 1,2, ⋯, m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange non-linear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigourously with Liapunov's first approximation principle, and its convergence is discussed deeply with LaSalle's invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems.
AB - By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, u i2, i = 1,2, ⋯, m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange non-linear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigourously with Liapunov's first approximation principle, and its convergence is discussed deeply with LaSalle's invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems.
KW - Convergence
KW - Lagrange neural network
KW - Nonlinear programming
KW - Stability
UR - http://www.scopus.com/inward/record.url?scp=34547500033&partnerID=8YFLogxK
U2 - 10.1109/ISDA.2006.174
DO - 10.1109/ISDA.2006.174
M3 - Conference contribution
AN - SCOPUS:34547500033
SN - 0769525288
SN - 9780769525280
T3 - Proceedings - ISDA 2006: Sixth International Conference on Intelligent Systems Design and Applications
SP - 158
EP - 166
BT - Proceedings - ISDA 2006
T2 - ISDA 2006: Sixth International Conference on Intelligent Systems Design and Applications
Y2 - 16 October 2006 through 18 October 2006
ER -