×

zbMATH — the first resource for mathematics

A new sequential optimality condition for constrained optimization and algorithmic consequences. (English) Zbl 1217.90148
The authors provide improved Karush-Kuhn-Tucker conditions (KKT) in limit form for the nonlinear optimization problem \[ \text{Minimize}\quad f(x)\text{ subject to }h(x)= 0,\;g(x)\leq 0,\tag{P} \] where \(f:\mathbb{R}^n\to \mathbb{R}\), \(h: \mathbb{R}^n\to \mathbb{R}^m\), \(g: \mathbb{R}^n\to \mathbb{R}^p\) have continuous first derivatives.
In detail, a feasible point \(x^*\) fulfills the “complementary approximate Karush-Kuhn-Tucker conditions” (CAKKT) if there exist sequences \(\{x^k\}\subset\mathbb{R}^n\), \(\{\lambda^k\}\subset\mathbb{R}^m\) and \(\{\mu^k\}\subset \mathbb{R}^p_+\) such that \[ \lim_{k\to \infty} s^k= x^*, \]
\[ \lim_{k\to \infty}\|\nabla f(x^k)+ \nabla h(x^k)\lambda^k+\nabla gf(x^k)\mu^k\|= 0, \]
\[ \begin{aligned} \lim_{k\to \infty}\mu^k_i g_i(x^k)= 0\quad &\text{for all }i= 1,\dots, p,\\ \lim_{k\to\infty} \lambda^k_i h_i(x^k)= 0\quad &\text{for all }i= 1,\dots, m.\end{aligned} \] Obviously, CAKKT are satisfied if the classical KKT hold. In the main theorems of the paper it is proved that
1. a minimizer of (P) fulfills CAKKT – without constraint qualification,
2. if a constraint qualification is satisfied then CAKKT are sufficient for KKT,
3. CAKKT are stronger than other known approximate KKT,
4. in the convex-affine case CAKKT are sufficient for minimality.
It is pointed out that optimization methods (e.g. the augmented Lagrangian method under suitable assumptions) which produces CAKKT sequences are more efficient.

MSC:
90C30 Nonlinear programming
49K99 Optimality conditions
65K05 Numerical mathematical programming methods
PDF BibTeX Cite
Full Text: DOI