Directional second derivative of the regularized function that smoothes the min-max problem. (English) Zbl 0812.90135

Gomez, Susana (ed.) et al., Advances in optimization and numerical analysis. Proceedings of the 6th workshop on optimization and numerical analysis, Oaxaca, Mexico, January 1992. Dordrecht: Kluwer Academic Publishers. Math. Appl., Dordr. 275, 273-285 (1994).
In order to solve the non-differentiable min-max optimization problem \(\min_ x \max_ i (f_ i(x))\), \(i= 1,\dots, m\), or its equivalent form \(\min_ x \sup_{u\in U} u^ T f(x)\), where \(U\) is the convex set \[ U=\left\{u\in \mathbb{R}^ m\mid \sum^ m_{i=1} u_ i= 1,\quad u_ i\geq 0,\;i= 1\cdots m\right\}, \] we have recently developed a new method, called the Regularization Method [see the authors, SIAM J. Numer. Anal. 27, No. 6, 1621-1634 (1990; Zbl 0717.49009), and in: Advances in Numer. Partial Diff. Equ. and Optim., Proc. 5th Mex.-US Workshop, Merida/Mex. 1989, 320-331 (1991; Zbl 0738.90068)], using differentiable approximations to the max function. The regularized function is defined as: \[ \varphi_ v(x)= \sup_{u\in U} (u^ T f(x)-\textstyle{{1\over 2}} \| u- v\|^ 2). \] The proposed method generates a sequence \(\{x^ k, v^ k\}\) that converges to a stationary pair \((x^*,v^*)\) of the original problem. In order to prove convergence to a minimum of the original problem, second order information of the regularized function is needed but second derivatives of \(\varphi_ v\) do not exist at every point.
In this paper we will show that the regularized function has second directional derivatives in all directions and find its explicit expression. This allows us to introduce optimality conditions for the regularized function and a penalty parameter to guarantee convergence to a minimum of the original problem in a paper in preparation.
For the entire collection see [Zbl 0791.00028].


90C30 Nonlinear programming
49J35 Existence of solutions for minimax problems