zbMATH — the first resource for mathematics

The directed subdifferential of DC functions. (English) Zbl 1222.49020
Leizarowitz, Arie (ed.) et al., Nonlinear analysis and optimization II. Optimization. A conference in celebration of Alex Ioffe’s 70th and Simeon Reich’s 60th birthdays, June 18–24, 2008, Haifa, Israel. Providence, RI: American Mathematical Society (AMS); Ramat-Gan: Bar-Ilan University (ISBN 978-0-8218-4835-7/pbk). Contemporary Mathematics 514; Israel Mathematical Conference Proceedings, 27-43 (2010).
Summary: The space of directed sets is a Banach space in which convex compact subsets of \(\mathbb R^n\) are embedded. Each directed set is visualized as a (nonconvex) subset of \(\mathbb R^n\), which is comprised of a convex, a concave and a mixed-type part.
Following an idea of A. Rubinov, the directed subdifferential of a Difference of Convex (DC) functions is defined as the directed difference of the corresponding embedded convex subdifferentials. Its visualization is called the Rubinov subdifferential. The latter contains the Dini-Hadamard subdifferential as its convex part, the Dini-Hadamard superdifferential as its concave part, and its convex hull equals the Michel-Penot subdifferential. Hence, the Rubinov subdifferential contains less critical points in general than the Michel-Penot subdifferential, while the sharp necessary and sufficient optimality conditions in terms of the Dini-Hadamard subdifferential are recovered by the convex part of the directed subdifferential.
Furthermore, the directed subdifferential could distinguish between points that are candidates for a maximum and those for a minimum. It also allows to easily detect ascent and descent directions from its visualization. Seven out of eight axioms that A. Ioffe demanded for a subdifferential are satisfied as well as the sum rule with equality.
For the entire collection see [Zbl 1193.00062].

49J52 Nonsmooth analysis
90C26 Nonconvex programming, global optimization
90C46 Optimality conditions and duality in mathematical programming
49J50 Fréchet and Gateaux differentiability in optimization