zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
A preconditioning proximal Newton method for nondifferentiable convex optimization. (English) Zbl 0871.90065
Summary: We propose a proximal Newton method for solving nondifferentiable convex optimization . This method combines the generalized Newton method with Rockafellar’s proximal point algorithm. At each step, the proximal point is found approximately and the regularization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some accuracy and the second order differentiability properties of the Moreau-Yosida regularization are invariant with respect to this preconditioning. Based upon these, superlinear convergence is established under a semismoothness condition.
MSC:
90C25Convex programming
49J52Nonsmooth analysis (other weak concepts of optimality)
References:
[1]X. Chen, L. Qi and R. Womersley, Newton’s method for quadratic stochastic programs with recourse,Journal of Computational and Applied Mathematics 60 (1995) 29–46. · Zbl 0836.65078 · doi:10.1016/0377-0427(94)00082-C
[2]F.H. Clarke,Optimization and Nonsmooth Analysis (John Wiley, New York, 1983).
[3]J.E. Dennis, Jr. and R.B. Schnabel,Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Prentice-Hall, Englewood Cliffs, NJ, 1983).
[4]F. Facchinei, Minimization of SC1 functions and the Maratos effect,Operations Research Letters 17 (1995) 131–137. · Zbl 0843.90108 · doi:10.1016/0167-6377(94)00059-F
[5]M. Fukushima, A descent algorithm for nonsmooth convex optimization,Mathematical Programming 30 (1984) 163–175. · Zbl 0545.90082 · doi:10.1007/BF02591883
[6]M. Fukushima and L. Qi, A globally and superlinearly convergent algorithm for nonsmooth convex minimization,SIAM Journal on Optimization 6 (1996) 1106–1120. · Zbl 0868.90109 · doi:10.1137/S1052623494278839
[7]J. Hiriart-Urruty and C. Lemaréchal,Convex Analysis and Minimization Algorithms (Springer, Berlin, 1993).
[8]C. Lemaréchal and C. Sagastizábal, Practical aspects of the Moreau-Yosida regularization: theoretical preliminaries,SIAM Journal on Optimization, to appear.
[9]R. Mifflin, A quasi-second-order proximal bundle algorithm,Mathematical Programming 73 (1996) 51–72.
[10]J.S. Pang and L. Qi, A globally convergent Newton method for convex SC1 minimization problems.Journal Optimization Theory and Applications 85 (1995) 633–648. · Zbl 0831.90095 · doi:10.1007/BF02193060
[11]R. Poliquin and R.T. Rockafellar, Second-order nonsmooth analysis, in: D. Du, L. Qi and R. Womersley eds.,Recent Advances in Nonsmooth Optimization (World Scientific, Singapore, 1995) 322–350.
[12]R. Poliquin and R.T. Rockafellar, Generalized Hessian properties of regularized nonsmooth functions,SIAM Journal on Optimization, to appear.
[13]L. Qi, Convergence analysis of some algorithms for solving nonsmooth equations,Mathematics of Operations Research 18 (1993) 227–244. · Zbl 0776.65037 · doi:10.1287/moor.18.1.227
[14]L. Qi, Superlinearly convergent approximate Newton methods for LC1 optimization problems,Mathematical Programming 64 (1994) 277–294. · Zbl 0820.90102 · doi:10.1007/BF01582577
[15]L. Qi, Second-order analysis of the Moreau-Yosida regularization of a convex function, Revised version, Applied Mathematics Report, Department of Applied Mathematics, The University of New South Wales (Sydney, 1995).
[16]L. Qi and J. Sun, A nonsmooth version of Newton’s method,Mathematical Programming 58 (1993) 353–368. · Zbl 0780.90090 · doi:10.1007/BF01581275
[17]L. Qi and R. Womersley, An SQP Algorithm for extended linear-quadratic problems in stochastic programming,Annals of Operations Research 56 (1995) 251–285. · Zbl 0835.90058 · doi:10.1007/BF02031711
[18]R.T. Rockafellar,Convex Analysis (Princeton University Press, Princeton, NJ, 1970).
[19]R.T. Rockafellar, Augmented Lagrangians and applications of the proximal point algorithm in convex programming,Mathematics of Operations Research 1 (1976) 97–116. · Zbl 0402.90076 · doi:10.1287/moor.1.2.97
[20]R.T. Rockafellar, Maximal monotone relations and the second derivatives of nonsmooth functions,Ann. Inst. H. Poincaré: Analyse Non Linéaire 2 (1985) 167–184.