zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization. (English) Zbl 1156.90426
Summary: We establish a strong convergence theorem regarding a regularized variant of the projected subgradient method for nonsmooth, nonstrictly convex minimization in real Hilbert spaces. Only one projection step is needed per iteration and the involved stepsizes are controlled so that the algorithm is of practical interest. To this aim, we develop new techniques of analysis which can be adapted to many other non-Fejérian methods.
MSC:
90C25Convex programming
90C30Nonlinear programming
References:
[1]Albert, Y.I.: Recurrence relations and variational inequalities. Soviet Mathematics, Doklady, 27, 511–517 (1983)
[2]Albert, Y.I., Iusem, A.N.: Extension of subgradient techniques for nonsmooth optimization in a Banach space. Set-Valued Anal. 9, 315–335 (2001) · Zbl 1049.90123 · doi:10.1023/A:1012665832688
[3]Albert, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)
[4]Bauschke, H.H., Combettes, P.L.: A weak-to-strong convergence principle for Fejer monotone methods in Hilbert space. Math. Oper. Res. 26, 248–264 (2001) · Zbl 1082.65058 · doi:10.1287/moor.26.2.248.10558
[5]Bello, L., Raydan, M.: Preconditioned spectral projected-gradient method on convex sets. J. Comput. Math. 23, 225–232 (2005)
[6]Bertsekas, D.P., Gafni, E.M.: Projection methods for variational inequalities with applications to the traffic assignment problem. Math. Program. Stud. 17, 139–159 (1982)
[7]Bertsekas, D.P.: On the Goldstein–Levitin–Polyak gradient projection method. IEEE Trans. Automat. Contr. AC-21(2), 174–184 (1976) · Zbl 0326.49025 · doi:10.1109/TAC.1976.1101194
[8]Byrne, C.L.: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Problems 18, 441–453 (2004) · Zbl 0996.65048 · doi:10.1088/0266-5611/18/2/310
[9]Clarke, F.H.: Optimization and Nonsmooth Analysis. SIAM Publications, Philadelphia (1983)
[10]Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62, 261–275 (1993) · Zbl 0805.90083 · doi:10.1007/BF01585170
[11]Ekeland, I., Themam, R.: Convex analysis and variational problems. In: Classic in Applied Mathematics, p. 28. SIAM, Philadelphia (1999)
[12]Ermoliev, Y.M.: Methods for solving nonlinear extremal problems. Cybernet. 2, 1–17 (1966) · doi:10.1007/BF01071403
[13]Hager, W.W., Park, S.: The gradient projection method with exact line search. J. Glob. Optim. 30, 103–118 (2004) · Zbl 1136.90513 · doi:10.1023/B:JOGO.0000049118.13265.9b
[14]Halpern, B.: Fixed points of nonexpanding maps. Bull. Amer. Math. Soc. 73, 957–961 (1967) · Zbl 0177.19101 · doi:10.1090/S0002-9904-1967-11864-0
[15]Iusem, A.N.: On the convergence properties of the projected gradient method for convex optimization. Comput. Appl. Math. 22(1), 37–52 (2003)
[16]Khobotov, E.N.: A modification of the extragradient method for the solution of variational inequalities and some optimization problems. Zh. Vychisl. Mat. Mat. Fiz. 27, 1462–1473 (1987)
[17]Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Matecon 12, 747–756 (1976)
[18]Marcotte, P.: Applications of Khobotov’s algorithm to variational and network equlibrium problems. Inform. Syst. Oper. Res. 29, 258–270 (1991)
[19]Maingé, P.E., Moudafi, A.: Strong convergence of an iterative method for hierarchical fixed-point problems. Pacific J. Optim. 3(3), 529–538 (2007)
[20]Moudafi, A.: Viscosity approximations methods for fixed point problems. J. Math. Anal. Appl. 241, 46–55 (2000) · Zbl 0957.47039 · doi:10.1006/jmaa.1999.6615
[21]Nadezhkina, N., Takahashi, W.: Strong convergence theorem by a hybrid method for nonexpansive mappings and Lipschitz continuous monotone mappings. SIAM J. Optim. 16(4), 1230–1241 (2006) · Zbl 1143.47047 · doi:10.1137/050624315
[22]Solodov, M.V., Tseng, P.: Modified projection methods for monotone variational inequalities. SIAM J. Control Optim. 34(5), 1814–1834 (1996) · Zbl 0866.49018 · doi:10.1137/S0363012994268655
[23]Solodov, M.V., Zavriev, S.K.: Error stability properties of generalized gradient-type algorithms. J. Optim. Theory Appl. 98, 663–680 (1998) · Zbl 0913.90245 · doi:10.1023/A:1022680114518
[24]Solodov, M.V.: A new projection method for variational inequality problems. SIAM J. Control Optim. 37(3), 756–776 (1999) · Zbl 0959.49007 · doi:10.1137/S0363012997317475
[25]Xiu, N., Wang, C., Kong, L.: A note on the gradient projection method with exact stepsize rule. J. Comput. Math. 25(2), 221–230 (2007)
[26]Yamada, I., Ogura, N.: Hybrid steepest descent method for the variational inequality problem over the fixed point set of certain quasi-nonexpansive mappings. Numer. Funct. Anal. Optim. 25(7–8), 619–655 (2004) · Zbl 1095.47049 · doi:10.1081/NFA-200045815
[27]Zeng, L.C., Yao, J.C.: Strong convergence theorem by an extragradient method for fixed point problems and variational inequality problems. Taiwan. J. Math. 10(5), 1293–1303 (2006)