zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. (English) Zbl 1254.65072

Summary: Interpolation-based trust-region methods are an important class of algorithms for derivative-free optimization which rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations between variables are negligible, implying, in the smooth case, a sparse structure in the Hessian matrix. To be able to exploit Hessian sparsity, existing optimization approaches require the knowledge of the sparsity structure.

The goal of this paper is to develop and analyze a method where the sparse models are constructed automatically. The sparse recovery theory developed recently in the field of compressed sensing characterizes conditions under which a sparse vector can be accurately recovered from few random measurements. Such a recovery is achieved by minimizing the 1 -norm of a vector subject to the measurements constraints. We suggest an approach for building sparse quadratic polynomial interpolation models by minimizing the 1 -norm of the entries of the model Hessian subject to the interpolation conditions. We show that this procedure recovers accurate models when the function Hessian is sparse, using relatively few randomly selected sample points.

Motivated by this result, we developed a practical interpolation-based trust-region method using deterministic sample sets and minimum 1 -norm quadratic models. Our computational results show that the new approach exhibits a promising numerical performance both in the general case and in the sparse one.

MSC:
65K05Mathematical programming (numerical methods)
90C51Interior-point methods
90C30Nonlinear programming
90C56Derivative-free methods; methods using generalized derivatives
References:
[1]Alizadeh F., Goldfarb D.: Second-order cone programming. Math. Program. 95, 3–51 (2003) · Zbl 1153.90522 · doi:10.1007/s10107-002-0339-5
[2]Bandeira, A., Scheinberg, K., Vicente, L.N.: On partially sparse recovery. Tech. Rep. 11–13, Dept. Mathematics, Univ. Coimbra (2011)
[3]Bandeira, A.S., Fickus, M., Mixon, D.G., Wong, P.: The road to deterministic matrices with the restricted isometry property. Submitted, available online: arXiv:1202.1234 [math.FA] (2012)
[4]Baraniuk R., Davenport M., DeVore R., Wakin M.: A simple proof of the restricted isometry property for random matrices. Constr. Approx. 28, 253–263 (2008) · Zbl 1177.15015 · doi:10.1007/s00365-007-9003-x
[5]Candès E., Tao T.: Near optimal signal recovery from random projections: universal encoding strategies?. IEEE Trans. Inf. Theory 52, 5406–5425 (2006) · Zbl 05455295 · doi:10.1109/TIT.2006.885507
[6]Candès, E.J.: The restricted isometry property and its implications for compressed sensing. Compte Rendus de l’Academie des Sciences, Paris, Serie I, pp. 589–592 (2009)
[7]Colson B., Toint Ph. L.: Optimizing partially separable functions without derivatives. Optim. Methods Softw. 20, 493–508 (2005) · Zbl 1152.90659 · doi:10.1080/10556780500140227
[8]Conn A.R., Gould N.I.M., Toint Ph.L.: Trust-Region Methods. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2000)
[9]Conn, A.R., Scheinberg, K., Toint, Ph. L.: A derivative free optimization algorithm in practice. In: Proceedings of the 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, St. Louis, Missouri, Sept 2–4 (1998)
[10]Conn A.R., Scheinberg K., Vicente L.N.: Global convergence of general derivative-free trust-region algorithms to first and second order critical points. SIAM J. Optim. 20, 387–415 (2009) · Zbl 1187.65062 · doi:10.1137/060673424
[11]Conn A.R., Scheinberg K., Vicente L.N.: Introduction to Derivative-Free Optimization. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2009)
[12]Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002) · Zbl 1049.90004 · doi:10.1007/s101070100263
[13]Dolan E.D., Moré J.J., Munson T.S.: Optimality measures for performance profiles. SIAM J. Optim. 16, 891–909 (2006) · Zbl 1113.90146 · doi:10.1137/040608015
[14]Fasano G., Morales J.L., Nocedal J.: On the geometry phase in model-based algorithms for derivative-free optimization. Optim. Methods Softw. 24, 145–154 (2009) · Zbl 1154.90589 · doi:10.1080/10556780802409296
[15]Ferris, M.C., Deng, G.: Classification-based global search: An application to a simulation for breast cancer. In: Proceedings of 2008 NSF Engineering Research and Innovation Conference. Knoxville, Tennessee (2008)
[16]Gould N.I.M., Orban D., Toint Ph.L.: CUTEr (and SifDec), a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003) · Zbl 1068.90526 · doi:10.1145/962437.962439
[17]Gratton S., Toint Ph.L., Tröltzsch A.: An active-set trust-region method for derivative-free nonlinear bound-constrained optimization. Optim. Methods Softw. 26, 873–894 (2011) · Zbl 1229.90138 · doi:10.1080/10556788.2010.549231
[18]Jacques L.: A short note on compressed sensing with partially known signal support. Signal Process. 90, 3308–3312 (2010) · Zbl 1197.94063 · doi:10.1016/j.sigpro.2010.05.025
[19]Moré J.J., Sorensen D.C.: Computing a trust region step. SIAM J. Sci. Comput. 4, 553–572 (1983) · Zbl 0551.65042 · doi:10.1137/0904038
[20]Nocedal J., Wright S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2006)
[21]Powell M.J.D.: On trust region methods for unconstrained minimization without derivatives. Math. Program. 97, 605–623 (2003) · Zbl 1106.90382 · doi:10.1007/s10107-003-0430-6
[22]Powell M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program. 100, 183–215 (2004)
[23]Powell, M.J.D.: The NEWUOA software for unconstrained optimization without derivatives. In: Nonconvex Optim. Appl., vol. 83, pp. 255–297. Springer, Berlin (2006)
[24]Rauhut, H.: Compressive sensing and structured random matrices. In: Fornasier, M. (ed.) Theoretical Foundations and Numerical Methods for Sparse Recovery, Radon Series Comp. Appl. Math., pp. 1–92 (2010)
[25]Rauhut H., Ward R.: Sparse Legendre expansions via 1-minimization. J. Approx. Theory 164, 517–533 (2012) · Zbl 1239.65018 · doi:10.1016/j.jat.2012.01.008
[26]Scheinberg K., Toint P.L.: Self-correcting geometry in model-based algorithms for derivative-free unconstrained optimization. SIAM J. Optim. 20, 3512–3532 (2010) · Zbl 1209.65017 · doi:10.1137/090748536
[27]Tao, T.: Open question: deterministic UUP matrices: http://terrytao.wordpress.com/2007/07/02/open-question-deterministic-uup-matrices (2007)
[28]Vaswani N., Lu W.: Modified-CS: modifying compressive sensing for problems with partially known support. IEEE Trans. Signal Process. 58, 4595–4607 (2010) · doi:10.1109/TSP.2010.2051150
[29]Wild, S.M.: MNH: A derivative-free optimization algorithm using minimal norm Hessians. In: Tenth Copper Mountain Conference on Iterative Methods (2008)
[30]Wojtaszczyk P.: Stability and instance optimality for gaussian measurements in compressed sensing. Found. Comput. Math. 10, 1–13 (2010) · Zbl 1189.68060 · doi:10.1007/s10208-009-9046-4
[31]Zhang Y.: Solving large-scale linear programs by interior-point methods under the MATLAB environment. Optim. Methods Softw. 10, 1–31 (1998) · Zbl 0916.90208 · doi:10.1080/10556789808805699