# zbMATH — the first resource for mathematics

Empirical risk minimization as parameter choice rule for general linear regularization methods. (English. French summary) Zbl 1439.62096
Summary: We consider the statistical inverse problem to recover $$f$$ from noisy measurements $$Y=Tf+\sigma \xi$$ where $$\xi$$ is Gaussian white noise and $$T$$ a compact operator between Hilbert spaces. Considering general reconstruction methods of the form $$\hat{f}_{\alpha}=q_{\alpha}(T^*T)T^*Y$$ with an ordered filter $$q_{\alpha}$$, we investigate the choice of the regularization parameter $$\alpha$$ by minimizing an unbiased estimate of the predictive risk $$\mathbb{E}[\Vert Tf-T\hat{f}_{\alpha}\Vert^2]$$. The corresponding parameter $$\alpha_{\text{pred}}$$ and its usage are well-known in the literature, but oracle inequalities and optimality results in this general setting are unknown. We prove a (generalized) oracle inequality, which relates the direct risk $$\mathbb{E}[\Vert f-\hat{f}_{\alpha_{\text{pred}}}\Vert^2]$$ with the oracle prediction risk $$\inf_{\alpha >0}\mathbb{E}[\Vert Tf-T\hat{f}_{\alpha}\Vert^2]$$. From this oracle inequality we are then able to conclude that the investigated parameter choice rule is of optimal order in the minimax sense.
Finally we also present numerical simulations, which support the order optimality of the method and the quality of the parameter choice in finite sample situations.

##### MSC:
 62G05 Nonparametric estimation 62G20 Asymptotic properties of nonparametric inference 65J22 Numerical solution to inverse problems in abstract spaces 65J20 Numerical solutions of ill-posed problems in abstract spaces; regularization
Full Text:
##### References:
 [1] F. Abramovich and B. W. Silverman. Wavelet decomposition approaches to statistical inverse problems. Biometrika 85 (1998) 115-129. · Zbl 0908.62095 [2] A. B. Bakushinskiĭ. Remarks on the choice of regularization parameter from quasioptimality and relation tests. Zh. Vychisl. Mat. Mat. Fiz. 24 (8) (1984) 1258-1259. [3] F. Bauer and T. Hohage. A Lepskij-type stopping rule for regularized Newton methods. Inverse Probl. 21 (6) (2005) 1975. · Zbl 1091.65052 [4] F. Bauer and M. A. Lukas. Comparing parameter choice methods for regularization of ill-posed problems. Math. Comput. Simulation 81 (9) (2011) 1795-1841. · Zbl 1220.65063 [5] P. Bellec and A. Tsybakov. Bounds on the prediction error of penalized least squares estimators with convex penalty. In Modern Problems of Stochastic Analysis and Statistics 315-333. Springer Proc. Math. Stat. 208. Springer, Cham, 2017. · Zbl 06848045 [6] L. Birgé and P. Massart. Gaussian model selection. J. Eur. Math. Soc. (JEMS) 3 (3) (2001) 203-268. · Zbl 1037.62001 [7] L. Birgé and P. Massart. Minimal penalties for Gaussian model selection. Probab. Theory Related Fields 138 (1-2) (2007) 33-73. · Zbl 1112.62082 [8] N. Bissantz, T. Hohage, A. Munk and F. Ruymgaart. Convergence rates of general regularization methods for statistical inverse problems and applications. SIAM J. Numer. Anal. 45 (6) (2007) 2610-2636. · Zbl 1234.62062 [9] G. Blanchard, M. Hoffmann and M. Reiß. Optimal adaptation for early stopping in statistical inverse problems. SIAM/ASA J. Uncertain. Quantificat. 6 (3) (2018) 1043-1075. · Zbl 1401.65058 [10] E. J. Candès, C. A. Sing-Long and J. D. Trzasko. Unbiased risk estimates for singular value thresholding and spectral estimators. IEEE Trans. Signal Process. 61 (19) (2013) 4643-4657. · Zbl 1393.94187 [11] L. Cavalier. Inverse problems in statistics. In Inverse Problems and High-Dimensional Estimation 3-96. Lect. Notes Stat. Proc. 203. Springer, Heidelberg, 2011. [12] L. Cavalier, G. K. Golubev, D. Picard and A. B. Tsybakov. Oracle inequalities for inverse problems. Ann. Statist. 30 (3) (2002) 843-874. Dedicated to the memory of Lucien Le Cam. · Zbl 1029.62032 [13] L. Cavalier and Y. Golubev. Risk hull method and regularization by projections of ill-posed inverse problems. Ann. Statist. 34 (4) (2006) 1653-1677. · Zbl 1246.62082 [14] L. Cavalier, Y. Golubev, O. Lepski and A. Tsybakov. Block thresholding and sharp adaptive estimation in severely ill-posed inverse problems. Teor. Veroyatn. Primen. 48 (3) (2003) 534-556. · Zbl 1130.62313 [15] E. Chernousova and Y. Golubev. Spectral cut-off regularizations for ill-posed linear models. Math. Methods Statist. 23 (2) (2014) 116-131. · Zbl 1308.62016 [16] A. Cohen, M. Hoffmann and M. Reiß. Adaptive wavelet Galerkin methods for linear inverse problems. SIAM J. Numer. Anal. 42 (4) (2004) 1479-1501. · Zbl 1077.65054 [17] A. R. Davies and R. S. Anderssen. Improved estimates of statistical regularization parameters in Fourier differentiation and smoothing. Numer. Math. 48 (6) (1986) 671-697. · Zbl 0615.65009 [18] C.-A. Deledalle, S. Vaiter, J. Fadili and G. Peyré. Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection. SIAM J. Imaging Sci. 7 (4) (2014) 2448-2487. · Zbl 1361.94012 [19] H. Dette, A. Munk and T. Wagner. Estimating the variance in nonparametric regression-what is a reasonable choice? J. R. Stat. Soc. Ser. B. Stat. Methodol. 60 (4) (1998) 751-764. · Zbl 0944.62041 [20] L. Ding and P. Mathé. Minimax rates for statistical inverse problems under general source conditions. Comput. Methods Appl. Math. 18 (4) (2018) 603-608. · Zbl 1412.65041 [21] D. L. Donoho. Nonlinear solution of linear inverse problems by wavelet-vaguelette decomposition. Appl. Comput. Harmon. Anal. 2 (2) (1995) 101-126. · Zbl 0826.65117 [22] H. Engl, M. Hanke and A. Neubauer. Regularization of Inverse Problems. Springer, Berlin, 1996. · Zbl 0859.65054 [23] A. Goldenshluger. On pointwise adaptive nonparametric deconvolution. Bernoulli 5 (5) (1999) 907-925. · Zbl 0953.62033 [24] G. H. Golub, M. Heath and G. Wahba. Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics 21 (2) (1979) 215-223. · Zbl 0461.62059 [25] G. K. Golubev and R. Z. Khas’minskiĭ. A statistical approach to some inverse problems for partial differential equations. Problemy Peredachi Informatsii 35 (2) (1999) 51-66. [26] Y. Golubev. The principle of penalized empirical risk in severely ill-posed problems. Probab. Theory Related Fields 130 (1) (2004) 18-38. · Zbl 1064.62011 [27] Y. Golubev. On universal oracle inequalities related to high-dimensional linear models. Ann. Statist. 38 (5) (2010) 2751-2780. · Zbl 1200.62074 [28] Y. Golubev. Adaptive spectral regularizations of high dimensional linear models. Electron. J. Stat. 5 (2011) 1588-1617. · Zbl 1271.62146 [29] P. Hall, J. W. Kay and D. M. Titterinton. Asymptotically optimal difference-based estimation of variance in nonparametric regression. Biometrika 77 (1990) 521-528. · Zbl 1377.62102 [30] T. Hohage and F. Weidling. Characterizations of variational source conditions, converse results, and maxisets of spectral regularization methods. SIAM J. Numer. Anal. 55 (2) (2017) 598-620. · Zbl 1432.65070 [31] Y. Ingster, B. Laurent and C. Marteau. Signal detection for inverse problems in a multidimensional framework. Math. Methods Statist. 23 (4) (2014) 279-305. · Zbl 1308.62091 [32] Y. I. Ingster, T. Sapatinas and I. A. Suslina. Minimax signal detection in ill-posed inverse problems. Ann. Statist. 40 (3) (2012) 1524-1549. · Zbl 1297.62097 [33] I. M. Johnstone, G. Kerkyacharian, D. Picard and M. Raimondo. Wavelet deconvolution in a periodic setting. J. R. Stat. Soc. Ser. B. Stat. Methodol. 66 (3) (2004) 547-573. · Zbl 1046.62039 [34] I. M. Johnstone and B. W. Silverman. Discretization effects in statistical inverse problems. J. Complexity 7 (1991) 1-34. · Zbl 0737.62099 [35] A. Kneip. Ordered linear smoothers. Ann. Statist. 22 (2) (1994) 835-866. · Zbl 0815.62022 [36] O. V. Lepskiĭ. On a problem of adaptive estimation in Gaussian white noise. Theory Probab. Appl. 35 (3) (1991) 454-466. [37] K.-C. Li. Asymptotic optimality for $$C_p, C_L$$, cross-validation and generalized cross-validation: Discrete index set. Ann. Statist. 15 (3) (1987) 958-975. · Zbl 0653.62037 [38] F. Lucka, K. Proksch, C. Brune, N. Bissantz, M. Burger, H. Dette and F. Wübbeling. Risk estimators for choosing regularization parameters in ill-posed problems - Properties and limitations, 2017. Available at arXiv:1701.04970. · Zbl 06945044 [39] F. Luisier, T. Blu and M. Unser. A new sure approach to image denoising: Interscale orthonormal wavelet thresholding. IEEE Trans. Image Process. 16 (3) (2007) 593-606. [40] M. A. Lukas. Asymptotic optimality of generalized cross-validation for choosing the regularization parameter. Numer. Math. 66 (1) (1993) 41-66. · Zbl 0791.65037 [41] M. A. Lukas. On the discrepancy principle and generalised maximum likelihood for regularisation. Bull. Aust. Math. Soc. 52 (3) (1995) 399-424. · Zbl 0840.65043 [42] B. A. Mair and F. H. Ruymgaart. Statistical inverse estimation in Hilbert scales. SIAM J. Appl. Math. 56 (5) (1996) 1424-1444. · Zbl 0864.62020 [43] C. L. Mallows. Some comments on $$C_p$$. Technometrics 15 (4) (1973) 661-675. · Zbl 0269.62061 [44] P. Mathé. The Lepskiĭ principle revisited. Inverse Probl. 22 (3) (2006) L11-L15. · Zbl 1095.65045 [45] P. Mathé and B. Hofmann. How general are general source conditions? Inverse Probl. 24 (1) (2008) 015009. · Zbl 1140.47006 [46] P. Mathé and S. V. Pereverzev. Optimal discretization of inverse problems in Hilbert scales. Regularization and self-regularization of projection methods. SIAM J. Numer. Anal. 38 (6) (2001) 1999-2021. · Zbl 1049.65046 [47] P. Mathé and S. V. Pereverzev. Geometry of linear ill-posed problems in variable Hilbert scales. Inverse Probl. 19 (3) (2003) 789-803. · Zbl 1026.65040 [48] P. Mathé and S. V. Pereverzev. Regularization of some linear ill-posed problems with discretized random noisy data. Math. Comp. 75 (256) (2006) 1913-1929 (electronic). · Zbl 1103.62031 [49] V. A. Morozov. On the solution of functional equations by the method of regularization. Sov. Math., Dokl. 7 (1966) 414-417. · Zbl 0187.12203 [50] J. Neter, M. H. Kutner, C. J. Nachtsheim and W. Wasserman. Applied Linear Statistical Models 4. Irwin, Chicago, 1996. [51] F. O’Sullivan. A statistical perspective on ill-posed inverse problems. Statist. Sci. 1 (4) (1986) 502-527. With comments and a rejoinder by the author. · Zbl 0625.62110 [52] M. S. Pinsker. Optimal filtration of square-integrable signals in Gaussian noise. Probl. Inf. Transm. 16 (2) (1980) 52-68 (Russian). [53] J. Rice. Bandwidth choice for nonparametric regression. Ann. Statist. 12 (4) (1984) 1215-1230. · Zbl 0554.62035 [54] A. Rieder. Runge-Kutta integrators yield optimal regularization schemes. Inverse Probl. 21 (2) (2005) 453-471. · Zbl 1075.65078 [55] C. M. Stein. Estimation of the mean of a multivariate normal distribution. Ann. Statist. 9 (6) (1981) 1135-1151. · Zbl 0476.62035 [56] A. Tsybakov. On the best rate of adaptive estimation in some inverse problems. C. R. Acad. Sci. Paris Sér. I Math. 330 (9) (2000) 835-840. · Zbl 1163.62316 [57] C. R. Vogel. Optimal choice of a truncation level for the truncated SVD solution of linear first kind integral equations when data are noisy. SIAM J. Numer. Anal. 23 (1) (1986) 109-117. · Zbl 0593.65091 [58] C. R. Vogel. Computational Methods for Inverse Problems. Frontiers in Applied Mathematics 23. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2002. · Zbl 1008.65103 [59] G. Wahba. Practical approximate solutions to linear operator equations when the data are noisy. SIAM J. Numer. Anal. 14 (4) (1977) 651-667. · Zbl 0402.65032 [60] Y.-Q. Wang and J.-M. Morel. SURE guided Gaussian mixture image denoising. SIAM J. Imaging Sci. 6 (2) (2013) 999-1034. · Zbl 1279.68341 [61] F. Werner. On convergence rates for iteratively regularized Newton-type methods under a Lipschitz-type nonlinearity condition. J. Inverse Ill-Posed Probl. 23 (1) (2015) 75-84. · Zbl 1308.65084 [62] F. Werner. Adaptivity and oracle inequalities in linear statistical inverse problems: A (numerical) survey. In New Trends in Parameter Identification for Mathematical Models 291-316. Birkhäuser, Basel, 2018. · Zbl 1405.62052 [63] F. Werner and T. Hohage. Convergence rates in expectation for Tikhonov-type regularization of inverse problems with Poisson data. Inverse Probl. 28 (10) (2012) 104004. · Zbl 1256.35200
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.