zbMATH — the first resource for mathematics

Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences. (English) Zbl 1047.62008
Summary: An empirical Bayes approach to the estimation of possibly sparse sequences observed in Gaussian white noise is set out and investigated. The prior considered is a mixture of an atom of probability at zero and a heavy-tailed density $$\gamma$$, with the mixing weight chosen by marginal maximum likelihood, in the hope of adapting between sparse and dense sequences. If estimation is then carried out using the posterior median, this is a random thresholding procedure. Other thresholding rules employing the same threshold can also be used. Probability bounds on the threshold chosen by the marginal maximum likelihood approach lead to overall risk bounds over classes of signal sequences of length $$n$$, allowing for sparsity of various kinds and degrees.
The signal classes considered are “nearly black” sequences where only a proportion $$\eta$$ is allowed to be nonzero, and sequences with normalized $$\ell_p$$ norm bounded by $$\eta$$, for $$\eta>0$$ and $$0<p\leq 2$$. Estimation error is measured by mean $$q$$ th power loss, for $$0<q\leq 2$$. For all the classes considered, and for all $$q \in (0,2]$$, the method achieves the optimal estimation rate as $$n\to\infty$$ and $$\eta\to 0$$ at various rates, and in this sense adapts automatically to the sparseness or otherwise of the underlying signal. In addition the risk is uniformly bounded over all signals. If the posterior mean is used as the estimator, the results still hold for $$q>1$$. Simulations show excellent performance. For appropriately chosen functions $$\gamma$$, the method is computationally tractable and software is available. The extension to a modified thresholding method relevant to the estimation of very sparse sequences is also considered.

MSC:
 62C12 Empirical decision procedures; empirical Bayes procedures 62G05 Nonparametric estimation 62G08 Nonparametric regression and quantile regression
Software:
EbayesThresh; EBayesThresh
Full Text:
References:
 [1] Abramovich, F., Benjamini, Y., Donoho, D. L. and Johnstone, I. M. (2000). Adapting to unknown sparsity by controlling the false discovery rate. Technical Report 2000-19, Dept. Statistics, Stanford Univ. · Zbl 1092.62005 [2] Abramovich, F. and Silverman, B. W. (1998). Wavelet decomposition approaches to statistical inverse problems. Biometrika 85 115–129. · Zbl 0908.62095 · doi:10.1093/biomet/85.1.115 · www3.oup.co.uk [3] Benjamini, Y. and Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. J. Roy. Statist. Soc. Ser. B 57 289–300. · Zbl 0809.62014 [4] Brown, L. D., Johnstone, I. M. and MacGibbon, K. B. (1981). Variation diminishing transformations: A direct approach to total positivity and its statistical applications. J. Amer. Statist. Assoc. 76 824–832. · Zbl 0481.62021 · doi:10.2307/2287577 [5] Bruce, A. and Gao, H.-Y. (1996). Applied Wavelet Analysis with S-PLUS . Springer, New York. · Zbl 0857.65147 [6] Cai, T. T. (2002). On block thresholding in wavelet regression: Adaptivity, block size, and threshold level. Statist. Sinica 12 1241–1273. · Zbl 1004.62036 [7] Cai, T. T. and Silverman, B. W. (2001). Incorporating information on neighboring coefficients into wavelet estimation. Sankhyā Ser. B 63 127–148. · Zbl 1192.42020 · sankhya.isical.ac.in [8] Donoho, D. L. and Johnstone, I. M. (1994). Minimax risk over $$\ell_p$$-balls for $$\ell_q$$-error. Probab. Theory Related Fields 99 277–303. · Zbl 0802.62006 · doi:10.1007/BF01199026 [9] Donoho, D. L. and Johnstone, I. M. (1995). Adapting to unknown smoothness via wavelet shrinkage. J. Amer. Statist. Assoc. 90 1200–1224. · Zbl 0869.62024 · doi:10.2307/2291512 [10] Donoho, D. L., Johnstone, I. M., Hoch, J. C. and Stern, A. S. (1992). Maximum entropy and the nearly black object (with discussion). J. Roy. Statist. Soc. Ser. B 54 41–81. · Zbl 0788.62103 [11] George, E. I. and Foster, D. P. (1998). Empirical Bayes variable selection. In Proc. Workshop on Model Selection , Special Issue of Rassegna di Metodi Statistici ed Applicazioni (W. Racugno, ed.) 79–108. Pitagora Editrice, Bologna. [12] George, E. I. and Foster, D. P. (2000). Calibration and empirical Bayes variable selection. Biometrika 87 731–747. · Zbl 1029.62008 · doi:10.1093/biomet/87.4.731 [13] Johnstone, I. M. and Silverman, B. W. (2003). EbayesThresh: R and S-PLUS software for Empirical Bayes thresholding. Available at www.stats.ox.ac.uk/ silverma/ebayesthresh. [14] Johnstone, I. M. and Silverman, B. W. (2004). Empirical Bayes selection of wavelet thresholds. Ann. Statist. · Zbl 1078.62005 · doi:10.1214/009053605000000345 [15] Karlin, S. (1968). Total Positivity 1 . Stanford Univ. Press, Stanford, CA. · Zbl 0219.47030 [16] Pollard, D. (1984). Convergence of Stochastic Processes . Springer, New York. · Zbl 0544.60045 [17] Stein, C. (1981). Estimation of the mean of a multivariate normal distribution. Ann. Statist. 9 1135–1151. JSTOR: · Zbl 0476.62035 · doi:10.1214/aos/1176345632 · links.jstor.org [18] Zhang, C.-H. (2004). General empirical Bayes wavelet methods and exactly adaptive minimax estimation. Ann. Statist. · Zbl 1064.62009 · doi:10.1214/009053604000000995
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.