×

zbMATH — the first resource for mathematics

Robust estimation of derivatives using locally weighted least absolute deviation regression. (English) Zbl 07064040
Summary: In nonparametric regression, the derivative estimation has attracted much attention in recent years due to its wide applications. In this paper, we propose a new method for the derivative estimation using the locally weighted least absolute deviation regression. Different from the local polynomial regression, the proposed method does not require a finite variance for the error term and so is robust to the presence of heavy-tailed errors. Meanwhile, it does not require a zero median or a positive density at zero for the error term in comparison with the local median regression. We further show that the proposed estimator with random difference is asymptotically equivalent to the (infinitely) composite quantile regression estimator. In other words, running one regression is equivalent to combining infinitely many quantile regressions. In addition, the proposed method is also extended to estimate the derivatives at the boundaries and to estimate higher-order derivatives. For the equidistant design, we derive theoretical results for the proposed estimators, including the asymptotic bias and variance, consistency, and asymptotic normality. Finally, we conduct simulation studies to demonstrate that the proposed method has better performance than the existing methods in the presence of outliers and heavy-tailed errors, and analyze the Chinese house price data for the past ten years to illustrate the usefulness of the proposed method.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
Software:
L1pack; pspline; locpol
PDF BibTeX XML Cite
Full Text: Link
References:
[1] G. Boente and D. Rodriguez. Robust estimators of high order derivatives of regression functions.Statistics & Probability Letters, 76(13):1335-1344, 2006. · Zbl 1094.62049
[2] L.D. Brown and M. Levine. Variance estimation in nonparametric regression via the difference sequence method.The Annals of Statistics, 35(5):2219-2232, 2007. · Zbl 1126.62024
[3] J.L.O. Cabrera. locpol: Kernel local polynomial regression. R package version 0.6-0, 2012. URLhttp://mirrors.ustc.edu.cn/CRAN/web/packages/locpol/index.html.
[4] R. Charnigo, M. Francoeur, M.P. Meng¨u¸c, A. Brock, M. Leichter, and C. Srinivasan. Derivatives of scattering profiles: tools for nanoparticle characterization.Journal of the Optical Society of America, 24(9):2578-2589, 2007.
[5] R. Charnigo, M. Francoeur, M.P. Meng¨u¸c, B. Hall, and C. Srinivasan. Estimating quantitative features of nanoparticles using multiple derivatives of scattering profiles.Journal of Quantitative Spectroscopy and Radiative Transfer, 112(8):1369-1382, 2011a.
[6] R. Charnigo, B. Hall, and C. Srinivasan. A generalizedCpcriterion for derivative estimation. Technometrics, 53(3):238-253, 2011b.
[7] P. Chaudhuri and J.S. Marron. SiZer for exploration of structures in curves.Journal of the American Statistical Association, 94(447):807-823, 1999. · Zbl 1072.62556
[8] W.S. Cleveland. Robust locally weighted regression and smoothing scatterplots.Journal of the American Statistical Association, 74(368):829-836, 1979. · Zbl 0423.62029
[9] K. De Brabanter, J. De Brabanter, B. De Moor, and I. Gijbels. Derivative estimation with local polynomial fitting.Journal of Machine Learning Research, 14(1):281-301, 2013. · Zbl 1320.62088
[10] M. Delecroix and A.C. Rosa. Nonparametric estimation of a regression function and its derivatives under an ergodic hypothesis.Journal of Nonparametric Statistics, 6(4):367- 382, 1996. · Zbl 0879.62034
[11] N.R. Draper and H. Smith.Applied Regression Analysis. Wiley and Sons, New York, 2nd edition, 1981. · Zbl 0548.62046
[12] J. Fan and I. Gijbels.Local Polynomial Modelling and Its Applications. Chapman & Hall, London, 1996. · Zbl 0873.62037
[13] J. Fan and P. Hall. On curve estimation by minimizing mean absolute deviation and its implications.The Annals of Statistics, 22(2):867-885, 1994. · Zbl 0806.62030
[14] S. Ghosal, A. Sen, and A.W. vander Vaart. Testing monotonicity of regression.The Annals of Statistics, 28(4):1054-1082, 2000. · Zbl 1105.62337
[15] I. Gijbels and A.C. Goderniaux. Data-driven discontinuity detection in derivatives of a regression function.Communications in Statistics-Theory and Methods, 33(4):851-871, 2005. · Zbl 1218.62031
[16] B. Hall.Nonparametric Estimation of Derivatives with Applications. PhD thesis, University of Kentucky, Lexington, Kentucky, 2010.
[17] W. H¨ardle.Applied Nonparametric Regression. Cambridge University Press, Cambridge, 1990. · Zbl 0714.62030
[18] W. H¨ardle and T. Gasser. Robust non-parametric function fitting.Journal of the Royal Statistical Society, Series B, 46(1):42-51, 1984. · Zbl 0543.62034
[19] W. H¨ardle and T. Gasser. On robust kernel estimation of derivatives of regression functions. Scandinavian Journal of Statistics, 12(3):233-240, 1985. · Zbl 0568.62041
[20] P.J. Huber and E.M. Ronchetti.Robust Statistics. John Wiley & Sons, Inc., 2009. · Zbl 1276.62022
[21] B. Kai, R. Li, and H. Zou. Local composite quantile regression smoothing: an efficient and sate alternative to local polynomial regression.Journal of the Royal Statistical Society, Series B, 72(1):49-69, 2010. · Zbl 1411.62101
[22] G.D. Knott.Interpolating Cubic Splines. Spring, 1st edition, 2000. · Zbl 1057.41001
[23] R. Koenker. A note on l-estimation for linear models.Statistics & Probability Letters, 2(6): 323-325, 1984. · Zbl 0626.62029
[24] R. Koenker.Quantile Regression. Cambridge University Press, New York, 2005. · Zbl 1111.62037
[25] R. Koenker and G. Bassett. Regression quantiles.Econometrica, 46(1):33-50, 1978. · Zbl 0373.62038
[26] X.R. Li and V.P. Jilkov. Survey of maneuvering target tracking. Part I: Dynamic models. IEEE Transations on Aerospace and Electronic Systems, 39(4):1333-1364, 2003.
[27] X.R. Li and V.P. Jilkov. Survey of maneuvering target tracking. Part II: Motion models of ballistic and space targets.IEEE Transations on Aerospace and Electronic Systems, 46 (1):96-119, 2010.
[28] Y. Liu and K. De Brabanter. Derivative estimation in random design. 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montr´eal, Canada., 2018.
[29] I. Matyasovszky. Detecting abrupt climate changes on different time scales.Theoretical and Applied Climatology, 105(3-4):445-454, 2011.
[30] H.G. M¨uller.Nonparametric Regression Analysis of Longitudinal Data. Springer, New York, 1988.
[31] H.G. M¨uller, U. Stadtm¨uller, and T. Schmitt. Bandwidth choice and confidence intervals for derivatives of noisy data.Biometrika, 74(4):743-749, 1987.
[32] J. Newell, J. Einbeck, N. Madden, and K. McMillan. Model free endurance markers based on the second derivative of blood lactate curves. InProceedings of the 20th International Workshop on Statistical Modelling, pages 357-364, Sydney, 2005.
[33] F. Osorio. L1pack: Routines forL1 estimation. R package version 0.3, 2015. URLhttp: //www.ies.ucv.cl/l1pack/.
[34] A. Pakes and D. Pollard. Simulation and the asymptotics of optimization estimators.Econometrica, 57(5):1027-1057, 1989. · Zbl 0698.62031
[35] C. Park and K.H Kang. SiZer analysis for the comparison of regression curves.Computational Statistics&Data Analysis, 52(8):3954-3970, 2008. · Zbl 1452.62291
[36] J. Porter and P. Yu. Regression discontinuity designs with unknown discontinuity points: Testing and estimation.Journal of Econometrics, 189(1):132-147, 2015. · Zbl 1337.62082
[37] J. Ramsay and B. Ripley. pspline: Penalized smoothing splines. R package version 1.0-16, 2013. URLhttp://mirrors.ustc.edu.cn/CRAN/web/packages/pspline/index.html.
[38] J.O. Ramsay and B.W. Silverman.Applied Functional Data Analysis: Methods and Case Studies. Springer, New York, 2002. · Zbl 1011.62002
[39] D. Ruppert and M.P. Wand. Multivariate locally weighted least squares regression.The Annals of Statistics, 22(3):1346-1370, 1994. · Zbl 0821.62020
[40] C.J. Stone. Additive regression and other nonparametric models.The Annals of Statistics, 13(2):689-705, 1985. · Zbl 0605.62065
[41] P.S. Swain, K. Stevenson, A. Leary, L.F. Montano-Gutierrez, I.B.N. Clark, J. Vogel, and T. Pilizota. Inferring time derivatives including cell growth rates using gaussian process. Nature Communications, 7: 13766, 2016.
[42] G. Wahba and Y. Wang. When is the optimal regularization parameter insensitive to the choice of the loss function?Communications in Statistics-Theory and Methods, 19(5): 1685-1700, 1990. · Zbl 0724.62044
[43] F.T. Wang and D.W. Scott. TheL1method for robust nonparametric regression.Journal of the American Statistical Association, 89(425):65-76, 1994. · Zbl 0791.62044
[44] W.W. Wang and L. Lin. Derivative estimation based on difference sequence via locally weighted least squares regression.Journal of Machine Learning Research, 16:2617-2641, 2015. · Zbl 1351.62095
[45] W.W. Wang and P. Yu. Asymptotically optimal differenced estimators of error variance in nonparametric regression.Computational Statistics & Data Analysis, 105:125-143, 2017. · Zbl 1466.62212
[46] A.H. Welsh. Robust estimation of smooth regression and spread functions and their derivatives.Statistica Sinica, 6(2):347-366, 1996. · Zbl 0884.62047
[47] Z. Zhao and Z. Xiao. Efficient regression via optimally combining quantile information. Econometric Theory, 30(6):1272-1314, 2014. · Zbl 1314.62151
[48] S. Zhou and D.A. Wolfe. On derivative estimation in spline regression.Statistica Sinica, 10 (1):93-108, 2000. · Zbl 0970.62024
[49] H. Zou and M. Yuan. Composite quantile regression and the oracle model selection theory. The Annals of Statistics, 36(3):1108-1126, 2008. · Zbl 1360.62394
[50] Figure 5: (a)-(d) The true second-order derivative function (bold line), LowLAD (green line) and LowLSR estimators (red line) based on the simulated data set from Figure 3.
[51] Figure 6: (a)-(d) The true second-order derivative function (bold line), LowLAD (green line) and LowLSR estimators (red line) based on the simulated data set from Figure 4.
[52] Figure 7: Boxplot of four estimators for the functionm4with∼95%N(0,0.12) + 5%N(0,12).
[53] Figure 8: Boxplot of four estimators for the functionm4with∼95%N(0,0.12) + 5%N(0,102).
[54] Figure 9: Black and green points denote the house prices in Beijing and Jinan, respectively. α0.0010.0020.0030.0040.0050.0060.0070.0080.009
[55] Table 4: The critical values ofσ0that equate the variances of the LowLAD (RLowLAD) and LowLSR derivative estimators with different contaminations.
[56] Figure 10: Black and green curves denote the relative growth rates for Beijing and Jinan, respectively. Relative growth rate is defined asRLowLAD/P rice.
[57] Figure 11: Black and green curves denote the relative growth rates based on the lower-order RLowLAD estimator.
[58] Figure 12: The red line is the criticalσ0curve between LowLSR and LowLAD withi∼ (1−α)N(0,1) +αN(0, σ02), and the red horizontal line isσ0= 2.77; The green line is the criticalσ0curve between LowLSR and RLowLAD, and the green horizontal line isσ0= 1.42; the black line isσ0= 1.
[59] Figure 13: The red line is the criticalσ0curve between RLowLAD and LAD with the same error distribution as in Figure 12, where the ratio larger than 20 is truncated at 20, the green horizontal line isσ0= 3.28, and the black line isσ0= 1.
[60] Figure 14: The red point-curve is the variance ratio function between LowLSR and RLowLAD fort(ν) with differentν’s; the green horizontal line isRatio= 0.95.
[61] Figure 15: The red point-curve is the variance ratio function between LAD and RLowLAD estimators for (ν) with differentν’s; and the green horizontal line isRatio= 1.50.
[62] Figure 16: The red curve is the variance ratio function between LowLSR and RLowLAD fori∼0.5N(µ,1) + 0.5N(−µ,1) with differentµ’s; the green horizontal line is Ratio= 0.89.
[63] Figure 17: The red curve is the variance ratio function between RLowLAD and LAD for the same error distribution as in Figure 16; the green horizontal line isRatio= 1.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.