## Nonparametric density estimation for linear processes with infinite variance.(English)Zbl 1332.62123

Summary: We consider nonparametric estimation of marginal density functions of linear processes by using kernel density estimators. We assume that the innovation processes are i.i.d. and have infinite-variance. We present the asymptotic distributions of the kernel density estimators with the order of bandwidths fixed as $$h = cn ^{ - 1/5}$$, where $$n$$ is the sample size. The asymptotic distributions depend on both the coefficients of linear processes and the tail behavior of the innovations. In some cases, the kernel estimators have the same asymptotic distributions as for i.i.d. observations. In other cases, the normalized kernel density estimators converge in distribution to stable distributions. A simulation study is also carried out to examine small sample properties.

### MSC:

 62G07 Density estimation 60F05 Central limit and other weak theorems 60E07 Infinitely divisible distributions; stable distributions 62G20 Asymptotic properties of nonparametric inference 62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)

### Software:

RMetrics; fBasics
Full Text:

### References:

 [1] Bryk A., Mielniczuk J. (2005). Asymptotic properties of density estimates for linear processes: application of projection method. Journal of Nonparametric Statistics 17, 121–133 · Zbl 1055.62031 [2] Cheng B., Robinson P.M. (1991). Density estimation in strongly dependent non-linear time series. Statistica Sinica 1, 335–359 · Zbl 0823.62031 [3] Chow Y.S., Teicher H. (1988). Probability theory (2nd ed). New York, Springer · Zbl 0652.60001 [4] Csörgo S., Mielniczuk J. (1995). Density estimation under long-range dependence. The Annals of Statistics 28, 990–999 · Zbl 0843.62037 [5] Doukhan P. (1994). Mixing: properties and examples Lecture Notes in Statistics, Vol. 85. New York, Springer · Zbl 0801.60027 [6] Fan J., Yao Q. (2003). Nonlinear time series. New York, Springer · Zbl 1014.62103 [7] Giraitis L., Koul H.L., Surgailis D. (1996). Asymptotic normality of regression estimators with long memory errors. Statistics & Probability Letters 29, 317–335 · Zbl 0903.62022 [8] Giraitis L., Surgailis D. (1986). Multivariate Appell polynomials and the central limit theorem. In: Eberlein E., Taqqu M.S. (eds). Dependence in probability and statistics. Boston, Birkhäuser, pp. 21–71 · Zbl 0605.60031 [9] Hall P., Hart J.D. (1990). Convergence rates in density estimation for data from infinite-order moving average process. Probability Theory and Related Fields 87, 253–274 · Zbl 0695.60043 [10] Hallin M., Tran L.T. (1996). Kernel density estimation for linear processes: asymptotic normality and optimal bandwidth derivation. The Annals of the Institute of Statistical Mathematics 48, 429–449 · Zbl 0886.62042 [11] Hidalgo J. (1997). Non-parametric estimation with strongly dependent multivariate time series. Journal of Time Series Analysis, 18, 95–122 · Zbl 0923.62045 [12] Ho H.-C. (1996). On central and non-central limit theorems in density estimation for sequences of long-range dependence. Stochastic Processes and their Applications 63, 153–174 · Zbl 0902.62046 [13] Ho H.-C., Hsing T. (1996). On the asymptotic expansion of the empirical process of long-memory moving averages. The Annals of Statistics 24, 992–1024 · Zbl 0862.60026 [14] Ho H.-C., Hsing T. (1997). Limit theorems for functionals of moving averages. The Annals of Probability 25, 1636–1669 · Zbl 0903.60018 [15] Honda T. (2000). Nonparametric density estimation for a long-range dependent linear process. The Annals of the Institute of Statistical Mathematics 52, 599–611 · Zbl 0978.62029 [16] Hsing T. (1999). On the asymptotic distribution of partial sum of functionals of infinite-variance moving averages. The Annals of Probability 27, 1579–1599 · Zbl 0961.60038 [17] Koul H.L., Surgailis D. (2001). Asymptotics of empirical processes of long memory moving averages with infinite variance. Stochastic Processes and their Applications 91, 309–336 · Zbl 1046.62089 [18] Koul H.L., Surgailis D. (2002). Asymptotic expansion of the empirical process of long memory moving averages. In: Dehling H., Mikosch T., Sørensen M. (eds). Empirical process techniques for dependent data. Boston, Birkhäuser, pp. 213–239 · Zbl 1021.62072 [19] Peng L., Yao Q. (2004). Nonparametric regression under dependent errors with infinite variance. The Annals of the Institute of Statistical Mathematics 56, 73–86 · Zbl 1050.62047 [20] Pipiras V., Taqqu M.S. (2003). Central limit theorems for partial sums of bounded functionals of infinite-variance moving averages. Bernoulli 9, 833–855 · Zbl 1053.60017 [21] Samorodnitsky G., Taqqu M.S. (1994). Stable non-Gaussian processes: stochastic models with infinite variance. London, Chapman & Hall · Zbl 0925.60027 [22] Schick A., Wefelmeyer W. (2006). Pointwise convergence rates for kernel density estimators in linear processes. Statistics & Probability Letters 76, 1756–1760 · Zbl 1098.62112 [23] Surgailis D. (2002). Stable limits of empirical processes of moving averages with infinite variance. Stochastic Processes and their Applications 100, 255–274 · Zbl 1059.60044 [24] Surgailis D. (2004). Stable limits of sums of bounded functions of long-memory moving averages with finie variance. Bernoulli 10, 327–355 · Zbl 1076.62017 [25] Taniguchi M. Kakizawa Y. (2000). Asymptotic theory of statistical inference for time series. New York, Springer · Zbl 0955.62088 [26] Wu W.B., Mielniczuk J. (2002). Kernel density estimation for linear processes. The Annals of Statistics 30, 1441–1459 · Zbl 1015.62034 [27] Wuertz, D. and many others and see the SOURCE file (2006). fBasics: Rmetrics -Marketes and Basic Statistics, R package version 221.10065. http://www.rmetrics.org .
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.