## Estimation of the density of a distribution from data with an admixture.(Ukrainian, English)Zbl 1114.62042

Teor. Jmovirn. Mat. Stat. 73, 88-96 (2005); translation in Theory Probab. Math. Stat., Vol 73, 99-108 (2006).
The authors consider the problem of estimation of the density function from observations of two-component mixtures with varying concentrations. Let $$\xi_{1:N},\dots,\xi_{N:N}$$ be a sample from a mixture with varying concentrations of independent random variables with the distribution function $P\{\xi_{j:N}<x\}=\omega_{j:N}H_1(x)+(1-\omega_{j:N})H_2(x),$ where $$H_1(x)$$ is the distribution function of the basic component, $$H_2(x)$$ is the distribution function of the admixture component, and $$\omega_{j:N}$$ is the concentration of the basic component at the moment of the $$j$$-th observation. It is assumed that the distribution of the first component is unknown, while a parametric model is (perhaps) available for the second component.
Applying the sieve maximum likelihood method the authors construct histogram-type estimators $$\tilde{h}_i^N(x)$$ for the densities $$h_i(x),i=1,2,$$ of distributions of the components in the class of functions $$g_k(x)=\sum_{k=1}^{K_N}g_k{\mathbb I}\{x\in A_k\},$$ where $$K_N$$ is the amount of subintervals of the partition $$A_k$$, $A_k=[t_{k-1},t_k), \quad k=1,2,\dots,K_n-1,\qquad A_{K_N}= [t_{K_N-1},t_{K_N}], \quad t_k=k/K_N,$ $$g_i$$ are positive numbers such that $$K_n^{-1}\sum_{k=1}^{K_N}g_k=1$$. They prove the consistency of the estimators and obtain estimates for the rate of convergence.

### MSC:

 62G07 Density estimation 62G20 Asymptotic properties of nonparametric inference
Full Text: