The author investigates a general method to approximate linear functionals on spaces of multivariate functions, namely the so-called moving least-squares method. Given a data set $\{L_i(f) \}_{i=1}^l$ (for example, $f(x_i)$ for sample points $x_i\in \bbfR^d$), an unknown functional $L(f)$ (for example, $f(x)$ for another point $x$) is approximated by $\sum_{i=1}^l a_iL_i(f)$, where the coefficients are obtained by minimizing the quadratic form $\sum_{i=1}^l w(L,L_i)a_i^2$ under the linear constraints $\sum_{i=1}^l a_iL_i(p_j)$, $j=1,\dots,J$. This means that the elements $p_j$ (typically polynomials of low degree) are reproduced, while the nonnegative weight $w(L,L_i)$ is a separation measure between the functionals $L_i$ and $L$ (such as a radial function of the distance of the sample point $x_i$ from $x$). It is shown how this constrained minimization can be solved in a matrix formulation. For the multivariate interpolation problem and for specific choices of the weights, $C^\infty$-smoothness in $x$ and the approximation order on suitable data sets are established. The paper concludes with several numerical examples, including some in a bi- and trivariate setting as well as with data-dependent separation measures.