# zbMATH — the first resource for mathematics

Generalized moment problem in vector lattices. (English) Zbl 0948.44003
The classical theorem on the moment problem going back to F. Hausdorff [Math. Z. 16, 220-248 (1923; JFM 49.0193.01)] states that a given real sequence $$a_{k}\in\mathbb{R}$$ is a sequence of moments of some nondecreasing function $$g\in V(0,1)$$ of bounded variation, i.e., $$a_{k}=\int_{0}^{1}t^{k} dg(t)$$, if and only if the sequence $$a_{n}$$ is completely monotone, i.e., $$\Delta ^{n}a_{k}:=\sum_{j=0}^{n}(-1)^{j}\binom{n}{j}a_{k+j}\geq 0$$. The authors generalise this result to the case where $$a_{k}$$ is the sequence of elements in an ordered vector space $$V$$, more precisely in $$\sigma$$-complete weakly $$\sigma$$-distributive vector lattice $$V$$ which satisfies two conditions: any interval in $$V$$ is sequentially order-compact; every chain in $$V$$ is at most countable. This is similar to, but distinct with results of H. H. Schaefer [Math. Ann. 146, 325-330 (1962; Zbl 0102.09905)]: neither contains the other. From this result, the authors derive an integral representation theorem for positive linear operators $$L:C(0,1)\rightarrow V$$, which is analogous to the real case.

##### MSC:
 44A60 Moment problems 46A40 Ordered topological linear spaces, vector lattices
Full Text: