Multiplicative updates nonnegative quadratic programming. (English) Zbl 1161.90456

Summary: Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the objective function at each iteration and converge monotonically to the global minimum. The updates have a simple closed form and do not involve any heuristics or free parameters that must be tuned to ensure convergence. Despite their simplicity, they differ strikingly in form from other multiplicative updates used in machine learning. We provide complete proofs of convergence for these updates and describe their application to problems in signal processing and pattern recognition.


90C20 Quadratic programming
90C25 Convex programming
Full Text: DOI


[1] DOI: 10.1121/1.382599
[2] DOI: 10.1214/aoms/1177692379 · Zbl 0251.62020
[3] Dempster A. P., Journal of the Royal Statistical Society B 39 pp 1– (1977)
[4] DOI: 10.1111/j.1365-2966.2007.11380.x
[5] DOI: 10.1006/inco.1996.2612 · Zbl 0872.68158
[6] DOI: 10.1038/44565 · Zbl 1369.68285
[7] DOI: 10.1109/78.650102
[8] DOI: 10.1080/10556780512331318182 · Zbl 1072.90026
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.