The author presents an algorithm for controller design achieving global optimal performance in the sense of $l_1$-norm minimization [{\it M. A. Dahleh}, {\it J. B. Pearson} and {\it J. Boyd, jun}, IEEE Trans. Autom. Control AC-32, 314-322 (1987;

Zbl 0622.93041)] for discrete-time linear time-invariant systems with structured time-varying or nonlinear uncertainties with induced $l_\infty$ norm bounds. The algorithm is based on the equivalence of the $l_1$ optimization problem and an infinite-dimensional linear programming problem which could be solved by approximation in the form of the FMV (finitely many variable) or the FME (finitely many equations) method. The main difference between the linear programs arising from the standard $l_1$ problem and those proposed in the paper is the dependence of the problem on the scaling parameter which itself is an optimization parameter. The approach proposed by the author is to use postoptimal sensitivity analysis to reduce the problem to a finite set of solvable problems followed by the minimization over all values of the scaling parameters. The algorithm seems to be simple and leads to the global optimum of the approximated problem. Unfortunately, the example presented in the paper is rubbish. A nominal plant considered by the author is the first order lag compensator. The use of the sophisticated algorithm to ensure robust performance even in the presence of severe uncertainties for such a plant seems to be out of sense.