2021

Boris Polyak is selected as the winner of the 2021 INFORMS Optimization Society Khachiyan Prize

Citation

Boris Polyak (Institute for Control Science, Russian Academy of Sciences) occupies a unique place in both Optimization and Automatic Control. Over the last 60 years he made fundamental contributions which have left a lasting impact in both fields.

In the early 1960s, when the field of Optimization was born, Boris’s work on the gradient, projection gradient, heavy ball, and conjugate gradient methods, as well as Newton method for unconstrained minimization and for minimization with nonlinear equality constraints made a decisive impact on the young field and secured his leading role in Optimization. Furthermore, important results in non-smooth optimization, methods for solving ill posed, large scale linear and quadratic programming problems were just a few of Boris’s contributions in this decade, which was incredible for Optimization. In the 1960s he not only developed and analyzed several methods for constrained and unconstrained optimization, but established new exposition standards where, along with existence and convergence properties of an algorithm, an analysis of its rate of convergence became a must. It should be stressed that the theory of gradient type first order methods for large-scale convex optimization, with its numerous extensions and its wide spectrum of current applications to Signal Processing and Machine Learning takes its origin in the pioneering papers of N. Shor and B. Polyak. To give a single example: in 1962 he discovered and analyzed the Heavy Ball method, a precursor to Fast Gradient algorithms widely used in today Machine Learning and the first implementation of a “gradient method with momentum,” now the work horse of Deep Neural Net optimization. It is amazing that now, almost 60 years later, Boris continues to bring new important contributions in the field of Optimization.

Boris is always looking for new problems with serious mathematical content and practical importance, and in his 85 years remains as active and nearly as productive as 40 years ago. In the past few years, he productively worked on the problem of robust principal component analysis, Monte-Carlo techniques in optimization, conditional gradient algorithm, adaptive Newton method,  sparse optimal control, mathematical aspects of search engines to name just a few.

Prize committee

Alex Shapiro (chair), Dimitris Bertsimas, Gérard P. Cornuéjols, Stephen J. Wright