当前位置: 首页  >  学术活动

学术活动

New perspectives for increasing efficiency of optimization schemes【学术报告】

发布日期:2017年08月29日

  报告题目:New perspectives for increasing efficiency of optimization schemes

  报告人:Yurri. Nesterov (CORE, UCL, Belgium)

  时间: 2017年8月31日(星期四) 下午14:30-15:30

  地点: 北京理工大学中关村校区研究生楼103

  摘要:

  In this talk we prove a new complexity bound for a variant of Accelerate Coordinate Descent Method. We show that this method always outperform the standard Fast Gradient Methods on many optimization problems with dense data. As application examples, we consider unconstrained convex quadratic minimization problem, and the problems arising in Smoothing Technique. On some special problem instances, the provable acceleration factor can reach the square root of number of variables. Joint work with S.Stich (EPFL)

  个人简介:

  2000年获得Dantzig Prize,2009年获得 John von Neunman Theory Prize, 2016年获得Euro Gold Medal。

  Yurri. Nesterov is a Russian mathematician, an internationally recognized expert in convex optimization, especially in the development of efficient algorithms and numerical optimization analysis. He is currently a professor at the Université catholique de Louvain (UCL).

  In 1977, Yurii Nesterov graduated in applied mathematics at Moscow State University. From 1977 to 1992 he was a researcher at the Central Economic- Mathematical Institute of the Russian Academy of Sciences. Since 1993, he has been working at UCL, specifically in the Department of Mathematical Engineering from the Polytechnic School of Louvain, Center for Operations Research and Econometrics.

  Nesterov is most famous for his work in convex optimization, including his 2004 book, considered a canonical reference on the subject. His main novel contribution is an accelerated version of gradient descent that converges considerably faster than ordinary gradient descent.

  His work with Arkadi Nemirovski in the 1994 book is the first to point out that interior point method can solve convex optimization problems, and the first to make a systematic study of semidefinite programming (SDP). Also in this book, they introduced the self-concordant functions which are useful in the analysis of Newton's method.