[개소24주년 초청세미나] Accelerated Gradient Methods for Large-scale Convex Optimization

When:
5월 23, 2018 @ 14:00 – 15:00
Where:
서울대학교 뉴미디어통신공동연구소 세미나 1실(132동 104호)
Categories:
  • 강사 김동환 박사 (Research Instructor, Dept. of Mathematics at Dartmouth College)
  • 일시 : 2018년 5월 23(오후 2
  • 장소 서울대학교 뉴미디어통신공동연구소 세미나 1

 

Abstract

Many modern applications such as machine learning, inverse problems, and control require solving large-dimensional optimization problems. First-order methods such as a gradient method are widely used to solve such large-scale problems, since their computational cost per iteration mildly depends on the problem dimension. However, they suffer from slow convergence rates, compared to second-order methods such as Newton’s method. Therefore, accelerating a gradient method has received a great interest in the optimization community, and this led to the development and extension of a conjugate gradient method, a heavy-ball method, and Nesterov’s fast gradient method, which we review in this talk. This talk will then present new proposed accelerated gradient methods, named optimized gradient method (OGM) and OGM-G, that have the best known worst-case convergence rates for smooth convex optimization among any accelerated gradient methods.

Biography

Dr. Kim is a research instructor in the Department of Mathematics at Dartmouth College.

He is working on image reconstruction algorithms for synthetic aperture radar (SAR) and ultrasound imaging.

BS: Dept. of EE, Seoul National University, 2009.

Ph.D.: Dept. of EE, Univ. of Michigan, 2014