Principled Algorithms for Finding Local Minima

Duke Computer Science/Mathematics Colloquium
Speaker Name
Oliver Hinder
Date and Time
Lunch served at 11:45 am.

Convex optimization is the cornerstone of continuous optimization, but many real problems are nonconvex: deep learning, optimizing drinking water networks, etc. This two part talk explores my work developing efficient algorithms for finding local minima of nonconvex functions.

The first part discusses my work with Yair Carmon, John Duchi and Aaron Sidford, characterizing the computational time required to find approximate local minima. The complexity of minimizing unconstrained convex functions was first characterized by Yudin and Nemirovski, who together proved lower bounds, and Nesterov, who proved upper bounds. Our work is the analog for finding local minima; we give the first 'accelerated' gradient-based methods and almost tight lower bounds for finding approximate local minima.

The second part of this talk focuses on my work with my advisor Yinyu Ye, developing an interior point method (IPM) for optimization problems with nonconvex constraints. We develop an IPM that, by careful initialization and updates of the slack variables, is guaranteed to find a first-order certificate of local infeasibility, local optimality or unboundedness of the (shifted) feasible region. It has the desirable property of avoiding a typical two-phase or penalty method. Comparison on test sets with a state-of-the-art IPM (IPOPT) indicates similar overall performance but much better infeasibility detection.

Short Biography

Oliver is a Ph.D. candidate in the Department of Management Science and Engineering at Stanford, advised by Professor Yinyu Ye. He develops efficient algorithms for finding local optima to continuous nonconvex optimization problems. Roughly speaking, his research is split into interior point methods and first-order methods. The former targets operations research problems suited to second-order methods (e.g., optimal power flow, drinking water network optimization) and the latter is motivated by large-scale machine learning problems (e.g., deep learning). His interests outside of nonconvex optimization include market design, integer programming, machine scheduling, and machine learning.

Xiaobai Sun & Jianfeng Lu