Events

IFML Seminar

IFML Seminar: 09/12/25 - Accelerating Nonconvex Optimization via Online Learning

Aryan Mokhtari, Associate Professor, ECE Department, UT Austin, and Visiting Faculty Researcher at Google Research

-

The University of Texas at Austin
Gates Dell Complex (GDC 6.302)
2317 Speedway
Austin, TX 78712
United States

Event Registration
Aryan Mokhtari

Abstract: A fundamental problem in optimization is finding an ε-first-order stationary point of a smooth function using only gradient information. The best-known gradient query complexity for this task, assuming both the gradient and Hessian of the objective function are Lipschitz continuous, is O(ε^−7/4). In this talk, I present a method with a gradient complexity of O(d^1/4 ε^−13/8), where d is the problem dimension—yielding improved complexity when d= O(ε^−1/2). The proposed method builds on quasi-Newton ideas and operates by solving two online learning problems under the hood. This talk is based on the following STOC paper: https://dl.acm.org/doi/pdf/10.1145/3717823.3718308 

Bio: Aryan Mokhtari is an Associate Professor in the ECE Department of UT Austin, where he holds the Fellow of William W. Hagerty. He is also currently a Visiting Faculty Researcher at Google Research. Before joining UT Austin, he was a Postdoctoral Associate in the Laboratory for Information and Decision Systems (LIDS) at MIT. Prior to that, he was a Research Fellow at the Simons Institute. He received his Ph.D. in Electrical and Systems Engineering from the University of Pennsylvania (Penn). He has received multiple awards, including the NSF CAREER Award, the Google Research Scholar Award, the Army Research Office (ARO) Early Career Program Award, the Simons-Berkeley Research Fellowship, the UT Austin ECE Junior Faculty Excellence in Teaching Award, and Penn’s Joseph and Rosaline Wolf Award for Best Doctoral Dissertation.

Event Registration