Foundational Research Seminar
Three Vignettes from Optimization and Sampling
The talk will cover two complexity-theoretic results from optimization. We will review work on an accelerated algorithm (à la Nesterov) with optimal (up to poly-logs) convergence rate for optimizing functions with smooth higher-order derivatives. We will also talk about the parallel complexity of minimizing non-smooth functions, where K parallel queries to a gradient oracle are allowed to be submitted in one round. A parallel algorithm will be given that improve upon the state-of-the-art, along with a lower bound characterizing when a parallel oracle could not yield performance gain compared to its fully sequential counterpart (i.e., K=1).
Finally, and for a slight change of scenery for more practical algorithms, we will address recent work on Langevin-based sampling algorithm that is reminiscent of mirror-descent algorithm from optimization, which allows one to adapt to arbitrary geometry for the potential function one wishes to sample from.