Event № 377
Classical primal-dual based methods have recently attracted a revived interest for solving structured convex minimization problems arising in signal/image processing and machine learning. In this talk, we focus on the interplay between optimization problems, their saddle point representation, and global rate of convergence analysis of these methods. In particular, we introduce a new algorithm for a class of saddle point problems which allows to efficiently address an important class of convex models. We prove its global rate of convergence. Numerical examples illustrate its relevance and performance.This is joint work with Yoel Drori and Marc Teboulle.