Event № 77
The alternating direction method with multipliers (ADMM) is one of the most powerful and successful methods for solving various convex or nonconvex composite problems that arise in the fields of image and signal processing and machine learning. In the convex setting, numerous convergence results have been established for the ADMM as well as its varieties. However, there is much less study of the convergence properties of the ADMM in the nonconvex framework. In this talk we study the Bregman modification of ADMM (BADMM), which includes the conventional ADMM as a special case and can significantly improve the performance of the algorithm. Under some assumptions, we show that the iterative sequence generated by the BADMM converges to a stationary point of the associated augmented Lagrangian function. The obtained results underline the feasibility of the ADMM in applications in nonconvex settings. [This is a joint work with Fenghui Wang and Zongben Xu.]
Please note unusual day, time and place!