Event № 800
Advisors: Dan Garber and Sabach Shoham
Abstract: Composite convex optimization problems that include a low-rank promoting term have important applications in data and imaging sciences. However, such problems are highly challenging to solve in large-scale: the low-rank promoting term prohibits efficient implementations of proximal based methods and even simple subgradient methods are very limited. On the other hand, methods which are tailored for low-rank optimization, such as conditional gradient-type methods, are usually slow. Motivated by these drawbacks, we present new algorithms and complexity results for some optimization problems in this class. At the heart of our results is the idea of using a low-rank SVD computations in every iteration. This talk is based on joint works with Dan Garber and Shoham Sabach.