Speaker: 

John Esser

Institution: 

UCLA Department of Mathematics

Time: 

Monday, September 28, 2009 - 4:00pm

Location: 

RH 306

In this talk, based on joint work with Xiaoqun Zhang and Tony
Chan, I will discuss some generalizations and extensions of the
primal-dual hybrid gradient (PDHG) algorithm proposed by Zhu and Chan. The
PDHG method applied to a saddle point formulation of a convex minimization
problem proceeds by alternating proximal steps that maximize and minimize
penalized forms of the saddle function. This can be useful for producing
explicit algorithms for large non-differentiable convex problems, and a
slight modification to the method can be made to guarantee convergence.
I will mainly focus on the connections to related algorithms including
proximal forward backward splitting, split Bregman and split inexact Uzawa
methods. For the problem of minimizing sums of convex functionals
composed with linear operators, I will show how to use operator splitting
techniques that allow the modified PDHG method to be effectively applied.
Specific applications to constrained TV deblurring and compressive sensing
problems will be presented.