Article,

A family of variable metric proximal methods

, , , and .
Mathematical Programming, 68 (1): 15--47 (Jan 1, 1995)
DOI: 10.1007/BF01585756

Abstract

We consider conceptual optimization methods combining two ideas: the Moreau---Yosida regularization in convex analysis, and quasi-Newton approximations of smooth functions. We outline several approaches based on this combination, and establish their global convergence. Then we study theoretically the local convergence properties of one of these approaches, which uses quasi-Newton updates of the objective function itself. Also, we obtain a globally and superlinearly convergent BFGS proximal method. At each step of our study, we single out the assumptions that are useful to derive the result concerned.

Tags

Users

  • @kirk86

Comments and Reviews