Accelerated Bregman Proximal Gradient Methods

Établi : September 1, 2018

A Python package of accelerated first-order algorithms for solving relatively-smooth convex optimization problems

minimize { f(x) + P(x) | x in C }

with a reference function h(x), where C is a closed convex set and

  • h(x) is convex and essentially smooth on C;
  • f(x) is convex and differentiable, and L-smooth relative to h(x), that is, f(x)-L*h(x) is convex;
  • P(x) is convex and closed (lower semi-continuous).

 

Publication:

https://arxiv.org/abs/1808.03045 (opens in new tab)

Github repository:

https://github.com/Microsoft/accbpg (opens in new tab)