Motivation and Overview of Contribution Related work on Inexact Algorithms

-

English
90 pages
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Description

Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization Mark Schmidt, Nicolas Le Roux, Francis Bach INRIA - SIERRA Project - Team Laboratoire d'Informatique de l'Ecole Normale Superieure (CNRS/ENS/UMR 8548) December 2011 Mark Schmidt, Nicolas Le Roux, Francis Bach Convergence Rates of Inexact Proximal-Gradient Methods

  • convergence rates

  • laboratoire d'informatique de l'ecole normale

  • accelerated gradient

  • optimization problems

  • inexact proximal-gradient

  • optimization numerical

  • composite convex


Sujets

Informations

Publié par
Nombre de lectures 16
Langue English
Poids de l'ouvrage 1 Mo
Signaler un problème
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem
Convergence Meth
Rates ods for
of Inexact Proximal-Gradient Convex Optimization
Mark Schmidt, Nicolas Le
Roux, Francis
Bach
INRIA - SIERRA Project - Team ´ LaboratoiredInformatiquedelEcoleNormaleSup´erieure (CNRS/ENS/UMR 8548)
December 2011
Mark Schmidt, Nicolas Le Roux, Francis Bach
Convergence Rates of Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem
Outline
1
2
3
4
Motivation
Related
and
Overview
work on
Inexact
of
Contribution
Algorithms
Convergence Rates for Convex Optimization
Numerical
Experiments
on
a
Mark Schmidt, Nicolas Le Roux, Francis Bach
Structured
Sparsity
Problem
Convergence Rates of Inexact Proximal-Gradient Methods
hare convexbuthis
x xmiRndf( )
non-smooth.
gand
where
Problems
Composite
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Convex Optimization
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem
problems
:=g(x) +h(x),
composite
optimization
consider
We
Convergence Rates of Inexact Proximal-Gradient Methods
Mark Schmidt, Nicolas Le Roux, Francis Bach
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems
We considercompositeoptimization problems
xmiRndf(x) :=g(x) +h(x),
wheregandhare convexbuthis non-smooth. Typically,gis adata-fittingterm, andhis aregularizer,
N miRndXli(x) +λr(x) xi=1
The most well-studied example is`1-regularized least squares,
miRndkAxbk2+λkxk1. x
Mark Schmidt, Nicolas Le Roux, Francis Bach
Convergence Rates of Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Fast Convergence Rates of Proximal-Gradient Methods
We considercompositeoptimization problems
min +h(x), xRdf(x) :=g(x)
wheregandhare convexbuthis non-smooth.
Mark Schmidt, Nicolas Le Roux, Francis Bach Convergence Rates of Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Fast Convergence Rates of Proximal-Gradient Methods
We considercompositeoptimization problems
xmindf(x) :=g(x) +h(x), R
wheregandhare convexbuthis non-smooth. Convergence ratesof methods for composite optimization:
Algorithm Sub-Gradient
Mark Schmidt, Nicolas Le Roux, Francis Bach
Convex O(1/k)
Strongly Convex O(1/k)
Convergence Rates of Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Fast Convergence Rates of Proximal-Gradient Methods
We considercompositeoptimization problems
mindf(x) :=g(x) +h(x), xR
wheregandhare convexbuthis non-smooth. Convergence ratesof methods for composite optimization:
Algorithm Sub-Gradient Proximal-Gradient
Mark Schmidt, Nicolas Le Roux, Francis Bach
Convex O(1/k) O(1/k)
Strongly Convex O(1/k) O((1γ)k)
Convergence Rates of Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Fast Convergence Rates of Proximal-Gradient Methods
We considercompositeoptimization problems
mindf(x) :=g(x) +h(x), xR
wheregandhare convexbuthis non-smooth. Convergence ratesof methods for composite optimization:
Algorithm Sub-Gradient Proximal-Gradient Accelerated Proximal-Gradient
Mark Schmidt, Nicolas Le Roux, Francis Bach
Convex O(1/k) O(1/k) O(1/k2)
Strongly Convex O(1/k) O((1γ)k) O((1− √γ)k)
Convergence Rates of Inexact Proximal-Gradient Methods
Composite Convex Optimization Problems Gradient, Accelerated Gradient, and Proximal-Gradient Inexact Proximal-Gradient Methods
Motivation and Overview of Contribution Related work on Inexact Algorithms Convergence Rates for Convex Optimization Numerical Experiments on a Structured Sparsity Problem Fast Convergence Rates of Proximal-Gradient Methods
We considercompositeoptimization problems
xmiRndf(x) :=g(x) +h(x),
wheregandhare convexbuthis non-smooth. Convergence ratesof methods for composite optimization:
Algorithm Convex Sub-GradientO(1/k) Proximal-GradientO(1/k) Accelerated Proximal-GradientO(1/k2)
Strongly Convex O(1/k) O((1γ)k) O((1− √γ)k)
Proximal-gradient methods have thesame convergence rates as [accelerated] gradient methods for smooth optimization. [Nesterov, 2007, Beck & Teboulle, 2009]
Mark Schmidt, Nicolas Le Roux, Francis Bach Convergence Rates of Inexact Proximal-Gradient Methods