Optimizing Costly Functions with Simple Constraints: A Limited Memory Projected Quasi Newton Algorithm
86 pages
English

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris

Optimizing Costly Functions with Simple Constraints: A Limited Memory Projected Quasi Newton Algorithm

-

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus
86 pages
English
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Description

Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm Mark Schmidt, Ewout van den Berg, Michael P. Friedlander, and Kevin Murphy Department of Computer Science University of British Columbia April 18, 2009

  • limited-memory projected

  • murphy optimizing

  • van den

  • ewout van den

  • optimizing costly

  • introduction pqn


Sujets

Informations

Publié par
Nombre de lectures 48
Langue English
Poids de l'ouvrage 1 Mo

Extrait

Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
Mark Schmidt, Ewout van den Berg, Michael P. Friedlander, and Kevin Murphy
Department of Computer Science University of British Columbia
April 18, 2009
Outline
1
2
3
4
Introduction PQN Algorithm Experiments Discussion
Introduction Motivating Problem Our Contribution
PQN
Algorithm
Experiments
Discussion
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Motivating Problem Our Contribution
Optimizing Costly Functions with Simple Constraints
Introduction PQN AlgorithmMotivating Problem ExperimentsOur Contribution Discussion
Motivating Problem: Structure Learning in Discrete MRFs
We want to fit aMarkov random fieldto discrete datay, but don’t know the graph structure
We can learn a sparse structure by using`1-regularization of the edge parameters [Wainwright et al. 2006, Lee et al. 2006] Since each edge has multiple parameters, we usegroup `1-regularization [Bach et al. 2004, Turlach et al. 2005, Yuan & Lin 2006]: minimizelogp(y|w) subject toX||we||2τ w e
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Optimizing Costly Functions with Simple Constraints
Introduction PQN AlgorithmMotivating Problem ExperimentsOur Contribution Discussion
Motivating Problem: Structure Learning in Discrete MRFs
We want to fit aMarkov random fieldto discrete datay, but don’t know the graph structure
We can learn a sparse structure by using`1-regularization of the edge parameters [Wainwright et al. 2006, Lee et al. 2006] Since each edge has multiple parameters, we usegroup `1-regularization [Bach et al. 2004, Turlach et al. 2005, Yuan & Lin 2006]: wX|| minimizelogp(y|w to) subjectwe||2τ e
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Optimizing Costly Functions with Simple Constraints
Introduction PQN AlgorithmMotivating Problem ExperimentsOur Contribution Discussion
Motivating Problem: Structure Learning in Discrete MRFs
We want to fit aMarkov random fieldto discrete datay, but don’t know the graph structure
We can learn a sparse structure by using`1-regularization of the edge parameters [Wainwright et al. 2006, Lee et al. 2006] Since each edge has multiple parameters, we usegroup `1-regularization [Bach et al. 2004, Turlach et al. 2005, Yuan & Lin 2006]: minimizelogp(y|w to) subjectX||we||2τ w e
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Optimizing Costly Functions with Simple Constraints
Introduction PQN AlgorithmMotivating Problem ExperimentsOur Contribution Discussion Optimization Problem Challenges
Solving this optimization problem has 3 complicating factors: 1the number of parameters islarge 2evaluating the objective isexpensive 3the parameters haveconstraints
So how should we solve it? Interior point methods: the number of parameters is toolarge Projected gradient: evaluating the objective is tooexpensive Quasi-Newton methods (L-BFGS): we haveconstraints
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Optimizing Costly Functions with Simple Constraints
Introduction PQN AlgorithmMotivating Problem ExperimentsOur Contribution Discussion Optimization Problem Challenges
Solving this optimization problem has 3 complicating factors: 1the number of parameters islarge 2evaluating the objective isexpensive 3the parameters haveconstraints
So how should we solve it? Interior point methods: the number of parameters is toolarge Projected gradient: evaluating the objective is tooexpensive Quasi-Newton methods (L-BFGS): we haveconstraints
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy
Optimizing Costly Functions with Simple Constraints
Motivating Problem Our Contribution
Introduction PQN Algorithm Experiments Discussion Extending the L-BFGS Algorithm
Quasi-Newton methods that useL-BFGSupdates achieve state of the art performance for unconstrained differentiable optimization [Nocedal 1980, Liu & Nocedal 1989]
L-BFGS updates have also been used for more general problems: L-BFGS-B: state of the art performance for bound constrained optimization [Byrd et al. 1995] OWL-QN: state of the art performance for`1-regularized optimization [Andrew & Gao 2007].
The above don’t apply since our constraints are not separable
However, the constraints are stillsimple: we can compute the projection inO(n)
M. Schmidt, E. van den Berg, M. Friedlander, and K. Murphy Optimizing Costly Functions with Simple Constraints
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents