Projected quasi Newton methods for constrained and non smooth optimization
25 pages
English

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris

Projected quasi Newton methods for constrained and non smooth optimization

-

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus
25 pages
English
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Description

1 Projected Newton-type Methods in Machine Learning Mark Schmidt University of British Columbia Vancouver, BC, V6T 1Z4 Dongmin Kim University of Texas at Austin Austin, Texas 78712 Suvrit Sra Max Planck Insitute for Biological Cybernetics 72076, Tubingen, Germany We consider projected Newton-type methods for solving large-scale optimiza- tion problems arising in machine learning and related fields. We first intro- duce an algorithmic framework for projected Newton-type methods by review- ing a canonical projected (quasi-)Newton method. This method, while concep- tually pleasing, has a high computation cost per iteration. Thus, we discuss two variants that are more scalable, namely, two-metric projection and in- exact projection methods. Finally, we show how to apply the Newton-type framework to handle non-smooth objectives. Examples are provided through- out the chapter to illustrate machine learning applications of our framework. 1.1 Introduction We study Newton-type methods for solving the optimization problem min x f(x) + r(x), subject to x ? ?, (1.1)

  • projected newton methods

  • methods

  • memory broyden-fletcher- goldfarb-shanno

  • hessian

  • davidson-fletcher-powell

  • newton method

  • powell- symmetric-broyden


Sujets

Informations

Publié par
Nombre de lectures 22
Langue English

Extrait

1ProjectedNewton-typeMethodsinMachineLearningschmidtmarkw@gmail.comdmkim@cs.utexas.eduMarkSchmidtUniversityofBritishColumbiaVancouver,BC,V6T1Z4DongminKimUniversityofTexasatAustinAustin,Texas78712SuvritSrasuvrit@tuebingen.mpg.deMaxPlanckInsituteforBiologicalCybernetics72076,Tu¨bingen,GermanyWeconsiderprojectedNewton-typemethodsforsolvinglarge-scaleoptimiza-tionproblemsarisinginmachinelearningandrelatedfields.Wefirstintro-duceanalgorithmicframeworkforprojectedNewton-typemethodsbyreview-ingacanonicalprojected(quasi-)Newtonmethod.Thismethod,whileconcep-tuallypleasing,hasahighcomputationcostperiteration.Thus,wediscusstwovariantsthataremorescalable,namely,two-metricprojectionandin-exactprojectionmethods.Finally,weshowhowtoapplytheNewton-typeframeworktohandlenon-smoothobjectives.Examplesareprovidedthrough-outthechaptertoillustratemachinelearningapplicationsofourframework.1.1IntroductionWestudyNewton-typemethodsforsolvingtheoptimizationproblemminf(x)+r(x),subjecttox,x1.1()
2ProjectedNewton-typeMethodsinMachineLearningwheref:RnRistwicecontinuouslydierentiableandconvex;r:RnRiscontinuousandconvexbutnotnecessarilydierentiableeverywhere;whileisa“simple”convexconstraint-set.Thisformulationisgeneralandcapturesnumerousproblemsinmachinelearning,especiallywherefcorrespondstoaloss,andrtoaregularizer.Letus,however,deferconcreteexamplesof(1.1)untilwehavedevelopedsometheoreticalbackground.Weproposetosolve(1.1)viaNewton-typemethods,acertainclassofsecond-ordermethodsthatareknowntooftenworkwellforunconstrainedproblems.Forconstrainedproblemstoo,wemayconsiderNewton-typemethodsthat,akintotheirunconstrainedversions,iterativelyminimizeaquadraticapproximationtotheobjective,thistimesubjecttoconstraints.ThisideadatesbacktoLevitinandPolyak(1966,§7),anditisreferredtoasaprojectedNewtonmethod.ProjectedNewtonmethodsforoptimizationoverconvexsetssharemanyoftheappealingpropertiesoftheirunconstrainedcounterparts.Forexample,theiriterationsareguaranteedtoimprovetheobjectivefunctionforasmallenoughstepsize;globalconvergencecanbeshownunderavariantoftheArmijocondition;andrapidlocalconvergenceratescanbeshownaroundlocalminimasatisfyingstrongconvexity(Bertsekas,1999).Inasimilarvein,wemayconsiderprojectedquasi-Newtonmethods,whereweinterpolatedierencesinparameterandgradientvaluestoapproximatetheHessianmatrix.TheresultingNewton-typemethodsarethesubjectofthischapter,andwewillparticularlyfocusonthelimited-memoryBroyden-Fletcher-Goldfarb-Shanno(L-BFGS)quasi-Newtonapproximation.ThemainappealoftheL-BFGSapproximationisitslinear-timeiterationcomplexity,anditsstrongempiricalperformanceonavarietyofproblems.Summaryofremainder.Wefirstrestrictourselvestosmoothoptimiza-tion,wherer(x)=0.Forthissetting,wedescribeprojectedNewton-typemethods(§1.2),coveringbasicimplementationissuessuchasHessianap-proximationandline-search.Then,wedescribetwo-metricprojectionmeth-ods(§1.3),followedbyinexact(ortruncated)projected-Newtonmethods(§1.4).Finally,wediscussthenonsmoothsetting,wherer(x)￿=0,forwhichwedescribetwoNewton-typemethods(§1.5).1.2ProjectedNewton-typeMethodsProjectedNewton-typemethodsoptimizetheirobjectiveiteratively.Atiterationk,theyfirstapproximatetheobjectivefunctionaroundthecurrent
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents