An Introduction to Conditional Random Fields for Relational Learning
48 pages
English

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris

An Introduction to Conditional Random Fields for Relational Learning

-

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus
48 pages
English
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Description

Niveau: Supérieur, Doctorat, Bac+8
1 An Introduction to Conditional Random Fields for Relational Learning Charles Sutton Department of Computer Science University of Massachusetts, USA Andrew McCallum Department of Computer Science University of Massachusetts, USA 1.1 Introduction Relational data has two characteristics: first, statistical dependencies exist between the entities we wish to model, and second, each entity often has a rich set of features that can aid classification. For example, when classifying Web documents, the page's text provides much information about the class label, but hyperlinks define a relationship between pages that can improve classification [Taskar et al., 2002]. Graphical models are a natural formalism for exploiting the dependence structure among entities. Traditionally, graphical models have been used to represent the joint probability distribution p(y,x), where the variables y represent the attributes of the entities that we wish to predict, and the input variables x represent our observed knowledge about the entities. But modeling the joint distribution can lead to difficulties when using the rich local features that can occur in relational data, because it requires modeling the distribution p(x), which can include complex dependencies. Modeling these dependencies among inputs can lead to intractable models, but ignoring them can lead to reduced performance. A solution to this problem is to directly model the conditional distribution p(y|x), which is sufficient for classification.

  • distribution over

  • unlike linear-chain

  • variable node

  • entity labels

  • underlying entity

  • input can

  • output variable

  • chain crf

  • conditional random


Sujets

Informations

Publié par
Nombre de lectures 62
Langue English

Extrait

Intro?l'inf?renceTLIFUniversit?ellierIsabO,elled'Orl?ansOutline1.IntropyspLea:rninghistoandertiesmotivationsCG2.5.Theblea6.rnabilittheiryrop4.rdingrningtobGoldgeneralization3.LeaCategoCGrialyGrammaecializationrsConclusion=⇒
:histomotivations(Solomono,fowLeaexistsrnabilitinyofnaturallanguagesandotherossible2,fothingsofintheinnate1960iesto:grammareseaabdomainout:naturalitlanguageacquisitione6...bolmogoehavioneedriststhernabilitthehemindthereasanahumanblackybacquireormalxrs:rstlearchsrningtheresultsoffrominferenceho(stimulus-respisonse)ptoChomskyaalikrgues0,ab4,out?theKprov...)IntroovertytoofrmalizethenotionleastimulusyOutline1.IntropyspLea:rninghistoandertiesmotivationsCG2.5.Theblea6.rnabilittheiryrop4.rdingrningtobGoldgeneralization3.LeaCategoCGrialyGrammaecializationrsConclusione e e e e
1 2 3 4 5
↓ ↓ ↓ ↓ ↓
G G G G G
1 2 3 4 5
ones?)bidentifyoneelongingleato...arninglanguagemembtainputsrgetrding:arithmfoofrmalgrammaprageneratingthisrninglanguageGold...lea(andrnabilitleayalgotoanyofitsgrammaersrstheandrningnotroaissinglenever-ending:rrect:NLfolearrectrrequirementsisGeneralleatornableifytherernabilitexistsleaaoneThea...G
L(G) G∈G
φ G
∀G∈G
∀{e } L(G)={e }
i i
i∈N i∈N
′ ′
∃G ∈G L(G)=L(G)

∃n ∈N ∀n>n φ({e ,...,e })=G ∈G
n
0 0 1
G φ G
G
lealearnabilitdenotesyifrding:tornablewiththatisalgoGoldlanguageLea67)rnabilitrsyistheinexiststhernslimitnofromexists,of:theif(Goldprns:grammaleaofositivelearithminalgolimitrnertherelea(set)theleaexamplesifmodelrithmofwithanotThernableΣ ={a}
Σ L

L∪{a }
aaa a aaaaaaaaaaa a aa

a

a

=⇒ L∪{a }
nitelanguagerithmrithmplusneverat,leastniteaneveryinniteositiveone,aitisgeneratonotneverlearithmrnabley:examplewith:enumerationlet,withistheresultsosesFirstofGolditthethesetosesofiteverybutnitealanguagenegativeonptoisalgordingstupidwiththernabletalearget,...ifisalgoythernabilitrleaaositivelanguage,generateswillndisenumerableleaiffromalgoofexamples:,amaifovergeneralize:will,only,examplesexamplesandeveryositivepletnotarnablepTheexamplesThelearnabilitytherAngluinrdingrnableto80iesGoldrsProblemsdenitionandhieraheritageexamplesofdenitionGold'sresultsdenitionrnableChomskyaaleathebiseppneglectedrovedtime,leainrnableinterestingwithout:explicitelyofpofrovidingtoarchyleaKanazarningalgoorithmalgo(aindefaultChomskyenumeratingrchyoneleaisfromenough)ositivenofoayGold'srevivedistherequiredfonewrtheleadenitionrningleaprogrammatransversalthehiera:of80,thewrst98resultthe:ofnoneriginalofrningtherithmsThelearnabilittowholeyitrdingtotoopGoldtheTheymainothesistwuntiloaprossibleelongsstrategiesrgetrningavailableecializationdatainitial:isargetsettheofthispisositivegrammaexamplesgeneralization,eratotheuntiltabrgettotalealearningbbspy:generalizationthe:hypspacebuildtheataleastgeneralusegrammaexamplesrgeneratingspacetheitexamplesoneapplyrOutline1.IntropyspLea:rninghistoandertiesmotivationsCG2.5.Theblea6.rnabilittheiryrop4.rdingrningtobGoldgeneralization3.LeaCategoCGrialyGrammaecializationrsConclusion

  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents