La lecture en ligne est gratuite
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
Télécharger Lire

Learning dynamic bayesian network

31 pages
:::DynamicDataBaessingy:esianandNet4.3w:orksesian?netZoubin:Ghahramani:Departmen:t:of:Computer:ScienceersionUnivLearningersit:y:of:T6oron:tooT:oron:to:ON:Mariables3H5,:Canadasmohttpww:csLearningt:oroMntLectureo2atutorial:zoubin:zoubinsDynamictor:on:to:edStatepaceu:Octob:erExample1997:Abstract:Ba:y:esian:net:wMLorks:are:directedEstimationacyclic:graphsstatepacethat:represen:t:dep:endencies:b12etvw:eenAvinariablesdaptiveinoraInprobabilisti:cBamowdel:Man:y:time:se:ries3moydelsorksincluding:the:hidden:Mark:o:vExamplemodelsdels:MMs:used:in:sp7eecHiddenhmorecognition:and:Kalman:ter:mo4dels:used:in:tering:and:con:trol8apwithplications:can:b:e:view4.2edHiddenasEMexamples:of1:dynamicdelsBa:y:esian:net4.4w:orks:W:e:st:pro:vide:aExamplebriefMarktutorialdelson:learning:and14BadidyappesianGilesnetdswoorksTWInformationeinthenSpringerpresen:t2someAdynamicyBanetyorkesian:net:w:orks:that:can:capture:m:uc:h:ric:her3structureBathanesianHMMswand:Kalman:ters:including:spatial:and:temp:oral:m:ultiresolutio:n:structure3.1distributed1:hiddenmostate:represen:ta:tions:and:m:ultiple:switc:hing:linear:regimes3.2While2 ...
Voir plus Voir moins

Vous aimerez aussi

: : : Dynamic Data Ba essing y : esian and Net 4.3 w : orks esian ? net Zoubin : Ghahramani : Departmen : t : of : Computer : Science ersion Univ Learning ersit : y : of : T 6 oron : to o T : oron : to : ON : M ariables 3H5, : Canada smo httpww : cs Learning t : oro M nt Lecture o 2 a tutorial  : zoubin : zoubins Dynamic tor : on : to : ed Statepace u : Octob : er Example 1997 : Abstract : Ba : y : esian : net : w ML orks : are : directed Estimation acyclic : graphs statepace that : represen : t : dep : endencies : b 12 et v w : een A v in ariables daptive in or a In probabilisti : c Ba mo w del : Man : y : time : se : ries 3 mo y dels orks including : the : hidden : Mark : o : v Example mo dels dels : MMs : used : in : sp 7 eec Hidden h mo recognition : and : Kalman : ter : mo 4 dels : used : in : tering : and : con : trol 8 ap with plications : can : b : e : view 4.2 ed Hidden as EM examples : of 1: dynamic dels Ba : y : esian : net 4.4 w : orks : W : e : st : pro : vide : a Example brief Mark tutorial dels on : learning : and 14 Ba did y app esian Giles net ds w o orks T W Information e in then Springer presen : t 2 some A dynamic y Ba net y ork esian : net : w : orks : that : can : capture : m : uc : h : ric : her 3 structure Ba than esian HMMs w and : Kalman : ters : including : spatial : and : temp : oral : m : ultiresolutio : n : structure 3.1 distributed 1: hidden mo state : represen : ta : tions : and : m : ultiple : switc : hing : linear : regimes 3.2 While 2: exact Mark probabili v stic dels inference : is : in : tractable : in : these : net : w : orks 8 one Learning can Inference obtain : tractable : v : ari : ational : appro : ximations : whic : h : call : as : subroutines : the 4.1 forw Estimation ardac Complete kw : ard : and : Kalman : ter : recursions : These : appro : ximations 9 can ML b with e V used The to algorithm learn : the 10 mo Example del Learning parameters mo b : y : maximizing : a : lo : w : er : b 11 ound Kalman on othing the : lik : elih : o : o : d : T : able : of : Con : ten : ts : 1 : In 4.5 tro 2: duction hidden : o : mo : : : : : : : : : : : : : ? : mo : v : to : ear : C : and : Gori : A : Pr : c : of : emp : al : . : Notes : Artiial : telligence : erlag : : predictions sev inference The in forw probabilit ardac learning kw et ard Y algorithm temp : h : deling : series : y : a : prediction : a : loss : w : ariables : one : w : and : algorithm : b : as : data : true 15 expressing 5 can Bey via ond : T e ractable that Mo probabilistic dels mo : or : h : only : but : probabilities : a : Ba : wn : Section : net : some : Inference : er : ximate : is : limited : y : mo : y : erful : and : [23 : t 16 outcome 5.1 densit Example Y 3: t F y actorial e HMMs or : to : hapter : learning : W : Ba : probabilistic : w : graph : et : . : user : whic : as : computing : b : next : Ba : demonstrates : w : ing 16 as 5.2 o Example on 4: a T the ree Section structured dels HMMs m : mo : tractable : w : ds : can : learning : uncertain : from : of : and : h : een : and : cess : pro : o : ol 18 oth 5.3 t Example mo 5: W Switc the hing in State the space t mo probabilit dels P : +1 : ; : ; : Suc : y : then : to : oin : error : e : exp : ize : This 19 ts 6 ork Inference dels and ral In express tractabili using t esian y formalism : mo : elief : marriage : theory : in : endencies : een : expressed : graph : ws : understand : ariables : other : serv : bac : eien : and : ma : required : learning : pro 20 tutorial 6.1 esian Gibbs Section sampling use : esian : for : series : w : suc : Kalman : hidden : mo : fo : problem : parameters : y : ork : ectationaximization : , : describ : her : for : nonlinear : tiresolution : suc : ma : computationally : w : section : presen : tractable : ap 21 whic 6.2 e V basis ariational 4.6 Metho added ds t : resulting : the : size : our : set : an : mismatc : b : w : our : del : the : pro : Probabilit : theory : vides : p : w : to : for : b : randomness : uncertain : y : our 22 del 6.3 ]. Example e Mean express ld uncertain for y factorial our HMMs of : future : Y : +1 : a : y : y : ( : t : j : 1 : : : : : Y : ). 23 h 6.4 probabilit Example densit Structured can appro b ximation used for mak factorial p HMMs t : dee : bars : mak : decisions : are 25 ected 6.5 minim Con some v function ex c dualit presen y a : framew : for : mo : of : o : data : e : these : dels : the : y : net : ork : (a.k.a. : graphical : dels : b : net : orksa : of : y : and : theory : whic : dep : b : w : v 27 are 7 graphically Conclusion The : not : allo : the : to : whic : v : act : h : ones : also : es : the : kb : for : tly : marginal : conditional : that : y : e : for : and : The : section : vides : brief : of : y : net : orks : 3 : the 28 of 1 y In net tro orks duction mo Supp time ose includ w some e ellno wish examples to h build the a er mo the del Mark of v data del from 4 a cuses ite the sequence of of the ordered of observ Ba ations esian f w Y using 1 Exp ; M Y [3 2 10]. ; 5 : es : ric : mo ; appropriate Y time t with g or . ul In structure most in realistic h scenarios dels from y mo e deling in sto Ho c ev k in prices 6 to e ph t ysiological eral data metho the for observ pro ations inference are h not b related used deterministically the . for F 2 urthermore there W indep the A ted Ba B y ork esian ) net the w at ork from tutorial suc A a Ba Y y represen esian wn net (1) w is ork no is c simply B a A graphical of mo giv del often for 3 represen ) ting ork conditional t in net dep B endencies or b arc et net w deitions een A a arc set desc of on random no v in ariables An Consider des four the random tics v conditionally ariables , W no , b X v , Y Y ( , j and y Z w . factorization F v rom no basic directed probabilit A y conditioned theory join w t e ould kno Y w The that ting w Figure e will can t factor ar the there join B t of probabilit no y c as cte a is pro A duct eac of a conditional de probabilities d P a ( and W h ; paren X no ; y Y h ; from Z ts ) 2 = b P ariables ( conditional W no ) et P with ( X X P j = W j ) ( P ; ( A Y net j a W y ; a X a ) Eac P is ( y Z in j ork W is ; no X no ; B Y A ) of : distribution This to factorization factoriza do e es w not W tell not us Z an y ything ork useful factorization ab wn out Some the graph join e t p probabilit no y a dis of tribution B eac a h A v if ariable a can . p of oten are tially c dep and end dir on p ev to ery sequence other starting v ending ariable h Ho no w sequence ev t er wing consider the the e follo ath wing B factorization of P from ( in W that ; de X is ; or Y follo ; The Z a ) net = simple P de ( enden W nonescenden ) its P More ( w X there ) corresp P w ( and Y e j ab W endence ) w P meaning ( relations Z een j asso X no ; j Y ; ) ) : ( (1) ) The P ab W o Y v P e Z factorization X implies Y a : set Ba of esian conditional w indep is endence graphical relations a A to v t ariable particular r of set join of distribution v h ariables ariable A represen is b c a onditional de ly the indep w endent A from arc B dra giv from en de C to if de P if ( is A on B in j factorization C the ) t = F P example ( represen A the j tion C w ) w P dra ( an B from j to C but ) from for to all . A Ba , esian B w and represen C the suc (1) h sho that in P 1. ( basic C from ) theory 6 b = necessary 0. this F oin rom The the de ab is o p v ent e another factorization de w if e is can directed sho from w to that ; giv so en is the child v A alues The of endents X a and de Y its , hildren Z hildren and hilden W so are A indep e enden d t ath P A ( B Z a ; of W des j from X and ; in Y suc ) that = h P de ( the W is ; paren X of ; follo Y no ; in Z sequence ) undir P cte ( p X from ; to Y is ) sequence = no P starting ( A W ending ) B P h ( eac X no ) in P sequence ( a Y t j child W the ) wing P de ( seman Z of j Ba X esian ; w Y are ) eac R no P is ( indep W t ) its P ts ( en X paren ) 2 P generally ( t Y o j Since W is ) oneone P ondence ( et Z een j des X v ; w Y will ) talk dW out d indep Z relations = et P een ( des W conditional ) endence P b ( w Y the j ariables W ciated ) the P des ( 2 Z are conditional Iap er orks pap diren this Iap Fig whic 1. as A path directed Ba acyclic onds graph ert (D computing A b G een consisten of t is with t the with conditional said indep d endence relation relations from in a P e ( c W lo ; or X more ; there Y , ; exact Z to ). previous disjoin v t y sets ]. of particular no w des an A P and ed B alid are G conditionally can indep ving enden absence t net giv whic en eien C probabilities , net if undirected C a d agation sep cte ar there ates path A w and general B e , pro that propagation is throughout if and along 3 ev of ery no undirected mo path are b rep et ha w seman een deal a dels no t de esian in G A b and endency a a no ev de displa in G B a there indep is P a a no no de e D without suc Iap h . that arcs (1) y D orks has endence con can v to erging algorithms arro and ws or 3 cte and orks neither the D has nor there its algorithm descenden pr ts 41]. are c in net C whic , b or one (2) et D y do no es a not kno ha junction v algorithm e I con the v b erging ince arro ds w pap and on D the is an in is C c [41 oth ]. follo F in rom Undirected visual ark insp w ection imp of ol the ting graphical ns mo e del set it [5, is e therefore el easy graphical to a infer in man A y y indep net endence ork relations is without to explicitly e grinding indep through map Ba for y distribution es if rule ery F eparation or y example in W corresp is to conditionally v indep conditional enden endence t in from . X is giv minimal en if the arc set b C deleted = G f remo Y the ; prop Z y g The , of since in Y Ba 2 esian C w is implies along indep the relations only h path b b exploited et obtain w t een for W marginal and conditional X F , singly and onne Y d do w es in not h ha underlying v graph e no con ops v exists erging general arro called ws elief Ho op w [31, ev F er multiply w onne e d cannot w infer in from h the can graph e that than W undirected is b conditionally w indep an enden t t o from des X exists giv more en algorithm Z wn . the Notice tr that e since [33 eac 25]. h will factorization vide implies essence a the strict elief ordering algorithm of the the metho v used ariables this the er connections based obtained it in refer this reader manner relev dee t a That directed D acyclic a graph hild 4 b . the F and ur wing thermore des there the are 4 man graphical y dels w o a net ys orks to another factorize ortan a to join for t resen distribution probabilit and distributio consequen and tly v there a are t man of y tics Ba 13 y W esian will net exclusiv w y orks directed consisten mo t 4 with join W X Y Z n P p [41 n , ted 24 n , That 19] ery for prop details and Assume are w n e =1 observ e e is some evidence evidenc marginal e messages : the the If v k alue ; of f some ; v p ariables c in the the ). net of w , ork paren The the goal n of a b pro elief paren propagation conditional is en to from up of date : the c marginal ; probabilities , of / all k the ; v k ariables j in 5 the j net n w des ork the to ( incorp message orate eac this paren new probabilit evidence en This of is of ac ed hiev e ed [ b . y y lo de cal to message of passing from eac w h y no y de de n paren sends message a c message paren to are its ; paren ; ts and and of to c its : c ` hildren P Since e the 4 graph 1 is P singly p connected : n k separates i the p graph + and )) therefore Y the ( evidence e in ) to (2) t the w whic o in m set utually + exclusiv undirected e The sets from e to + h ( its n ts ), the consisting y of giv the ev paren setting ts the of t n the , observ the in no set des ( connected ) to f n g through The its probabilit paren of ts no 5 is , ortional and the n duct itself the and obtained e its ( ts n eigh ) b consisting the of probabilit the of c no hildren giv of its n ts and the the obtained no its des hildren connected the to ts n n through f its 1 c : hildren : igure p 2). g The the message hilden from n n f to 1 eac : h : of c its g c then hildren ( is j the ) probabilit 2 y X texts p . ;:::;p n g of ( t j paren 1 a : through : es p go ) n Y to =1 path ( Fig i 2. e Separation ( of i evidence 3 in ` singly j connected P graphs c of ; eac ( h j setting j of ) n 5 giv is en no the for evidence h observ 5 ed n + e (n) p1 p2 p3 n -c1 e (n) c2 c3 cess in o the Y summation w r cause more Y generally Y the ; in o tegral at extends Ba o causal v v er Y all 3. settings ables of use f in p ; 1 on ; The : but : series : eac ; f p mo k the g : . Y F states or represen example t giv step en , the alue evidence mo e F = f f . X that = e x oin Z ev = t z simplies g w , o P time ( one Y a j : X a = whic x inenced Z igure = 2 z ) ) ( /   T Z that P esian ( Mark Y do j et W more ) ed P ; ( will W to ) +1 dW extending  allo P w (  Z allo = ; z g ; y X is = are x ariable j state Y t ) in (3) that / t P ev ( the Y vice ) design P esian ( for Z arcs = forw z Assigning j t X v = the x dels Y of ) 1 P : ( g X der = , x eac ) is (4) b where v P P ( ; W : ) Y is P the ) message 2 passed ) from P W j to ) Y sequence since , e Ba + w ( a W v ) mo = directly ; endencies , een and v P one ( ving Z Y = : z t ; mo X mak = Y x the j Y Y One ) y is o the is message higher passed b from v Z example to rder Y mo . arcs V t ariables : in t the Y evidence w set extend are mo referred p to observ as enden observable hidden v h ariables call while ariables those diren not where in ts the time evidence assumption set an are en referred can to another as en hidden in v future ariables not Often ersa a the Ba of y y esian net net orks w time ork directed is should constructed w b ard y time com a bining index a to priori h kno ariable wledge of ab simplest out mo conditional for indep sequence endences data b Y et ; w : een ; the T v is ariables str p Markov erhaps del from in an h exp h ert ariable in directly a only particular y domain previous and ariable a 3): data ( set 1 of Y observ ; ations : A ; natural T w = a ( y 1 in P whic Y h j this 1 a  priori  kno ( wledge T can Y b 1 e is elicited of from the the and exp Fig ert A is y b net y ork asking ting questions strder regarding o causalit pro y These a dels v not ariable represen that dep has b a w direct observ causal o ect er on than another time v Ha ariable observ will f b 1 e : its : paren Y t g in the the del net only w e ork of Since t temp predict oral v order of sp t ecis . the simple direction a of of causalit Mark y v , dels this to notion w pla order ys teractions an et imp een ortan ariables t or role a in th the Mark design v of del dynamic ws Ba from y Y esian  net : w : orks Y 3 1 Dynamic to Ba t y Another esian a net to w Mark orks v In dels time to series osit mo the deling ations w dep e t observ a e v the whic v w alues will of the certain 6 v p Y Y Y1 Y T2 3 where t : Mark t o , v del pro X cess comp igure X 4). e A and classic t mo t del probabilit of in this ( kind determining is a the Y linearaussian t statepace functions mo ation del X also + kno X wn ( as (5) the X Kalman b ter and a t matrix ) ation the observ X the and is v C probabilit and t matrix Y transition ) state the the timen is the A is where linearaussian (7) t Fig = 4. T A P Ba j y ) esian t net ) w state ork P sp j ecifying ) conditional decomp indep determin endence c relations ts for f a t statepace w mo t del transition 3.1 mean Example giv 1: 1 Statepace t mo random dels Similarl In observ statepace P mo j dels can a osed sequence = of X D v imensional b real and alued linear observ arian ation distribution v and ec v tors the f ecomes Y mo 1 = ; + : Y : X : 7 ; Y Y =2 T ( g t , X is 1 mo P deled Y b j y t assuming : that The at transition eac y h ( time t step X Y 1 t can w e as osed gener to ated istic from sto a hastic K onen imensional X real = alued t hidden X state 1 v + ariable t X f t is , deterministic and function that the the of sequence t of en X t 's , dee w a is strder zeroean Mark noise o ector v y pro the cess ation Using y the ( shortand t nota X tion ) f b Y decomp t as g t to g denote ( sequences t from + t t = If 1 oth to transition t output = are T and : v P t ( the f of X states t observ ; noise Y ariables t Gaussian g mo ) b = a P statepace ( del X t 1 AX ) 1 P w ( (6) Y t 1 C j t X v 1 ) t X X1 X X T2 3 Y Y Y1 Y T2 3 sequence matrix er the mo observ Ba ations transition can fully b ted e [48 divided and in ues to arian a alues set alued of or input mo r problems predictor some v ted ari o ables tak and : output t r b resp b onse t v L ariables t Again suc assuming statepace linearit [7 y of and e Gaussian computational noise y w mo e param can y write to the formally state and transition K function f as g X S t a = b AX K t ables 1 one + bilities B ) U y t F + ectors w ) t y ; Gaussian (8) w where can U for t 36]. is distribution the ations input ations observ exten ation h v 2], ector and and to B wledge is of the w input kno matrix of The er Ba dated y osterior esian dels net prior w P ork Often corresp one onding v to t this ; mo ; del The w P ould j include ), a in sequence HMM of sp no a des K f the U discrete t taking g L eac emission h ( of S whic b h ecid is K a ation paren real t ation of ( the S corresp b onding in X t t as . of Linearaussian neural statepace Lik mo dels dels e are allo used v extensiv 4 ely system in the all a areas obser of en con input trol ha and een signal ely pro sp cessing [28 3.2 [32 Example fault 2: 4 Hidden A Mark approac o starts v priori mo out dels structurehe In in a esian hidden mo Mark This o is v the mo prior del o MM del the and sequence the of a observ y ations er f parameters Y assuming t o g dels is M mo prior d can eled e b of y discrete assuming al that S eac 2 h 1 observ : ation : dep K ends . on state a probabilities discr ( ete t hidden S state 1 S for t time , v and t that can the e sequences ecid of y hidden single states  are transition distributed If according observ to are a sym Mark ols o on v of pro v cess the The proba join P t Y probabilit j y t for can the e sequences sp of b states a and  observ observ ations matrix can or b v e observ factored v in P exactly Y the j same t manner can as e equation deled (5), man with diren S forms t h taking a the mixture place Gaussians of a X net t ork : e P mo ( HMMs f b S augmen t to ; w Y input t ariables g , ) , = The P then ( dels S conditional 1 of ) sequence P output ( v Y giv 1 a j of S observ 1 HMMs ) v T b Y applied t siv =2 to P in ( eec S recognition t ], j biology S , t and 1 detection ) ]. P Learning ( Inference Y Ba t esian j h S learning t with ) a : kno (9) ab Consequen the tly del , set the arcs conditional the indep y endences net in orknd an del HMM eters can initial also wledge b represen e in expressed form graphically a using probabilit the distribution Ba v y mo esian structures net parameters w up ork using sho data wn obtain in p Figure probabilit 4. distribution The v state mo is and repre More sen , ted a b distribution y v a mo single structures m ( ultinom ) ial a v 8 ariable that v ML b o the v net er will parameters esian for t eac P h o mo and del er structure is P strong ( v  ; jM data ), metho a set data P set maxim D um is y used this to the form esian a urthermore p mo osterior in distribution Data o = v h er then mo ; dels ; using the Ba in y e es parameters rule ; P eak ( lik MjD of ) d = to R in P e ( the D mo j Although  pro ; a M impractical ) application P wledge ( a  a jM than ) ML d set P distributed ( ; M ) ) can P series ( o D ( ) N whic i h Tw in tegrating tegrates the out dels the 9 uncertain an t uniform y er in p the  parameters ) F sharply or around a of giv o en the mo single del eliho struc mo ture e w obtained e y can o compute parameters the cus p er osterior of distribution for o giv v del er principle the only parameters Ba P in ( Ba  is jM . ; man D there ) priori = out P structure ( estimate D pro j parsimonious  mo ; distribution M parameters ) with P a ( indep  iden jM ations ) Y P : ( ( D , jM whic ) e : or If v the lik data of set is is j some ) sequence i of Y observ j ations ) D appro = for f v Y osterior 1 of ; ork : describ : and : data ; and Y uninformativ T (e.g. g prior and v w the e the wish osterior to ( predict jM the D next will observ e ation p Y ed T the +1 a based the on eliho our d data therefore and predictions mo a dels maxim then lik the o Ba L y del esian b prediction similar P those ( b Y Ba T esian +1 tegration jD v ) the = W Z fo P in ( pap Y on T problem +1 estimating j parameters  a ; del M en ; mo D structure ) in P this ( an  ap jM ximate ; y D learning ) practice P fulledged ( y MjD analysis ) often d 6 d F M in in y tegrates areas out is the a uncertain kno t ab y the in del the and mo single del of structure parameters and vides parameters more W and e terpretable obtain del a a somewhat o imp er o 4.1 v Estimation erished Complete b Assume y data nonetheless of useful enden limiting and case tically of observ the D Ba f y (1) esian : approac : h Y to N learning g if eac w of e h assume b a a single ector mo time del of structure ectors M the and eliho w d e the estimate set the P parameters D ^   M that = maxim Y i =1 ze ( the ( lik ) eliho  o M d 6 P o ( ximate D ds j in  o ; er M p ) in under case that neural mo w del mo In are the ed limit [35] of [38]. a distribution large X of e or ; notational h con  v ation enience ) w normalized e ariables henceforth P drop the the in implicit obtain conditioning ; on X the ) mo h del data structure the M e . X The of ML X parameters dropp are lik obtained er b L y X maxim ) ing  the ( lik X eli j ho coun o j d paren or Estimation equiv With alen o tly in the  log  lik Y eliho X o and d tegral L the ( e  erscript ) ev = for N distribution X ariables i er =1 X log ) P ) ( ) Y X ( X i X ) X j j  ) ) F : of If simply the con observ of ation of v en ector of includes in all 4.2 the Hidden v EM ariables v in lik the cannot Ba osed y Rather esian L net = w Y ork = then P eac X h (12) term the in v the X log r lik v eliho to o probabilit d data further v factors the as i log b P the ( o Y single ( an i o ) hidden j e  lo ) ound = log log ( Y j j log P ( ( ( Y j ( ( i  ) ( j P j Y Y Q ( (14) i Q ) log pa ; j ) ) ( ; Q  (15) j Q ) 10 (10)  = is X a j table log taining P ts ( eac Y setting ( Y i giv ) eac j setting j its Y ts ( the i set ) ML pa with j V ) The ; algorithm  hidden j ariables ) log ; eliho (11) d where b j decomp indexes as o (11). v w er d the ( no ) des log in ( the j Ba ) y log esian X net ( w ; ork j pa ) j where ) is is set the hidden set ariables of P paren is ts sum of in j o , er and required  obtain j marginal are y the the parameters (W that ha dee e the ed conditional sup probabilit ( y ) of (12) Y y j aluating giv log en eliho its d paren a ts observ The Using lik y eliho Q o v d the therefore v decouples w in can to a lo w cal b terms on in : v X olving P eac Y h X no  de = and X its Q paren X ts P simplifying Y the X ML  estimation Q problem X F (13) or X example Q if X the log Y ( v ; ariables j are ) discrete ( and )  = j X is ( the ) conditional P probabilit X y Y table  for X Y Q j X giv log en ( its ) paren = ts ( then  the (16) ML F estimate
Un pour Un
Permettre à tous d'accéder à la lecture
Pour chaque accès à la bibliothèque, YouScribe donne un accès à une personne dans le besoin