Estimating the Partition Function of 2-D Fields and the Capacity of  Constrained Noiseless 2-D Channels
5 pages
English

Estimating the Partition Function of 2-D Fields and the Capacity of Constrained Noiseless 2-D Channels

-

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
5 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

62009 IEEE Information Theory WorkshopEstimating the Partition Function of 2-D Fields andthe Capacity of Constrained Noiseless 2-DChannels Using Tree-Based Gibbs SamplingHans-Andrea Loeliger and Mehdi MolkaraieETH ZurichDept. of Information Technology & Electrical Engineering8092 Zurich,¨ SwitzerlandEmail:floeliger, molkaraig@isi.ee.ethz.ch= = = =Abstract—Tree-basedGibbssampling(proposedbyHamzeandde Freitas) is used to compute a Monte-Carlo estimate of thepartition function of factor graphs with cycles. The proposedmethod can be used, in particular, to compute the capacity ofnoiseless constrained 2-D channels.= = = =I. INTRODUCTIONLetX ;X ;:::;X be finite sets, letX be the Cartesian1 2 N4productX =X X :::X , and letf be a nonnegative1 2 N= = = =function f :X!R. We are interested in computing (exactlyor approximately) the quantityX4Z = f(x) (1)x2X= = = =1(or, equivalently, logZ) for cases whereN Fig. 1. Forney-style factor graph for Example 1. The unlabeled boxes X ;:::X are “small” sets (e.g.,jXj =jXj =::: = 2), represent factors as in (4).1 N 1 2 N is large, and f has a “useful” factorization (as will be detailedvalue 1. Letf be the indicator function of this constraint, which canbelow).be factored into factors of the formNote that14 0; if x =x = 1k ‘(x ;x ) = (4)p(x) = f(x) (2) k ‘1; else,Zis a probability mass function onX . We will also need the set with one such factor for each adjacent pair (x ;x ).k ‘The ...

Informations

Publié par
Nombre de lectures 23
Langue English

Extrait

2009 IEEE Information TheoryWorkshop
Estimating the Partition Function of 2-D Fields and the Capacity of Constrained Noiseless 2-D Channels Using Tree-Based Gibbs Sampling
Hans-Andrea Loeliger and Mehdi Molkaraie ETH Zurich Dept. of Information Technology & Electrical Engineering 8092Zurich,Switzerland Email:{loeliger, molkarai}@isi.ee.ethz.ch
= = = = Abstract—Tree-based Gibbs sampling (proposed by Hamze and de Freitas) is used to compute a Monte-Carlo estimate of the partition function of factor graphs with cycles. The proposed method can be used, in particular, to compute the capacity of noiseless constrained 2-D channels. = = = = I. INTRODUCTION LetX1,X2, . . . ,XNbe finite sets, letXbe the Cartesian 4 productX=X1× X2×. . .× XN, and letfbe a nonnegative = = = = functionf:X →R. We are interested in computing (exactly or approximately) the quantity X 4 Z=f(x)(1) x∈X = = = = 1 (or, equivalently,logZ) for cases where N Fig. 1.Forney-style factor graph for Example 1. The unlabeled boxes X1, . . .XNare “small” sets (e.g.,|X1|=|X2|=. . .= 2),represent factors as in (4). Nis large, andfhas a “useful” factorization (as will be detailed below).value1. Letfbe the indicator function of this constraint, which can be factored into factors of the form Note that 410,ifxk=x`= 1 p(x) =f(x)(2)κ(xk, x`) =(4) 1,else, Z is a probability mass function onX. We will also need the setwith one such factor for each adjacent pair(xk, x`). The corresponding Forney-style factor graph offis shown in 4 +Fig. 1, where the boxes labeled “=” are equality constraints [5]. Xf={x∈ X:f(x)6= 0}.(3) (Fig. 1 may also be viewed as a factor graph as in [4] where the boxes labeled “=” are the variable nodes.) The quantity (1) is known as the “partition function” in This example is known as the 2-D(1,)constrained channel [6]. statistical physics (where it is considered as a function of a “temperature” parameter that is of no concern to us here). TheNote that, in this example,Z=|Xf|. For this particular + 1 example2log computation of (1) is also the key to computing information,limM→∞M2(Z)0.5879is known to nine rates of source/ channelmodels with memory [1]–[3].decimal digits [7], [8]. However, the method proposed in this Iffhas a cycle-free factor graph with not too many states,paper works also for various generalizations of this example then the sum (1) can be computed by sum-product messagefor which this limit is not known to any useful accuracy [6]. passing [1], [4]. In this paper, however, we consider the caseA number of Monte-Carlo methods to estimateZhave where no such cycle-free factor graph exists. In particular, webeen proposed, see [9], [10]. However, these methods assume are interested in examples of the following type.thatfis strictly positive, which excludes applications as in Example 1; more about this will be said in Section II. Example 1 (Simple 2-D Constraint).Consider a grid ofN= In this paper, we propose a new Monte-Carlo method for M×Mbinary (i.e.,{0,1}-valued) variables with the constraint that no two (horizontally or vertically) adjacent variables have both thethe computation ofZthat works also for Example 1. In
9781424449835/09/$25.00 © 2009 IEEE228
✬ ✩ ✬constrast to the method of [3], the proposed method converges 1 to the exact value oflog(Z)in the limit of infinitely many N = = = = samples. II. ESTIMATING1/ZUSINGGIBBSSAMPLING One method to estimate1/Z(and thusZitself) goes as follows. = = = = (1) (2)(K) 1) Drawsamplesx,x, .. . ,xfromXfaccording + top(x)defined in (2). 2) Compute K X 41 1 = = = = Γ =(5) (k) | K|Xff(x) + k=1 It is easily verified thatE[Γ] = 1/Z. This method was proposed in [11], see also [9]. = = = = However, there are two major issues with this method. First, it is usually assumed (as in [9], [11]) thatfis strictly positive. In this case,Xf=Xand|Xf|=|X |is + + ✫ ✪ ✫known. However, this assumption excludes applications as in (k) Example 1. (Indeed, in Example 1, we would havef(x) = 1 Fig. 2.Partition of Fig. 1 into two cycle-free parts (one part inside the two (k)ovals, the other part outside the ovals). for all samplesx, and|Xf|=Zis the desired unknown + quantity.) We will see how this issue is resolved by an idea from [12]. (1)B. Tree-BasedEstimation of1/Z[12] Second, there is the problem of generating the samplesx, (2) (K) x, . . . ,xaccording top(x). A standard general method isLet X 4 Gibbs sampling [9], [13], which, however, produces strongly fA(xA) =f(xA, xB)(8) dependent samples. In consequence, the required number ofxB samplesKis likely to exceed the limits of practicality. We and will see how this issue is eased by tree-based Gibbs samplingX 4 fB(xB) =f(xA, xB).(9) as proposed by Hamze and de Freitas [14]. xA III. PROPOSEDNEWMETHOD Since The proposed method combines tree-based Gibbs samplingX X X f(xA) =f(xB) =f(x) =Z,(10) from [14] with an idea from [12]. xAxBx Let(A, B)be a partition of the index set{1, . . . , N}such that, (i) for fixedxA, the factor graph off(x) =f(xA, xB) we can estimateZby applying the algorithm of Section II to is a tree and (ii) for fixedxB, the factor graph off(x) = fAor tofB(as noted in [12]). Specifically, an estimateΓA f(xA, xB)is also a tree. An example of such a partition is of1/Zis formed as follows: shown in Fig. 2.((1) (2)K) 1) Drawsamplesx,x. . ,, .xfrom(XA)ac-+ A AA f A 4P A. Tree-BasedGibbs Sampling [14] cording top(xA) =p(xA, xB) =fA(xA)/Z. xB (0) (0) (0) 2) Starting from some initial configurationx= (xx ,), A B K (k) (k) X (k) the samplesx= (x ,x),k= 1,2, . . ., are created as41 1 A B ΓA=(11) (k) (k) follows. First,xis sampled according toK|(XA)| + A ff(x) Ak=1A (k1) (k1) p(xA|xB=x)f(xA, x);(6) where B B 4 (k)(XA) ={xA:fA(xA)6= 0}.(12) + f thenxis sampled according to A B (k) (k)By symmetry, we also have an analogous estimateΓB. The p(xB|xA=x)f(x ,xB).(7) A A computation of X The point is that the sampling can be done very efficiently(k) (k) f(x) =f(x ,xB),(13) A A in both cases since the corresponding factor graphs are cycle-xB free; see the appendix for details. Tree-based Gibbs sampling mixes much faster than naivewhich is required in (11), is easy since the corresponding Gibbs sampling [14].factor graph is a tree.
229
Fig. 3.Estimated capacity (in bits per symbol) vs. the number of samplesK for a10×10grid with a(1,)constraint (Example 1). The plot shows 10 independent sample paths, each with two estimates, one fromΓAand one fromΓB.
The quantity|(XA)|in (12) may be easy to determine + f A even iffis not strictly positive. This applies, in particular, to Example 1 (and many similar examples) where (XA) ={xA:f(xA,0)6= 0}.(14) + f A P In this case,|(XA)|=f(xA,0)is easily computed by + f xA A sum-product message passing in the (cycle-free) factor graph off(xA,0). C. AHappy Combination (1) (2) It is now obvious to create the required samplesx,x, A A (K) . . . ,xin (11) by means of tree-based Gibbs sampling as A in Section III-A. The marginals (13) may then be obtained as a by-product of the tree-based sampling (see the appendix). We thus obtain two estimates,ΓAandΓB, as a by-product of tree-based Gibbs sampling with virtually no extra computations. IV. NUMERICALEXPERIMENTS Some experimental results with the proposed method are shown in Figures 3 through 6. All figures refer tofas in 1 Example 1 and show the quantity (the “capacity”)log (Z). 2 N Figures 3 and 4 use a factor graph partition as in Fig. 2. In Fig. 3, we haveN= 10×10and the estimated capacity is about0.6082. In Fig. 4, we haveN= 60×60; for this size of grid there are issues with slow convergence. To improve the convergence and to speed up the mixing, we can partition the factor graph (the extension of Fig. 1 toN= 60×60) into “thicker” vertical strips. Such thick strips have cycles, but exact sum-product computation is still possible, e.g., by converting the strip into an equivalent cycle-free factor graph. The computation time is exponential in the thickness of the strip, but the faster mixing (as shown in Figures 5 and 6) results in a substantial reduction of total computation time for strips of moderate width. From Fig. 6, the estimated capacity is about0.5914.
Fig. 4.Estimated capacity (in bits per symbol) vs. the number of samplesK for a60×60grid with a(1,)constraint (Example 1). The plot shows 10 independent sample paths, each with two estimates, one fromΓAand one fromΓB.
230
Fig. 5.Same conditions as in Fig. 4, but with strips of width two.
Fig. 6.Same conditions as in Fig. 4, but with strips of width three.
Also shown in the figures (as a horizontal dotted line) isgk1gkgk+1 1 the infinite-grid limitlimM→∞2log2(Z)0.5879, whichXk2Xk1XkXk+1 M ✛ ✛ is known for this simple example (see Section I). All figures show the estimates fromΓAand fromΓBfor ←− Fig. 7.Forney-style factor graph of (16) with messagesµX(17). k several independent experiments. ←− V. BOUNDS FORINFINITEGRID in Fig. 7 (cf. [5]). We then haveµXn(xn) = 1and X 4 4 1 LetCM=2log (Z)be the capacity of a constraint asµ(x) =g(xx ,)µ(x)(17) M2Xkk k+1k k+1Xk+1k+1 in Example 1 for anM×Mgrid. It is clear (from tiling thexk+1 n whole plane withM×Msquares) thatCCMfor anyX Y finiteM.=gm(xm1, xm)(18) xk+1,...,xnm=k+1 On the other hand, by tiling the plane withM×Msquares separated by all-zero guard rows and all-zero guard columns, fork=n1, n2, . . . ,1. Then M2 we obtainCCM( ).X M+1 p(x1) =p(x1, . . . , xn)(19) In the example of Figures 4–6 (withM= 60), we thus x2,...,xn obtain0.5721C0.5914. ←− µX1(x1)(20) VI. CONCLUDINGREMARKSand ←− gk(xk1, xk)µXk(xk) p(xk|xk1) =(21) ←− We have shown that tree-based Gibbs sampling (as proposed µXk1(xk1) by Hamze and de Freitas) can be used to compute an estimate fork= 2, . . . , n. The proof of (21) follows from noting that of the partition function with virtually no extra computational cost. The proposed method can be used, in particular, top(x) =γ µ(x)µ(x)(22) k1Xk1k1Xk1k1 compute (a Monte Carlo estimate of) the capacity of noiseless and constrained 2-D channels. Our preliminary numerical experi-ments are encouraging. p(xk1, xk) =γ µXk1(xk1)gk(xk1, xk)µXk(xk)(23) −→ whereµXk1is the forward sum-product message along the APPENDIX: SAMPLING FROMMARKOVCHAINS edgeXk1and whereγis the missing scale factor in (16). We recall some pertinent facts about the simulation ofWe also note that X X Markov chains and cycle-free factor graphs. Letp(x) = ←− µX1(x1) =g(x)(24) p(x1, . . . , xn)be the probability mass function of a Markov x1x chain. Ifp(x)is given in the form whereg(x)is defined as the right-hand side of (16). In this n Y paper, this fact is used to compute the marginals (13) as a p(x) =p(x1)p(xk|xk1),(15) by-product of the sampling. k=2 The generalization of all this to arbitrary factor graphs without cycles is straightforward. then it is obvious how to create i.i.d. samples according to p(x). Now consider the case wherep(x)is not given in the ACKNOWLEDGEMENT form (15), but in the more general form The authors wish to thank David MacKay and Iain Murray n Y for pointing out to us the work by Hamze and de Freitas [14]. p(x)gk(xk1, xk)(16) k=2 REFERENCES with general factorsgk. It is then still easy to create i.i.d. samples according top(x), which may be seen as follows.,ˇvicca,dn.WeZgn.O.Vontobel,A.Ka,PerigelLoA..-,HdlonrA.D]1[ “Simulation-based computation of information rates for channels with First, a probability mass function of the form (16) can be memory,”IEEE Trans. Inform. Theory, vol. 52, no. 8, pp. 3498–3508, rewritten in the form (15) (which allows efficient simulation). August 2006. Second, this reparameterization ofp(x)may be efficiently[2] P. H. Siegel, “Information-theoretic limits of two-dimensional optical recording channels,” inOptical Data Storage 2006(Proc. of SPIE, Vol. carried out by backward sum-product message passing, as 6282, Eds. Ryuichi Katayama and Tuviah E. Schlesinger), Montreal, will be detailed below. The resulting algorithm is know as Quebec, Canada, April 23–26, 2006, pp. 62820W-1 – 2820W-13. “backward-filtering forward-sampling” (or, in a time-reversed[3] O. Shental, N. Shental, S. Shamai (Shitz), I. Kanter, A. J. Weiss, and Y. Weiss, “Discrete-input two-dimensional Gaussian channels with version, as “forward-filtering backward-sampling”) [15]. memory: estimation and information rates via graphical models and ←− Specifically, letµXkbe the backward sum-product messagestatistical mechanics,”IEEE Trans. Inform. Theory,vol. 54, pp. 1500– along the edgeXkin the factor graph of (16), as is illustrated1513, April 2008.
231
[4] F.R. Kschischang, B. J. Frey, and H.-A. Loeliger, “Factor graphs and the sum-product algorithm,”IEEE Trans. Inform. Theory,vol. 47, pp. 498– 519, Feb. 2001. [5] H.-A.Loeliger, “An introduction to factor graphs,”IEEE Signal Proc. Mag.,Jan. 2004, pp. 28–41. [6] K.Kato and K. Zeger, “On the capacity of two-dimensional run-length constrained channels,”IEEE Trans. Inform. Theory,vol. 45, pp. 1527– 1540, July 1999. [7] N.J. Calkin and H. S. Wilf, “The number of independent sets in a grid graph,”SIAM J. Discr. Math.,vol. 11, pp. 54–60, Feb. 1998. [8] W.Weeks and R. E. Blahut, “The capacity and coding gain of certain checkerboard codes,”IEEE Trans. Inform. Theory,vol. 44, pp. 1193– 1203, May 1998. [9] G.Potamianos and J. Goutsias, “Stochastic approximation algorithms for partition function estimation of Gibbs random fields,”IEEE Trans. Inform. Theory,vol. 43, pp. 1984–1965, Nov. 1997.
[10] F.Huang and Y. Ogata, “Comparison of two methods for calculating the partition functions of various spatial statistical models,”Australian & New Zealand Journal of Statistics,vol. 43 (1), pp. 47–65, 2001. [11] Y. Ogata and M. Tanemura, “Estimation of interaction potentials of spatial point patterns through the maximum likelihood procedure,”Ann. Inst. Statist. Math.,vol. 33, pp. 315–338, 1981. [12] H.-A.Loeliger and M. Molkaraie, “Simulation-based estimation of the partition function and the information rate of two-dimensional models,” Proc. 2008 IEEE Int. Symp. on Information Theory,Toronto, Canada, July 6–11, 2008, pp. 1113–1117. [13] D.J. C. MacKay, “Introduction to Monte Carlo methods,” inLearning in Graphical Models,M. I. Jordan, ed., Kluwer Academic Press, 1998, pp. 175–204. [14] F.Hamze and N. de Freitas, “From fields to trees,”Proc. Conf. on Uncertainty in Artificial Intelligence,Banff, July 2004. [15] D.Gamerman and H. F. Lopes,Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference.2nd ed., CRC Press, 2006.
232
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents