Structures et algorithmes aleatoires
123 pages
English

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris

Structures et algorithmes aleatoires

Découvre YouScribe en t'inscrivant gratuitement

Je m'inscris
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus
123 pages
English
Obtenez un accès à la bibliothèque pour le consulter en ligne
En savoir plus

Description

ENS 2011-2012 Structures et algorithmes aleatoires 18 novembre 2011

  • random algorithms

  • probability

  • notation concerning

  • hoeffding's inequality

  • proba- bility theory

  • famous discrete probability

  • used only


Sujets

Informations

Publié par
Nombre de lectures 17
Langue English

Extrait

ENS 20112012
Structures et algorithmes aléatoires
18 novembre 2011
2
Contents
1
2
3
4
5
6
7
Events and probability 1.1 Outcomes and events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Probability and independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Conditional probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Random variables 2.1 Distribution and expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Famous discrete probability distributions . . . . . . . . . . . . . . . . . . . . . . 2.3 Conditional expectation and martingales . . . . . . . . . . . . . . . . . . . . . . .
Generating functions 3.1 Definition, examples and main properties . . . . . . . . . . . . . . . . . . . . . . 3.2 A computational tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 The branching process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Inequalities and bounds 4.1 Jensen’s inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Markov’s inequality and Chernoff’s bounds . . . . . . . . . . . . . . . . . . . . . 4.3 Poisson bounding of multinomial events . . . . . . . . . . . . . . . . . . . . . . . 4.4 Hoeffding’s inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The strong law of large numbers 5.1 Almostsure convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Limits of expectations, expectation of limits . . . . . . . . . . . . . . . . . . . . . 5.3 Kolmogorov’s strong law of large numbers . . . . . . . . . . . . . . . . . . . . . .
The probabilistic method 6.1 Existence proofs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 Lovasz’s local lemma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3 First and second moment methods . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4 Random algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Growth phenomena in random graphs 7.1 Percolation on the infinite grid . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3
5 5 8 14
19 19 29 37
49 49 52 56
61 61 63 70 73
79 79 85 89
95 95 99 103 108
117 117
4
CONTENTS
Chapter
Events
1.1
1
and
probability
Outcomes and events
The study of random phenomena requires a clear and precise language. That of proba bility theory features familiar mathematical objects, such as points, sets and functions, which however receive a particular interpretation: points areoutcomes(of an experi ment), sets areevents, functions arerandom numbers. The meaning of these terms will be given just after we recall the notation concerning operations on sets,union,intersection, andcomplementation.
Samplespaceandevents
IfAandBare subsets of some set Ω,ABdenotes their union andABtheir intersection. In this book we shall denote byAthe complement ofAin Ω. The notation A+B(thesumofAandB) implies by convention thatAandBaredisjoint, in which P case it represents the unionABthe notation. Similarly, Akis used forAk k=1k=1 only when theAk’s are pairwise disjoint. The notationABis used only ifBA, and it stands forABparticular, if. In BA, thenA=B+ (AB). Theindicator functionof the subsetAΩ is the function 1A: Ω→ {0,1}defined by 1 ifωA , 1A(ω) = 0 ifω6∈A.
Random phenomena are observed by means of experiments (performed either by man or nature). Each experiment results in anoutcome. The collection of all possible outcomes
5
6
CHAPTER 1.
EVENTS AND PROBABILITY
ωis called thesample spacesubsetΩ. Any Aof the sample space Ω can be regarded as a representation of someevent.
Example 1.1.1:Tossing a die, take 1.The experiment consists in tossing a die once. The possible outcomes areω= 1,2, . . . ,6 and the sample space is the set Ω = {1,2,3,4,5,6}subset. The A={1,3,5}is the event “result is odd.”
Example 1.1.2:Throwing a dart.The experiment consists in throwing a dart at a 2 wall. The sample space can be chosen to be the plane . An outcome is the position R 2 2 ω= (x, yThe subset) hit by the dart. A={(x, y);x+y >1}is an event that could be named “you missed the dartboard” if the dartboard is a disk centered at 0 and of radius 1.
Example 1.1.3:Heads and tails, take 1.The experiment is an infinite succession of coin tosses. One can take for sample space the collection of all sequencesω={xn}n1, wherexn= 1 or 0, depending on whether thenTheth toss results in heads or tails. subsetA={ω;xk= 1 fork= 1 to 1,000}is a lucky event for anyone betting on heads!
The language of probabilists The probabilists have their own dialect. They say that outcomeωrealizesindexrealize eventAifωAinstance, in the die model of Example 1.1.1, the outcome. For ω= 1 realizes the event “result is odd”, since 1A={1,3,5}. Obviously, ifωdoes not realize A, it realizesA. EventABis realized by outcomeωif and only ifωrealizes bothA andB. Similarly,ABis realized byωif and only ifat leastone event amongAand BTwo eventsis realized (both can be realized). AandBare calledincompatiblewhen AB=other words, event. In ABis impossible: no outcomeωcan realize bothA andBthis reason one refers to the empty set. For as theimpossibleevent. Naturally, Ω is called thecertainevent. P Recall now tAis u hat the notationk=1ksed forAkonly when the subsetsAkare k=1 pairwise disjoint. In the terminology of sets, the setsA1,A2, . . .form apartitionof Ω if
X Ak= Ω. k=0
The probabilists then say that the eventsA1,A2, . . .aremutually exclusiveandexhaus tive. They are exhaustive in the sense that any outcomeωrealizes at least one among them. They are mutually exclusive in the sense that any two distinct events among them are incompatible. Therefore, anyωrealizesone and only oneof the eventsA1, . . . , An.
1.1.
OUTCOMES AND EVENTS
7
IfBA, eventBis said toimplyeventA, becauseωrealizesAwhenever it realizesB.
The sigmafield of events Probability theory assigns to each event a number, theprobabilityTheof the said event. collectionFof events to which a probability is assigned is not always identical to the collection of all subsets of Ω. The requirement onFis that it should be a sigmafield:
Definition 1.1.1LetFbe a collection of subsets ofΩ, such that 1. the certain eventΩis inF,
2. ifAbelongs toF, then so does its complementA, and 3. ifA1, A2, . . .belong toF, then so does their unionAk. k=1 One then callsFasigmafieldonΩ, here the sigmafield ofevents.
Note that the impossible eventbeing the complement of the certain event Ω is in F. Note also that ifA1, A2, . . .belong toF, then so does their intersectionAk k=1 (Exercise??).
Example 1.1.4:Trivial sigmafield, gross sigmafield.These are respectively the collectionP(Ω) of all subsets of Ω, and the sigmafield with only two members: {Ω,}
If the sample space Ω is finite or countable, one usually (but not always and not neces sarily) considers any subset of Ω to be an event. That isF=P(Ω).
n n Example 1.1.5:Borel sigmafield.TheBorel sigmafielddenotedon , B, is by R n definition the smallest sigmafield on that contains all rectangles, that is, all sets of R Q n the formIj, where theIjsets in this sigmafield. The ’s are arbitrary intervals of j=1 R are calledBorel sets. The above definition is not constructive and therefore one may wonder if there exists sets that arenotBorel sets. The theory tells us that there are indeed such sets, but they are in a sense “pathological”: the proof of their existence involves the “axiom of choice”. In practice, it is enough to know that all the sets for which you had been able to compute the volume in your earlier life are Borel sets.
Example 1.1.6:Heads and tails, take 2.(Example 1.1.3 ct’d). Take in the quoted exampleFto be the smallest sigmafield that contains all the sets{ω;xk= 1},k1. This sigmafield also contains the sets{ω;xk= 0},k1 (pass to the complements), and therefore (take intersections) all the sets of the form{ω;x1=a1, . . . , xn=an}for alln1, alla1, . . . , an∈ {0,1}.
8
1.2
CHAPTER 1.
EVENTS AND PROBABILITY
Probability and independence
The axioms of probability TheprobabilityP(A) of an eventA∈ FAs ameasures the likeliness of its occurrence. function defined onF, the probabilityPis required to satisfy a few properties, called theaxioms of probability.
Definition 1.2.1A probability on,F)is a mappingP:F → R
1.0P(A)1,
2.P(Ω) = 1, and P P ∞ ∞ =P(A). 3.P(k=Ak)k=1k 1
such that
Property 3 is calledsigmaadditivity. The triple (Ω,F, P) is called aprobability space, orprobability model.
Example 1.2.1:Tossing a die, take 2.An event(Example 1.1.1 ct’d). Ais a subset of Ω ={1,2,3,4,5,6}. The formula
|A| P(A) =, 6
where|A|is thecardinalofA(the number of elements inA), defines a probabilityP.
Example 1.2.2:Heads and tails, take 3.(Examples 1.1.3 and 1.1.6 ct’d). Choose probabilityPsuch that for any event of the formA={x1=a1, . . . , xn=an}, where a1, . . . , anare arbitrary in{0,1}, 1 P(A) =n. 2 Note that this does not define the probability of any event ofFthe theory tells. But us that there does exist such a probability satisfying the above equirement an that this probability is unique.
Example 1.2.3:Random point on the square, take 1.The following is a possible 2 2 model of a random point inside the unit square [0,[01] = ,1]×[0,1]: Ω = [0,1] ,F 2 2 is the collection of sets in the Borel sigmafieldBthat are contained in [0,1] . The theory tells us that there exists indeed one and only one probabilityPsatisfying the requirement P([a, b]×[c, d]) = (ba)×(dc), 2 called the Lebesgue measure on [0,and that formalizes the intuitive notion of “area”.1] ,
1.2.
PROBABILITY AND INDEPENDENCE
9
The probability of Example 1.2.1 suggests an unbiased die, where each outcome 1, 2, 3, 4, 5, or 6 has the same probability. As we shall soon see, probabilityPof Example 1.2.2 implies anunbiasedcoin andindependenttosses (the emphasized terms will be defined later).
The axioms of probability are motivated by the heuristic interpretation ofP(A) as theempirical frequencyof occurrence of eventA. Ifn“independent” experiments are performed, among whichnAresult in the realization ofA, then the empirical frequency nA F(A) = n should be close toP(A) ifnis “sufficiently large.” (This statement has to be made precise. It is in fact a loose expression of the law of large numbers that will be given later on.) Clearly, the empirical frequency functionFsatisfies the axioms.
Basic formulas of probability We shall now list the properties of probability that follow directly from the axioms: Theorem 1.2.1For any eventA∈ F
and
P(A) = 1P(A),
P() = 0.
Proof.For a proof of (1.1), use additivity:
1 =P(Ω) =P(A+A) =P(A) +P(A).
Applying (1.1) withA= Ω gives (1.2).
Theorem 1.2.2Probability ismonotone, that is,
AB=P(A)P(B). (Recall the interpretation of the set inclusionAB: eventAimplies eventB.)
Proof.For a proof, observe that whenAB,B=A+ (BA), and therefore
P(B) =P(A) +P(BA) P(A).
(1.1)
(1.2)
(1.3)
10
CHAPTER 1.
EVENTS AND PROBABILITY
Theorem 1.2.3We have thesubsigmaadditivityproperty : P P(Ak)P(Ak). k=1k=1
Proof. Observe that
where
X ∞ ′ , =1Ak=Ak k k=1 n o k1 A∩ ∪ k=AkAi. i=1
Therefore,  ! X ∞ ′ P(A) =P A k=1k k k=1 X =P(A). k k=1 ′ ′ ButAAk, and thereforeP(A)P(Ak). k k
(1.4)
Sequential continuity This property comes close to being a tautology and is of great use. Theorem 1.2.4Let{An}n1be anondecreasingsequence of events, that is, for all n1, An+1An.
Then
Proof.Write
and
Therefore,
) = li P(=1Anmn↑∞P(An). n
An=A1+ (A2A1) +∙ ∙ ∙+ (AnAn1)
k=1Ak=A1+ (A2A1) + (A3A2) +∙ ∙ ∙.
X P(k=1Ak) =P(A1) +P(AjAj1) j=2     n X = limP(A1) +P(AjAj1) = limP(An). n↑∞ n↑∞ j=2
(1.5)
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents