tutorial 4
7 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
7 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

ELEG3410 Random Process & DSP Tutorial # 4 Random Variable: Definition Probability Density Function and Probability Distribution Function Moment Transformation of Variables Central Limit Theorem Two Dimensional Distribution Stochastic Process / Random Process The outcome of a trial (throwing a dice) has only one value (1, 2, …, 6). In a random process, the outcome is a function of time t. For example, if the random process is defined as the noise voltages of telephone lines in EWB, then for each telephone line, there will be a noise voltage output which depends on time t. 1 For the case above, “statistically determined” means if we are given any time t and noise voltage x, we know the probability of the noise of all telephone lines which is lower than x. st1 order p.d.f. nd2 order joint p.d.f. thn order joint p.d.f.: st1 Moment: ∞µ()t = E[]x()t = xp()x,t dx ∫ − ∞Autocorrelation: ∞ ∞R()t ,t = E[]x(t)x(t)= x x p()x , x ;t ,t dx dx1 2 1 2 1 2 1 2 1 2 1 2∫∫ − ∞ − ∞Autocovariance: 2C()t ,t = E[]()x(t)− µ (t ) (x (t ) − µ (t ) )1 2 1 1 2 2 = Rt ,t − µ(t)µ(t)1 2 1 2 Variance: ∞22σ ()t =()xt − µt p(x,t)dx = C(t,t)∫ − ∞ X (t ) = A cos (2 πt )E.g.: Let , where A is some random variable. Find the mean, autocorrelation, and autocovariance of X(t). Mean: µ()t = E[]Acos(2 πt) = E [A ]cos (2 πt ) Autocorrelation: ()()()R t ,t = E X t X t1 2 1 2= E[]Acos()2 πt Acos(2 πt) 1 22= EA cos2 ...

Informations

Publié par
Nombre de lectures 159
Langue English

Extrait

ELEG3410 Random Process & DSP



Tutorial # 4

Random Variable:
Definition
Probability Density Function and Probability Distribution Function
Moment
Transformation of Variables
Central Limit Theorem
Two Dimensional Distribution

Stochastic Process / Random Process

The outcome of a trial (throwing a dice) has only one value (1, 2, …, 6). In
a random process, the outcome is a function of time t. For example, if the
random process is defined as the noise voltages of telephone lines in EWB,
then for each telephone line, there will be a noise voltage output which
depends on time t.


1
For the case above, “statistically determined” means if we are given any
time t and noise voltage x, we know the probability of the noise of all
telephone lines which is lower than x.



st1 order p.d.f.


nd2 order joint p.d.f.


thn order joint p.d.f.:


st1 Moment:

µ()t = E[]x()t = xp()x,t dx ∫
− ∞
Autocorrelation:
∞ ∞
R()t ,t = E[]x(t)x(t)= x x p()x , x ;t ,t dx dx1 2 1 2 1 2 1 2 1 2 1 2∫∫
− ∞ − ∞
Autocovariance:
2C()t ,t = E[]()x(t)− µ (t ) (x (t ) − µ (t ) )1 2 1 1 2 2
= Rt ,t − µ(t)µ(t)1 2 1 2

Variance:

22σ ()t =()xt − µt p(x,t)dx = C(t,t)∫
− ∞

X (t ) = A cos (2 πt )E.g.: Let , where A is some random variable.
Find the mean, autocorrelation, and autocovariance of X(t).

Mean:

µ()t = E[]Acos(2 πt) = E [A ]cos (2 πt )

Autocorrelation:

()()()R t ,t = E X t X t1 2 1 2
= E[]Acos()2 πt Acos(2 πt) 1 2
2= EA cos2 πt cos(2 πt)1 2

















3Autocovariance:
C()t ,t = R()t ,t − µ(t)µ (t )1 2 1 2 1 2
2= E[]A cos()2 πt cos(2 πt)− E[]A cos()2 πt E[]A cos()2 πt1 2 1 2
22()[] [] ()()= E A − E A cos 2 πt cos 2 πt1 2
2= σ cos2 πt cos(2 πt)A 1 2

Strict-sense Stationary:

What is stationary process?
An everyday example of a stationary process is the daily temperature during
the summer.

A random process is said to be Stationary in the Strict Sense (SSS) if its
statistics (i.e. pdfs) are invariant to a shift in the time origin. That is, the 2
processes f(t) and f(t+ ε) have the same pdfs for any arbitrary ε.
⇒ the nth-order pdf must be:

p(x , x , …, x ; t , t , …, t ) = p(x , x , …, x ; t + ε, t + ε, …, t + ε) 1 2 n 1 2 n 1 2 n 1 2 n


stLet’s see the 1 order pdf:

p(x;t) = p(x;t + ε )

stSince ε is arbitrary, we can see that the 1 order pdf is the same for any time
t. That is independent of time t.

ndLet see the 2 order pdf:

p(x , x ;t ,t ) = p(x , x ;t + ε ,t + ε ) = p(x , x ;t − t ) 1 2 1 2 1 2 1 2 1 2 1 2

ndWe can see that the 2 order pdf is independent of t , t , ε but depend on t -t . 1 2 1 2

Stationary of order k:
A random process, x(t) is said to be stationary of order k iff

4p(x , x ,..., x ;t , t ,..., t ) = p(x , x ,..., x ; t + ε , t + ε ,..., t + ε ) for n ≤ k 1 2 n 1 2 n 1 2 n 1 2 n

Wide-sense Stationary:
A random process, x(t) is said to be Wide-Sense Stationary (WSS) iff:
(1) Its mean is constant ⇒ its mean is not time-varying (This does not imply
stationary of order 1)
τ = t − t(2) Its autocorrelation R( τ) depends on (This does not imply 1 2
stationary of order 2.)

Note:
(1) x(t) is stationary of order 2 implying that x(t) is WSS. In reverse, it DOES
NOT WORK.
(2) x(t) is Gaussian and WSS implying that x(t) is Gaussian and SSS. In
reverse, it WORKS.
(3) x(t) is SSS implying that x(t) is WSS. In reverse, it DOES NOT WORK.
(See Figure 2)
(4) x(tx(t) is stationary of order k. In reverse, it DOES
NOT WORK.


5Figure 1. Example of stationary and non-stationary
SSS
Non-stationary
WSS
Universe of random process

Figure 2. A set structure of SSS, WSS and Non-stationary

x(t) = asin( ω ⋅t + θ ) θExample 1) Is , where is a random phase angle
uniformly distributed in [0, 2 π] and a, ω are constants, a WSS process?

x(t) = asin( ω ⋅t + θ )Example 2) Is , where ω is a random variable uniformly
θdistributed in [0, ω ] and a, are constants, a WSS process? 0




















61)S:
E[ x(t)] = aE [sin( wt + θ )]
2 π
= a sin( wt + θ ) p( θ )d θ∫ 0
= 0
1 2R()t ,t = E[]x(t)x(t)= − a E[cos(wt + wt + 2 θ ) − cos(wt − wt )]1 2 1 2 1 2 1 22

2 π1 2= − a [cos(wt + wt + 2 θ ) − cos(wt − wt )]p( θ )dθ 1 2 1 2 ∫2 0
1 2= a cos w τ
2
This is a WSS process.


2)S:
E[x(t)] = aE[sin(wt + θ )]
w0
= a sin(wt + θ ) p(w)dw∫ not a WSS process.
0
a
= (cos θ − cos(w t + θ ))0w t0

7

  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents