Download Having characterized the random process by the joint

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Important Classes of Random Processes
Having characterized the random process by the joint distribution ( density) functions and
joint moments we define the following two important classes of random processes.
(a) Independent and Identically Distributed Process
Consider a discrete-time random process { X n }. For any finite choice of time instants
n1 , n2 ,...., nN , if the random variables X n1 , X n2 ,..., X nN are jointly independent with a
common distribution, then { X n } is called an independent and identically distributed (iid)
random process. Thus for an iid random process { X n },
FX n , X n ,.., X n ( x1 , x2 .....xn )  FX ( x1 ) FX ( x2 ).....FX ( xn )
1
2
N
and equivalently
p X n , X n ,.., X n ( x1 , x2 .....xn )  p X ( x1 ) p X ( x2 )..... p X ( xn )
1
2
N
Moments of the IID process
It is easy to verify that for an iid process { X n }
 Mean EX n   X  constant


Variance E ( X n   X ) 2   X2 =constant
Autocovariance CX (n, m)  E ( X n   X )( X m   X )
 E ( X n   X )E ( X m   X )
0 for n  m
= 2
 X otherwise
  X2  [n, m]
where  [n, m]  1 for n  m and 0 otherwise.
 Autocorrelation RX (n, m)  C X (n, m)   X2   X2  (n, m)   X2
Example Bernoulli process: Consider the Bernoulli process { X n } with
p X (1)  p and
p X (0)  1  p
This process is an iid process.
Using the iid property, we can obtain the joint probability mass functions of any
order in terms of p. For example,
p X1 , X 2 (1, 0)  p (1  p )
p X1 , X 2 , X 3 (0, 0,1)  (1  p) 2 p
and so on.
Similarly, the mean, the variance and the autocorrelation function are given by
 X  EX n  p
n
var( X n )  p(1  p)
RX (n1 , n2 )  EX n1 X n2
 EX n1 EX n2
 p2
(b) Independent Increment Process
A random process { X (t )} is called an independent increment process if for any
n  1 and t1  t2  ...  tn , the set of n random variables
X (t1 ), X (t2 )  X (t1 ),... ,X (tn )  X (tn1 ) are jointly independent random variables.
If the probability distribution of X (t  r )  X (t ' r ) is same as that of X (t )  X (t ), for
any choice of t , t 'and r , then { X (t )} is called stationary increment process.

The above definitions of the independent increment process and the stationary
increment process can be easily extended to discrete-time random processes.

The independent increment property simplifies the calculation of joint
probability distribution, density and mass functions from the corresponding
first-order quantities. As an example, for t1  t2 , x1  x2 ,
FX (t1 ), X (t2 ) ( x1 , x2 )  P({ X (t1 )  x1 , X (t2 )  x2 }
 P({ X (t1 )  x1}) P({ X (t2 )  x2 }/{ X (t1 )  x1})
 P({ X (t1 )  x1 ) P({ X (t2 )  X (t1 )  x2  x1}/{ X (t1 )  x1})
 P({ X (t1 )  x1 ) P({ X (t2 )  X (t1 )  x2  x1})
 FX (t1 ) ( x1 ) FX (t2 )  X (t1 ) ( x2  x1 )
 The independent increment property simplifies
autocovariance function.
the computation
For t1  t2 , the autocorrelation function of X (t ) is given by
RX (t1 , t2 )  EX (t1 ) X (t2 )
 EX (t1 )( X (t1 )  X (t2 )  X (t1 ))
 EX 2 (t1 )  EX (t1 ) E ( X (t2 )  X (t1 ))
 EX 2 (t1 )  EX (t1 ) EX (t2 )  ( EX (t1 )) 2
 var( X (t1 ))  EX (t1 ) EX (t2 )
 C X (t1 , t2 )  EX (t1 ) X (t2 )  EX (t1 ) EX (t2 )  var( X (t1 ))
Similarly, for t1  t2 ,
CX (t1 , t2 )  var( X (t2 ))
Therefore
of
the
C X (t1 , t2 )  var( X (min(t1 , t2 )))
Example: Two continuous-time independent increment processes are widely studied.
They are:
(a) Wiener process with the increments following Gaussian distribution and
(b) Poisson process with the increments following Poisson distribution. We shall
discuss these processes shortly.
Random Walk process
Consider an iid process {Z n } having two states Z n  1 Z n  1 with the probability
mass functions
pZ (1)  p and pZ (1)  q  1  p.
Then the sum process { X n } given by
n
X n   Z i  X n 1  Z n
i 1
with X 0  0 is called a Random Walk process.
 This process is one of the widely studied random processes.
 It is an independent increment process. This follows from the fact that
X n  X n1  Zn and {Z n } is an iid process.
n

If we call Z n  1 as success and Z n  1 as failure, then X n   Z i

represents the total number of successes in n independent trials.
1
If p  , { X n } is called a symmetrical random walk process.
2
i 1
Probability mass function of the Random Walk Process
At an instant n, X n can take integer values from n to n
Suppose X n  k .
Clearly k  n1  n1
where n1  number of successes and n1  number of failures in n trials of Z n such
that n1  n1  n.
nk
nk
 n1 
and n1 
2
2
Also n1 and n1 are necessarily non-negative integers.
nk
nk
nk
nk
n
2
2
C
p
(1

p
)
if
and
are non-negative integers
 nk
 p X n (k )   2
2
2
0
otherwise

Mean, Variance and Covariance of a Random Walk process
Note that
EZ n  1 p  1 (1  p )  2 p  1
EZ n2  1 p  1 (1  p )  1
and
var( Z n )  EZ n2  (EZ n )2
 1- 4 p 2  4 p  1
 4 pq
n
 EX n   EZ i  n(2 p  1)
i 1
and
n
var( X n )   var( Z i )
i 1
Z i s are independent random variables
 4npq
Since the random walk process { X n } is an independent increment process, the
autocovariance function is given by
CX (n1 , n2 )  4 pq min(n1 , n2 )
Three realizations of a random walk process is as shown in the Fig. below:
Remark If the increment Z n of the random walk process takes the values of s and  s,
then
n
 EX n   EZ i  n(2 p  1) s
i 1
and
n
var( X n )   var( Zi )
i 1
 4npqs 2
(c) Markov process
A process { X (t )} is called a Markov process if for any sequence of time t1  t2  .....  tn ,
P({X (tn )  x | X (t1 )  x1 , X (t2 )  x2 ,..., X (tn1 )  xn1})  P({ X (tn )  x | X (tn1 )  xn1})
 Thus for a Markov process “the future of the process, given present, is
independent of the past.”
 A discrete-state Markov process is called a Markov Chain. If { X n } is a discretetime discrete-state random process, the process is Markov if
P({X n  xn | X 0  x0 , X1  x1 ,..., X n1  xn1})  P({X n  xn | X n1  xn1})
 An iid random process is a Markov process.
 Many practical signals with strong correlation between neighbouring samples are
modelled as Markov processes
Example Show that the random walk process { X n } is Markov.
Here,
P({ X n  xn | X 0  0, X 1  x1 ,..., X n 1  xn 1 })
 P({ X n 1  Z n  xn | X 0  0, X 1  x1 ,..., X n 1  xn 1 })
 P({Z n  xn  xn 1 })
 P({ X n  xn | X n 1  xn 1 })
Wiener Process
Consider a symmetrical random walk process { X n } given by
X n  X (n)
where the discrete instants in the time axis are separated by  as shown in the Fig. below.
Assume  to be infinitesimally small.
0
 2
t  n

Clearly,
EX n  0
1 1
var( X n )  4 pqns 2  4   ns 2  ns 2
2 2
For large n, the distribution of X n approaches the normal with mean 0 and variance
t
ns 2  s 2   t

As   0 and n  , X n becomes the continuous-time process X (t ) with the pdf
2
1x

1
f X t  ( x) 
e 2  t . This process  X  t  is called the Wiener process.
2 t
A random process  X  t  is called a Wiener process or the Brownian motion process if it
satisfies the following conditions:
(1) X  0  0
(2) X  t  is an independent increment process.
(3) For each s  0, t  0
variance  t .
X  s  t   X (s) has the normal distribution with mean 0 and
2
1x

1
f X  s t  X  s  ( x) 
e 2 t
2 t


Wiener process was used to model the Brownian motion – microscopic particles
suspended in a fluid are subject to continuous molecular impacts resulting in the
zigzag motion of the particle named Brownian motion after the British Botanist
Brown.
Wiener Process is the integration of the white noise process.
A realization of the Wiener process is shown in the figure below:
RX  t1 , t2   EX  t1  X  t2 
 EX  t1   X  t2   X  t1   X  t1 
 EX  t1  E  X  t2   X  t1   EX 2  t1 
 EX 2  t1 
  t1
Similarly if t1  t2
RX  t1 , t2    t2
 RX t1 , t2    min  t1 , t2 
2
1 x

1
2 t
f X t   x  
e
2 t
Remark
CX  t1 , t2    min t1, t2 
X  t  is a Gaussian process.
Assuming t2  t1
Related documents