Download Joint random variable

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Jointly distributed random variables
We may define two or more random variables on the same sample space. Let X and
Y be two real random variables defined on the same probability space ( S , F , P). The
mapping
variable.
S
2
such that
for s  S , ( X ( s), Y ( s)) 
2
is called a joint random
( X (s), Y (s))
R2
Y ( s)
s 
S
Figure Joint Random Variable
Remark
 The above figure illustrates the mapping corresponding to a joint random
variable. The joint random variable in the above case is denoted by ( X , Y ).
 We may represent a joint random variable as a two-dimensional vector
X  [ X Y].
 We can extend the above definition to define joint random variables of any
dimension. The mapping S  n such that for
s  S , ( X 1 ( s), X 2 ( s),...., X n ( s))  n is called a n-dimensional random variable
and denoted by the vector X  [ X1 X 2 .... X n ].
Example1: Suppose we are interested in studying the height and weight of the students in
a class. We can define the joint RV ( X , Y ) where X represents height and Y represents
the weight.
Example 2 Suppose in a communication system X
Y is the corresponding noisy received signal. Then
variable.
is the transmitted signal and
( X , Y ) is a joint random
Joint Probability Distribution Function
Recall the definition of the distribution of a single random variable. The event
{ X  x} was used to define the probability distribution function FX ( x). Given
FX ( x), we can find the probability of any event involving the random variable.
Y,
the
event
X and
Similarly,
for
two
random
variables
{ X  x, Y  y}  { X  x}  {Y  y} is considered as the representative event.
The probability P{ X  x, Y  y} ( x, y)  2 is called the joint distribution
function of the random variables X and Y and denoted by FX ,Y ( x, y ).
Y
( x, y )
X
FX ,Y ( x, y ) satisfies

the following properties:
FX ,Y ( x1 , y1 )  FX ,Y ( x2 , y2 ) if x1  x2 and y1  y2
If x1  x2 and y1  y2 ,
{ X  x1 , Y  y1 }  { X  x2 , Y  y2 }
 P{ X  x1 , Y  y1 }  P{ X  x2 , Y  y2 }
 FX ,Y ( x1 , y1 )  FX ,Y ( x2 , y2 )
Y
( x2 , y2 )
( x1 , y1 )
 FX ,Y (, y )  FX ,Y ( x, )  0
Note that
{ X  , Y  y}  { X  }
 FX ,Y (, )  1.


FX ,Y ( x, y ) is
right continuous in both the variables.
If x1  x2 and y1  y2 ,
P{x1  X  x2 , y1  Y  y2 }  FX ,Y ( x2 , y2 )  FX ,Y ( x1 , y2 )  FX ,Y ( x2 , y1 )  FX ,Y ( x1 , y1 )  0.
( x2 , y2 )
Y
( x1 , y1 )
X
Given FX ,Y ( x, y ), -  x  , -  y  ,
the random variables

X
we have a complete description of
and Y .
FX ( x)  FXY ( x,).
To prove this
{X  x}  {X  x}  {Y  }
 FX ( x)  P {X  x}  P { X  x, Y  }  FXY ( x, )
Similarly FY ( y)  FXY (, y).
 Given FX ,Y ( x, y ), -  x  , -  y  ,
a marginal distribution function.
each of FX ( x) and FY ( y) is called
Example
Consider two jointly distributed random variables
(1  e2 x )(1  e  y ) x  0, y  0
FX ,Y ( x, y )  
otherwise
0
X and Y with the joint CDF
(a) Find the marginal CDFs
(b) Find the probability P{1  X  2, 1  Y  2}
(a)
1  e2 x
FX ( x)  lim FX ,Y ( x, y )  
y 
0
1  e y
FY ( y )  lim FX ,Y ( x, y )  
x 
0
x0
elsewhere
y0
elsewhere
(b)
P{1  X  2, 1  Y  2}  FX ,Y (2, 2)  FX ,Y (1,1)  FX ,Y (1, 2)  FX ,Y (2,1)
 (1  e4 )(1  e2 )  (1  e2 )(1  e1 )  (1  e2 )(1  e2 )  (1  e4 )(1  e1 )
=0.0272
Jointly distributed discrete random variables
X and Y are two discrete random variables defined on the same probability space
( S , F , P) such that X takes values from the countable subset RX and Y takes values
If
from the countable subset RY . Then the joint random variable ( X , Y ) can take values from
the countable subset in RX  RY . The joint random variable ( X , Y ) is completely
specified by their joint probability mass function
p X ,Y ( x, y )  P{s | X ( s)  x, Y ( s)  y}, ( x, y)  RX  RY .
Given p X ,Y ( x, y ), we can determine other probabilities involving the random variables
X and Y .
Remark

p X ,Y ( x, y )  0 for ( x, y)  RX  RY


 p X ,Y ( x, y )  1
( x , y ) RX  RY
This is because
  p X ,Y ( x, y )  P (
( x , y ) RX  RY
( x , y )RX  RY
{x, y})
=P( RX  RY )
=P{s | ( X ( s), Y ( s))  ( RX  RY )}

=P ( S )  1
Marginal Probability Mass Functions:
The probability mass functions
p X ( x) and pY ( y) are obtained from the joint probability mass function as
follows
pX ( x)  P{ X  x} RY
=  pX ,Y ( x, y )
yRY
and similarly
pY ( y )   p X ,Y ( x, y )
xRX
These probability mass functions pX ( x) and pY ( y) obtained from the joint
probability mass functions are called marginal probability mass functions.
Example Consider the random variables X and Y with the joint probability
mass function as tabulated in Table . The marginal probabilities are as shown in
the last column and the last row
X
Y
0
1
p X ( x)
0
0.25
0.14
0.39
1
2
pY ( y )
0.1
0.35
0.45
0.15
0.01
0.5
0.5
Joint Probability Density Function
and Y are two continuous random variables and their joint distribution function
is continuous in both x and y, then we can define joint probability density function
f X ,Y ( x, y ) by
If X
f X ,Y ( x, y) 
2
FX ,Y ( x, y), provided it exists.
xy
x
y
Clearly FX ,Y ( x, y )    f X ,Y (u , v)dvdu
 
Properties of Joint Probability Density Function

f X ,Y ( x, y ) is always a non-negative quantity. That is,
f X ,Y ( x, y)  0 ( x, y) 

2
 
  f X ,Y ( x, y)dxdy  1
 

The probability of any Borel set B can be obtained by
P( B)   f X ,Y ( x, y)dxdy
( x , y )B
Marginal density functions
The marginal density functions f X ( x) and fY ( y) of two joint RVs X and Y are given by
the derivatives of the corresponding marginal distribution functions. Thus
f X ( x) 
d
dx
FX ( x)

d
dx
FX ( x, )

d
dx
x



 (  f X ,Y (u , y ) dy ) du

  f X ,Y ( x, y ) dy


and similarly
fY ( y )   f X ,Y ( x, y ) dx

Remark
 The marginal CDF and pdf are same as the CDF and pdf of the concerned
single random variable. The marginal term simply refers that it is derived
from the corresponding joint distribution or density function of two or more
jointly random variables.

With the help of the two-dimensional Dirac Delta function, we can define the
joint pdf of two discrete jointly random variables. Thus for discrete jointly
random variables X and Y .
f X ,Y ( x, y) 

 pX ,Y ( x, y) ( x  xi , y  y j )
( xi , y j )RX RY .
Example The joint density function f X ,Y ( x, y ) in the previous example is
f X ,Y ( x, y ) 

2
FX ,Y ( x, y )
xy
2
[(1  e 2 x )(1  e  y )] x  0, y  0
xy
 2e 2 x e  y
x  0, y  0
Example: The joint pdf of two random variables X
f X ,Y ( x, y )  cxy 0  x  2, 0  y  2
 0 otherwise
Find c.
Find FX , y ( x, y)
Find f X ( x) and fY ( y ).
What is the probability P(0  X  1, 0  Y  1) ?
(i)
(ii)
(iii)
(iv)

 

 
c 
f X ,Y ( x, y )dydx  c 
2
0
1
4
1 y x
uvdudv
4 0 0
x2 y 2

16
2
xy
f X ( x)   dy 0  y  2
4
0
FX ,Y ( x, y ) 
x
2

0 y2
Similarly
fY ( y ) 
y
2
0 y2

2
0
xydydx
and Y are given by
P (0  X  1, 0  Y  1)
 FX ,Y (1,1)  FX ,Y (0, 0)  FX ,Y (0,1)  FX ,Y (1, 0)
1
000
16
1
=
16
=
Related documents