Download Some additional Topics

Document related concepts
no text concepts found
Transcript
Some additional Topics
Distributions of functions of
Random Variables
Gamma distribution, c2 distribution,
Exponential distribution
Therorem
Let X and Y denote a independent random variables
each having a gamma distribution with parameters
(l,a1) and (l,a2). Then W = X + Y has a gamma
distribution with parameters (l, a1 + a2).
Proof:
a1
 l 
mX  t   

 l t 
a2
 l 
and mY  t   

 l t 
Therefore mX Y  t   mX  t  mY t 
a1
a2
a1 a 2
 l   l 
 l 

 
 

 l t   l t 
 l t 
Recognizing that this is the moment generating
function of the gamma distribution with parameters
(l, a1 + a2) we conclude that W = X + Y has a
gamma distribution with parameters (l, a1 + a2).
Therorem (extension to n RV’s)
Let x1, x2, … , xn denote n independent random variables
each having a gamma distribution with parameters
(l,ai), i = 1, 2, …, n.
Then W = x1 + x2 + … + xn has a gamma distribution
with parameters (l, a1 + a2 +… + an).
Proof:
ai
 l 
mxi  t   

 l t 
i  1, 2..., n
Therefore
mx1  x2 ... xn  t   mx1 t  mx2 t  ...mxn t 
a1
a2
an
 l   l 
 l 

...
 



 l t   l t 
 l t 
a1 a 2 ...a n
 l 


 l t 
Recognizing that this is the moment generating
function of the gamma distribution with parameters
(l, a1 + a2 +…+ an) we conclude that
W = x1 + x2 + … + xn has a gamma distribution with
parameters (l, a1 + a2 +…+ an).
Therorem
Suppose that x is a random variable having a
gamma distribution with parameters (l,a).
Then W = ax has a gamma distribution with
parameters (l/a, a).
Proof:
a
 l 
mx  t   

 l t 
a
a
l 

 l  
a 
then max  t   mx  at   


 l  at   l  t 
 a 
Special Cases
1. Let X and Y be independent random variables
having an exponential distribution with parameter
l then X + Y has a gamma distribution with a = 2
and l
2. Let x1, x2,…, xn, be independent random variables
having a exponential distribution with parameter l
then S = x1+ x2 +…+ xn has a gamma distribution
with a = n and l
3. Let x1, x2,…, xn, be independent random variables
having a exponential distribution with parameter l
S x1   xn
then
x
n

n
has a gamma distribution with a = n and nl
Distribution of x
population – Exponential distribution
0.6
0.5
pop'n
n=4
n = 10
0.4
n = 15
n = 20
0.3
0.2
0.1
0
0
5
10
15
20
Another illustration of the central limit theorem
Special Cases -continued
4. Let X and Y be independent random variables
having a c2 distribution with n1 and n2 degrees of
freedom respectively then X + Y has a c2
distribution with degrees of freedom n1 + n2.
5. Let x1, x2,…, xn, be independent random variables
having a c2 distribution with n1 , n2 ,…, nn degrees
of freedom respectively then x1+ x2 +…+ xn has a
c2 distribution with degrees of freedom n1 +…+ nn.
Both of these properties follow from the fact that a
c2 random variable with n degrees of freedom is a
G random variable with l = ½ and a = n/2.
Recall
If z has a Standard Normal distribution then z2 has a
c2 distribution with 1 degree of freedom.
Thus if z1, z2,…, zn are independent random variables
each having Standard Normal distribution then
U  z12  z22  ...  zn2
has a c2 distribution with n degrees of freedom.
Therorem
Suppose that U1 and U2 are independent random variables
and that U = U1 + U2 Suppose that U1 and U have a c2
distribution with degrees of freedom n1andn respectively.
(n1 < n)
Then U2 has a c2 distribution with degrees of freedom n2
=n -n1
Proof:
 12 
Now mU1  t    1 
 2 t 
v1
2
 12 
and mU  t    1 
 2 t 
v
2
Also mU t   mU1 t  mU2 t 
Hence mU 2  t  
mU  t 
mU1  t 
 12 
 1 t

 2
v
2
 12 
 1
v

1
 12  2  2  t 
 1 t
2

Q.E.D.
v
v
1

2 2
Bivariate Distributions
Discrete Random Variables
The joint probability function;
p(x,y) = P[X = x, Y = y]
1.
0  p  x, y   1
2.
 p  x, y   1
x
3.
y
P  X , Y   A   p  x, y 
 x, y   A
Marginal distributions
p X  x   P  X  x    p  x, y 
y
pY  y   P Y  y    p  x, y 
x
Conditional distributions
p  x, y 
p X Y  x y   P  X  x Y  y  
pY  y 
p  x, y 
pY X  y x   P Y  y X  x  
pX  x 
The product rule for discrete distributions
 pY  y  p X Y  x y 
p  x, y   
 p X  x  pY X  y x 
Independence
p  x, y   pX  x  pY  y 
Bayes rule for discrete distributions
pX Y  x y  
p X  x  pY X  y x 
 p u  p  y u 
X
YX
u
Proof:
pX Y  x y  

p  x, y 
pY  y 
p  x, y 
 p  x, u 
u

p X  x  pY X  y x 
 p u  p  y u 
X
u
YX
Continuous Random Variables
Definition: Two random variable are said to have
joint probability density function f(x,y) if
1.
0  f  x, y 
 
2.
  f  x, y  dxdy  1
 
3.
P  X , Y   A    f  x, y  dxdy
A
Marginal distributions
fX  x 

 f  x, y  dy

fY  y  

 f  x, y  dx

Conditional distributions
fY X  y x  
fX Y  x y 
f  x, y 
fX  x
f  x, y 
fY  y 
The product rule for continuous distributions
 fY  y  f X Y  x y 
f  x, y   
 f X  x  fY X  y x 
Independence
f  x, y   f X  x  fY  y 
Bayes rule for continuous distributions
fX Y  x y 
f X  x  fY X  y x 


f X  u  fY X  y u  du

Proof:
fX Y  x y 

f  x, y 
fY  y 
f  x, y 

 f  x, u  du





f X  x  fY X  y x 
f X  u  fY X  y u  du
Example
• Suppose that to perform a task we first have to
recognize the task, then perform the task.
• Suppose that the time to recognize the task, X,
has an exponential distribution with l = ¼ (i,e,
mean m = 1/l = 4 )
• Once the task is recognized the time to perform
the task, Y, is uniform from X/2 to 2X.
1.Find the joint density of X and Y.
2.Find the conditional density of X given Y = y.
 e
Now f x  

X  
1
4
and
fY X
Thus
 14 x
x0
x0
 0
2
 1


x
 y x    2 x  2 3x

0

x
2
y
 y  2x
x
2
f  x, y   f X  x  fY X  y x 

 e  32x x  0, 2x  y  2 x

otherwise

 0
 14 x
1
 6 x e
x  0, 2x  y  2 x

otherwise
 0
1
4
 14 x
or 2 x  y
Graph of non-zero region of f(x,y)
y  2x
y
x
2
Bayes rule for continuous distributions
f X  x  fY X  y x 
fX Y  x y 

1
6x
e
2y

y
2
1
6u
e


 14 x
 14 u
f X  u  fY X  y u  du

du

1
x
e
2y

1
u
e
 14 x
 14 u
y
2
 x  2 y, y  0
du
y  2x
y
2
y 
 , y
2 
y
 2 y, y 
x
2
Conditional Expectation
Let U = g(X,Y) denote any function of X and Y.
Then
E U x   E  g  X , Y  x 


 g  x, y  f  y x  dy
Y X

 h  x
is called the conditional expectation of U = g(X,Y)
given X = x.
Conditional Expectation and Variance
More specifically
mY x  E Y x  

 yf  y x  dy
YX

is called the conditional expectation of Y given X = x.

2
Yx

 E  Y  mY x


2


x    y  mY x



2
fY X  y x  dy
is called the conditional variance of Y given X = x.
An Important Rule
and
E U   E  g  X , Y   EX  E U x  
Var U   EX Var U x    VarX  E U x  
where EX and VarX denote mean and variance with
respect to the marginal distribution of X, fX(x).
Proof
Then
Let U = g(X,Y) denote any function of X and Y.

E U x   E  g  X , Y  x    g  x, y  fY X  y x  dy

 h  x

E X  E U x    E X  h  X     h  x  f X  x  dx




    g  x, y  fY X  y x  dy  f X  x  dx
  

 

  g  x, y  f  y x  f  x  dxdy
Y X
 
 

X
  g  x, y  f  x, y  dxdy  E  g  X , Y   E U 
 
Now
Var U   E U    E U 
 E X  E U 2 x    E X  E U x  
2
2





2
2

 EX Var U x   E U x    E X  E U x  






2
 EX Var U x    E X  E U x    E X  E U x  



2

 EX Var U x    VarX  E U x  



2
Example
• Suppose that to perform a task we first have to
recognize the task, then perform the task.
• Suppose that the time to recognize the task, X,
has an exponential distribution with l = ¼ (i,e,
mean m = 1/l = 4 )
• Once the task is recognized the time to perform
the task, Y, is uniform from X/2 to 2X.
1.Find E[XY].
2.Find Var[XY].
Solution
 2 x  2x  5 2
E  XY x   xE Y x   x 
4x

 2 
2
5



E  XY   EX  E  XY x    4 E X  X 
2
EX  X   m2  2  32
l
for the exponential distribution with l  14
2
Thus E  XY   54 E X  X 2   54  32  40
  2 x  2x 2 

Var  XY x   x 2Var Y x   x 2 
 12



2
 x
5
4
4
 
3
2
12
4
15 4
 1645
x

64 x
12 
4
15


EX Var  XY x    15
E
X

X 
64
 64 m4
15 4!
24
 64 4  15
64 1 4  60  24 
 4
l
25
VarX  E  XY x    VarX  54 X 2   16
VarX  X 2 


VarX  X   EX  X   EX  X   m   m2 
2
4!  2! 
 4   2   241  44  20  44 
l  l   4
2
4
2
2
2
25
hence VarX  E  XY x    16
20(44 )  8000
and Var  XY   EX Var  XY x    VarX  E  XY x  
 60  24  8000  1440  8000  9440
Conditional Expectation:
k (>2) random variables
Definition
Let X1, X2, …, Xq, Xq+1 …, Xk denote k
continuous random variables with joint
probability density function
f(x1, x2, …, xq, xq+1 …, xk )
then the conditional joint probability function
of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is
f1
q q 1 k
 x ,, x
1
q

xq 1 ,, xk 
f  x1 ,
f q 1
k
x
q 1
, xk 
,, xk 
Definition
Let U = h( X1, X2, …, Xq, Xq+1 …, Xk )
then the Conditional Expectation of U
given Xq+1 = xq+1 , …, Xk = xk is
E U xq 1 ,, xk  


  h  x , , x  f
1


k
1 q q 1 k
 x , , x
1
q

xq 1 ,, xk dx1  dxq
Note this will be a function of xq+1 , …, xk.
Let X, Y, Z denote 3 jointly distributed
random variable with joint density function


2
12

x
 yz 0  x  1, 0  y  1, 0  z  1
7
f  x, y , z   
Example
0
otherwise


Determine the conditional expectation of
U = X 2 + Y + Z given X = x, Y = y.
The marginal distribution of X,Y.
12  2 1 
f12  x, y    x  y  for 0  x  1, 0  y  1
7
2 
Thus the conditional distribution of Z given X = x,Y = y
is
12 2
x  yz 

f  x, y , z 
 7
f12  x, y  12  2 1 
 x  y
7
2 
x 2  yz

1
2
x  y
2
for 0  z  1
The conditional expectation of U = X 2 + Y + Z given X
= x, Y = y.
1
2
x
 yz
2
E U x, y     x  y  z  2 1 dz
x 2y
0
1



1
 2 1  x 2  y  z x 2  yz dz
x 2y0
1





1
2
2
2
2
2


 2 1  yz   y x  y  x  z  x x  y dz
x 2y0
z 1
3
2
 z

1
2
2 z
2
2
 2 1  y   y  x  y   x   x  x  y  z 
x 2y 3
2
 z 0
1
 2 1
x 2
 1 

2
2 1
2
2

 y  y x  y  x   x x  y 
y 3
2





Thus the conditional expectation of U = X 2 + Y + Z
given X = x, Y = y.
1
E U x, y   2 1
x 2
1
 2 1
x 2
 1 

2
2 1
2
2

 y  y x  y  x   x x  y 
y 3
2



 y x2

2
2
1
   x 2y x y 
y3 2

x 2  13 y
 2 1  x2  y
x 2y
1
2





The rule for Conditional Expectation
Let (x1, x2, … , xq, y1, y2, … , ym) = (x, y) denote q + m
random variables.
Let U  g  x1 , , xq , y1 , , ym   g  x, y 
Then
E U   Ey  E U y  
Var U   Ey Var U y    Vary  E U y  
Proof (in the simple case of 2 variables X and Y)
Thus U  g  X , Y 
E U  
 
  g  x, y  f  x, y  dxdy
 
E U Y   E  g  X , Y  Y  


 g  x, y 


 g  x, y  f  x y  dx
XY

f  x, y 
fY  y 
dx
hence
EY  E U Y   

 E U

y  fY  y  dy

f  x, y  
    g  x, y 
dx  fY  y  dy
fY  y 

 
 



    g  x, y  f  x, y  dx  dy
  


 

  g  x, y  f  x, y  dxdy  E U 
 
Now
Var U   E U    E U 
2
2


Var U Y   E U Y    E

  
   

 EY  E U Y    EY  E U Y  
2
 EY
2
2
Y


 E U Y  




2
 EY Var U Y    EY  E U Y    EY  E U Y  



 EY Var U Y    VarY E U Y 
2


2
The probability of a Gamblers
ruin
• Suppose a gambler is playing a game for
which he wins 1$ with probability p and loses
1$ with probability q.
• Note the game is fair if p = q = ½.
• Suppose also that he starts with an initial
fortune of i$ and plays the game until he
reaches a fortune of n$ or he loses all his
money (his fortune reaches 0$)
• What is the probability that he achieves his
goal? What is the probability the he loses his
fortune?
Let Pi = the probability that he achieves his goal?
Let Qi = 1 - Pi = the probability the he loses his
fortune?
Let X = the amount that he was won after
finishing the game
If the game is fair
Then E [X] = (n – i )Pi + (– i )Qi
= (n – i )Pi + (– i ) (1 –Pi ) = 0
or
(n – i )Pi = i(1 –Pi )
and (n – i + i )Pi = i
i
i n i
Thus Pi 
and Qi  1  
n
n
n
If the game is not fair
then Pi  qPi 1  pPi 1
or
 p  q  Pi  qPi1  pPi1
Thus
or
since p  q  1.
p  Pi 1  Pi   q  Pi  Pi 1  .
q
Pi 1  Pi   Pi  Pi 1  .
p
Note
P0  0 and Pn  1
Also
q
q
P2  P1   P1  P0   P1
p
p
2
q
q
P3  P2   P2  P1     P1
p
 p
3
q
q
P4  P3   P3  P2     P1
p
 p
q
Pi  Pi 1   
 p
i 1
P1
hence
Pi  P1   P2  P1    P3  P2  
2
q
q
 P1    P1 
p
 p
  Pi  Pi 1 
q
 
 p
i 1
P1
or
2
q
q
Pi  P1  P1    P1 
p
 p

 P1 1  r  r 
where
2
q
r
p
r
i 1
q
 
 p

i 1
P1
r 1
 P1
r 1
i
r 1
Pn  P1
1
r 1
r 1
P1  n
r 1
n
Note
thus
and
ri 1
Pi  P1
r 1
r 1 ri 1 r i 1
 n
 n
r 1 r 1 r 1
1



  1
q i
p
q n
p
table
i
9
9
9
9
90
90
90
90
900
900
900
900
n
10
10
10
10
100
100
100
100
1000
1000
1000
1000
p
0.50
0.48
0.45
0.40
0.50
0.48
0.45
0.40
0.50
0.48
0.45
0.40
q
0.50
0.52
0.55
0.60
0.50
0.52
0.55
0.60
0.50
0.52
0.55
0.60
Pi
Qi
0.900
0.860
0.790
0.661
0.900
0.449
0.134
0.017
0.900
0.000
0.000
0.000
0.100
0.140
0.210
0.339
0.100
0.551
0.866
0.983
0.100
1.000
1.000
1.000
A waiting time paradox
• Suppose that each person in a restaurant is being
served in an “equal” time.
• That is, in a group of n people the probability that one
person took the longest time is the same for each
person, namely
1
n
• Suppose that a person starts asking people as they
leave – “How long did it take you to be served”.
• He continues until it he finds someone who took
longer than himself
Let X = the number of people that he has to ask.
Then E[X] = ∞.
Proof
1
P  X  x 
x 1
= The probability that in the
group of the first x people
together with himself, he took
the longest
p  x   P  X  x
 P  X  x  1  P  X  x
1
1
1
 

x x  1 x  x  1
Thus



1
1
E  X    xp  x    x

x  x  1 x 1 x  1
x 1
x 1
1 1 1 1
    
2 3 4 5

The harmonic series
The harmonic series
1 1 1 1 1 1 1
     
2 3 4 5 6 7 8
1 1 1 1 1 1 1
       
2 3 4 5 6 7 8
1 1 1 1 1 1 1
       
2 4 4 8 8 8 8
1
 
2
1
2

1
2


Related documents