Download Chapter 8

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
STAT 111
Chapter Eight
Expectation of Discrete
Random Variable
Expectation of Discrete Random
Variable
One of the important concept in probability theory is that of the
expectation of a random variable. The expected value of a random
variable X, denoted by E(X) or  x , measure where the
Probability distribution is centered.
Definition :
Let X be a discrete random variable having a probability mass
Function f(x). If
 x f ( x)  
x
Then , the expected value (or mean) of X exist and is define as
E ( X )   xf ( x)
x
Expectation of Discrete Random
Variable
In words , the expected value of X is the weighted
average of the possible values of X can take on , each
value being weighted by the probability that X assumes it.
Example
The probability mass function of the random
variable X is given by
x
x f (x)
1
1/2
2
1/3
Find the expected value of X.
3
1/6
Solution:
x
1
2
3
sum
f(x)
1/2
1/3
1/6
1
x f(x)
1/2
2/3
3/6
10\6
Then ,
E ( X )   xf ( x) = 10/6
x
The value of E(x)
Example
The probability distribution of the discrete random variable Y is
 3  1   3 
f ( y)  
 y
 4   4 
    
Find the mean of Y.
Solution:
y
3 y
y  0,1,2,3
Get the values of f(y) such as
0
30
3


1
3




When y=0 then
f (0)  
 27 / 64
 0
 4   4 
    
When y=1 then
1
31
And so on then
 3  1   3 
f (1)      
1  4   4 
 27 / 64
Example (continued)
Then we can form the following table
y
f(y)
y f(y)
0
27/64
0
1
27/64
27/64
2
3
9/64 1/64
18/64 3/64
sum
1
48/64
So , E(y) = 48/64 = 3/4
The value of E(y)
Example
A pair of fair dice is tossed. Let X assign to each point
(a,b) in S the maximum of its number, i.e.
X (a,b) = max (a, b) . Find the probability mass function
of X , and the mean of X .
Solution: When toss a pair of fair dice
S={(1,1),(1,2),(1,3)……………..(6,5),(6,6)}
x
f(x)
x f(x)
1
1/2
1/2
2
1/3
2/3
3
1/6
3/6
sum
1
10\6
E(x) = 161/36
E(x)
Example
X=no. of chemist
Find the expected number of chemists on committee of 3
selected at random from 4 chemists and 3 biologists .
Solution:
Now we want to form the table of function
1- get the value of X = the number of chemist in the committee
x = 0,1,2,3
2- get the values of mass functions f(x)
x=0 , then f(0)=P(X=0) =
x=2 , then f(2)=P(X=2) =
 4  3 

0

 

  3   1
35
7

3

 
 4  3 

 2

1 

    18
35
7


3
 
Example
x
0
1
f(x)
x f(x)
1/35
0
12/35
12/35
E(X) = 60/35 =1.7
Note:
E(X) need not be an integer .
2
3
sum
18/35 4/35
1
36/35 12/36 60/35
E(x)
Example
Let X be the following probability mass function
Find E(X3(
Solution:
x
 ..........x  1,2,3
f ( x)   6

0...........elsewhere
E ( X 3 )   X 3 f ( x)
x
= 1/6+16/6+81/6=98/6
Expected value(mean) of some
distributions
x
Distribution
Binomail dist.
E(X)= mean
E(X) = np
Hypergeometric
E(X) = n M / N
Geometric dist.
E(X)= 1/ P
Poisson dist.
E(X) = ‫ג‬
Uniform dist.
E(X) = (N+1)/ 2
Examples
Example 1:
A fair die is tossed 1620 times. Find the expected number
of times the face 6 occurs.
Solution:
X= # of times faces {6} occurs
X ~Bin(1620,1/6) ,then
E(X) = np = 1620 * 1/6 = 270
Example 2:
If the probability of engine malfunction during any 1-hour
period is p=0.02 and X is the number of 1-hour interval
until the first malfunction , find the mean of X.
Solution:
X ~g(0.02) ,then
E(X) = 1/P= 1/0.02=50
Example 3:
A coin is biased so that a head is three times as likely to
occur as a tail. Find the expected number of tails when this
coin is tossed twice.
Solution:
Since coin is biased then
P(H) = 3 P(T) [ since P(H)+P(T) = 1
3P(T)+P(T)=1
4P(T)=1
X= # of tails (T)
X ~Bin(2,1/4) ,then
E(X) = np = 2 *1/4 = 1/2
P(T)= ¼ ]
Example 4:
If X has a Poisson distribution with mean 3 . Find
the expected value of X.
Solution :
X ~Poisson(3)
then
E(x) = ‫ = ג‬3
Properties of Expectation:
1.If X is a random variable with probability distribution f(x).The mean or
expected value of the random variable g(x) is
E ( g ( x))   g ( x) f ( x)
x
(Law of unconscious statistician)
2. If a and b are constants ,then
(I) E(a) = a
(II) E(aX) = a E(X)
(III) E(aX ± b) =E(aX) ± E(b) = a E(X) ± b
(IV) E ( X 2 ) 
x 2 f ( x)

x
Example
If X is the number of points rolled with a balanced die,find the
expected value of the random variable g(x)= 2 X2 +1
Solution:
S={1,2,3,4,5,6} each with probability 1/6
x
1
2
3
4
5
6
sum
f(x)
1/6
1/6
1/6
1/6
1/6
1/6
1
xf(x)
1/6
2/6
3/6
4/6
5/6
6/6
21/6
x2 f(x)
1/6
4/6
9/6
16/6
25/6
36/6
91/6
E(g(x))=E(2 X2 +1)=2E(X2)+E(1)
=2 * 91/6 + 1=188/6=31.3
Expectation and moments for Bivariate
Distribution
We shall now extend our concept of mathematical expectation to the
case of n random variable X1 , X2 ,….,Xn with joint probability
distribution f(x1 , x2 ,….,xn ).
Definition:
Let X1 , X2 ,….,Xn be a discrete random vector with joint probability
distribution f(x1 , x2 ,….,xn ) and let g be a real valued function.
Then the random variable Z=g(x1 , x2 ,….,xn )expectation has finite
expectation if and only if
 g ( x , x ,...x ) f ( x , x ,...x )  
1
x1 , x2 ,... xn
2
n
1
2
n
and in this case the expected value of Z, denoted by
E( g ( x1 , x2 ,...xn )) 
 g ( x , x ,...x ) f ( x , x ,...x )
1
2
n
1
2
n
x1 , x2 ,... xn
Example:
Let X and Y be the random variable with the following joint
probability function.
y/x
0
1
2
Find the expected value of
g(x,y)= XY
0
3/28
9/28
3/28
Solution:
1
3/14
3/14
0
2
2
E ( x )    xyf ( x, y )
2
1/28
0
x 0 x 0
=0*0*f(0,0)+0*1*f(0,1)+…+1*1*f(1,1)+…+2*0*f(2,0)
=f(1,1) =3/4
0
Theorem 1:
The expected value of the sum or difference of two or more functions of
the random variables X,Y is the sum or difference of the expected values
of the function. That is
E[ g ( X , Y )  h( X , Y )]  E[ g ( X , Y )]  E[h( X , Y )]
Generalized of the above theorem to n random variables is straightforward.
Corollary:
Setting g(x,y)=g(x) and h(x,y)=h(y),we see that
E[ g ( X )  h(Y )]  E[ g ( X )]  E[h(Y )]
Corollary:
Setting g(x,y)= x and h(x,y)=y
,we see that
E[ X  Y ]  E[ X ]  E[Y ]
And in general
n
E ( X i ) 
i 1
n
 E( X
i 1
i
)
Theorem 2: (Independence)
If X and Y are two independent random variables having
finite expectations. Then XY has finite expectation and
E(XY) = E(X)E(Y)
Note: opposite are not true
If E(XY) = E(X)E(Y)
X,Y independent
In general, if X1, X2 ,….., Xn are n independent random
variables such that each expectation E(Xi) exists
(i=1,2,…n(, then
n
E ( X i ) 
i 1
n
 E( X
i 1
i
)
Example1 :
Prob.= 1/n=1/4
Let (X,Y) assume the values (1,0),(0,1),(-1,0),(0,-1) with
equal probabilities. Show that the equation satisfied
f(y)
E(XY) = E(X)E(Y)
Solution:
However, the random
Variables X and Y are
not independent
Then,
f(x)
x
-1
0
1
sum
-1
0
1/4
0
1/4
0
1/4
0
1/4
2/4
1
0
1/4
0
1/4
sum
1/4
2/4
1/4
1
Y
Example
E (X)= -1x1/4+0x2/4+1x1/4=-1/4+0+1/4=0
E (Y)= -1x1/4+0x2/4+1x1/4=-1/4+0+1/4=0
E(X) E(Y) = 0
Now,
E(XY) = (-1x -1x0) + (-1x0x1/4)+…..+(1x0x1/4)+(1x1x0)
= 0 + 0 +……..+ 0 + 0 = 0
Then, E(XY) = E(X)E(Y)
0 = 0
(equation is satisfied)
However, X and y are not independent since
f (0,0)  f x (0) f y (0)
0 
2
2

4
4
Example
Suppose that X1, X2 and X3 are independent random
variables such that E(Xi)=0 and E(Xi2(=1 for i=1,2,3
Find E(X12 (X2 - 4 X3 (2 (
Solution:
Since X1, X2 and X3 are independent
X12 and (X2 - 4 X3 (2 are also independent
E(X12 (X2 - 4 X3 (2 (=E(X12)E( X2 - 4 X3 (2
= 1x E(X22- 8X2 X3 + 16 X32 (
=E(X22( - 8E(X2 X3( + 16 E(X32 (
=1-8 E(X2)E(X3) + 16x 1
= 1-(8x0x0) + 16=17
Remember if X,Y indep,then
E(XY)  E(X)E(Y)
Conditional Expectation
Definition: Let X and Y be two random variables with joint probability
Distribution f(x,y). The conditional expectation of X, given Y=y, is defined
as
E ( X | Y  y )   xf ( x \ y )
x
Where f(x\y) is the conditional distribution of X given Y = y
f ( x \ y) 
f ( x, y )
f y ( y)
Example
If the joint probability distribution function of X and Y are as shown in
The following table:
y
x
-1
1
Find:
-1
1/8
1/2
1.The conditional distribution of X given
0
0
1/4
Y= -1, That is [f(x\y= -1) for every x]
f ( x, y) f ( x,1)
f ( x \ y  1) 

f y (1)
5/8
sum
5/8
1/4
1
1/8
0
1/8
sum
2/8
3/4
1
When X= -1
f ( x \ y  1) 
f (1,1) 1 / 8

 1/ 5
5/8
5/8
When X = 1
f (1,1) 1 / 2
f ( x \ y  1) 

 4/5
5/8
5/8
x
f(x/y=-1)
-1
1
1/5
4/5
Example
2. The conditional mean of X given Y= -1
E ( X | Y  y)   xf ( x \ y)  3 / 5
x
x
-1
1
sum
f(x/y=-1)
1/5
4/5
1
xf(x/y=-1)
-1/5
4/5
3/5
Variance
The variance measures the degree to which a distribution is
concentrated
around its mean. Such measure is called the variance (or dispersion).
Definition:
The variance of a random variable X, denoted by Var(X) or σx2 where
Var ( X )   x2  E ( X   x ) 2  E ( X  E ( X )) 2
In other words,
Var( X )   x2  E( X 2 )  x  E( X 2 )  (E( X )) 2
2
Since the variance is the expected value of the nonnegative random
variable (X-μx2(,then it has some properties.
Properties of variance:
1. Var(X(≥0
2.  x  Var(X )
3.
4.
Is called the standard deviation of X.
The variance a distribution provides a measure of the
spread or dispersion of the distribution around its mean
μx.
If a, b are constants , then
(i) Var(a)=0
(ii) Var(aX)=a2Var(X)
(iii) Var(aX ± b)=Var(aX)+Var(b)= a2Var(X)
Variances of some distributions:
Distribution
Binomail dist.
Variance
Var(X) = npq
Geometric dist.
Var(X)= 1/ P2
Poisson dist.
Var(X) = λ
Example
Let X be a random variable which take each of the five values
-2,0,1,3 and 4, with equal probabilities. Find the
standard deviation of Y=4X-7
Solution:
equal probabilities
each value has prob.=1/5
standard deviation of Y=√var(Y)
-2
0
1
3
4
x
sum
E(X)=6/5 , E(X2)= 30/5
1/5 1/5 1/5 1/5 1/5
1
f(x)
2
2
Var(X)=E(X ) – [E(X)]
xf(x) -2/5 0 1/5 3/5 4/5 6/5
= 30/5 –(6/5)2 = 4.56
x2f(x) 4/5 0 1/5 9/5 16/5 30/5
Var(Y)=Var(4X-7)=Var(4X)+Var(7)
=42 Var(X)+0 = 16 Var(X)=16 x 4.56 =72.96
standard deviation of Y=√var(Y)= √ 72.96= 8.542
Example
If E(X) = 2, Var(X) =5 , find
1. E(2+X)2
2.Var(4+3X)
Solution:
1. E(2+X)2 =E(4+4X+X2( = 4+4E(X)+E(X2)
To get the value of E(X2) we use Var(X) =5
Var(X) = E(X2) – [E(X)]2
5 = E(X2) - 22
E(X2) = 5+4 =9
So, E(2+X)2 =E(4+4X+X2( = 4+4E(X)+E(X2)
= 4+(4x2)+9= 4+8+9=21
2.Var(4+3X)=Var(4)+32Var(X)
= 0 + (9x5) =45
Variance of the sum :
Def : Let X and Y be two random variables each having finite
second moments. Then X+Y has finite second moments and hence
finite variance. Now
Var(X+Y)=Var(X)+Var(Y)+2E[(X-E(X))(Y-E(Y))]
Thus unlike the mean ,the variance of a sum of two random variables is
in general not the sum of the variances. Where the quantity
E[(X-E(X))(Y-E(Y))]
Is called the covariance of X and Y and written Cov(X,Y).
Thus, we have the formula
Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)
Note that:
Cov(X,Y) = E[(X-E(X))(Y-E(Y))]
= E[XY-YE(X)-XE(Y)+E(X)E(Y)]
= E(XY) – E(X)E(Y)
Corollary:
If X and Y are independent then Cov(X,Y)=0
Then,
Var(X+Y) = Var(X) + Var(Y)
In general,
If X1 , X2 ,….,Xn are independent random variables each each having a finite
second moment, then n
n
Var (  X i ) 
i 1
Properties of Cov(X,Y):
 Var ( X
i 1
i
)
Let X and Y be two random variables , the Cov(X,Y) has the following
properties:
1. Symmetry , i.e.
Cov(X,Y) = Cov(Y,X)
2. Cov(a1X1 + a2X2 , b1Y1+b2Y2 ( =
= a1b1Cov(X1, Y1)+ a1b2Cov(X1, Y2) + a2b1Cov(X2, Y1)+ a2b2Cov(X2, Y2)
3. If X and Y are independent then Cov(X,Y)=0
4.
n
n
i 1
j 1
n
n
Cov( ai X i ,  b jY j )   ai b j Cov( X i , Y j )
i 1 j 1
5. Cov(a,X) = 0 , where a is a constant.
Note that:
Var (aX+bY) =a2 Var(X)+b2 Var(Y)+2abCov(X,Y)
In general,
If X1 , X2 ,….,Xn are random variables and
Y= a1X1+ a2X2 +…….+ anXn where a1 , a2 ,….,an are constants, then
n
Var (Y )   ai2Var ( X i )  2 ai a j Cov( X i , X j )
i 1
i j
Where the double sum extends over all the values of I and j,from 1 to n
for which i< j
Example
If the random variable X,Y,Z have meanes,respectively,2,-3,4 and
variances 1,5,2 and Cov(X,Y)= -2,Cov(X,Z)= -1,Cov(Y,Z)=1.
Find the mean and the variance of W= 3X - Y +2Z
Solution:
E(W)=E(3X - Y +2Z)= 3E(X) – E(Y) + 2E(Z)
= (3 x2) – (-3) + 2x4
=6
+ 3 + 8 =17
Var(W) =Var(3X - Y +2Z)=Var(3X)+Var(Y)+Var(2Z)+2Cov(3X,-Y)
+ 2Cov(3X,2Z)+2Cov(-Y,2Z)
= 9Var(X)+Var(Y)+4Var(Z)+(2x3x-1)Cov(X,Y)+(2x3x2)Cov(X,Z)
+(2x-1x2)Cov(Y,Z)=(9x1)+(5)+(4x2)+(-6x-2)+(12x-1)+(-4x1)
= 9+5+8+12-12-4 = 18
Example
Let X and Y be two independent random variables having finite second
moments. Compute the mean and variance of 2X+3Y in terms of
those of X and Y.
Solution:
Remember if X,Y indep, then
Cov(X, Y)  0
E(2X+3Y)= 2E(X)+3E(Y)
Var(2X+3Y) =
4Var(X)+9Var(Y)
Example
If X and Y are random variables with 2 and 4 respectively and
Cov(X,Y) = -2,Find the variance of the random variable Z=3X-4Y+8
Solution:
Var(Z)=Var(3X-4Y+8)= 9Var(X)+16Var(Y)+Var(8)+2Cov(3X,-4Y)
= (9x2)+(16x4) + 0 +(2x 3x-4x-2)
= 18 + 64+ 48 =130
Example
If X and Y are independent random variables with variances 5 and 7
respectively ,find
1-The variance of T =X-2Y
Var(T )=Var(X-2Y)=Var(X)+4Var(Y)=5+(4x7)=33
2-The Variance of Z= -2X+3Y
Var(Z)= Var(-2X+3Y)=4 Var(x)+9Var(Y)=83
Note:
Cov (x,x)=Var(x)
Cov(y,y)=Var(Y)
3- The Cov(T,Z)
Cov(T,Z)=Cov(X-2Y, -2X+3Y)
=Cov(X,-2X)+Cov(X,3Y)+Cov(-2Y,-2X)+Cov(-2Y,3Y)
= -2Cov(X,X)+3Cov(X,Y)+(-2x-2)Cov(Y,X)+(-2x3)Cov(Y,Y)
= -2Var(X)+(3x0) +(4x0)-6xVar(Y)
= (-2x5)+0+0 -(6x7)= -10 - 42= -52
Correlation Coefficient:
Let X and Y be two random variables having finite variances . One
measure of the degree of dependence between the two random variables
is the correlation coefficient ρ(X,Y) defined by
   ( X ,Y ) 
Cov( X , Y )
Var ( X )Var (Y )
These random variables are said to be uncorrelated if ρ =0
(since Cov(X,Y)=0).
If X and Y are independent ,we see at once that independent random
variables are uncorrelated (the converse is not always true) , i.e. it is
possible for dependent random variable to be uncorrelated .
Theorem:

If Y= a+ bX , then
 1

 ( X , Y )  0
 1

b0
b0
b0
Example:
Let X and Y be the random variables
with the following joint probability
Function: Find
1- E(XY) = ??
E ( XY )   xyf ( x, y )
x
y
=(1x-3x0.1)+(1x2x0.2)+(1x4x0.2)+
(3x-3x0.3)+(3x2x0.1)+(3x4x0.1)=
= -0.3+0.4+0.8-2.7+0.6+1.2
= 0
X\Y
-3
2
4
Sum
1
0.1
0.2
0.2
0.5
3
0.3
0.1
0.1
0.5
Sum
0.4
0.3
0.3
1
Example (Continue)
x
1
3
sum
y
-3
2
4
Sum
f(x)
0.5
0.5
1
f(y)
0.4
0.3
0.3
1
xf(x)
0.5
1.5
2
yf(y)
-1.2
0.6
1.2
0.6
x2f(x)
0.5
4.5
5
y2f(y)
3.6
1.2
4.8
9.6
From table:
2-E(X)=2 ,E(X2)= 5
Var(X)= E(X2)-E(X)=3
From table:
3-E(Y)=0.6 ,E(Y2)= 9.6
Var(Y)= E(Y2)-E(Y)=9.24
4-E(X+Y)=E(X)+E(Y)
= 2+0.6
= 2.6
5-Cov(X,Y)=E(XY)-E(X)E(Y)
= 0 – (2x0.6)
= - 1.2
Example (Continue)
6- Var(X,Y)=Var(X)+Var(Y)+2Cov(X,Y)
= 1 + 0.6 + (2x -0.6)=0.4
7- find the correlation coefficient (ρ) ??
   ( X ,Y ) 
Cov( X , Y )
 1.2

 0.39477
Var ( X )Var (Y )
1 9.24
8- Are X and Y independent??
No , since Cov(X,Y) = -0.6 ≠0
Or
No, Since ρ has a value ,so X is related to Y
Moment Generating Function
In the following we concentrate on applications of moment generating
functions. The obvious purpose of the moment generating function is in
determining moments of distributions. However the most important
contribution is to establish distributions of functions of random variables.
Definition:
The moments generating function of the random variable X is given by
E(etx ) and denoted by Mx(t) . Hence
Mx(t)= E(etx ( = ∑etx f(x)
Example:
Given that the probability distribution
x2
x= 1,2,3,4,5
f ( x) 
25
Find the moment generating function of this random variable
Mx(t)= E(etx ) = ∑etx f(x)
= [ 3 et + 4 e2t +5 e3t +6e4t +7e5t ]/25
Some properties of moment generating functions:
1  M ax (t )  M x (3t )
2  M x b (t )  ebt M x (t )
3  M axb (t )  ebt M x (3t )
Where a, b constant
Moments generating functions Mx(t) of
some distributions:
Distribution
mean
Var(X)
Mx(t)
Binomail dist.
E(X)= np
Var(X) = npq
M x (t )  (q  pet ) n
Geometric dist.
E(X) = 1/p
Var(X)= 1/ P2
Poisson dist.
E(X) = λ
Var(X) = λ
Hypergeometric
dist.
E(X) = n M / N
--
--
Uniform distribution
E(X) = (N+1)/ 2
--
---
pet
M x (t ) 
1  qet
M x (t )  e ( e
t
1)
Example
For each of the following moment generating function, find
the mean and the variance of X
M x (t )  (0.4et  0.6) 4
1The distribution is Binamail with n=4 , P=0.4
E(X)=np= 4 x 0.4 =1.6
Var(x) = npq =4x 0.4x0.6 = 0.96
6 ( e t 1)
M
(
t
)

e
2x
The distribution is Poisson with λ=6
E(X)= λ = 6
Var(X) = λ =6
Example
3-
0.2et
M x (t ) 
1  0.8et
The distribution is geometric with P=0.2
E(X)= 1/P= 1/0.2 =50
Var(X) =1/P2 =1/0.22
P(X=1) = pqx-1 = pq0 = 0.2x(0.8)0=0.2
Example
The moment generating function of the random variable X
and Y are
t
M X (t )  e ( 2e  2)
M Y (t )  (0.75et  0.25)10
If X and Y are independent ,find
1- E(XY)
2- Var(X+Y)
3 –Cov(X+2,Y-3)
Solution:
X has Poisson distribution with λ = 2
E(x)=Var(X)=λ=2
Y has Binomail distribution with n=10,P=0.75
E(Y)=10x0.75 = 7.5 ,Var(Y)= 10x0.75x0.25=0.1878
Example
Since X and Y independent ,then
1- E(XY)=E(X)E(Y)=2x7.5=15
2- Var(X+Y)=Var(X) + Var(Y)
= 2 + 0.1878 = 2.1878
3 –Cov(X+2,Y-3)= Cov(X,Y)+Cov(X,3)+Cov(2,-3)
= 0 + 0 +0 =0
Related documents