Download Function of Random Variables FRV

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Function of Random Variables
7.1 Introduction
How to find the distribution of a random variable Y
that is a function of several random variables X1, X2,
…, Xn that has a joint probability distribution?
Functions of Random Variables
𝑦 = 𝑢(𝑥1 , 𝑥2 , …, 𝑥𝑛 )
Methods for finding distribution of function of one
or more random variables:
1. Distribution Function Technique
2. Transformation Technique
3. Moment Generating Function Technique
1
7.2 Distribution Function Technique
Example: Let X ~ U(0,1), and Y = 𝑋 𝑛 , find
𝑝. 𝑑. 𝑓. of 𝑌.
For finding the probability density function with a
given joint probability density, the probability
density function of 𝑌 = 𝑢(𝑋1 , 𝑋2 , …, 𝑋𝑛 ) can be
obtained by first finding the cumulative probability
or distribution function
G(y) = P(Y ≤ y)
= P(𝑋 𝑛 ≤ y)
= P(X ≤ y1/n )
F(y)= 𝑃 𝑌 ≤ y = 𝑃(𝑢(𝑋1 , 𝑋2 , …, 𝑋𝑛 )
= F(y1/n )
and then differentiate it to get the p.d.f.
𝑓 𝑦 =
= y1/n
𝑑𝐹(𝑦)
𝑑𝑦
g 𝑦 =
Example: Let X has the following p.d.f.,
6𝑥 1 − 𝑥 ,
0<𝑥<1
𝑓 𝑥 =
0,
elswhere
4
Example: Let X have a p.d.f. f(x) and Y = 𝑋 2 , find
𝑝. 𝑑. 𝑓. of 𝑌.
G(y) = P(Y ≤ y)
find the p.d.f. of Y = 𝑋 3 .
= P(𝑋 2 ≤ y)
G(y) = P(Y ≤ y) = P(𝑋 3 ≤ y)
= P(-y1/2 ≤ X ≤ y1/2)
= P(X ≤ y1/3 )
y1/3
= 0 6𝑥 1 − 𝑥 𝑑𝑥
= F(y1/2) - F(- y1/2)
= 3y2/3 – 2y, for 0< y <1
−1/3 – 1), 0 < 𝑦 < 1
g 𝑦 = 2(y
0,
elswhere.
1 1 −1
𝑦𝑛 , 0 ≤ 𝑦 ≤ 1
𝑛
0,
elswhere.
g(y) = 2
5
1
𝑦
𝑓(y1/2) + f(− y1/2)
6
FRV - 1
Function of Random Variables
G(y) = P(Y ≤ y)
= P(|X|≤ y)
= P(-y ≤ X ≤ y)
G(y) = P(Y ≤ y)
= P(lnX ≤ y)
= P(X ≤ e y )
= F(y) - F(- y)
g(y) = 𝑓(y) + f(−y), for y > 0
=
when 𝑋 ~ 𝑁 0,1 , g(y) = 2𝑓(y) , for y > 0
=2
-y 0
y
1
2𝜋
𝑒−
6𝑒 −3𝑥1 −2𝑥2 ,
0,
𝑦
0
g(y) = G′(y) =
7
1 𝑦 −1 ∙𝑒 𝑦
𝑒 𝑒 𝜃 ,
𝜃
- < y < 
8
Example: If X1 and X2 are independent random variables
having U(0,1), find the distribution function of Y = X1+X2.
for 𝑥1 > 0, 𝑥2 > 0
elsewhere
𝑓1 𝑥1 = 1 = 𝑓2 𝑥2  𝑓 𝑥1 , 𝑥2 ,
for 0 < 𝑥1 < 1, 0 < 𝑥2 < 1.
x1
F(y) = P(Y ≤ y) = P(X1 + X2 ≤ y)
𝑦 𝑦−𝑥2
6𝑒 −3𝑥1−2𝑥2
0 0
1
= 1 − 𝑒 −𝜃∙𝑒
𝑦2
2
Find the probability of Y = X1 + X2
=
𝑒 𝑦 1 −𝑥
𝑒 𝜃 𝑑𝑥
0 𝜃
𝑦
𝑥 𝑒
−𝜃
=-𝑒
Example: If the joint density of X1 and X2 is given by
𝑓 𝑥1 , 𝑥2 =
𝑥
1
Example: Let X have a p.d.f. 𝑓 𝑥 = 𝜃 𝑒 −𝜃 , for 𝑥 > 0,
find p. d. f. of 𝑌 = ln 𝑋 .
Example: Let X have a p.d.f. f(x) and Y = |X|, find
𝑝. 𝑑. 𝑓. of 𝑌 when 𝑋 ~ 𝑁 0,1 .
x1 + x2 = y
𝑑𝑥1 𝑑𝑥2
F(y) = 0, if y ≤ 0.
1
= 1 + 2e-3y – 3e-2y
0
F(y) = 2 𝑦 2 , if 0 < y ≤ 1.
y
F(y) = 1 −
F(y) = f (y) = 6(e-2y – e-3y), for y > 0,
f (y) = 0, elsewhere.
(2−𝑦)2
,
2
if 1 < y ≤ 2.
1
x2
0
1
y = x1 + x2
F(y) = 1, if y > 2.
9
7.3 Transformation Technique: One Variable
For discrete random variable, whether X and Y = u(X)
is one-to-one or not, finding the distribution of Y is
straight forward substitution.
Example: Let X be the number of heads in tossing a
balanced coin three times, find the probability
distribution of Y = 1/(1+X) . (One-to-one function)
10
Example: Let X be the number of heads in tossing a
balanced coin three times, find the probability
distribution of Y = (1 - X)2 . (Not one-to-one function)
x
f(x)
0
1/8
1
3/8
2
3/8
3
1/8
4
1/8
x
f(x)
0
1/8
1
3/8
2
3/8
3
1/8
y*
g(y*)
1
1/8
0
3/8
1
3/8
y
g(y)
1
1/8
1/2
3/8
1/3
3/8
1/4
1/8
y
g(y)
0
3/8
1
4/8
4
1/8
11
12
FRV - 2
Function of Random Variables
Inverse Function Theorem :
For functions of a single variable, if u is a continuously
differentiable function with nonzero derivative at the
point x, then u is invertible in a neighborhood of x, the
inverse is continuously differentiable, and
1
𝐷𝑦 𝑢−1 (𝑦) =
𝐷𝑥 𝑢(𝑥)
where y = u(x).
Theorem 7.1: (Univariate Transformation Theorem)
Let f(x) be the probability density of the continuous
random variable X at x. If the function given by y =
u(x) is differentiable and either (monotone) increasing
and decreasing for all values within the range of X that
has density, then the equation y = u(x) is one-to-one and
x = w(y), and the probability density of Y = u(X) is
given by
g 𝑦 =
𝑓 𝑤 𝑦
∙ 𝑤 ′ 𝑦 , for 𝑢′(𝑥) ≠ 0
0, elsewhere.
(𝑢−1 𝑦 )′ = 𝑤′(𝑦) =
13
𝑢′
1
𝑑𝑥
=
𝑥
𝑑𝑦
14
Example: X ~ U (0, 1)
u(x) is increasing function
Another version of the formula: u-1 (y) = w(y)
f(x)
y
y = u(x)
g 𝑦 =
𝑓 𝑢−1 𝑦
∙
𝑑 −1
𝑢
𝑑𝑦
0,
𝑦 , for 𝑢′(𝑥) ≠ 0
x
b
elsewhere.
0 .5 1
a
Y = 2X
w(a)
w(b)
x
g(y)
P(u(X) ≤ y) = P(X ≤ w(y))
y
0
1
2
15
16
Example: X ~ U (0, 1)
u(x) is decreasing function
Let u(x) be a strictly decreasing function in the range
of X, and w be the inverse function of u, i.e., u-1 .
f(x)
y
b
G(y) = P(Y ≤ y) = P(u(X) ≤ y)
x
y = u(x)
0 .5 1
Y = -2X
a
w(b) w(a)
x
g(y)
P(u(X) ≤ y) = P(X  w(y))
-2
-1
y
0
17
= P(X  w(y)) = 1 - P(X < w(y)) =1 - F(w(y))
g(y) = G (y) = - F (w(y)) = -f(w(y))w′(y)
Since u(x) is a decreasing function then w′(y) < 0. If
u(x) is a increasing function in the range of X, then
G(y) = P(Y ≤ y) = P(u(X) ≤ y)
= P(X ≤ w(y)) = P(X < w(y)) = F(w(y))
g(y) = G (y) = F (w(y)) = f(w(y))w′(y), with w′(y) > 0.
g 𝑦 =
𝑓 𝑤 𝑦
∙ 𝑤 ′ 𝑦 , for 𝑢′(𝑥) ≠ 0
0, elsewhere.
18
FRV - 3
Function of Random Variables
Example: Let X have the exponential distribution with
p.d.f. f(x) given by
𝑒 −𝑥 ,
for 𝑥 > 0,
𝑓 𝑥 =
0,
elsewhere
find the p.d.f. of the random variable Y = 𝑋.
Sol:
For y > 0, y = 𝑥  x = y2 .
w(y) = y2 , w(y) = 2y
2
g(y) = 𝑓(𝑦 2 ) ∙ 2𝑦 = 𝑒 −𝑦 |2𝑦|
g 𝑥 = 2𝑦𝑒
0,
q
a
x = a  tanq
0
x
Example: Let X be the a random variable takes the
distance for 0 to a point on the x-axis where the double
arrow will point to, when it is spun. The random
variable Q is the angle that has uniform density
for y > 0
1
𝑓 𝜃 = 𝜋,
0,
−𝑦 2
, for 𝑦 > 0, (Weibull Distribution)
elsewhere.
𝜋
𝜋
<𝜃< ,
2
2
elsewhere
for −
find the p.d.f. of the random variable X.
19
Example: If F(x) is the distribution function of the
continuous random variable X, find the p.d.f. of Y = F(x).
q
a
x = a  tanq
Sol:
20
0
x
x = a  tan q 
𝑑𝜃
𝑎
=
𝑑𝑥 𝑎2 + 𝑥 2
Sol: Let y = F(x),
dx 1
1


, for f ( x)  0.
dy dy f ( x)
dx
1
𝑎
∙
𝜋 𝑎2 + 𝑥 2
1
𝑎
for - < x < .
= ∙ 2
𝜋 𝑎 + 𝑥2
g(x) =
g(y) = f(x) 
21
Distribution Function Method for Random Numbers:
1. Generate a U(0, 1) random number
2. set this random numbers equal to F(x) and solve
for x.
3. The value x would be a random number from the
distribution that has a distribution function F(x).
Example: Generate a random number from
exponential distribution with parameter q using
U(0,1) random number.
F(x) = 1 – e-x/q = u
u is a random # from U(0, 1)
The random # from the exponential distribution
would be:
x = -q ln (1 – u)
dy
 F ( x)  f ( x)
dx
23
1
= 1, for 0 < y <1.
f ( x)
* Distribution function technique for random number
generation using U(0,1) random number generator.
22
Example: If X has the standard normal distribution find
the probability density of Z = X 2.
Sol:
z = x 2 is not one-to-one.
First let Y = |X|, then Z = Y 2 = X 2
2 −1𝑦 2
𝑒 2 for y > 0
p.d.f. of Y  g(y)= 2n(y; 0, 1) =
2𝜋
z = y2 , w(z) = y = 𝑧
p.d.f. of Z  h(z)= g( 𝑧 ) |w(y)|
2 −1𝑧 1 −1
1 −1 −1𝑧 for z > 0
=
𝑒 2  𝑧 2 =
𝑧 2𝑒 2
2
2𝜋
2𝜋
(Chi-square distribution with degrees of freedom = 1.) 24
FRV - 4
Function of Random Variables
Example: Let X ~ U(0,1), and Y = 𝑋 𝑛 , find
𝑝. 𝑑. 𝑓. of 𝑌.
7.4 Transformation Method: Several Variables
For random variable Y = u(X1, X2) where the joint
distribution or density of X1 and X2 is given, and one
can find the joint distribution or density for Y and X2 or
X1 and Y by holding the other variable fixed, if possible,
and then find the marginal distribution or density
function for Y.
In continuous case, one can first use the transformation
technique with the formula, by holding x1 or x2 fixed,
g 𝑦, 𝑥2 = 𝑓(𝑥1 , 𝑥2 ) ∙
Answer:
g 𝑦 =
1 1 −1
𝑦𝑛 , 0 ≤ 𝑦 ≤ 1
𝑛
0,
elswhere.
or
25
Example: If X1 and X2 are independent random having
Poisson distribution with the parameters l1 and l2 , find
the probability distribution of the random variable Y =
X 1 + X 2.
Sol: Since X1 and X2 are independent, the joint density is
𝑒 −𝜆1 𝜆1 x1 𝑒 −𝜆2 𝜆2 x2 𝑒 −(𝜆1 +𝜆2 ) 𝜆1 x1 𝜆2 x2
𝑓 𝑥1 , 𝑥2 =
∙
=
𝑥1 !
𝑥2 !
𝑥1 ! 𝑥2 !
for x1 = 0, 1, 2, …, and x2 = 0, 1, 2, … .
Since y = x1 + x2 then x1 = y - x2
𝑒 −(𝜆1+𝜆2 ) 𝜆1 𝑦−x2 𝜆2 x2 Joint Distribution
𝑔 𝑦, 𝑥2 =
of Y and X2
(𝑦 − 𝑥2 )! x2!
for y = 0, 1, 2, …, and x2 = 0, 1, 2, …, y .
27
Example: Let random variables X1 and X2 have the
joint p.d.f. as
𝑓 𝑥1 , 𝑥2 =
𝑒 −(𝑥1 +𝑥2 ) ,
0,
𝜕𝑥
g 𝑥1 , 𝑦 = 𝑓(𝑥1 , 𝑥2 ) ∙ 2
𝜕𝑦
then find the marginal density of Y.
𝑦
ℎ(𝑦) =
𝑥2 =0
=
𝑋1
𝑋1 +𝑋2
Sol: Since y decreases as x2 increases and x1 hold
constant, we can find a joint density of X1 and Y and
then use transformation technique to find density of Y.
𝑥1
1−𝑦
𝑦=
⟹ 𝑥2 = 𝑥1 ∙
for 0 < y < 1
𝑥1 + 𝑥2
𝑦
𝜕𝑥2
𝑥1
𝑥1
⟹
=− 2
𝑔 𝑥1 , 𝑦 = 𝑒 −𝑥1 /𝑦 − 2
29
𝜕𝑦
𝑦
𝑦
26
𝑒 −(𝜆1+𝜆2 ) 𝜆1 𝑦−x2 𝜆2 x2
(𝑦 − 𝑥2 )! x2!
𝑒 −(𝜆1 +𝜆2 )
=
𝑦!
𝑒 − 𝜆1 +𝜆2
𝑦
𝑥2 =0
𝑦!
𝜆 𝑦−x2 𝜆2 x2
(𝑦 − 𝑥2 )! x2! 1
(𝜆1 + 𝜆2 )𝑦
𝑦!
for y = 0, 1, 2, … .
The sum of two independent Poisson random
variables with parameters l1 and l2 is a Poisson
random variable with parameter l1 + l2 .
𝑔 𝑥1 , 𝑦 = 𝑒 −𝑥1 /𝑦 −
for 𝑥1 > 0, 𝑥2 > 0,
elsewhere
find the p.d.f. of the random variable Y =
𝜕𝑥1
𝜕𝑦
=
∞
0
∞
=
0
= 1
𝑥1
𝑦2
𝑥1 −𝑥 /𝑦
𝑒 1
𝑦2
ℎ 𝑦 =
28
for x1 > 0, 0 < y < 1.
𝑥1 −𝑥 /𝑦
∙ 𝑒 1 𝑑𝑥1
𝑦2
Let u = x1 / y
du = -1/y2 dx1
𝑢 ∙ 𝑒 −𝑢 𝑑𝑥1
for 0 < y < 1.
It is a U(0, 1)!!!
30
FRV - 5
Function of Random Variables
Example: Let random variables X and Y have the joint
p.d.f. as
2,
for 𝑥 > 0, 𝑦 > 0, 𝑥 + 𝑦 < 1
𝑓 𝑥, 𝑦 =
0,
elsewhere
find the joint p.d.f. of X and Z = X + Y & marginal p.d.f.
of Z.
z = x + y  y = z – x , and 0 < z – x and 0 < z < 1
𝑑𝑦
z
z=x
𝑔 𝑥, 𝑧 = 𝑓(𝑥, 𝑦)
1
𝑑𝑧
= 2 ∙ 1 = 2 for x < z, 0 < z < 1
𝑧
𝑧
x
ℎ 𝑧 =
2𝑑𝑥 = 2𝑥 = 2𝑧
0
0
1 31
0
for 0 < z < 1
Example: Let random variables X1 and X2 have the
joint p.d.f. as
𝑒 −(𝑥1 +𝑥2 ) ,
for 𝑥1 > 0, 𝑥2 > 0,
𝑓 𝑥1 , 𝑥2 =
0,
elsewhere
𝑋1
.
a) find the joint p.d.f. of Y1= X1 + X2 , and Y2 =
𝑋1 +𝑋2
𝑥1
𝑦1 = 𝑥1 + 𝑥2 and 𝑦2 =
𝑥1 + 𝑥2
⟹ 𝑥1 = 𝑦1 𝑦2 and 𝑥2 = 𝑦1 (1 − 𝑦2 )
0 < y1 and 0 < y2 <1
𝜕𝑥1
𝜕𝑦1
𝐽=
𝜕𝑥2
𝜕𝑦1
𝜕𝑥1
𝑦2
𝜕𝑦2
= 1−𝑦
𝜕𝑥2
2
𝜕𝑦2
𝑦1
−𝑦1 = −𝑦1
1,
0,
Let f(x1, x2) be the joint probability density of the continuous
random variable X1 and X2. If the function given by y1 = u1(x1,
x2) and y2 = u2(x1, x2) are partially differentiable w.r.t. both x1
and x2 and are one-to-one for which f(x1, x2) ≠ 0, ∀ x1, x2 in
the space of X1 and X2 , and the inverse functions x1 = w1(y1,
y2) and x2 = w2(y1, y2) can be uniquely determined (by solving
for x1 and x2), the joint p.d.f. of Y1 = u1(X1, X2) and Y2 = u2(X1,
X2) is
g 𝑦1 , 𝑦2 = 𝑓[w1(y1, y2),w2(y1, y2)] ∙ 𝐽
where J is the Jacobian of the transformation, is the
𝜕𝑥1 𝜕𝑥1
determinant
𝜕𝑦1 𝜕𝑦2
𝐽=
𝜕𝑥2 𝜕𝑥2
𝜕𝑦1 𝜕𝑦2
32
g 𝑦1 , 𝑦2 = 𝑒 −𝑦1 −𝑦1 = 𝑦1 𝑒 −𝑦1
for 0 < y1 and 0 < y2 <1, and 0 elsewhere.
b) Find the marginal density of Y2 .
∞
ℎ 𝑦2 =
0
𝑔 𝑦1 , 𝑦2 𝑑𝑦1
∞
=
0
𝑦1 𝑒 −𝑦1 𝑑𝑦1
= (2)
=1
for 0 < y2 <1, and 0 elsewhere.
33
Example: Let random variables X1 and X2 have the
joint p.d.f. as
𝑓 𝑥1 , 𝑥2 =
Theorem 7.2: (Generalization of Theorem 7.1)
for 0 < 𝑥1 < 1,0 < 𝑥2 < 1,
elsewhere
34
𝑓 𝑦, 𝑧 = 1 ∙ 1 = 1,
for z < y < z + 1, 0 < z < 1, and 0, else where.
b) Find the marginal density of Y.
z
1
a) find the joint p.d.f. of Y = X1 + X2 , and Z = X2 .
y = z + x1 ,
z = x2
y = x1 + x2 , and z = x2
 x1= y - x2 , and x2 = z, z < y < z + 1, 0 < z < 1
𝜕𝑥1
𝜕𝑦
𝐽=
𝜕𝑥2
𝜕𝑦
z
𝜕𝑥1
1
𝜕𝑧
1 −1
=
=1
𝜕𝑥2
0 1
𝜕𝑧
0,
𝑦
1𝑑𝑧 = 𝑦,
ℎ 𝑦 =
𝑦≤0
0
1
2
y
for 0 < 𝑦 < 1
0
1
1𝑑𝑧 = 2 − 𝑦, for 1 < 𝑦 < 2
𝑦−1
y
0
1
2
0,
35
𝑦≥2
36
FRV - 6
Function of Random Variables
Example: Let random variables X1 and X2 have the
joint p.d.f. as
b) find the marginal p.d.f. of Z = X + Y .
2,
for 𝑥 > 0, 𝑦 > 0, 𝑥 + 𝑦 < 1,
0,
elsewhere
a) find the joint p.d.f. of Z = X + Y , and W = X - Y.
𝑓 𝑥, 𝑦 =
z = x + y and w = x - y  x=(z + w)/2; y=(z-w)/2,
z + w > 0, z - w > 0, and 0< z < 1.
1 1
𝐽 = 2 2 = −1/2
1
1
−
2
2
z
𝑔 𝑧, 𝑤 = 1, for 0 < z < 1, z > -w, and z > w.
𝑧
𝑔 𝑧 =
w
𝑔(𝑧, 𝑤) 𝑑𝑤
−𝑧
𝑧
z–w=0
=
1
1 𝑑𝑤 = 𝑤
−𝑧
1
0
=1
2
-1
for 0 < z < 1, z > -w, and z > w.
𝑧
= 2𝑧
−𝑧
for 0 < z < 1.
x
𝑔 𝑧, 𝑤 = 2 ∙ −
z–w=0
1
z
0
1
-1
z+w=0
1
z+w=0
37
38
𝜕𝑥
Previous Example: g 𝑥1 , 𝑦 = 𝑓(𝑥1 , 𝑥2 ) ∙ 𝜕𝑦2
Example: Let random variables X1 and X2 have the
joint p.d.f. as
Example: Let random variables X1 and X2 have the
joint p.d.f. as
2,
for 𝑥 > 0, 𝑦 > 0, 𝑥 + 𝑦 < 1,
0,
elsewhere
find the marginal p.d.f. of Z = X + Y .
𝑓 𝑥, 𝑦 =
𝑔(𝑥, 𝑧) 𝑑𝑥
0
𝑧
=
0
z–x=0
1
𝑧
2 𝑑𝑥 = 2𝑥 = 2𝑧
0
x
0
1
39
for 0 < z < 1.
0
1
− 𝑦2 𝑦1 −3/2
2
1
𝑦1
=
<y2< 𝑦1
y2
1
2𝑦1
y22 = y1
1
y22 < y1
𝑦2
𝑔 𝑦1 , 𝑦2 = 4 𝑦1 ∙
𝐽
𝑦1
0
= 4y2 /2y1 =2y2 /y1 for 0 < y2< 𝑦1
1
y1
40
Theorem 7.3 (Generalized Version): If X1, X2, …, Xn
Y1 ~ u1(x1, x2 , … , xn)
Y2 ~ u2(x1, x2 , … , xn)
…
Yn ~ un(x1, x2 , … , xn)
are independent random variables with m.g.f.’s is M X i (t )
i = 1, 2,…, n, then the m.g.f. of Y 

n
i 1 i
a X i is
n
g(x1, x2 , … , xn) = f(x1, x2 , … , xn)|J|
𝜕𝑥1
𝜕𝑦𝑛
⋱
⋮
𝜕𝑥𝑛
⋯
𝜕𝑦𝑛
𝐽=
1
2 𝑦1
7.5 Moment-Generating Function Technique
X1, X2 , … , Xn ~ f(x1, x2 , … , xn)
𝜕𝑥1
𝜕𝑦1
𝐽= ⋮
𝜕𝑥𝑛
𝜕𝑦1
𝑦2
,0
𝑦1
y1 = x12 and y2 = x1x2  x1= 𝑦1 , x2 =
z = x + y  y = z – x for 0 < z < 1 and 0 < z – x .
𝑑𝑦
𝑔 𝑥, 𝑧 = 𝑓(𝑥, 𝑦) ∙
= 2∙ 1 = 2 z
𝑑𝑧
𝑧
𝑔 𝑧 =
4𝑥1 𝑥2 ,
for 0 < 𝑥1 < 1,0 < 𝑥2 < 1
0,
elsewhere
a) find the joint p.d.f. of Y1 = X12 and Y2 = X1 X2 .
𝑓 𝑥1 , 𝑥2 =
M Y (t )   M X i (ai t )
i 1
…
41
42
FRV - 7
Function of Random Variables
Example: Find the probability distribution of the
sum of n independent random variables X1, X2, …, Xn
that have Poisson distribution with parameters l1, l2,
…, ln , respectively.
The m.g.f. of Poisson distribution is M X (t ) e
i
Example: Find the probability distribution of the
sum of n independent random variables X1, X2, …, Xn
that have Poisson distribution with parameters l1, l2,
…, ln , respectively.
li ( e t -1)
So, for Y = X1+ X2 + …+ Xn , the m.g.f. is
n
n
M Y (t )   e li ( e -1)  e ( l1 + l2 ++ ln )( e -1)
t
i 1
which is the m.g.f. of a Poisson distribution with
parameter l = l1+ l2 + …+ ln , therefore, Y has a
Poisson distribution with l = l1+ l2 + …+ ln .
t
t
i 1
43
which is the m.g.f. of a Poisson distribution with
parameter l = l1+ l2 + …+ ln , therefore, Y has a
Poisson distribution with l = l1+ l2 + …+ ln .
44
Sol:
Example: If X1, X2, …, Xn are mutually independent
random variables from normal distributions with
means m1, m2, m3, …, mn, and variances s12, s22, s32,
…, sn2, then the linear function
n
n
i 1
i 1
M Y (t )   M X i (ci t )   e mi ci t +s i ci t
n
Y   ci X i
e
i 1
has the normal distribution N(Scimi , Sci2si2).
2 2 2
/2
 n
  n 2 2 t2 

ci m i  t + 
cis i  
 
2
  i 1
  i 1
 



n
n
c m ,c s

m.g.f. of N 
i 1
Sample Mean : Y  X
li ( e t -1)
So, for Y = X1+ X2 + …+ Xn , the m.g.f. is
M Y (t )   e li ( e -1)  e ( l1 + l2 ++ ln )( e -1)
t
The m.g.f. of Poisson distribution is M X (t ) e
i
i
i
i 1
2
2
i
i



1
if ci  .
n
45
46
Distribution of X
If X1, X2, …, Xn are observations of a random
sample of size n from the normal distribution
N(m, s 2), then the distribution of the sample
1 n
mean X   X i is N(m, s 2/n)
n i 1
mX  m
s
sX 
n
47
FRV - 8
Related documents