Download Moment Generating Function and Probability Generating Function

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Moment Generating Function
and
Probability Generating Function
Definition. For any random variable X , the Moment Generating Function (MGF) , and the
Probability Generating Function (PGF) are defined as follows:
MX (t) = E[etX ]
PX (z) = E[z X ]
M GF
P GF
Note.
MX (t) = PX (et )
Definition. k-th raw moment of any random variable X with density function f (x):
 ∫
∞


xk f (x)dx
if X is continuous

 −∞
µ′k := E(X k ) =


 ∑ k

if X is continuous
j xj P (X = xj )
Theorem. Let X be a random variable (continuous or discrete). If the moment generating
function M = MX exists on a neighborhood of t = 0 , then The raw moments of X can be
calculated through the derivatives M (k) (0) :
E(X k ) = M (k) (0)
Proof.
M (n) (t) =
dn
dtn
E(etX ) = E
[ dn
dtn
]
[
]
etX = E X n etX
1
⇒
M (n) (0) = E[X n ]
Definition of the Gamma function:
∫
∞
Γ(x) =
tx−1 e−t dt
0<x<∞
0
Example. Γ(1) =
∫∞
0
[
]t=∞
=1
e−t dt = − e−t
t=0
Γ(1) = 1
The domain of this function is (0 , ∞). Now if x > 1 so that both x and x − 1 are in the
domain of the Gamma function , then we have
Γ(x) = (x − 1)Γ(x − 1)
Here is how:
∫ ∞
∫
Γ(x) =
tx−1 e−t dt =
0
∞
x>1
(
) [
]t=∞ ∫
tx−1 − e−t = − tx−1 e−t
+
t=0
0
∫
= 0 + (x − 1)
∞
tx−2 e−t dt = (x − 1)Γ(x − 1)
∞
(
)
e−t d tx−1
0
✓
0
Example.
K a positive integer
Γ(k) = (k − 1)Γ(k − 1) = (k − 1)(k − 2) · · · (1)Γ(1) = (k − 1)!
Γ(k) = (k − 1)!
K a positive integer
We use this equality to define x! for x > −1 :
If x is any real number with x > −1 then we define
Note. In the integral Γ(α) =
∫∞
0
x! = Γ(x + 1)
tα−1 e−t dt if we make the change of variable t = xθ , the
integral takes the new shape:
∫
∞
xα−1 e− θ dx = θα Γ(α)
x
0
2
Example. The Weibull(θ , τ ) distribution has the density function
( x )τ
τ
f (x) =
x τ
x τ
e− ( θ )
τ xτ −1 e−( θ )
=
x
θτ
θ
x>0
θ>0
Calculate its raw moments.
Solution.
∫
x τ
τ xn+τ −1 e−( θ )
θτ
∞
n
E(X ) =
0
Now make the change of variable y = xτ . Then
τ xτ −1 dx = dy
n
τ xn+τ −1 dx = xn dy = y τ dy.
⇒
Then in continuation to the above calculations:
=
=
1
θτ
∫
∞
y
y τ e− θτ dy =
n
0
1
θτ
∫
∞
y
y α−1 e− θτ dy
where α =
0
n
+1
τ
)
n
1 τ n +1 ( n
τ
Γ
+
1
= θn Γ( + 1)
(θ
)
τ
θ
τ
τ
Example. The Gamma(α , θ) distribution has the density function
f (x) =
1 α −x
θ e θ =x>0
Γ(α)
θ>0
τ >0
Calculate the MGF and the raw moments of the Gamma distribution.
Solution.
∫
M (t) = E(e
tX
∞
)=
0
1
=
Γ(α)θα
∫
∞
x
1
e f (x) dx =
Γ(α)θα
α−1 −( θ1 −t)x
e
0
1
Γ(α)
=
Γ(α)θα
(
θ
1 − θt
∫
tx
)α
=
1
dx =
Γ(α)θα
1
(1 − θt)α
∞
etx xα−1 e− θ dx
x
0
∫
∞
xα−1 e−(
0
✓
3
1−θt
θ
)x dx
τ >0
To find the raw moments:
M (t) = (1 − θt)−α
M (n) (t) = (−θ)n (−α)(−α − 1) · · · (−α − k + 1)(1 − θt)−α−k = θn α(α − 1) · · · (α − k +
⇒
1)(1 − θt)−α−k
⇒
E(X n ) = M (n) (0) = θn α(α − 1) · · · (α − k + 1)
✓
Example. The Pareto(α , θ) distribution has the density function
f (x) =
α θα
(x + θ)α+1
Find the raw moment E(X n ) for n < α.
Solution.
∫
∞
n
Eα (X ) =
0
∫
= αθ
α
∞
0
[
= αθ
α
xn αθα
dx = αθα
(x + θ)α+1
(
1
x d − (x + θ)−α
α
= 0 + αθα
n
α
∫
0
So, Eα (X n ) =
∞
0
n
+
α
∫
0
[
= αθ
∞
nθ
α−1
xn (x + θ)−α−1 dx
0
α
1
− xn (x + θ)−a
α
]∞
∫
− αθ
0
α
0
∞(
)
1
−a
− (x + θ)
d(xn )
α
xn−1
dx
(x + θ)α
∫
n
xn−1
dx =
α
(x + θ)
α−1
Eα (X n ) = · · · = (
=(
]∞
∞
)
n
xn
−
α(x + θ)α
∫
0
∞
nθ
xn−1 (α − 1)θα
dx =
Eα−1 (X n−1 )
α
(x + θ)
α−1
Eα−1 (X n−1 ). By repeated application of this equality, we get
nθ
(n − 1)θ
θ
)(
)···(
)Eα−n (x0 )
α−1
α−1
α−n+1
(n − 1)θ
θ
nθ
)(
)···(
)
α−1
α−1
α−n+1
(n − 1)θ
θ
nθ
=(
)(
)···(
)
α−1
α−1
α−n+1
∫
∞
0
∫
∞
1
dx
(x + θ)α−n+1
(x + θ)−α+n−1 dx
0
[
]∞
(n − 1)θ
θ
1
nθ
−α+n
)(
)···(
)
(x + θ)
=(
α−1
α−1
α − n + 1 −α + n
0
4
[
]∞
nθ
(n − 1)θ
θ
1
1
=(
)(
)···(
)
α−1
α−1
α − n + 1 n − α (x + θ)α−n 0
=(
nθ
(n − 1)θ
θ
θ
θn n!
)(
)···(
)(
)=
α−1
α−1
α−n+1 α−n
(α − 1) · · · ()α − n
Example. Calculate the PGF of the Uniform(a, b) .
Solution.
∫
X
b
zx
PX (z) = E(z ) =
a
1 [ 1 x ]x=b
zb − za
1
dx =
z
=
b−a
b − a ln z
(b − a) ln(z)
x=a
Example. Calculate the PGF of a discrete uniform random variable with support
{ n1 ,
2
n
, ... ,
n
n}
:
P (X =
k
1
)=
n
n
k = 1, ..., n
Solution.
Before anything we recall that for a geometric progression with common ratio r we have
a + ar + ar2 + · · · + arn−1 =
first term(1 − rnumber of terms )
a(1 − r)
=
1−r
1−r
n
n
n
[ ] ∑
∑
k
k 1
k
1∑ k
z n P (X = ) =
zn =
zn
PX (z) = E z X =
n
n
n
k=1
k=1
1 z n (1 − (z n )n )
1 ∑ ( 1 )k
1 z n (1 − z)
zn =
=
1
n
n
n 1 − z n1
1 − zn
k=1
n
=
k=1
1
1
1
Uniqueness Theorem in terms of MGF. Let X and Y are two random variables whose
mgf functions exist on a neighborhood of the origin. If for all t in some neighborhood
t ∈ (−h , h) of the origin we have MX (t) = MY (t) , then X are Y are identically distributed ,
i.e. they have identical distributions.
Uniqueness Theorem in terms of PGF. Let X and Y are two random variables whose
pgf functions exist on a neighborhood of the point 1. If for all t in some neighborhood
5
t ∈ (1 − h , 1 + h) of the 1 we have PX (t) = PY (t) , then X are Y are identically distributed ,
i.e. they have identical distributions.
6
Related documents