Download UNIT 6 : PROBABILITY FUNCTIONS

Document related concepts
no text concepts found
Transcript
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
UNIT 6 : PROBABILITY FUNCTIONS
Gabriel Asare Okyere (PhD)
March 23, 2016
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Outline of Presentation
1
Moment Generating Functions
2
Special Probability Distributions
3
Useful Continuous Distributions
4
Distributions of Functions of Random Variables
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Moment Generating Functions
The Moment Generating Function (MGF) is a function that
generates moments. It is defined as follows.
Mathematical Definition of MGF
The Moment Generating Function(MGF) of a random variable X,
denoted by MX (t), is defined by
MX (t) = E (e tX ) =
(P
∞
e tx f (x) if X is discrete
f (x) = R ∞x=0 tx
−∞ e f (x)dx if X is continuous
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
A discrete random variable Y has probability mass function given
by
12
P(Y = y ) =
(0.6)y (0.4)12−y , y = 0, 1, ..., 12.
y
Find the moment generating function of Y
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
MY (t) = E (e tY )
12
X
12
=
e ty
(0.6)y (0.4)12−y
y
y =0
=
12 X
12
y =0
y
(0.6e t )y (0.4)12−y
Recognizing the sum as a binomial expansion of (0.6e t + 0.4)12 ,
we obtain
MY (t) = (0.6e t + 0.4)12
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
A continuous random variable Y has a p.d.f given by
(
4e −4y , y > 0
f (y ) =
0
elsewhere
Find the moment generating function if Y
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
tY
Z
∞
MY (t) = E (e ) =
e ty f (y )dy
−∞
Z ∞
=
e ty 4e −4y dy
Z0 ∞
=
4e −4y +ty dy
0
Z ∞
=
4e −(4−t)y dy
0
#∞
"
4
e −(4−t)y
=
,t < 4
= −4
4−t
4−t
0
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
If Mx (t) exists, then for any positive integer k,
d k MX (t)
(k)
|t=0 = MX (0) = E (X k )
dt k
, where
to t.
d k MX (t)
dt k
(k)
= MX is the k th derivative of MX (t) with respect
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Proof
MX (t) = E (e tX )
MX0 (t) = E (Xe tX ) ⇒ MX0 (0) = E (X )
MX00 (t) = E (X 2 e tX ) ⇒ MX00 (0) = E (X 2 )
.
.
. = .
.
MXn (t)
.
= E (X n e tX ) ⇒ MXn (0) = E (X n )
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
A random variable Y has a moment generating function,
MY (t) = (0.6e t + 0.4)12 Find the E (Y ) and Var (Y )
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
MY (t) = (0.6e t + 0.4)12
MY0 (t) = 12(0.6e t + 0.4)11 (0.6e t )
E (Y ) = MY0 (0) = 12(0.6e 0 + 0.4)11 (0.60 )
= 7.2
MY00 (t)
= 12 × 11(0.6e t + 0.4)10 (0.6e t )2
+12(0.6e t + 0.4)11 (0.6e t )
E (Y 2 ) = MY00 (0) = 12 × 11 × 0.62 + 12 × 0.6
= 54.72
Var (Y ) = E (Y 2 ) − (E (Y ))2 = 54.72 − 7.22
= 2.88
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
A continuous random variable Y has moment generating function
given by
MY (t) = 3(3 − t)−1
Find the E (Y ) and Var (Y ).
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
MY (t) = 3(3 − t)−1
MY0 (t) = 3(3 − t)−2
MY00 (t) = 6(3 − t)−3
1
E (Y ) = MY0 (0) = 3(3)−2 =
3
2
E (Y 2 ) = MY00 (0) = 6(3)−3 =
9
Var (Y ) = E (Y 2 ) − (E (Y ))2
2
1
2
=
−
9
3
1
=
9
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
A moment generating function always exists at t = 0 and equals to
1.
Proof
MX (0) = E (e 0X ) = E (1) = 1
Theorem
Let X be a random variable with moment generating function
MX (t). If Y = aX , where a is a constant, then MY (t) = MX (at)
Proof
MY (t) = E (e tY ) = E (E taX ) = E (e (at)X ) = MX (at)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
Given that X has the moment generating function, MX (t) =
for t < 4, find the moment generating function of Y = 2X
4
4−t
Solution
4
4−t
MY (t) = M2X (t) = MX (2t)
4
=
,t < 2
4 − (2t)
MX (t) =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
If Y = a + bX , and MX (t) exists, then MY (t) = e at MX (bt)
Proof
MY (t) = E (e t(a+bX ) ) = E (e at e btX ) = e at E (e btX ) = e at MX (bt)
Example
4
Given that X has the moment generating function, MX (t) = 4−t
for t < 4, find the moment generating function of Y = 4 + 3X
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
4
4−t
MY (t) = M4+3X (t) = e 4t MX (3t)
4
4
4t
= e
,t <
4 − (3t)
3
MX (t) =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
The Uniqueness theorem
Corresponding to each moment generating function M(t), there is
a unique distribution function having that M(t) as moment
generating function.
Independence
Let X1 , X2 , ...Xn be independent random variables with moment
generating functions MXi (t)(i = 1, 2, ..., n). If
Y = X1 + X2 + ... + Xn , then MY (t) = MX1 (t)MX2 (t)...MXn (t).
Example
The random variables X and Y are independent with respective
moment generating functions MX (t) = (pe t + q)n and
MY (t) = (pe t + q)m , where p + q = 1. Find the moment
generating function of X + Y
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
Since X and Y are independent,
MX +Y (t) = MX (t)MY (t)
= (pe t + q)n (pe t + q)m
= (pe t + q)n+m
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Outline of Presentation
1
Moment Generating Functions
2
Special Probability Distributions
3
Useful Continuous Distributions
4
Distributions of Functions of Random Variables
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Some probability distributions are used so extensively in statistical
analysis that special formulae and/or tables have been developed
for computing the probabilities associated with them.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS:Discrete Uniform
Distribution
The simplest discrete random variable is one that assumes only a
finite number of possible values, each with equal probability. A
random variable X that assumes each of the values x1 , x2 , . . . , xn ,
with equal probability 1/n, is frequently of interest.
Definition
A random variable X has a discrete uniform distribution if each of
the n values in its range, say, x1 , x2 , . . . , xn has equal probability.
Then,
1
f (xi ) = , i = 1, 2, . . . , n
n
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Discrete Uniform Distribution
Mean, Variance & MGF of a Discrete Uniform Distribution
Suppose X is a discrete uniform random variable on the
consecutive integers for a, a + 1, a + 2, . . . , b, for a ≤ b.
The mean of X is
b+a
E [X ] = µ =
2
The variance of X is
Var (X ) = σ 2 =
MX (t) =
Gabriel Asare Okyere (PhD)
(b − a + 1)2 − 1
12
e t (e Nt − 1)
N(e t − 1)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Discrete Uniform Distribution
Example
Suppose that X has a discrete uniform distribution on the integers
0 through 9. Determine the mean, variance, and standard
deviation of the random variable Y = 5X
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Discrete Uniform Distribution
Solution
S = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9}
X has a discrete uniform distribution with probability 0.1 for each
value in the sample space. That is
f (x) =
Gabriel Asare Okyere (PhD)
1
= 0.1
10
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Discrete Uniform Distribution
E [Y ] = E [5X ] = 5E [X ]
E [X ] = (0 + 9)/2 = 4.5
E [Y ] = 5(4.5) = 22.5
Var (Y ) = Var (5X ) = 52 Var (X ) = 25Var (X )
Var (X ) =
(9 − 0 + 1)2 − 1
= 8.25
12
Var (Y ) = 25(8.25) = 206.25
p
√
SD(Y ) = Var (Y ) = 206.25 = 14.3614
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS: Bernoulli Process
A single trial of an experiment may result in one of the two
mutually exclusive outcomes, such as, head or tail in a toss of a
coin, yes or no in an election, dead or alive as a person comes out
of a surgery, male or female in a child birth, etc. Such a trial is
called Bernoulli trial and a sequence of these trials form a Bernoulli
Process.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Conditions of a Bernoulli Process
Every Bernoulli process should satisfy the following conditions:
1
Each trial in one of the two mutually exclusive outcomes,
success and failure.
2
The probability of a success p remains constant, from trial to
trial. The probability of failure, denoted q = 1 − p, remains
the same.
3
The trials are independent. That is, the outcome of any
particular trial is not affected by the outcome of any other
trial.
4
The random variable of this experiment is a binary which
assumes the values, 0 and 1.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Definition
A random variable X is said to have a Bernoulli distribution if it
assumes the values 0 and 1 for two outcomes. The probability
distribution for the success in the trial, x is defined by
(
p x =1
p(x) =
q x =0
or
p(x) = p x (1 − p)1−x ,
where x = 0 or 1 , 0 < p < 1 , 1 − p = q
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Mean, Variance, X’tic Function & MGF of a Bernoulli Process
E [X ] = µ = p
Var (X ) = σ 2 = p(1 − p)
Mx (t) = pe t + q
φ(x) = pe it + q
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Proof
E (X ) = 0 × (1 − p) + 1 × p = p
E (X 2 ) = 02 × (1 − p) + 12 × p = p
Var (X ) = E (X 2 ) − [E (X )]2 = p − p 2 = p(1 − p)
MX (t) = E (e tX ) = pe 1×t + (1 − p)e 0×t
= pe t + (1 − p) = pe t + q
φX (t) = MX (it) = pe it + q.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Example
Suppose that a fair die is tossed.
Let
(
1 if a 2 occurs
W =
0 otherwise.
a. Find the distribution of W
b. Find E [W ] and Var [W ]
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Bernoulli Process
Solution
When a die is tossed, a 2 occurs or a 2 does not occur. If the die is
1
fair, the probability of a 2 is . The outcomes from different trials
6
are independent. W therefore has the Bernoulli distribution with
probability mass function
w 1−w
1
5
f (w ) =
,
6
6
E (W ) = p =
1
6
w = 0, 1
Var (W ) = p(1 − p) =
Gabriel Asare Okyere (PhD)
1 5
5
× =
6 6
36
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS:Binomial Distribution
The Binomial distribution is used to model experiments consisting
of observations of identical and independent trials, each of which
results in one of the two outcomes. They are generalizations of
Bernoulli trials. Some examples are tossing a coin n times and
observing the number of successes, heads or tails, a sequence of n
shots may result in a number of hits or misses.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Conditions of a Binomial Distribution
1 The experiment consists of independent and identical trials.
2
Each trial results in one of the two outcomes called success or
failure.
3
The probability of success in a single trial is p and remains the
same from trial to trial. The probability of a failure, also in a
single trial is q = 1 − p.
4
The random variable of interest, X is the number of successes
observed during the n trials.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Definition
A random variable X has the binomial distribution with the
number of trials n and probability of success, p if
n x
P(X = x) =
p (1 − p)n−x , x = 0, 1, 2, . . . , n
x
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Mean, Variance, X’tic Function & MGF of a Binomial Distribution
E [X ] = µ = np
Var (X ) = σ 2 = np(1 − p)
Mx (t) = (pe t + q)n
φ(x) = (pe it + q)n
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Example
Four fair coins are flipped. If their outcomes are assumed
independent, what is the probability that two heads and two tails
are obtained? Calculate the mean and variance.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Solution
Letting X equals the number of heads (“success”) that appear
then X is a binomial random variable with parameters (n = 4 and
1
p = ). Hence
2
2 2
4
1
1
3
P(X = 2) =
=
2
2
2
8
1
E (X ) = np = 4
=2
2
1
1
Var (X ) = np(1 − p) = 4
=1
2
2
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Example
It is known that all items produced by a certain machine will be
defective with probability 0.1, independently of each other. What
is the probability that in a sample of 3 items, at most one will be
defective.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Solution
If X is the number of defective items in the sample, then X is a
binomial random variable with parameters (3,0.1). Hence the
desired probability is given by
P(X ≤ 1) = P(X = 0) + P(X = 1)
3
3
0
3
=
(0.1) (0.9) +
(0.1)1 (0.9)2
0
1
= 0.972
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Example
The random variable X has the moment generating function given
by
(2 + 3e t )6
MX (t) =
56
a. What is the distribution of X ?
b. Find P(X ≥ 1)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
Solution
(2 + 3e t )6
=
MX (t) =
(5)6
2 + 3e t
5
6
=
2 3 t
+ e
5 5
6
3
X has the binomial distribution with parameters n = 6 and p =
5
Comparing with
Mx (t) = (pe t + q)n
2
5
x 6−x
6
3
2
P(X = x) =
x
5
5
p=
3
5
Gabriel Asare Okyere (PhD)
and
q=
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Binomial Distribution
P(X ≥ 1) = 1 − P(X < 1) = 1 − P(X = 0)
" #
6
3 0 2 6
= 1−
0
5
5
= 1 − 0.004096 = 0.9959
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS:Multinomial Distribution
Multinomial distributions can be seen in many real-life situations
like rating a manufactured product as excellent, very good, good,
average or inferior and persons being interviewed in an opinion poll
to indicate whether they are for, against or undecided for a
candidate. Such situations conform to the multinomial experiment,
a generalization of the binomial where k > 2.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Conditions of a Binomial Distribution
1 The experiment consists of n independent and identical trials.
2
A trial of an experiment results in any one of the k mutually
exclusive possible outcomes P
with respective probabilities
p1 , p2 , p3 , . . . , pk such that ki=1 pi = 1.
3
The probability of an outcome in a single trial remains the
same from trial to trial.
4
The random variables y1 , y2 , y3 , . . . , yk the number of
successes
in each class of outcomes where
Pk
y
=
y1 + y2 + y3 + · · · + yk = n
i=1 i
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Definition
A random variables y1 , y2 , y3 , . . . , yk with probabilities
p1 , p2 , p3 , . . . , pk in a multinomial distribution given by,
P(y1 , y2 , y3 , . . . , yk ) =
where
n!
p y1 .p y2 .p y3 . · · · .pkyk
y1 !y2 !y3 ! . . . yk ! 1 2 3
yi = 0, 1, 2, 3, · · · , n
Gabriel Asare Okyere (PhD)
Pk
i=1 pi
=1
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Mean and Variance of a Multinomial Distribution
E [yi ] = npi
Var (yi ) = npi (1 − pi )
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Example
Items in a large lot under inspection are subject to two defects. It
is judged that 60% of the items are defect free whereas 30% have
a type X defect and 10% have type Y defect. If 10 of these items
are randomly selected from the lot, find the probability that 5 have
no defects, 2 have type X defect and 3 have type Y defect. Find
their expected values and variance.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Solution
Assume the outcomes are independent from a selection of an item
to item. Letting y1 , y2 , andy3 be the number of defect free, type X
and type Y defects with probabilities 0.60, 0.30, 0.10 respectively,
then
P(y1 = 5, y2 = 2, y3 = 3) =
10!
(0.6)5 (0.3)2 (0.1)3 = 0.0176
5!2!3!
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Multinomial Distribution
Solution
E (y1 ) = np1 = 10(0.6) = 6
Var (y1 ) = np1 (1 − p1 ) = 10(0.6)(0.4) = 2.4
E (y2 ) = np2 = 10(0.3) = 3
Var (y2 ) = np2 (1 − p2 ) = 10(0.3)(0.7) = 2.1
E (y3 ) = np3 = 10(0.2) = 2
Var (y3 ) = np3 (1 − p3 ) = 10(0.2)(0.8) = 1.6
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS: Hypergeometric
Distribution
A hypergeometric random variable is often found in many fields
with uses in acceptance sampling, electronic testing and quality
assurance where testing is done at the expense of the items being
tested (that is, the items are destroyed and cannot be replaced in
the sample.)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Suppose that we have a set of N balls of which k are red and
(N − k) are blue. We choose n of these balls without replacement,
and define X to be the number of red balls in our sample. The
distribution of X is called the hypergeometric distribution.
NOTE CAREFULLY
The hypergeometric random variable arises to a situation quite
similar to the Binomial random variable. The main distinction
between the two is that the trials of hypergeometric are not
independent (sampling without replacement) while that of the
binomial are independent (sampling with replacement)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Definition
The probability distribution of the hypergeometric random variable
x, the number of successes in a random sample of size n selected
from N items of which k successes and (N − k) failures called
Hypergeometric distribution is
k N−k
p(x) =
x
n−x
N
n
,
x = 0, 1, 2, . . . , min(k, n)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Mean and Variance of the Hypergeometric Distribution
If X has the hypergeometric distribution with parameters
n, k, andN, then
E [X ] = np
Var (X ) = np(1 − p)
where p =
N −n
N −1
k
N
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
The Binomial Approximation to the Hypergeometric Distribution
Let X have the hypergeometric distribution with parameters
n, k, and N. If n is very small and N is large, then
n x
P(X = x) ≈
p (1 − p)n−x
x
k
where p =
N
This stands when n ≤ 0.05N
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Example
Suppose that a car dealer has 30 cars available for immediate sale,
of which 10 are classified as compact cars. What is the probability
that, of the next five purchases from these cars available for
immediate sale,
a. one will be a compact car?
b. at least one will be a compact car?
c. all five will be compacct cars?
What is the average number of compact cars the car dealer would
expect to obtain in the next five purchases.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Solution
Given that N = 30, n = 5, and k = 10
10 20
P(X = 1) =
1
4
= 0.34
30
5
10
0
20
5
P(X ≥ 1) = 1 − P(X = 0) = 1 −
10
5
20
0
P(X = 5) =
30
5
= 0.8912
30
5
= 0.0018
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Hypergeometric Distribution
Solution
k
10
1
=
=
N
30 3
1
5
E (X ) = np = 5
= = 1.67
3
3
N −n
1
30 − 5
1
var (X ) = np(1 − p)
=5
1−
N −1
3
3
30 − 1
1
2
25
= 5
= 0.96
3
3
29
p =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS: Geometric Distribution
The Geometric distribution is a series of independent Bernoulli
trials with constant probability p of success. A geometric random
variable can be found in situations like the number of pregnancies
required before the first boy-child is born, the number of oil wells
needed to be drilled until the first successful oil well is hit and the
number of shots fired before the first target is hit.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
We can see from the examples given above that a geometric
random variable can be defined in two ways:
Number of trials until first success
(Geometric on {1, 2, . . . }).
Number of failures before first success
(Geometric on {0, 1, 2, . . . }).
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Definition 1
In a series of independent Bernoulli trials, with constant probability
p of success, let the random variable X denote the number of trials
until the first success. Then X has the geometric distribution with
parameter p and
P(X = x) = p(1 − p)x−1 ,
Gabriel Asare Okyere (PhD)
x = 1, 2, 3, . . .
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Definition 2
In a series of independent Bernoulli trials, with constant probability
p of success, let the random variable X denote the number of
failures before the first success. Then X has the geometric
distribution with parameter p and
P(X = x) = p(1 − p)x ,
Gabriel Asare Okyere (PhD)
x = 0, 1, 2, 3, . . .
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Mean, Variance, MGF & X’tic Func. of Geometric Distribution
(Number of trials until first success)
E (X ) =
Var (X ) =
MX (t) =
φX (t) =
1
,
x = 1, 2, 3, . . .
p
1−p
p2
pe t
,
t < − ln q
1 − qe t
pe it
,
t < − ln q
1 − qe it
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Mean, Variance, MGF & X’tic Func. of Geometric Distribution
(Number of failures before first success)
E (X ) =
Var (X ) =
MX (t) =
φX (t) =
1
1−p
−1=
,
x = 0, 1, 2, 3, . . .
p
p
1−p
p2
p
,
t < − ln q
1 − qe t
p
,
t < − ln q
1 − qe it
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Theorem
Let X have the geometric distribution with probability mass
function h(x; p) = p(1 − p)1−x , x = 1, 2, . . . . Then
P(X > x) = q x ,
where
q =1−p
Proof
P(X > x) = P(X = x + 1) + P(X = x + 2) + P(X = x + 3) + . . .
= pq x + pq x+1 + pq x+2 + pq x+3 + . . .
1
pq x
x
2
x
=
= pq (1 + q + q + . . . ) = pq
1−q
p
= qx
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
Let Y have the geometric distribution with probability mass
function h(y ; p) = p(1 − p)y , x = 0, 1, 2, . . . . Then
P(X > x) = q x ,
Gabriel Asare Okyere (PhD)
where
q =1−p
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Lack of Memory Property
If X has the geometric distribution, then
P(X > s + t|X > s) = P(X > t), for all positive integers s and t.
Proof
P(X > s + t ∩ X > s)
P(X > s)
P(X > s + t)
q s+t
=
= s , (by theorem above)
P(X > s)
q
t
= q = P(X > t)
P(X > s + t|X > s) =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Remark
The lack of memory property means that the count of the number
of trials until the next success, can be started at any trial without
changing the probability distribution of the random variable. For
example, if a six occurred 2 times in 15 rolls of a die, the
probability that a 6 will occur during the next n rolls of the die
does not depend on the number of times it occurred in the 15 rolls
of the die.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Example
A large consignment of items contains 10% that are defective.
Items are drawn until a defective item is found. Find the
probability that less that 4 draws are required.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Solution
Let X be the number of items drawn until a defective item is
found. Then X has the geometric distribution with parameter
p =0.1. Thus
P(X < 4) = P(X = 1) + P(X = 2) + P(X = 3)
= p(1 − p)1−1 + p(1 − p)2−1 + p(1 − p)3−1
= 0.1(0.9)0 + 0.1(0.9)1 + 0.1(0.9)2
= 0.271
Alternatively, let Y be the number of non-defective items drawn
until a defective item is found. Then, P(Y = y ) = 0.1(0.9)y
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
P(Y < 4) = P(Y ≤ 3) = P(Y = 0) + P(X = 1) + P(X = 2)
= p(1 − p)0 + p(1 − p)1 + p(1 − p)2
= 0.1(0.9)0 + 0.1(0.9)1 + 0.1(0.9)2
= 0.271
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Example
You throw a die repeatedly until you get a 6. What is the
probability that you need to throw more than 20 times to get 6?
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Solution
If you use the number of trials X as the geometric random
variable, then you have
P(X ≥ n) = (1 − p)n−1 ,
P(X > 20) = P(X ≥ 21) = (1 − p)21−1 =
Gabriel Asare Okyere (PhD)
p=
1
6
1 20
1−
= 2.6%
6
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Geometric Distribution
Solution Continued
If you use the number of failures Y as the geometric random
variable, then you have:
P(Y ≥ k) = (1 − p)k
Throwing a die at least 21 times to get a 6 is the same as having
at least 20 failures before a success.
1 20
20
P(Y ≥ 20) = (1 − p) = 1 −
= 2.6%
6
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
DISCRETE DISTRIBUTIONS: Negative Binomial
Distribution
The Negative Binomial Distribution is a series of independent
Bernoulli trials with constant probability p of success. A negative
binomial random variable can be found in situations like the
number of pregnancies required before the third boy-child is born,
the number of oil wells needed to be drilled until the k th successful
oil well is hit the number of shots fired before the tenth target was
hit and the number of applicants interviewed until the k th suitable
applicant is found.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
The negative binomial random variable can be defined in two ways:
Number of trials until k th success
Number of failures before k th success
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
Definition 1
In a series of independent Bernoulli trials, with constant probability
p of success, let the random variable X denote the number of trials
until k successes occur. Then X has the negative binomial
distribution with parameter p and k = 1, 2, 3, . . . . Thus
x −1 k
P(X = x) =
p (1 − p)x−k ,
x = k, k + 1, k + 2 . . .
k −1
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Definition 2
In a series of independent Bernoulli trials, with constant probability
p of success, let the random variable X denote the number of
failures before k successes occur. Then X has the negative
binomial distribution with parameter p and k = 0, 1, 2, . . .
x +k −1 k
P(X = x) =
p (1 − p)x ,
x = k, k + 1, k + 2, . . .
k −1
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
Mean, Variance, MGF & X’tic Func. of Negative
Binomial(Number of trials until k th success )
k
p
k(1 − p)
Var (X ) =
p2
k
pe t
MX (t) =
1 − qe t
k
pe it
φX (t) =
1 − qe it
E (X ) =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
Mean, Variance, MGF & X’tic Func. of Negative Binomial on
(Number of failures before k th success)
k(1 − p)
p
k(1 − p)
Var (X ) =
p2
k
p
MX (t) =
1 − qe t
k
p
φX (t) =
1 − qe it
E (X ) =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
Example
A geological study indicates that an exploratory oil well drilled in a
certain part of a state strikes oil with probability of 0.30. Find the
probability that the fourth strike of oil comes on the eighth well
drilled. Calculate the mean and variance of the number of wells
that must be drilled if the company wants to set up five producing
wells.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Negative Binomial Distribution
Solution
The probability of striking fourth oil on the eighth well drilled,
p =0.3
x −1 k
P(X = x) =
p (1 − p)x−k
k −1
8−1
P(X = 8) =
0.34 (0.7)8−4
4−1
= 0.068
k
4
E (X ) =
=
p
0.3
= 13.33
4(0.7)
k(1 − p)
Var (X ) =
=
= 31.111
p2
0.32
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
The Poisson distribution is used as a model for describing the
number of times some random event occurs in an interval of time
or space. Some examples are the number of claims processed by a
certain insurance company in a given month, the number of road
traffic accidents in an area during a given time interval, the
number of errors a typist makes in typing a page of a text and the
number of admissions of a clinic in a given time interval.
In all these examples, µ is the average number of times the event
occurs in a given time interval.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Definition
The Poisson distribution for the random variable, X , representing
the number of occurrence of an event in a given interval of time,
space or volume is defined by
P(X = x) =
µx e −x
,
x!
Gabriel Asare Okyere (PhD)
x = 0, 1, 2, . . .
and µ > 0
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Poisson Approximation to Binomial
When the number of trials, n in a Binomial process is large, the
computations of the binomial probabilities may be too tedious. The
Poisson distribution can be used as an alternative to approximate
the Binomial distribution. This is based on the convergence of the
Binomial distribution as n becomes large (n → ∞). This
approximation works when n becomes large and p becomes small.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
Let X be a binomial random variable with parameters n and p. If n
approaches infinity and p approaches 0 in such a way that np
remains constant at some value µ > 0, then
n x
µ
µx e −µ
lim
,
where p =
p (1 − p)n−x ≈
n→∞ x
x!
n
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Mean, Variance, MGF & X’tic Func. of Poisson Distribution
E (X ) = Var (X ) = µ
MX (t) = e µ(e
t −1)
φX (t) = e µ(e
it −1)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Example
The number of power surges in an electric grid has a Poisson
distribution with a mean of one power surge every twelve hours.
a. What is the probability that there will be no more than one
power surge in a 24-hour period?
b. What is the probability that there will be more than three
power surge in a 24-hour period?
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Solution
In 12 hours, the mean number of power surges is 1. In 24 hours,
the mean number is therefore 2.
P(X ≤ 1) = P(X = 0) + P(X = 1)
20 e −2 21 e −2
=
+
0!
1!
= e −2 + 2e −2
= 3e −2
= 0.406
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
P(X > 3) = 1 − P(X ≤ 3)
= 1 − [P(X = 0) + P(X = 1) + P(X = 2) + P(X = 3)]
0 −2 1 −2 2 −2 3 −2 2 e
2 e
2 e
2 e
= 1−
+
+
+
0!
1!
2!
3!
−2
4e
= e −2 + 2e −2 + 2e −2 +
3
= 0.857
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Example
Suppose that on average, 1 person in 1000 makes a numerical
error in preparing his or her income tax return. If 9000 forms are
selected at random and examined, find the probability that less
than 3 of the forms contain an error.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Poisson Distribution
Solution
If X of the 9000 forms contain an error, then X has the binomial
distribution with parameters n = 9000 and p = 0.001.
P(X < 3) =
2 X
9000
(0.001)x (0.999)9000−x
x
x=0
Since n = 9000 is large and p = 0.001 is small,
P(X = x) ≈
µx e −µ
, x = 0, 1, 2, . . .
x!
where µ = 9000 × 0.001 = 9. Thus
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
P(X < 3) =
=
2
X
9x e −9
x!
x=0
90 e −9
0!
= 0.0062
Gabriel Asare Okyere (PhD)
+
91 e −9 92 e −9
+
1!
2!
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Outline of Presentation
1
Moment Generating Functions
2
Special Probability Distributions
3
Useful Continuous Distributions
4
Distributions of Functions of Random Variables
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
CONTINUOUS DISTRIBUTIONS:The Uniform
Distribution
The uniform distribution provides a simple probability model to
describe a continuous random variable that can randomly assume
any value between two points a and b (a < b) on a line. It
therefore provides a good model for a continuous random variable
whose values are uniformly distributed over an interval. For
example, if buses arrive at a given bus stop over 20 minutes and
you arrive at the bus stop at a random time, the time you must
wait for the next bus to arrive could be described by the uniform
distribution over the interval from 0 to 20 or [0, 20].
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distrribution
Graphical Representation
The probability density function is a horizontal line segment
between a and b at 1/(b − a).
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distrribution
Definition
A random variable X has a continuous uniform distribution over
the interval (a, b) if its p.d.f. is given by

 1
a≤x ≤b
f (x) = (b − a)

0
elsewhere
F (x) =
x −a
b−a
Gabriel Asare Okyere (PhD)
a≤x ≤b
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distribution
Mean, Variance, MGF & X’tic Fxn of a Continuous Uniform Distr.
If X has a continuous uniform distribution over the interval (a, b),
then
E (X ) =
Var (X ) =
MX (t) =
φX (t) =
b+a
2
(b − a)2
12
tb
e − e ta
,
(b − a)t
e itb − e ita
,
(b − a)it
Gabriel Asare Okyere (PhD)
t 6= 0
t 6= 0
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distribution
Example
A bus arrives every 20 minutes at a bus stop. It is assumed that
the waiting time for a particular individual is a random variable
with continuous uniform distribution.
(a) Compute the mean and standard deviation of an individual’s
waiting time.
(b) Find the probability that an individual waits more than 9
minutes.
(c) Find the probability that an individual waits between 2 and 10
minutes.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distribution
Solution
The probability density function for the waiting time, X

 1 0 ≤ x ≤ 20
f (x) = 20
0
elsewhere
The mean and standard deviation of X are
E (X ) =
Var (X ) =
b+a
0 + 20
=
= 10
2
2
(b − a)2
(20 − 0)2
=
= 33.333
12
12
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuous Uniform Distribution
r
SD(X ) =
Z
20
P(X > 9) =
9
Z
P(2 < X < 10) =
2
10
(20 − 0)2
= 2.4
12
x 20 20
9
11
1
dx = =
−
=
20
20 9
20 20
20
x 10 10
2
8
1
dx = =
−
=
20
20 2
20 20
20
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
CONTINUOUS DISTRIBUTIONS: Exponential
Distribution
The length of time within which we have occurrence of an event in
a Poisson process results in a random variable with Exponential
distribution. Examples are the length of time between arrivals at a
car wash, the length of time until a machine or a component of it
fails, length of time between successive filing of claims in an
insurance office and waiting time for service line or in a queue.
The Exponential distribution models situations in which the
random variable represents waiting time or measurement of length
of time between successive occurrences of an event.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Definition
X is an exponential random variable with mean θ if
F (x) = 1 − e −x/θ
Often λ = 1/θ is called the rate of X
F (x) = 1 − e −x/θ = 1 − e −λx
f (x) =
d
1
F (x) = e −x/θ = λe −λx
dx
θ
Gabriel Asare Okyere (PhD)
for
x >0
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Mean, Variance, X’tic Function & MGF of Exponential Distribution
1
λ
2
1
2
Var (X ) = θ =
λ
1
λ
1
Mx (t) =
=
, t < and t < λ
1 − θt
λ−t
θ
E [X ] = θ =
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Graphical Representation
The probability density function is skewed to the right. The tail of
the distribution is heavier for larger values of λ
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Lack of Memory Property
If X has the exponential distribution, then for any positive
numbers x and a,
P(X > x + a|X > x) = P(X > a)
Proof
For any positive integer x,
Z
P(X > x) = λ
∞
e −λt dt = e −λx .
x
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
P(X > x + a|X > x) =
Gabriel Asare Okyere (PhD)
P(X > x + a, X > x)
P(X > x)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
P(X > x + a|X > x) =
=
=
P(X > x + a, X > x)
P(X > x)
P(X > x + a)
P(X > x)
e −λ(x+a)
e −λx
= e −λa
= P(X > a)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Remark
This means that given that X > a, X − a has the same distribution
as the original variable X . For example, if the time between buses
is exponential with mean 15 minutes, the amount of time I need to
wait (X − a) is an exponential with mean 15 minutes no matter
how long it has been (a minutes) since the last bus.
Another example is that the remaining life of a device does not
depend on how long it has been used. The device is therefore as
good as new.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Example
Suppose X has the exponential distribution with mean 10.
Determine te following:
(a) P(X > 10)
(b) P(X > 30)
(c) the value of x such that P(X < x) = 0.95
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Solution
f (x) =
1 −x/10
e
,
10
x ≥0
Z ∞
1
e −t/10 dt
P(x > 10) =
f (t)dt =
10 10
10
∞
−t/10 = −e
= e −1 = 0.3678
Z
∞
10
P(X > 30) = 1 − P(X ≤ 30) = 1 − F (30) = 1 − (1 − e −30/10 )
= e −3 = 0.0498
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
P(X < x) = 0.95
F (x) = 0.95
1−e
−x/10
= 0.95
e
−x/10
= 0.05
x
= −10 ln 0.05
x
= 29.96
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Example
A random variable Y has the moment generating function given by
1
MY (t) = 4(4 − t)−1 . Find P Y <
4
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Exponential Distribution
Solution
MY (t) = 4(4 − t)−1 =
4
1
1
=
= (1 − t)−1
1
4−t
4
(1 − 4 t)
This is the moment generating function of the exponential
1
distribution with mean . The p.d.f of Y is therefor given by
4
(
4e −4y y ≥ 0
f (y ) =
0
elsewhere
Z 1
i1
4
1
4
P Y <
4e −4y = −e −4y = 1 − e −1 = 0.6321
=
4
0
0
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
CONTINUOUS DISTRIBUTIONS: Normal Distribution
We now consider the most important distribution in statistics - the
normal distribution. The formula for this distribution was first
published by Abraham De Moivre in 1733. Many other
mathematicians figure prominently in the history of the normal
distribution, including Carl Friedrich Gauss.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Definition: The normal distribution
A random variable X has the normal distribution with mean µ and
variance σ 2 if its p.d.f is given by
1
(x − µ)2
f (x) = √ exp{−
}
2σ 2
σ 2π
−∞<x <∞
The notation X is N(µ, σ 2 ) means that the random variable X has
the normal distribution with mean µ and variance σ 2
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Properties of the normal distribution
1
The p.d.f. is bell-shaped and is symmetrical about a vertical
axis through the mean µ
2
The mode, which is the point on the horizontal axis where the
curve is a maximum, occurs at x = µ. The mean, the median
and the mode are equal.
3
The normal curve approaches the horizontal axis
asymptotically as we proceed in either direction away from the
mean.
4
The total area under the normal p.d.f. and the horizontal axis
is equal to 1.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Graphical Representation of the Normal Distribution
The figure below shows two normal curves with different means
and standard deviations.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
The standard normal distribution
The distribution of a normal random variable with µ = 0 and
σ = 1 is called the standard normal distribution. The standard
normal random variable is denoted by Z
Example
If Z is N(0, 1), find:
a. P(Z ≤ 1.5)
b. P(Z > 1.86)
c. P(−1.97 < Z < 1.32)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
a. P(Z ≤ 1.5) = the area shaded in brown in the figure below.
We can also read this probability from the standard normal
table to arrive at P(Z ≤ 1.5) = 0.9332
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
b. P(Z > 1.86) = 1 − P(Z ≤ 1.86) = 1 − 0.9686 = 0.0314,
using the standard normal table.p
c. P(−1.97 < Z < 1.32) = P(Z < 1.32) − P(Z < −1.97)
P(Z < 1.32) = 0.9066, using the standard normal table.
But P(Z < −1.97) = 1 − P(Z < 1.97) = 0.0244
∴ P(−1.97 < Z < 1.32) = 0.9066 − 0.0244 = 0.8822
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
If X is N(µ, σ 2 ), then Z =
X −µ
σ
is N(0, 1)
From the theorem above, it can be seen that, if X is N(µ, σ 2 ),
then:
x −µ
X −µ
x −µ
P(X ≤ x) = P
≤
=P Z ≤
σ
σ
σ
, where Z is N(0, 1).
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
If X is N(2, 16), find:
(a) P(X < 3)
(b) P(1 < X < 4)
(c) P(X = 2)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
(a) P(X < 3) =
is N(0, 1)
X −2
4
<
3−2
4
= P(Z < 0.25) = 0.5987, where Z
(b)
X −2
4−2
1−2
<
<
P(1 < X < 4) = P
4
4
4
= P(−0.25 < Z < 0.5)
= P(Z < 0.5) − P(Z < −0.25)
= 0.6915 − 0.4013 = 0.2902
(c) P(X = 2) = 0, since X is a continuous random variable.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
If Y is N(140, 625), find the value of a such that:
P(Y < a) = 0.8849
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
= P Z < a−140
, where Z
P(Y < a) = P Y −140
< a−140
25
25
25
is N(0, 1)
Hence a must satisfy the equation P Z < a−140
= 0.8849
25
a−140
25 = 1.20, (using the standard normal table)
∴ a = 140 + 25 × 1.20 = 170
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Applications of the normal distribution
The normal distribution arises in the study of numerous basic
physical phenomena. An example of such phenomena is given next.
Example
The ages of students of a certain school are normally distributed
with a mean of 12 years and standard deviation of 4 years. What
percentage of the students are less than 10 years old?
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
Let X years denote the age of a student chosen at random from
the school. Then, X is N(12, 16).
< 10−12
P(X < 10) = P X −12
= P(Z < −0.5) = 0.3085.
4
4
Hence, 30.85% of the students are less than 10 years old.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
The normal approximation to the Poisson distribution
The normal distribution can be used to approximate the Poisson
distribution with mean λ, when λ is large. The following theorem
gives the result.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
Let X be a Poisson random variable with mean λ. Then, for large
values of λ,
X −λ
Z= √
λ
is approximately N(0, 1)
Thus, if X has the Poisson distribution with mean λ, then for large
values of λ,
x −λ
P(X ≤ x) ≈ P Z ≤ √
λ
where Z is N(0, 1). The approximation is good for λ > 5.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Continuity correction
The normal approximation to the poisson distribution may be
improved by using the continuity correction, a device that makes
an adjustment for the fact that a discrete distribution is being
approximated by a continuous distribution. We do this by
representing each integer k by the interval from k − 0.5 to k + 0.5.
For instance, 3 is represented by the interval from 2.5 to 3.5, 10 is
represented by the interval from 9.5 to 10.5. It can be seen that a
good approximation of the event X = k is the event
k − 0.5 ≤ X ≤ k + 0.5, a good approximation of the event
a ≤ X ≤ b is the event a − 0.5 ≤ X ≤ b + 0.5, and a good
approximation of the event X ≤ b is the event X ≤ b + 0.5.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
A certain insurance company offers life, fire and automobile
coverage. The number of claims on any day on these types of
policy are independent Poisson random variables with means equal
to 30,20, and 50, respectively. What is the probability that, on a
given day, the company will receive claims on more than 120
policies of all three types?
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
Let X1 , X2 , and X3 be the number of life, fire, and automobile
claims, respectively, and let Y = X1 + X2 + X3 . Then Y has the
Poisson distribution with mean λ = 30 + 20 + 50 = 100. We are
required to find P(Y > 120). Now,
∞
X
100x e −100
P(Y > 120) =
x!
x=121
Since this is computationally difficult, we use the normal
approximation to the Poisson distribution and the continuity
correction, we obtain
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
121−0.5−λ
√
≥
=
P(Y > 120) = P(Y ≥ 121) ≈ P Y√−λ
λ
λ
P Z ≥ 121−0.5−100
10
P(Y > 120) = P(Z ≥ 2.05) = 1 − P(Z < 2.05) = 1 − 0.9798 =
0.0202, where Z is N(0, 1). The exact probability is 0.022669329.
It can be seen that the normal approximation is very close.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Moments and other properties
If Z is N(0, 1), then the moment generating function of Z is given
1 2
by MZ (t) = e 2 t , and the characteristic function is given by
1 2
φZ (t) = e − 2 t
Theorem
If X is N(µ, σ 2 ), then the moment generating function of X is
given by MX (t) = exp µt + 12 σ 2 t 2 , and the characteristic
function of X is given by φX (t) = exp µit − 21 σ 2 t 2 .
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
The distribution of a linear combination of independent normally
distributed random variables
Theorem
Let Xi be N(µi , σi2 ), i = 1, 2, . . . , n. If X1 , X2 , . . . , Xn are
P
independent
and P
c1 , c2 , . . . ,cn are constants, then Y = ni=1 ci Xi
Pn
n
2 2
is N
i=1 ci µi ,
i=1 ci σi
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
X1 is N(2, 4) and X2 is N(1, 5). If X1 and X2 are independent,
Y = 2X1 − X2 , find P(Y < 5)
Solution
E (Y ) = 2E (X1 ) − E (X2 ) = 2 × 2 − 1 = 3
V (Y ) = 22 V (X1 ) + (−1)2 V (X2 ) = 4 × 4 + 1 × 5 = 21.
Hence, Y is N(3,
21).
5−3
√
√2
<
=
P
Z
<
, where Z is
P(Y < 5) = P Y√−3
21
21
21
N(0, 1).
∴ P(Y < 5) = P(Z < 0.4364) = 0.67
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
Let X1 , X2 , . . . , Xn be a sample
which
P of size n from a population
2
is N(µ, σ 2 ) and let X̄ = n1 ni=1 Xi . Then X̄ is N(µ, σn )
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
An electrical firm manufactures light bulbs that have a length of
life that is approximately normally distributed, with mean 800
hours and standard deviation 40 hours. Find the probability that a
random sample of 16 bulbs will have an average life of less that
775.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
Let X̄ denote the mean life of a random sample of 16 bulbs. We
are required to find P(X̄ < 775).
X̄ −800
√
is N(0, 1). Therefore,
Now, 40/
16
X̄ − 800
775 − 800
P(X̄ < 775) = P
<
40/4
40/4
= P(Z < −25/10)
= P(Z < −2.5) = 0.0062
where Z is N(0, 1)
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Outline of Presentation
1
Moment Generating Functions
2
Special Probability Distributions
3
Useful Continuous Distributions
4
Distributions of Functions of Random Variables
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
A problem often encountered in statistics is the following.
We have a random variable X and we know its distribution. We
are interested, though, in a random variable Y = g (X ), where
g (X ) is a real-valued function of X . In particular, we want to
determine the distribution of Y .
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
Let X be a discrete random variable with probability mass function
f (x) = 16 , x = 1, 2, . . . , 6,
and let Y = (X − 3)2 . Find the probability mass function of Y .
Solution
The following table gives values of X and the corresponding values
of Y .
x 1 2 3 4 5 6
y 4 1 0 1 4 9
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
The random variable Y takes the values 0, 1, 4 and 9.
P(Y = 0) = P(X = 3) = 16
P(Y = 1) = P(X = 2) + P(X = 4) = 16 + 16 = 62
P(Y = 4) = P(X = 1) + P(X = 5) = 16 + 16 = 62
P(Y = 9) = P(X = 6) = 16
The possible values of Y and the corresponding probabilities are
given in the following table.
y
0 1 4 9
P(Y = y ) 16 26 26 16
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
Let X be
( a continuous random variable with p.d.f.
2x, 0 < x < 1,
f (x) =
0,
elsewhere
(a) Find the distribution function of X
(b) Find the distribution function and the p.d.f. of Y = 8X 3
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
a.
The distribution function of X is given by
Z x
F (x) =
f (t)dt
−∞
Rx
If x < 0, then, F (x) = −∞ 0dt = 0
Rx
Rx
If 0 ≤ x ≤ 1, then, F (x) = −∞ 0dt + 0 2tdt = x 2
R0
R1
Rx
If x > 1, then, F (x) = −∞ 0dt + 0 2tdt + 1 0dt = 1
Thus,


0, x < 0,
F (x) = x 2 , 0 ≤ x ≤ 1,


1, x > 1.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
b.
The distribution function of Y = 8X 3 is given by
FY (y ) = P(Y ≤ y ) = P(8X 3 ≤ y )
1 1
1 1
y3
= P X ≤ y 3 = FX
2
2

1 13

0,
2 y < 0,
1 23
1 13
=
4 y , 0 ≤ 2 y ≤ 1,


1 13
1,
2 y > 1.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution


y < 0,
0,
2
1
or FY (y ) = 4 y 3 , 0 ≤ y ≤ 8,


1,
y > 8.
The p.d.f. of Y is given
( by1
1 −3
y , 0≤y ≤8
d
f (y ) = dy
FY (y ) = 6
0,
elsewhere.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Theorem
Let X be a continuous random variable with p.d.f. fX (x) and let
Y = g (X ), where y = g (x) is a one-to-one differentiable function
with inverse x = g −1 (y ). Then, the p.d.f. of Y is given by
−1 d
−1
g (y ) fY (y ) = fX g (y ) dy
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Example
Let X be
( a continuous random variable with p.d.f.
2x, 0 < x < 1,
f (x) =
0,
elsewhere
Find the p.d.f. of Y = 8X 3
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Solution
We first express X in terms of Y and then determine the range of
Y.
y = 8x 3 → x = 12 y 1/3 and 0 ≤ x ≤ 1 → 0 ≤ y ≤ 8.
Moreover, y = 8x 3 is a monotone(increasing) function of x.
Hence, by the theorem
above, the
p.d.f. of Y is given by
1 1/3 d
1 1/3 fY (y ) = fX 2 y
dy 2 y
= 2 21 y 1/3 13 × 21 y −2/3 =
1 −1/3
,
6y
0 ≤ y ≤ 8.
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Moment Generating Functions
Special Probability Distributions
Useful Continuous Distributions
Distributions of Functions of Random Variables
Thank you and all the best!
Gabriel Asare Okyere (PhD)
PROBABILITY & PROBABILITY DISTRIBUTION
Related documents