Download lecture notes - WSU Department of Mathematics

Document related concepts
no text concepts found
Transcript
Math/Stat 370: Engineering Statistics,
Washington State University
Haijun Li
lih@math.wsu.edu
Department of Mathematics
Washington State University
Week 4
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
1 / 19
Outline
1
Section 3-7: Discrete Random Variables
2
Section 3-8: Binomial Distribution
3
Section 3-9: Poisson Distribution
4
Section 3-10: Normal Approximation
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
2 / 19
Example
Toss a fair coin three times. The sample space =
{HHH, HHT , HTH, THH, TTH, THT , HTT , TTT }.
Let N denote the number of heads in three tosses.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
3 / 19
Example
Toss a fair coin three times. The sample space =
{HHH, HHT , HTH, THH, TTH, THT , HTT , TTT }.
Let N denote the number of heads in three tosses.
P(N = 0) = 1/8, P(N = 1) = 3/8,
P(N = 2) = 3/8, P(N = 3) = 1/8.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
3 / 19
Example
Toss a fair coin three times. The sample space =
{HHH, HHT , HTH, THH, TTH, THT , HTT , TTT }.
Let N denote the number of heads in three tosses.
P(N = 0) = 1/8, P(N = 1) = 3/8,
P(N = 2) = 3/8, P(N = 3) = 1/8.
Table: Probability Masses
N=x
0
P(N = x) 1/8
Haijun Li
1
3/8
2
3
3/8 1/8
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
3 / 19
Discrete Random Variables
The distribution of a discrete random variable X is
described by the probability mass function (PMF)
f (xi ) = P(X = xi ), for all the possible values xi of X .
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
4 / 19
Discrete Random Variables
The distribution of a discrete random variable X is
described by the probability mass function (PMF)
f (xi ) = P(X = xi ), for all the possible values xi of X .
P
The CDF F (x) = P(X ≤ x) = xi ≤x f (xi ).
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
4 / 19
Discrete Random Variables
The distribution of a discrete random variable X is
described by the probability mass function (PMF)
f (xi ) = P(X = xi ), for all the possible values xi of X .
P
The CDF F (x) = P(X ≤ x) = xi ≤x f (xi ).
P
µ = E(X ) = xi f (xi ), and
P
P
σ 2 = V (X ) = (xi − µ)2 f (xi ) = xi2 f (xi ) − µ2 .
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
4 / 19
Discrete Random Variables
The distribution of a discrete random variable X is
described by the probability mass function (PMF)
f (xi ) = P(X = xi ), for all the possible values xi of X .
P
The CDF F (x) = P(X ≤ x) = xi ≤x f (xi ).
P
µ = E(X ) = xi f (xi ), and
P
P
σ 2 = V (X ) = (xi − µ)2 f (xi ) = xi2 f (xi ) − µ2 .
Example: Toss a fair coin three times. Let N be the number
of heads in three tosses. Find E(N) and V (N).
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
4 / 19
Discrete Random Variables
The distribution of a discrete random variable X is
described by the probability mass function (PMF)
f (xi ) = P(X = xi ), for all the possible values xi of X .
P
The CDF F (x) = P(X ≤ x) = xi ≤x f (xi ).
P
µ = E(X ) = xi f (xi ), and
P
P
σ 2 = V (X ) = (xi − µ)2 f (xi ) = xi2 f (xi ) − µ2 .
Example: Toss a fair coin three times. Let N be the number
of heads in three tosses. Find E(N) and V (N).
Solution:
E(N) = 0 × 18 + 1 × 38 + 2 × 38 + 3 × 18 = 12
= 32 , and
8
2
1
3
3
1
2
2
2
2
V (N) = 0 × 8 + 1 × 8 + 2 × 8 + 3 × 8 − 32 = 34 .
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
4 / 19
Binomial Experiment
Bernoulli trial: A trial with only two possible outcomes
(labeled “0” and “1”).
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
5 / 19
Binomial Experiment
Bernoulli trial: A trial with only two possible outcomes
(labeled “0” and “1”).
Bernoulli random variable X :
P(X = 1) = p, P(X = 0) = 1 − p.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
5 / 19
Binomial Experiment
Bernoulli trial: A trial with only two possible outcomes
(labeled “0” and “1”).
Bernoulli random variable X :
P(X = 1) = p, P(X = 0) = 1 − p.
E(X ) = 0 × (1 − p) + 1 × p = p,
V (X ) = 02 × (1 − p) + 12 × p − p2 = p(1 − p).
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
5 / 19
Binomial Experiment
Bernoulli trial: A trial with only two possible outcomes
(labeled “0” and “1”).
Bernoulli random variable X :
P(X = 1) = p, P(X = 0) = 1 − p.
E(X ) = 0 × (1 − p) + 1 × p = p,
V (X ) = 02 × (1 − p) + 12 × p − p2 = p(1 − p).
Binomial experiment:
1
2
3
The trials are independent.
Each trial results in one of the two outcomes, labeled
“success or 1” and “failure or 0”.
The probability p of “success” on each trial remains
constant.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
5 / 19
Binomial Distribution
Binomial random variable Y : the number of successes
during the n trials.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
6 / 19
Binomial Distribution
Binomial random variable Y : the number of successes
during the n trials. n
n!
Binomial coefficient:
= y !(n−y
= number of ways to
)!
y
select y numbers from 1, . . . , n.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
6 / 19
Binomial Distribution
Binomial random variable Y : the number of successes
during the n trials. n
n!
Binomial coefficient:
= y !(n−y
= number of ways to
)!
y
select y numbers
from 1, . . . , n.
n
P(Y = y ) =
P(sequence of y ones and n − y zeros).
y
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
6 / 19
Binomial Distribution
Binomial random variable Y : the number of successes
during the n trials. n
n!
Binomial coefficient:
= y !(n−y
= number of ways to
)!
y
select y numbers
from 1, . . . , n.
n
P(Y = y ) =
P(sequence of y ones and n − y zeros).
y
The PMF of Y :
n
f (y ) =
py (1 − p)n−y , y = 0, 1, . . . , n.
y
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
6 / 19
Binomial Distribution
Binomial random variable Y : the number of successes
during the n trials. n
n!
Binomial coefficient:
= y !(n−y
= number of ways to
)!
y
select y numbers
from 1, . . . , n.
n
P(Y = y ) =
P(sequence of y ones and n − y zeros).
y
The PMF of Y :
n
f (y ) =
py (1 − p)n−y , y = 0, 1, . . . , n.
y
Table: The PMF of N in the Coin Tossing Example
N=x
0
P(N = x) 1/8
Haijun Li
1
3/8
2
3
3/8 1/8
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
6 / 19
Binomial Mean and Variance
E(Y ) = np, V (Y ) = np(1 − p).
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
7 / 19
Binomial Mean and Variance
E(Y ) = np, V (Y ) = np(1 − p).
P
Proof: Write Y = ni=1 Yi , where Yi = outcome (0 or 1)
resulted P
from the i-th Bernoulli trial. We have
E(Y ) = ni=1 E(Yi ) = nE(Yi ) = np.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
7 / 19
Binomial Mean and Variance
E(Y ) = np, V (Y ) = np(1 − p).
P
Proof: Write Y = ni=1 Yi , where Yi = outcome (0 or 1)
resulted P
from the i-th Bernoulli trial. We have
E(Y ) = ni=1 E(Yi ) = nE(Yi ) = np.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
7 / 19
Example: Process Control
Samples of 20 parts from a metal punching process are
selected every hour. Typically, 1% of the parts require rework.
Let X denote the number of parts in the sample that require
rework. A process problem is suggested if X exceeds its mean
by more than three standard deviations.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
8 / 19
Example: Process Control
Samples of 20 parts from a metal punching process are
selected every hour. Typically, 1% of the parts require rework.
Let X denote the number of parts in the sample that require
rework. A process problem is suggested if X exceeds its mean
by more than three standard deviations.
Note that n = 20, p = 0.01, and thus E(X ) = 0.2 and
V (X ) = σ 2 = 0.198.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
8 / 19
Example: Process Control
Samples of 20 parts from a metal punching process are
selected every hour. Typically, 1% of the parts require rework.
Let X denote the number of parts in the sample that require
rework. A process problem is suggested if X exceeds its mean
by more than three standard deviations.
Note that n = 20, p = 0.01, and thus E(X ) = 0.2 and
V (X ) = σ 2 = 0.198.
What is the probability that X exceeds its mean by more
than three standard deviations?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
8 / 19
Example: Process Control
Samples of 20 parts from a metal punching process are
selected every hour. Typically, 1% of the parts require rework.
Let X denote the number of parts in the sample that require
rework. A process problem is suggested if X exceeds its mean
by more than three standard deviations.
Note that n = 20, p = 0.01, and thus E(X ) = 0.2 and
V (X ) = σ 2 = 0.198.
What is the probability that X exceeds its mean by more
than three standard deviations?
Solution: Since σ = 0.45,
P(X > 0.2 + 3σ) = P(X > 1.55) = P(X ≥ 2) = 1 − P(X = 0)
−P(X = 1) = 1 − (0.99)20 − 20(0.99)19 (0.01) = 0.015.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
8 / 19
Example: Process Control (cont’d)
If the rework percentage increases to 4%, what is the
probability that X exceeds 1?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
9 / 19
Example: Process Control (cont’d)
If the rework percentage increases to 4%, what is the
probability that X exceeds 1?
Solution:
Note thatp
p
σ = np(1 − p) = 20(0.04)(0.96) = 0.88.
P(X > 1) = P(X ≥ 2) = 1 − P(X = 0) − P(X = 1) =
1 − (0.96)20 − 20(0.96)19 (0.04) = 0.19.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
9 / 19
Example: Process Control (cont’d)
If the rework percentage increases to 4%, what is the
probability that X exceeds 1?
Solution:
Note thatp
p
σ = np(1 − p) = 20(0.04)(0.96) = 0.88.
P(X > 1) = P(X ≥ 2) = 1 − P(X = 0) − P(X = 1) =
1 − (0.96)20 − 20(0.96)19 (0.04) = 0.19.
If the rework percentage increases to 4%, what is the
probability that X exceeds 1 in at least one of the next 5
hours of samples?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
9 / 19
Example: Process Control (cont’d)
If the rework percentage increases to 4%, what is the
probability that X exceeds 1?
Solution:
Note thatp
p
σ = np(1 − p) = 20(0.04)(0.96) = 0.88.
P(X > 1) = P(X ≥ 2) = 1 − P(X = 0) − P(X = 1) =
1 − (0.96)20 − 20(0.96)19 (0.04) = 0.19.
If the rework percentage increases to 4%, what is the
probability that X exceeds 1 in at least one of the next 5
hours of samples?
Solution: Since P(X > 1 in one hour) = 0.19,
P(X ≤ 1 in one hour) = 0.81. Thus
P(X > 1 in at least one of five hours)
= 1 − P(X ≤ 1 in any one of five hours)
5
= 1 − P(X ≤ 1 in one hour) = 1 − 0.815 = 0.65.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
9 / 19
Poisson distribution
A discrete random variable X with PMF
e−λ λx
, x = 0, 1, 2, . . . ,
f (x) =
x!
is said to have a Poisson distribution with parameter λ.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
10 / 19
Poisson distribution
A discrete random variable X with PMF
e−λ λx
, x = 0, 1, 2, . . . ,
f (x) =
x!
is said to have a Poisson distribution with parameter λ.
E(X ) = λ (≈ np), V (X ) = λ.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
10 / 19
Law of Rare Events
Suppose n is sufficiently large, and p is sufficiently small,
and λ = np is fixed.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
11 / 19
Law of Rare Events
Suppose n is sufficiently large, and p is sufficiently small,
and λ = np is fixed.
−λ x
n
limn→∞
px (1 − p)n−x = e x!λ , for any x = 0, 1, . . . , n.
x
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
11 / 19
Law of Rare Events
Suppose n is sufficiently large, and p is sufficiently small,
and λ = np is fixed.
−λ x
n
limn→∞
px (1 − p)n−x = e x!λ , for any x = 0, 1, . . . , n.
x
Since p is small, the event of “success” is considered as a
rare event (extreme event). The examples include defects,
failures, ...
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
11 / 19
Limit of Binomial Distributions
Green = Poisson with λ = 0.5, red = binomial with
n = 10, p = 0.05, blue = binomial with n = 20, p = 0.025.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
12 / 19
Poisson Counting Process of Random Events
The time interval [0, t] can be partitioned into n subintervals
of small length.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
13 / 19
Poisson Counting Process of Random Events
The time interval [0, t] can be partitioned into n subintervals
of small length.
The probability that more than one event in a subinterval is
zero.
The probability p of one event in a subinterval is the same
for all subintervals.
The probability of one event in a subinterval is proportional
to the length of the subinterval.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
13 / 19
Poisson Counting Process of Random Events
The time interval [0, t] can be partitioned into n subintervals
of small length.
The probability that more than one event in a subinterval is
zero.
The probability p of one event in a subinterval is the same
for all subintervals.
The probability of one event in a subinterval is proportional
to the length of the subinterval.
The events in different intervals are independent.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
13 / 19
Poisson Counting Process of Random Events
The time interval [0, t] can be partitioned into n subintervals
of small length.
The probability that more than one event in a subinterval is
zero.
The probability p of one event in a subinterval is the same
for all subintervals.
The probability of one event in a subinterval is proportional
to the length of the subinterval.
The events in different intervals are independent.
The total count Xt of events by time t is called a Poisson
process.
e−λt (λt)x
, x = 0, 1, 2, . . . ,
x!
where E(Xt ) = λt ≈ np.
P(Xt = x) =
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
13 / 19
Exponential Distribution and Poisson Process
The total count Xt of events by time t:
P(Xt = x) =
e−λt (λt)x
, x = 0, 1, 2, . . . ,
x!
where E(Xt ) = λt ≈ np.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
14 / 19
Exponential Distribution and Poisson Process
The total count Xt of events by time t:
P(Xt = x) =
e−λt (λt)x
, x = 0, 1, 2, . . . ,
x!
where E(Xt ) = λt ≈ np.
Let T denote the (time) length from the starting point to the
first event. For any t ≥ 0,
P(T > t) = P(Xt = 0) =
e−λt (λt)0
= e−λt .
0!
That is, T has the exponential distribution with rate λ.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
14 / 19
Exponential Distribution and Poisson Process
The total count Xt of events by time t:
P(Xt = x) =
e−λt (λt)x
, x = 0, 1, 2, . . . ,
x!
where E(Xt ) = λt ≈ np.
Let T denote the (time) length from the starting point to the
first event. For any t ≥ 0,
P(T > t) = P(Xt = 0) =
e−λt (λt)0
= e−λt .
0!
That is, T has the exponential distribution with rate λ.
In fact, the (random) time length between any two events
has the exponential distribution with rate λ.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
14 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Solution: Since λ = 1/2, P(T > 2) = e−1 .
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Solution: Since λ = 1/2, P(T > 2) = e−1 .
2
If you have not had a message in the last four hours, what is
the probability that you do not receive a message in the
next two hours?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Solution: Since λ = 1/2, P(T > 2) = e−1 .
2
If you have not had a message in the last four hours, what is
the probability that you do not receive a message in the
next two hours?
Solution: Still P(T > 2) = e−1 .
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Solution: Since λ = 1/2, P(T > 2) = e−1 .
2
If you have not had a message in the last four hours, what is
the probability that you do not receive a message in the
next two hours?
Solution: Still P(T > 2) = e−1 .
3
What is the expected time between your fifth and sixth
messages?
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
The time T between the arrival of electronic messages at your
computer is exponentially distributed with a mean of two hours.
1
What is the probability that you do not receive a message
during a two-hour period?
Solution: Since λ = 1/2, P(T > 2) = e−1 .
2
If you have not had a message in the last four hours, what is
the probability that you do not receive a message in the
next two hours?
Solution: Still P(T > 2) = e−1 .
3
What is the expected time between your fifth and sixth
messages?
Solution: Mean = 2 hours.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
15 / 19
Example
For the case of the thin copper wire, suppose that the number X
of flaws follows a Poisson distribution with a mean of 2.3 per
millimeter.
1
Find the probability of exactly 2 flaws in 1 millimeter wire.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
16 / 19
Example
For the case of the thin copper wire, suppose that the number X
of flaws follows a Poisson distribution with a mean of 2.3 per
millimeter.
1
Find the probability of exactly 2 flaws in 1 millimeter wire.
−2.3
2
Solution: Since λ = 2.3, P(X = 2) = e 2!(2.3) = 0.265. Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
16 / 19
Example
For the case of the thin copper wire, suppose that the number X
of flaws follows a Poisson distribution with a mean of 2.3 per
millimeter.
1
Find the probability of exactly 2 flaws in 1 millimeter wire.
−2.3
2
Solution: Since λ = 2.3, P(X = 2) = e 2!(2.3) = 0.265. 2
Find the probability of exactly 10 flaws in 5 millimeter wire.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
16 / 19
Example
For the case of the thin copper wire, suppose that the number X
of flaws follows a Poisson distribution with a mean of 2.3 per
millimeter.
1
Find the probability of exactly 2 flaws in 1 millimeter wire.
−2.3
2
Solution: Since λ = 2.3, P(X = 2) = e 2!(2.3) = 0.265. 2
Find the probability of exactly 10 flaws in 5 millimeter wire.
Solution: Since λ =−11.5
2.3 × 510 = 11.5 for the 5 millimeter
e
(11.5)
= 0.113.
wire, P(X = 10) =
10!
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
16 / 19
Normal Approximation
A Fundamental Scheme of Normal Approximation
Let X denote the sum of n independent and identically
distributed random variables with finite variance, then X√−E(X )
V (X )
has approximately the standard normal distribution as n → ∞.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
17 / 19
Normal Approximation
A Fundamental Scheme of Normal Approximation
Let X denote the sum of n independent and identically
distributed random variables with finite variance, then X√−E(X )
V (X )
has approximately the standard normal distribution as n → ∞.
If X has a binomial distribution of parameters n and p, then
Z =p
X − np
np(1 − p)
has approximately the standard normal dist as n → ∞.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
17 / 19
Normal Approximation
A Fundamental Scheme of Normal Approximation
Let X denote the sum of n independent and identically
distributed random variables with finite variance, then X√−E(X )
V (X )
has approximately the standard normal distribution as n → ∞.
If X has a binomial distribution of parameters n and p, then
Z =p
X − np
np(1 − p)
has approximately the standard normal dist as n → ∞.
If X has a Poisson distribution of parameter λ, then
X −λ
Z = √
λ
has approximately the standard normal dist as n → ∞.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
17 / 19
Continuity Correction Factor
P(2 ≤ X (GIF
< 4)
= 396
P(1.5
X
Binomial12.gif
Image,
× 264 ≤
pixels)
≤ 3.5).
P(2 < X ≤ 4) = P(2.5 ≤ X ≤ 4.5).
https://onlinecourses.science.
Figure: making adjustments at the ends of the interval
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
18 / 19
Example
The manufacturing of semiconductor chips produces 2%
defective chips. Assume that chips are independent and a lot
contains 1000 chips. Approximate the probability that between
20 and 30 chips are defective.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
19 / 19
Example
The manufacturing of semiconductor chips produces 2%
defective chips. Assume that chips are independent and a lot
contains 1000 chips. Approximate the probability that between
20 and 30 chips are defective.
Solution: Let X denote the number of defectives. Since
n = 1000 and p = 0.02, then E(X ) = np = 20 and
V (X ) = np(1 − p) = 19.6.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
19 / 19
Example
The manufacturing of semiconductor chips produces 2%
defective chips. Assume that chips are independent and a lot
contains 1000 chips. Approximate the probability that between
20 and 30 chips are defective.
Solution: Let X denote the number of defectives. Since
n = 1000 and p = 0.02, then E(X ) = np = 20 and
V (X ) = np(1 − p) = 19.6.
1
(Without Continuity Correction)
√
√
√X −np ≤ 30−20
P(20 < X ≤ 30) ≈ P 20−20
≤
=
19.6
19.6
np(1−p)
Φ(2.26) − Φ(0) = 0.988 − 0.5 = 0.488.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
19 / 19
Example
The manufacturing of semiconductor chips produces 2%
defective chips. Assume that chips are independent and a lot
contains 1000 chips. Approximate the probability that between
20 and 30 chips are defective.
Solution: Let X denote the number of defectives. Since
n = 1000 and p = 0.02, then E(X ) = np = 20 and
V (X ) = np(1 − p) = 19.6.
1
(Without Continuity Correction)
√
√
√X −np ≤ 30−20
P(20 < X ≤ 30) ≈ P 20−20
≤
=
19.6
19.6
np(1−p)
2
Φ(2.26) − Φ(0) = 0.988 − 0.5 = 0.488.
(With Continuity Correction)
√
P(20 < X ≤ 30) ≈ P 20.5−20
≤ √X −np ≤
19.6
np(1−p)
30.5−20
√
19.6
=
Φ(2.37) − Φ(0.1129) = 0.9911 − 0.5438 = 0.4473.
Haijun Li
Math/Stat 370: Engineering Statistics, Washington State University
Week 4
19 / 19
Related documents