Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
MATH 425
HOMEWORK 8
Winter, 2009
8a. We are given the joint probability density function of X and Y :
(
c(y 2 − x2 )e−y if 0 < y < ∞, −y ≤ x ≤ y,
f (x, y) =
0
otherwise .
Since f is a joint density function, choose c to satisfy
Z ∞Z y
1 =
c(y 2 − x2 )e−y dx dy
Z0 ∞ −y Z y
(y 2 − x2 ) dx dy
ce− y
=
−y
0
Z ∞
4 3 −y
y e dy
= c
3
0
Z ∞ −y 3
e y
dy
= 8c
3!
0
= 8c.
So, c = 81 . Note that the function inside the last integral
e−y y 3
3!
is the density of a gamma distribution with parameters (k = 3, λ = 1), so
the integral is 1.
8b. The marginal density of X is
Z ∞
fX (a) =
f (a, y) dy
−∞
Z ∞
1 2
=
(y − a2 )e−y dy
8
|a|
1
=
|a| + 1 e−|a|
4
The second equality is because f (a, y) = 0 when y < 0 or when a 6∈ (−y, y),
so that y > |a|.
The marginal density of Y is
Z
∞
fY (b) =
f (x, b) dx
−∞
Z b
=
−b
1 2
(b − x2 )e−b dx
8
1 3 −b
=
be .
6
8c. The expectation of X can be computed from the marginal density from
(8b)
Z ∞
1
x |x| + 1 e−|x| dx
E[X] =
−∞ 4
Z 0
Z ∞
−|x|
1
1
x |x| + 1 e
=
x |x| + 1 e−|x| dx
dx +
4
−∞ 4
0
Z ∞
Z ∞
1
1
x |x| + 1 e−|x| dx
= −
x |x| + 1 e−|x| dx +
4
4
0
0
= 0.
The third equality is by substituting u = −x in the first integral.
9a. The joint probability density function of X and Y is given by
(
xy
6
2
x
+
if 0 < y < 2, 0 < x < 1,
2
f (x, y) = 7
0
otherwise .
This is really a density function: (a) f (x, y) ≥ 0 for all x and y, and (b)
Z 2Z 1
Z 2
6 2 xy 2 3y x +
dx dy =
+
dy = 1.
2
7 14
0
0 7
0
9b. The density function for X is
Z 2
6 2 ay fX (a) =
a +
dy
2
0 7
12 2 6
6
=
a + a = (2a2 + a)
7
7
7
when 0 < a < 1 and fX (a) = 0 otherwise.
9c. Compute.
1
Z
Z
x
6 2 xy x +
dy dx
2
0
0 7
Z 1
6 2
3 x + x3 dx
=
7
14
0
15
=
.
56
P{X > Y } =
Alternatively, we can compute the probability by reversing the order of
integration. To do this note that X can be at most one, so we need only
integrate y up to 1.
Z 1Z 1
6 2 xy P{X > Y } =
x +
dx dy
2
0
y 7
Z 1
3
2
(1 − y 3 ) + (y − y 3 ) dy
=
7
14
0
15
=
.
56
9d. By the definition of conditional probability:
P(Y >
P{X < 12 , Y > 21 }
1
1
|X < ) =
2
2
P{X < 12 }
Compute each probability.
Z
2
Z
1
2
6 2 xy x +
dx dy
1
7
2
0
Z 22 1
3 =
+ y dy
1
28 56
2
69
=
448
1
1
P{X < , Y > } =
2
2
Use the density for X from (9b):
1
P{X < } =
2
Z
0
1
2
6
5
(2x2 + x) dx = .
7
28
So, plugging back into the first equation
P(Y >
1
1
69/448
69
|X < ) =
=
= 0.8625.
2
2
5/28
80
9e. Compute using the density from 9b.
Z 1
6
5
x (2x2 + x) dx =
E[X] =
7
7
0
9f. We will need the density function for Y .
Z 1
6 2 xb x +
dx
fY (b) =
2
0 7
2
3
=
+ b
7 14
when 0 < b < 2 and fY (b) = 0 otherwise.
Compute.
Z 2
3 8
2
E[Y ] =
y + y dy =
7 14
7
0
10a. Compute.
∞
Z
y
Z
e−(x+y) dx dy
P{X < Y } =
Z0 ∞
=
0
e−y − e−2y dy
0
= 1−
1
1
= .
2
2
Alternatively,
Z
∞
Z
P{X < Y } =
Z0 ∞
=
0
1
=
.
2
∞
e−(x+y) dy dx
x
e−2x dx
10b. When a < 0, P{X < a} = 0. When a ≥ 0,
Z aZ ∞
e−(x+y) dy
P{X < a} =
Z0 a 0
e−x dx
=
0
= 1 − e−a
Alternatively, notice that X and Y are independent and have the same
distribution:
e−(x+y) = e−x e−y .
which the exponential distribution with λ = 1. The cumulative distribution
for X is then
P{X < a} = 1 − e−a
12. We assume the men and women are equally likely to enter the drugstore.
Let M , W be random variables giving the number of men, women entering
the store during the hour. From Example 6.2b, M and W are independent
and are Poisson distributed with parameter λ = 10 · 12 = 5. We want to
compute
P(M ≤ 3 | W = 10) =
P{M ≤ 3, W = 10}
= P{M ≤ 3}.
P{W = 10}
The last equality is because M and W are independent.
So, the probability that no more than 3 men enter the store is
P{M = 3} = e−5 + e−5 5 + e−5
53
52
+ e−5
2!
3!
118 −5
e
3
≈ 0.265
=
13a. Let X be the time the man arrives and Y the time the woman arrives
1
(both in fractions of 1 hour). We want the probability P{|X − Y | ≤ 12
}.
The probability densities for each variable are
(
2 if 14 ≤ t ≤ 34
fX (t) =
0 otherwise
(
1 if 0 ≤ t ≤ 1
fY (t) =
0 otherwise
The probability is given by
3
4
Z
1
P{|X − Y | ≤ } =
12
Z
1
t+ 12
2 ds dt
1
4
1
t− 12
3
4
Z
1
dt
3
=
1
4
1
.
6
=
13b. We want to compute the probability P{X < Y }.
3
4
Z
P{X < Y } =
Z
1
2 ds dt
1
4
t
3
4
Z
=
2 − 2t dt
1
4
=
1
.
2
14. Let X be the location of the ambulance (at the time of the accident) and
Y the location of the accident. These random variables are independent and
are uniformly distributed on the interval [0, L] (the endpoints are irrelevant).
Their density (which is the same) is given by
(
1
if 0 ≤ x ≤ L
f (x) = L
0 otherwise
We need to determine the probabilities P{|X −Y | ≤ a} for 0 ≤ a ≤ L. There
are three regions: {0 ≤ X ≤ a}, {a ≤ X
Z a Z x+a
1
dx dy =
L2
0
0
Z L−a Z y+a
1
dx dy =
2
a
y−a L
Z LZ L
1
dx dy =
2
a
x−a L
≤ L − a} and {L − a ≤ X ≤ L}
a2
L2
2a 4a2
− 2
L
L
a2
L2
a2
2a 4a2
a2
−
+
+
P{|X − Y | ≤ a} =
L2
L
L2
L2
2
2aL − a
=
.
L2
18. X and Y have similar densities, which are uniform over an interval of
length L2 :
(
2
if 0 ≤ y ≤ 1
fX (x) = L
0 otherwise
and fY is the same except fY (y) = L2 for L2 ≤ y ≤ 1.
There are two cases to compute P{Y − X > L3 }, when X < L6 (so Y can
take any value) and when L6 < X < L2 (so Y must be at least X + L3 ):
Z
L
6
0
Z
L
2
Z
Z
L
L
2
L
2
4
1
dy dx =
2
L
3
4
4
dy dx =
2
L
L
9
x+ L
6
3
L
1 4
7
P{Y − X > } =
+ = .
3
3 9
9
19a. The joint density is given as
(
f (x, y) =
1
x
0
if 0 < y < x < 1
otherwise.
The marginal density of Y when 0 < y < 1 is
Z 1
1
fY (y) =
dx
y x
= ln 1 − ln y = − ln y
So,
(
− ln y
fY (y) =
0
if 0 < y < 1
otherwise.
19b. The marginal density of X when 0 < x < 1 is
Z x
1
dy
fX (x) =
0 x
= 1
So,
(
1
fX (x) =
0
19c. The expectation of X is
Z
E[X] =
if 0 < x < 1
otherwise.
∞
Z
xfX (x) dx =
−∞
19d. The expectation of Y is
Z ∞
E[Y ] =
yfY (y) dy
−∞
Z 1
=
−y ln y dy
0
1
x dx = .
2
(by parts: u = ln y, dv = −y)
0
Z 1
y 2 ln y 1 i
y
=
−
+
dy
2 y=0
0 2
1
1
= 0+ = .
4
4
h
1
For the third equality, use l’Hôspital:
− ln y
y→0
1/y 2
1/y
= lim+
y→0 2/y 3
y2
= 0.
= lim+
y→0 2
lim+ −y 2 ln y =
y→0
lim+
20a. The joint density of X and Y was is given by
(
xe−(x+y) if 0 < x, y
f (x, y) =
0
otherwise.
X and Y are independent. The quickest way to see this is that f (x, y) is the
product of a function of x alone and a function of y alone: when 0 < x, y
f (x, y) = xe−x · e−y .
When x < 0 or y < 0, the equation is still true. So, independence follows by
Proposition 2.1.
20b. The joint density of X and Y was is given by
(
2 if 0 < x < y, 0 < y < 1
f (x, y) =
0 otherwise.
X and Y are not independent. To see this compute the marginal density
functions when 0 < x, y < 1:
Z 1
fX (x) =
2 dy = 2(1 − x)
x
Z y
fY (y) =
2 dy = 2y.
0
However, for 0 < x < y < 1
f (x, y) = 2 6= 2(1 − x) · 2y = fX (x) · fY (y).
23a. The joint density of X and Y is
(
12xy(1 − x)
f (x, y) =
0
if 0 < x, y < 1
otherwise.
X and Y are independent. The quickest way to see this is that f (x, y) is the
product of a function of x alone and a function of y alone: when 0 < x, y < 1
f (x, y) = 12x(1 − x) · y ,
When one of x or y is not in the interval (0, 1), the equation is still true. So,
independence follows by Proposition 2.1.
Since we will need the marginal densities, we can verify this by computing
these. When 0 < x, y < 1,
Z 1
1
2
12xy(1 − x) dy = 6x(1 − x)y = 6x(1 − x)
fX (x) =
y=0
0
Z 1
1
12xy(1 − x) dx = 6x2 y − 4x3 y = 2y.
fY (y) =
0
x=0
both marginal densities are 0 otherwise.
Note that when 0 < x, y < 1,
f (x, y) = 12xy(1 − x) = fX (x) · fY (y),
verifying independence.
23b. Compute the expectation of X.
Z ∞
E[X] =
xfX (x) dx
−∞
Z 1
1
=
6x2 (1 − x) dx = .
2
0
23c. Compute the expectation of Y .
Z ∞
yfY (y) dx
E[Y ] =
−∞
Z 1
2
2y 2 dy = .
=
3
0
23d. Compute the variance of X.
E[X] =
2
1
Z2
∞
x2 fX (x) dx
E[X ] =
−∞
Z 1
3
6x3 (1 − x) dx =
10
0
2
V ar(X) = E[X 2 ] − E[X]
1
1
3
− = .
=
10 4
20
=
23e. Compute the variance of Y .
E[Y ] =
2
2
Z3
∞
y 2 fY (y) dy
E[Y ] =
−∞
1
Z
=
0
2y 3 dx =
1
2
V ar(X) = E[X 2 ] − E[X]
1 4
1
=
− = .
2 9
18
2