Download Lecture Notes 3 Two Random Variables • Joint, Marginal, and

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Lecture Notes 3
Two Random Variables
EE278
Prof. B. Prabhakar
Statistical Signal Processing
Autumn 02-03
• Joint, Marginal, and Conditional pmfs
• Joint, Marginal, and Conditional cdfs, pdfs
c
Copyright °2000–2002
Abbas El Gamal
3-1
EE 278: Two Random Variables
Joint, Marginal, and Conditional pmfs
• Let X and Y be two discrete random variables defined over the same
probability space
• They are completely specified by their joint pmf
pX,Y (x, y) = P{X = x, Y = y}, for all x ∈ X , y ∈ Y
P
P
Clearly, x∈X y∈Y pX,Y (x, y) = 1.
• Example: Consider the following p(x, y)
x
0 1 2.5
−3 0 14 18
y −1 18 0 14
2 18 81 0
• To find pX (x), the marginal pmf of X, we use the law of total probability
X
pX (x) =
p(x, y), for x ∈ X
y∈Y
EE 278: Two Random Variables
3-2
• The conditional pmf of X given Y = y is defined as
pX|Y (x|y) =
pX,Y (x, y)
, for pY (y) 6= 0 and x ∈ X
pY (y)
(check that it is a pmf for X)
• Bayes rule for pmfs
pY |X (y|x)
pX (x)
pY (y)
pY |X (y|x)
pX (x)
= P
pX,Y (x0, y)
pX|Y (x|y) =
x0 ∈X
= P
x0 ∈X
pY |X (y|x)
pX (x)
pY |X (y|x0)pX (x0)
• X and Y are independent iff
pX,Y (x, y) = pX (x)pY (y), for all x ∈ X and y ∈ Y,
which implies that pX|Y (x|y) = pX (x), for all x ∈ X and y ∈ Y
3-3
EE 278: Two Random Variables
Example: Binary Symmetric Channel
Consider the following binary communication channel
Z ∈ {0, 1}
X ∈ {0, 1}
Y ∈ {0, 1}
where the bit sent X ∼ Br(p), the noise Z ∼ Br(²), the bit received
Y = (X + Z) mod 2 = X ⊕ Z, and X and Z are independent. Find
1. pX|Y (x|y),
2. pY (y), and
3. the probability of error P{X 6= Y }
EE 278: Two Random Variables
3-4
Solution:
1. To find pX|Y (x|y) we use Bayes rule
pX|Y (x|y) = P
x0 ∈X
pY |X (y|x)
pX (x)
pY |X (y|x0)pX (x0)
We know pX (x)
To find pY |X , note that
pY |X (y|x) =
=
=
=
=
P{Y = y|X = x}
P{X ⊕ Z = y|X = x}
P{Z = y ⊕ x|X = x}
P{Z = y ⊕ x}, since Z and X are independent
pZ (y ⊕ x)
So we have
pY |X (0|0)
pY |X (0|1)
pY |X (1|0)
pY |X (1|1)
=
=
=
=
pZ (0 ⊕ 0)
pZ (0 ⊕ 1)
pZ (1 ⊕ 0)
pZ (1 ⊕ 1)
=
=
=
=
pZ (0)
pZ (1)
pZ (1)
pZ (0)
=
=
=
=
1−²
²
²
1−²
3-5
EE 278: Two Random Variables
Plugging in the Bayes rule equation, we get
pX|Y (0|0) =
pY |X (0|0)
pY |X (0|0)pX (0)+pY |X (0|1)pX (1) pX (0)
pX|Y (0|1) =
pY |X (1|0)
pY |X (1|0)pX (0)+pY |X (1|1)pX (1) pX (0)
pX|Y (1|0) =
pY |X (0|1)
pY |X (0|0)pX (0)+pY |X (0|1)pX (1) pX (1)
pX|Y (1|1) =
pY |X (1|1)
pY |X (1|0)pX (0)+pY |X (1|1)pX (1) pX (1)
2. We already found pY (y) as
pY (y) = pY |X (y|0)pX (0) + pY |X (y|1)pX (1)
3. Now to find the probability of error P{X 6= Y }, consider
P{X 6= Y } =
=
=
=
EE 278: Two Random Variables
pX,Y (0, 1) + pX,Y (1, 0)
pY |X (1|0)pX (0) + pY |X (0|1)pX (1)
²(1 − p) + ²p
²
3-6
An interesting special case is when ² =
1
2
Here, P{X 6= Y } = 12 , which is the worst possible (no information is
sent), and
1
1
1
pY (0) = p + (1 − p) = = pY (1),
2
2
2
i.e., Y ∼ Br( 12 ), independent of the value of p !
Also in this case, the bit sent X and the bit received Y are independent
(check this)
3-7
EE 278: Two Random Variables
Joint and Marginal cdf and pdf
• Any two random variables are specified by their joint cdf
FX,Y (x, y) = P{X ≤ x, Y ≤ y}, for x, y ∈ R
y
(x, y)
x
• Properties of the cdf:
1. FX,Y (x, y) ≥ 0
2. FX,Y (x1, y1) ≤ FX,Y (x2, y2), whenever x1 ≤ x2 and y1 ≤ y2
3. limx,y→∞ FX,Y (x, y) = 1
EE 278: Two Random Variables
3-8
4. limy→−∞ FX,Y (x, y) = 0 and limx→−∞ F (x, y) = 0
5. limy→∞ FX,Y (x, y) = FX (x), the marginal cdf of X, and
lim FX,Y (x, y) = FY (y)
x→∞
6. The probability of any set can be determined from the
joint cdf, e.g.,
y
d
c
a
x
b
P{a < X ≤ b, c < Y ≤ d} = F (b, d) − F (a, d) − F (b, c) + F (a, c)
3-9
EE 278: Two Random Variables
• X and Y are independent iff
FX,Y (x, y) = FX (x)FY (y), for all x, y
• X and Y are continuous random variables if their joint cdf is continuous
in both x and y
In this case, we can define their joint pdf, provided that it exists, as the
function fX,Y (x, y) such that
Z x Z y
fX,Y (u, v) du dv, for x, y ∈ R
FX,Y (x, y) =
−∞
−∞
• If FX,Y (x, y) is differentiable in x and y, then
P{x < X ≤ x + ∆x, y < Y ≤ y + ∆y}
∂ 2F (x, y)
fX,Y (x, y) =
= lim
∆x,∆y→0
∂x∂y
∆x∆y
• Properties of fX,Y (x, y):
1. fX,Y (x, y) ≥ 0
R∞ R∞
2. −∞ −∞ fX,Y (x, y) dx dy = 1
EE 278: Two Random Variables
3-10
3. The probability of any set A ⊂ R can be calculated by integrating
the joint pdf over the set, i.e.,
Z
fX,Y (x, y) dx dy
P{(X, Y ) ∈ A} =
(x,y)∈A
• The marginal pdf of X can be obtained from the joint via the law of
total probability
Z ∞
fX,Y (x, y) dy
fX (x) =
−∞
• X and Y are independent iff
fX,Y (x, y) = fX (x)fY (y), for all x, y
Related documents