Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."
SOLUTIONS FOR CHAPTER 1
Q.1.1 DMS with source probabilities : {0.30 0.25 0.20 0.15 0.10}
1
Entropy H(X) = ∑ pi log
i pi
= 0.30 log 1/0.30 + 0.25 log 1/0.25 + ……………
= 2.228 bits
qi
Q.1.2 Define D(p ⎜⎜q) = ∑p i
i log
pi
(1)
pi, qi – probability distributions of discrete source X.
qi ⎛ qi ⎞
D(p ⎜⎜q) = ∑p i log ≤ ∑P⎜⎜ − 1⎟⎟ [using identity ln x ≤ x – 1]
i
i pi i ⎝ pi ⎠
= ∑ (qi − pi ) = 0
i
∴ D(p ⎜⎜q) ≥ 0
Put qi = 1/n in (1) where n = cardinality of the distance source.
D(p ⎜⎜q) = ∑ p log p
i
i i + ∑p
i
i log n
∑ p log p
i i + log n = − H ( X ) + log n ≥ 0
= i
− H ( X ) ≤ log n
H(X) = log n for uniform probability distribution. Hence proved that entropy of a
discrete source is maximum when output symbols are equally probable. The
quantity D(p ⎜⎜q) is called the Kullback-Leibler Distance.
Q. 1.3 The plots are given below:
1
,"Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and 2
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."
3
2
y = x -1
1
y = ln(x)
0
-1
-2
-3
0 0.5 1 1.5 2 2.5 3 3.5
Q 1.4 Consider two probability distributions: {p0, p1, , pK-1} and {q0, q1, , qK-1}.
K −1
⎛q ⎞ 1 K −1 ⎛q ⎞
We have ∑ pk log 2 ⎜⎜ k ⎟⎟ = ∑ pk ln⎜⎜ k ⎟⎟ . Use ln x ≤ 1 – x,
k =0 ⎝ pk ⎠ ln 2 k =0 ⎝ pk ⎠
K −1
⎛q ⎞ 1 K −1 ⎛ qk ⎞
∑ pk log 2 ⎜⎜ k ⎟⎟ ≤ ∑ pk ⎜⎜ − 1⎟⎟
k =0 ⎝ pk ⎠ ln 2 k =0 ⎝ pk ⎠
1 K −1
≤ ∑ (qk − pk )
ln 2 k =0
1 ⎛ K −1 K −1
⎞
≤ ⎜ ∑ qk − ∑ pk ⎟ = 0
ln 2 ⎝ k =0 k =0 ⎠
K −1
⎛q ⎞
Thus, ∑p k log 2 ⎜⎜ k ⎟⎟ ≤ 0 . (1)
k =0 ⎝ pk ⎠
n m P ( xi , y j )
Now, I(X; Y) = ∑∑ P( xi , y j )log (2)
i =1 j =1 P( xi ) P( y j )
From (1) and (2) we can conclude (after basic manipulations) that I(X;Y) ≥ 0. The
equality holds if and only if P ( xi , x j ) = P( xi ) P( x j ) , i.e., when the input and output
symbols of the channel are statistically independent.
2
, "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and 3
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."
Q.1.5 Source X has infinitely large set of outputs P(xi) = 2-i, i = 1, 2, 3, ………
∞ ∞
1
H ( X ) = ∑ p ( xi ) log = ∑ 2 −i log 2 −i
i =1 p ( xi ) i =1
∞
= ∑ i.2
i =1
−i
= 2 bits
Q.1.6 Given: P(xi) = p(1-p)i-1 i = 1, 2, 3, ……..
H ( X ) = − ∑ p (1 − p ) i −1 log {p (1 − p ) i −1 }
i
= − ∑ p (1 − p ) i −1 {log p + (i − 1) log (1 − p )}
i
= − p log p ∑ p (1 − p ) i −1 − p log (1 − p ) ∑ (i − 1) (1 − p) i −1
i =1 i
1 1− p
= − p log p × − p log(1 − p) × 2
p p
⎛ 1− p ⎞ p log p − (1 − p) log(1 − p) 1
= − log p − ⎜⎜ ⎟⎟ log(1 − p) = = H ( p) bits
⎝ p ⎠ p p
Q 1.7 Hint: Same approach as the previous two problems.
Q 1.8 Yes it is uniquely decodable code because each symbol is coded uniquely.
Q 1.9 The relative entropy or Kullback Leibler distance between two probability mass
functions p(x) and q(x) is defined as
⎛ p ( x) ⎞
D( p || q ) = ∑ p ( x) log⎜⎜ ⎟⎟ . (1.76)
x∈X ⎝ q ( x) ⎠
(i) Show that D ( p || q ) is non negative.
⎛ p ( x) ⎞ ⎛ q ( x) ⎞
Solution: − D( p || q ) = − ∑ p ( x) log⎜⎜ ⎟⎟ = ∑ p ( x) log⎜⎜ ⎟⎟
x∈X ⎝ q( x) ⎠ x∈X ⎝ p ( x) ⎠
q( x)
≤ log ∑ p ( x) (from Jensen’s Inequality: Ef(X)≥ f(EX))
x∈X p ( x)
= log ∑ q( x) = log(1) = 0.
x∈X
Thus, − D( p || q ) ≤ 0 or D( p || q ) ≥ 0 .
3