100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Exam (elaborations)

TEST BANK FOR Information, Theory, Coding and Cryptography 3rd Edition By Ranjan Bose (Solution Manual)

Rating
-
Sold
-
Pages
62
Grade
A+
Uploaded on
16-11-2021
Written in
2021/2022

Exam (elaborations) TEST BANK FOR Information, Theory, Coding and Cryptography 3rd Edition By Ranjan Bose (Solution Manual) "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 1 1 SOLUTIONS FOR CHAPTER 1 Q.1.1 DMS with source probabilities : {0.30 0.25 0.20 0.15 0.10} Entropy H(X) = Σi i i p p log 1 = 0.30 log 1/0.30 + 0.25 log 1/0.25 + …………… = 2.228 bits Q.1.2 Define D(p ⎜⎜q) = Σi i i i p q p log (1) pi, qi – probability distributions of discrete source X. D(p ⎜⎜q) = Σi i i i p q p log ≤ Σ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − i i i i p q P 1 [using identity ln x ≤ x – 1] = 0 ) ( = − Σi i i q p ∴ D(p ⎜⎜q) ≥ 0 Put qi = 1/n in (1) where n = cardinality of the distance source. D(p ⎜⎜q) = Σ + Σ i i i i i p log p p log n = H X n p p n H X n i i i ( ) log log log ( ) log 0 − ≤ Σ + = − + ≥ H(X) = log n for uniform probability distribution. Hence proved that entropy of a discrete source is maximum when output symbols are equally probable. The quantity D(p ⎜⎜q) is called the Kullback-Leibler Distance. Q. 1.3 The plots are given below: "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 2 2 0 0.5 1 1.5 2 2.5 3 3.5 -3 -2 -1 0 1 2 3 y = ln(x) y = x - 1 Q 1.4 Consider two probability distributions: {p0, p1, , pK-1} and {q0, q1, , qK-1}. We have ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ = ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ Σ Σ− = − = k k K k k k k K k k p p q p p q ln ln 2 log 1 1 0 2 1 0 . Use ln x ≤ 1 – x, ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − ≤ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ Σ Σ− = − = 1 ln 2 log 1 1 0 2 1 0 k k K k k k k K k k p p q p p q ( ) Σ− = ≤ − 1 ln 2 0 1 K k k k q p 0 ln 2 1 1 0 1 0 = ⎟⎠ ⎞ ⎜⎝ ⎛ ≤ Σ −Σ − = − = K k K k k k q p Thus, log 0 2 1 0 ≤ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ Σ− = k k K k k p p q . (1) Now, I(X; Y) =ΣΣ = = n i m j i j i j i j P x P y P x y P x y 1 1 ( ) ( ) ( , ) ( , )log (2) From (1) and (2) we can conclude (after basic manipulations) that I(X;Y) ≥ 0. The equality holds if and only if ( , ) ( ) ( ) i j i j P x x = P x P x , i.e., when the input and output symbols of the channel are statistically independent. "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 3 3 Q.1.5 Source X has infinitely large set of outputs P(xi) = 2-i, i = 1, 2, 3, ……… i i i i i i p x H X p x − ∞ = − ∞ = = Σ = Σ2 log 2 ( ) ( ) ( ) log 1 1 1 .2 2 1 = = Σ∞ = − i i i bits Q.1.6 Given: P(xi) = p(1-p)i-1 i = 1, 2, 3, …….. ( ) = −Σ (1− ) −1 log{ (1− )i−1} i H X p p i p p p(1 p) 1 {log p (i 1) log (1 p)} i = −Σ − i− + − − Σ Σ − = = − − − − − − − i i i p p p p i p p i p 1 1 log (1 ) 1 log (1 ) ( 1) (1 ) 2 log 1 log(1 ) 1 p p p p p p p − = − × − − × p p p p p p p log p 1 p log(1 ) log (1 ) log(1 ) − − − = − ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − = − − = 1 H( p) p bits Q 1.7 Hint: Same approach as the previous two problems. Q 1.8 Yes it is uniquely decodable code because each symbol is coded uniquely. Q 1.9 The relative entropy or Kullback Leibler distance between two probability mass functions p(x) and q(x) is defined as ( ) Σ∈ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ = x X q x D p q p x p x ( ) || ( )log ( ) . (1.76) (i) Show that D(p || q) is non negative. Solution: ( ) Σ∈⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − = − x X q x D p q p x p x ( ) ) ( log ) ( || = Σ∈ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ x X p x p x q x ( ) ( ) log ( ) Σ∈ ≤ x X p x p x q x ( ) log ( ) ( ) (from Jensen’s Inequality: Ef(X)≥ f(EX)) Σ∈ = x X log q(x) = log(1) = 0. Thus, − D(p || q) ≤ 0 or D(p || q) ≥ 0 . "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 4 4 (ii) ( ) ( || ) ( ) log ( ) q x D p q = Σp x p x Symmetry Property: D( p || q) =D(q || p) ( ) ( ) log ( ) ( ) ( ) log ( ) p x q x q x q x Σp x p x = Σ Σp(x) log p(x) −Σ p(x) log q(x) ≠ Σq(x) log q(x) −Σq(x) log p(x) Therefore Kullback Leibler distance does not follow symmetry property. Triangle Inequality: D( p || q) + D(q || r) ≥D( p || r) Σ +Σ ≥Σ ( ) ( ) log ( ) ( ) ( ) log ( ) ( ) ( ) log ( ) r x p x p x r x q x q x q x p x p x On solving this we get 0 ( ) Σ(− ( ) + ( )) log ( ) ≥ r x p x q x q x This relation does not hold if p(x) > q(x). Therefore Kullback Leibler distance does not follow triangle inequality property. (iii) I (X;Y) =Σ p(x, y)I (x; y) =Σ log ( ) ( ) ( , ) log( , ) p x p y p x y x y = D( p(x, y) || p(x) p( y)) Q.1.10 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 5 5 Q.1.11 The codeword lengths are possible if and only if they satisfy Kraft-McMillan inequality, which in this case is 1 2 2 1 2 1 2 1 3 3 3 8 + + + d ≤ 8 5 256 d ≤ d ≤ 160 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 6 6 Q. 1.12 First note that it is a discrete random variable with a valid probability distribution, since 1 log 1 2 Σ =Σ 2 = ∞ n n= n An n P . However, H(X) = Σ− = +∞ n n n P log P (the series does not converge!). Thus the entropy of a discrete random variable can also be infinite. Q.1.13 ⎩ ⎨ ⎧ ≤ ≤ = − otherwise a x a P x 0 0 ( ) 1 Differential entropy = − ∫ a p x p x dx 0 ( ) log ( ) = − ∫ = a a dx a a 0 2 1 log log The plot is given below. Note that the differential entropy can be negative. 0 2 4 6 8 10 -3 -2 -1 0 1 2 3 log2 a H(X) Q.1.14 DMS with source probabilities {0.35, 0.25, 0.20, 0.15, 0.05} (i) Huffman code "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 7 7 (ii) = Σ i i i R p l = 0.35 × 2 + 0.25 × 2 + 0.20 × 2 + 0.15 × 3 + 0.05 × 3 = 2.2 bits. (iii) = ( ) ( ) = Σ log 1 = 2.121 i i p H X p R η H X 0.964 96.4% 2.2 η = 2.121 = = . Q.1.15 (i) (ii) = Σ i i i R p l = 0.35 + 0.25 + 0.20 × 2 + 0.15 × 2 + 0.05 × 2 = 1.40 ternary digits/ symbol Q.1.16 (i) DMS with source probabilities : Symbols Prob. Efficient fixed length code S1 S2 S3 S4 S5 S6 S7 S8 0.20 0.20 0.15 0.15 0.10 0.10 0.05 0.05 000 001 010 011 100 101 110 111 S1 0.35 S1 0.25 S1 0.20 S1 0.15 S1 0.05 1 1 2 3 2 3 0.40 1.00 Codes S1 2 S2 3 S3 11 S4 12 S5 13 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 8 8 Average code length = Σ= = L k k k R n x P x 1 ( ) ( ) = 3 (ii) Huffman code Σ= = L k k k R n x P x 1 ( ) ( ) = 2.9 bits. (iii) The entropy H(X) = Σ= − n i i i P x P x 1 ( )log ( ) = 2.8464 bits Huffman code gives shorter code length (η = 98.15%) Q.1.17 Symbol Probability Self information Code x1 0.5 0.5 0 x2 0.4 0.528 10 x3 0.1 0.332 11 H(X) = 1.36 bits / symbol pair 1 R = 1.5 bits / symbol pair η = 90.66 % (ii) 01 1 0 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 9 9 H(X) = 2.72 bits / symbol pair 2 R = 2.78 bits / symbol pair η = 97.84 % Symbol Self Huffman Probability Triple Information Codes x1 x1 x1 0.375 000 0.125 x1 x1 x2 0..1 x1 x2 x1 0..1 x2 x1 x1 0..1 x1 x2 x2 0..08 x2 x1 x2 0..08 x2 x2 x1 0..08 x2 x2 x2 0..064 x1 x1 x3 0..025 x1 x3 x1 0..025 x3 x1 x1 0.1330 0.025 x1 x2 x3 0.11287 0.02 x1 x3 x2 0.11287 0.02 x2 x3 x1 0.11287 0.02 x2 x1 x3 0.11287 0.02 x3 x1 x2 0.11287 0.02 x3 x2 x1 0.11287 0.02 x2 x2 x3 0.09545 0.016 x2 x3 x2 0.09545 0.016 x3 x2 x2 0.09545 0.016 x1 x3 x3 0.0382 0.005 x3 x1 x3 0.0382 0.005 x3 x3 x1 0.0382 0.005 x2 x3 x3 0.03186 0.004 x3 x2 x3 0.03186 0.004 x3 x3 x2 0.03186 0.004 x3 x3 x3 0.00996 0.001 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 10 10 H(X) = 4.0826 bits / triple 2 R = 4.118 bits / symbol triple ( ) 99.14% 3 = = R η H X Q.1.18 For a B-symbol block x1 x2 …… xB, H (x1 x2 …… xB) = Σ Σ Σ = = = − − n j n j j j j n j B B p x x x 1 1 2 1 1 1 2 ( ..... ) log ( ...... ) j1 j2 jB p x x x P(x1 x2 ….. xB) = p(x1) p(x2⎜x1) p(x3⎜x1 x2) …… p(xB|x1 x2 …. xB-1) Assuming the B r, vs to be statistically independent ∴ H (x1 x2 …… xB) = H(x1) + H(x2) + ….. + H(xB) = BH(X) Q 1.19 Hint: Apply equation 1.27. Q.1.20 Lempel Ziv Code for Parsing 0,1,00,11,111,001,01,000,0010,10,101,100,110 gives Dictionary Dictionary Location Contents Codeword 1 2 3 4 5 6 7 8 9 10 11 12 13 0001 0010 0011 0100 0101 0110 0111 1000 1001 1010 1011 1100 1101 0 1 00 11 111 001 01 000 0010 10 101 100 110 00000 00001 00010 00101 01001 00111 00011 00110 01100 00100 10101 10100 01000 Encoded stream Q.1.21 (i) Lempel Ziv Code for Parsing 13,30,02,021,11,300,00,22,12,223 gives Dictionary Dictionary Location Contents Codeword "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 12 12 Q.1.22 (i) For run length code we encode a run in terms of [how many] and [what]. Therefore to encode a run of n bits we require ⎡log ⎤ 1 2 = n + bits. The run length code is therefore ⎡log 1⎤ 1 2 n + ⎡log 2⎤ 1 2 n + ⎡log 3⎤ 1 2 n + Compression it will provide: Every run requires ⎡log ⎤ 1 2 n + bits. Average length of a run = 2 3 ( 1) 2( 1) 1 2 ( 1) 1 2 3 1 2 2 1 2 11 − − + + + − − − − − + − + n n n n ( 1) [ 1 ]2 2 3 2 2 2 ( 1)2 2 n n n n n n − + − = − + Average Compression ⎡ ⎤ [ ] ⎥ ⎥⎦ ⎤ ⎢ ⎢⎣ ⎡ − + − + = ( −1) +1 2 2 3 2 2 2 2 ( 1)2 2 log 1 n n n n n n n (ii) Huffman Coding Q.1.23 For example, let us assume a run of 214 1’s. For first level of run length coding we require 14+1=15 1’s. For second level of run length coding we need 4+1=5 1’s. This multi-layer run length coding is useful when we have large runs. To encode n 1’s maximum possible compression can be calculated as: For first run required bits = ⎡log ⎤ 1 2 n + For second run required bits = ⎡log⎡log ⎤ 1⎤ 1 2 n + + 1/22 1/2 0 1 1/23 1/24 n-1/2n-1 n/2n-1 0 1 3/16 3/4 15/16 "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 13 13 So for a run of n 1’s maximum possible compression = ⎡log ⎡log ⎡log 1⎤ 1⎤ 1⎤ 1 2 2 2 n + + + − − − − − + n "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 1 1 SOLUTIONS FOR CHAPTER 2 Q.2.1 p(y = 0) = p(y = 0 | x = 0) p(x = 0) + p(y = 0 ⎜x = 1) p (x = 1) = (1 - p) p0 + qp1 = (1 - p) p0 + q(1 - p0) p(y = 1) = p(y = 1 ⎜x = 0) p(x = 0) + p(y = 1 ⎜x = 1) p(x = 1) = pp0 + (1- q) (1-p0) p(x = 0 ⎜y = 0) (1 ) (1 ) (1 ) ( 0) ( 0 0) ( 0) 0 0 0 p p q p p p p y p y x p x − + − − = = = = = = p(x = 1 ⎜y = 1) (1 ) (1 ) (1 ) (1 ) ( 1) ( 1 1) ( 1) 0 0 0 pp q p q p p y p y x p x + − − − − = = = = = = Q 2.2 p(y = 0) = (1 - p) p0 p(y = e) = p p0 + q(1- p0) p(y = 1) = (1- q) (1-p0) I (x; y) =ΣΣ = = 2 1 3 1 ( ) ( | ) ( | ) (x )log i j j j i j i j P y P y x P y x P (1 ) (1 ) log (1 ) (1 )(1 ) log 1 (1 ) (1 ) log 1 log 0 0 0 0 0 0 0 0 0 0 pp q p q p q p q p pp q p pp p p p p + − + − − + − − − = − + 0 1 0 1 1 – p 1 – q p q p0 p1 e "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This resource material is for Instructor's use only." 2 2 Hint: For capacity find value of 0 p by ( ; ) 0 0 = dp dI X Y and then substitute it in above equation. Q 2.3 (a) CA = max I (x; y) p I (x; y) =ΣΣ = = n i m j i j i j i j P x P y P x y P x y 1 1 ( ) ( ) ( , ) ( , )log ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − = + + − p p p p p I x y p 1 log 1 (1 ) log 1 3 log 1 1 3 ( , ) 2 log (1 ) log(1 ) 1 log 1 (1 ) log 1 p p p p p p p p − − − − = ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − = + − ( , ) = − log p − 1 + log(1− p) + 1 = 0 dp dI x y ⇒ -

Show more Read less
Institution
Module











Whoops! We can’t load your doc right now. Try again or contact support.

Connected book

Written for

Institution
Module

Document information

Uploaded on
November 16, 2021
Number of pages
62
Written in
2021/2022
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

Content preview

,"Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and 1
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."



SOLUTIONS FOR CHAPTER 1

Q.1.1 DMS with source probabilities : {0.30 0.25 0.20 0.15 0.10}
1
Entropy H(X) = ∑ pi log
i pi
= 0.30 log 1/0.30 + 0.25 log 1/0.25 + ……………
= 2.228 bits

qi
Q.1.2 Define D(p ⎜⎜q) = ∑p i
i log
pi
(1)


pi, qi – probability distributions of discrete source X.

qi ⎛ qi ⎞
D(p ⎜⎜q) = ∑p i log ≤ ∑P⎜⎜ − 1⎟⎟ [using identity ln x ≤ x – 1]
i
i pi i ⎝ pi ⎠
= ∑ (qi − pi ) = 0
i

∴ D(p ⎜⎜q) ≥ 0

Put qi = 1/n in (1) where n = cardinality of the distance source.

D(p ⎜⎜q) = ∑ p log p
i
i i + ∑p
i
i log n

∑ p log p
i i + log n = − H ( X ) + log n ≥ 0
= i

− H ( X ) ≤ log n

H(X) = log n for uniform probability distribution. Hence proved that entropy of a
discrete source is maximum when output symbols are equally probable. The
quantity D(p ⎜⎜q) is called the Kullback-Leibler Distance.

Q. 1.3 The plots are given below:




1

,"Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and 2
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."



3


2
y = x -1
1
y = ln(x)
0


-1


-2


-3
0 0.5 1 1.5 2 2.5 3 3.5


Q 1.4 Consider two probability distributions: {p0, p1, , pK-1} and {q0, q1, , qK-1}.
K −1
⎛q ⎞ 1 K −1 ⎛q ⎞
We have ∑ pk log 2 ⎜⎜ k ⎟⎟ = ∑ pk ln⎜⎜ k ⎟⎟ . Use ln x ≤ 1 – x,
k =0 ⎝ pk ⎠ ln 2 k =0 ⎝ pk ⎠
K −1
⎛q ⎞ 1 K −1 ⎛ qk ⎞
∑ pk log 2 ⎜⎜ k ⎟⎟ ≤ ∑ pk ⎜⎜ − 1⎟⎟
k =0 ⎝ pk ⎠ ln 2 k =0 ⎝ pk ⎠
1 K −1
≤ ∑ (qk − pk )
ln 2 k =0
1 ⎛ K −1 K −1

≤ ⎜ ∑ qk − ∑ pk ⎟ = 0
ln 2 ⎝ k =0 k =0 ⎠
K −1
⎛q ⎞
Thus, ∑p k log 2 ⎜⎜ k ⎟⎟ ≤ 0 . (1)
k =0 ⎝ pk ⎠
n m P ( xi , y j )
Now, I(X; Y) = ∑∑ P( xi , y j )log (2)
i =1 j =1 P( xi ) P( y j )

From (1) and (2) we can conclude (after basic manipulations) that I(X;Y) ≥ 0. The
equality holds if and only if P ( xi , x j ) = P( xi ) P( x j ) , i.e., when the input and output
symbols of the channel are statistically independent.




2

, "Copyrighted Material" -"Additional resource material supplied with the book Information, Theory, Coding and 3
Cryptography, Second Edition written by Ranjan Bose & published by McGraw-Hill Education (India) Pvt. Ltd. This
resource material is for Instructor's use only."


Q.1.5 Source X has infinitely large set of outputs P(xi) = 2-i, i = 1, 2, 3, ………
∞ ∞
1
H ( X ) = ∑ p ( xi ) log = ∑ 2 −i log 2 −i
i =1 p ( xi ) i =1

= ∑ i.2
i =1
−i
= 2 bits


Q.1.6 Given: P(xi) = p(1-p)i-1 i = 1, 2, 3, ……..
H ( X ) = − ∑ p (1 − p ) i −1 log {p (1 − p ) i −1 }
i


= − ∑ p (1 − p ) i −1 {log p + (i − 1) log (1 − p )}
i


= − p log p ∑ p (1 − p ) i −1 − p log (1 − p ) ∑ (i − 1) (1 − p) i −1

i =1 i
1 1− p
= − p log p × − p log(1 − p) × 2
p p
⎛ 1− p ⎞ p log p − (1 − p) log(1 − p) 1
= − log p − ⎜⎜ ⎟⎟ log(1 − p) = = H ( p) bits
⎝ p ⎠ p p


Q 1.7 Hint: Same approach as the previous two problems.

Q 1.8 Yes it is uniquely decodable code because each symbol is coded uniquely.

Q 1.9 The relative entropy or Kullback Leibler distance between two probability mass
functions p(x) and q(x) is defined as
⎛ p ( x) ⎞
D( p || q ) = ∑ p ( x) log⎜⎜ ⎟⎟ . (1.76)
x∈X ⎝ q ( x) ⎠
(i) Show that D ( p || q ) is non negative.


⎛ p ( x) ⎞ ⎛ q ( x) ⎞
Solution: − D( p || q ) = − ∑ p ( x) log⎜⎜ ⎟⎟ = ∑ p ( x) log⎜⎜ ⎟⎟
x∈X ⎝ q( x) ⎠ x∈X ⎝ p ( x) ⎠
q( x)
≤ log ∑ p ( x) (from Jensen’s Inequality: Ef(X)≥ f(EX))
x∈X p ( x)

= log ∑ q( x) = log(1) = 0.
x∈X



Thus, − D( p || q ) ≤ 0 or D( p || q ) ≥ 0 .




3

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
Expert001 Chamberlain School Of Nursing
Follow You need to be logged in order to follow users or courses
Sold
797
Member since
4 year
Number of followers
566
Documents
1190
Last sold
1 week ago
Expert001

High quality, well written Test Banks, Guides, Solution Manuals and Exams to enhance your learning potential and take your grades to new heights. Kindly leave a review and suggestions. We do take pride in our high-quality services and we are always ready to support all clients.

4.2

159 reviews

5
104
4
18
3
14
2
7
1
16

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their exams and reviewed by others who've used these revision notes.

Didn't get what you expected? Choose another document

No problem! You can straightaway pick a different document that better suits what you're after.

Pay as you like, start learning straight away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and smashed it. It really can be that simple.”

Alisha Student

Frequently asked questions