Correct Answer - Option 2 : (a), (d), (c), (b)
Concept:
Considering a discrete random variable x, that may represent a set of possible symbols to be transmitted at a particular time, taking possible values, with respective probabilities: px(x1), px(x2), …, px(xM)
The Entropy H(x) of x is the expected (or mean or average) value of the information obtained by learning the outcome of x.
Mathematically, this is calculated as:
\(H = \mathop \sum \limits_{i = 1}^M {p_i}{\log _2}\frac{1}{{{p_i}}}\)
Calculation:
For Source 1:
p1 = p2 = p3 = p4 = 1/4
\(H = \left( {\frac{1}{4}} \right){\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right)\)
\(= \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right)\)
\(H = \frac{8}{4} = 2\;bits/symbol\)
For Source 2:
p1 = 1/2, p2 = 1/4, p3 = 1/8, p4 = 1/8
\(H = \frac{1}{2}{\log _2}2 + \frac{1}{4}{\log _2}4 + \frac{1}{8}{\log _2}8 + \frac{1}{8}{\log _2}8\)
\(= \frac{1}{2}\left( 1 \right) = \frac{1}{4}\left( 2 \right) + \frac{1}{8}\left( 3 \right) + \frac{1}{8}\left( 3 \right)\)
\(H = \frac{1}{2} + \frac{1}{2} + \frac{3}{8} + \frac{3}{8}\)
\(H = 1.75\;bits/symbol\)
For Source 3:
p1 = 1/2, p2 = 1/2, p3 = 1/8, p4 = 1/8
\(H = \frac{1}{2}{\log _2}\left( 2 \right) = \frac{1}{2}{\log _2}\left( 2 \right) + \frac{1}{8}\log \left( 8 \right) + \frac{1}{8}{\log _2}\left( 8 \right)\)
\(H = \frac{1}{2} + \frac{1}{2} + \frac{2}{4} + \frac{3}{8}\)
\(H = \frac{{15}}{8} = 1.815\;bit/symbol\)
For Source 4:
p1 = 1/2, p2 = 1/4, p3 = 1/4, p4 = 1/8
\(H = \frac{1}{2}{\log _2}\left( 2 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}\log \left( 4 \right) + \frac{1}{8}\log \left( 8 \right)\)
\(H = \frac{1}{2} + \frac{2}{4} + \frac{2}{4} + \frac{3}{8}\)
\(H = \frac{{15}}{8} = 1.875\;bits/symbol\)
Arranging it in terms of the decreasing order of their Entropy, we can write:
H(S1) > H(S4) > H(S3) > H(S2)
Option (2) is therefore correct.
Note: Entropy will always be maximum for the equally likely symbols.