Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
122 views
in General by (85.8k points)
closed by

Four sources are generating information as given below

(a) Source1: \({p_1} - \frac{1}{4},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{4}\)

(b) Source2: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{8},{p_4} = \frac{1}{8}\)

(c) Source3: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{2},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)

(d) Source4: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)

Arrange theses sources in the descending order of their entropy (H).
1. (c), (d), (a), (b)
2. (a), (d), (c), (b)
3. (d), (c), (a), (b)
4. (b), (a), (c), (d)

1 Answer

0 votes
by (96.5k points)
selected by
 
Best answer
Correct Answer - Option 2 : (a), (d), (c), (b)

Concept:

Considering a discrete random variable x, that may represent a set of possible symbols to be transmitted at a particular time, taking possible values, with respective probabilities: px(x1), px(x2), …, px(xM)

The Entropy H(x) of x is the expected (or mean or average) value of the information obtained by learning the outcome of x.

Mathematically, this is calculated as:

\(H = \mathop \sum \limits_{i = 1}^M {p_i}{\log _2}\frac{1}{{{p_i}}}\)

Calculation:

For Source 1:

p1 = p2 = p3 = p4 = 1/4

\(H = \left( {\frac{1}{4}} \right){\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}{\log _2}\left( 4 \right)\)

\(= \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right) + \frac{1}{4}\left( 2 \right)\)

\(H = \frac{8}{4} = 2\;bits/symbol\)

For Source 2:

p1 = 1/2, p2 = 1/4, p3 = 1/8, p4 = 1/8

\(H = \frac{1}{2}{\log _2}2 + \frac{1}{4}{\log _2}4 + \frac{1}{8}{\log _2}8 + \frac{1}{8}{\log _2}8\)

\(= \frac{1}{2}\left( 1 \right) = \frac{1}{4}\left( 2 \right) + \frac{1}{8}\left( 3 \right) + \frac{1}{8}\left( 3 \right)\)

\(H = \frac{1}{2} + \frac{1}{2} + \frac{3}{8} + \frac{3}{8}\)

\(H = 1.75\;bits/symbol\)

For Source 3:

p1 = 1/2, p2 = 1/2, p3 = 1/8, p4 = 1/8

\(H = \frac{1}{2}{\log _2}\left( 2 \right) = \frac{1}{2}{\log _2}\left( 2 \right) + \frac{1}{8}\log \left( 8 \right) + \frac{1}{8}{\log _2}\left( 8 \right)\)

\(H = \frac{1}{2} + \frac{1}{2} + \frac{2}{4} + \frac{3}{8}\)

\(H = \frac{{15}}{8} = 1.815\;bit/symbol\)

For Source 4:

p1 = 1/2, p2 = 1/4, p3 = 1/4, p4 = 1/8

\(H = \frac{1}{2}{\log _2}\left( 2 \right) + \frac{1}{4}{\log _2}\left( 4 \right) + \frac{1}{4}\log \left( 4 \right) + \frac{1}{8}\log \left( 8 \right)\)

\(H = \frac{1}{2} + \frac{2}{4} + \frac{2}{4} + \frac{3}{8}\)

\(H = \frac{{15}}{8} = 1.875\;bits/symbol\)

Arranging it in terms of the decreasing order of their Entropy, we can write:

H(S1) > H(S4) > H(S3) > H(S2)

Option (2) is therefore correct.

Note: Entropy will always be maximum for the equally likely symbols.

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...