Correct Answer - Option 2 : 4500 bits/sec
Concept:
Information associated with the event is “inversely” proportional to the probability of occurrence.
Entropy: The average amount of information is called the “Entropy”.
\(H = \;\mathop \sum \limits_i {P_i}{\log _2}\left( {\frac{1}{{{P_i}}}} \right)\;bits/symbol\)
Rate of information = r.H
Calculation:
Given:
Three symbols with a probability of 0.25, 0.25, and 0.50 at the rate of 3000 symbols per second.
Entropy is given as;
\( H = 0.25{\log _2}\left( {\frac{1}{{0.25}}} \right) + 0.25{\log _2}\frac{1}{{0.25}} + 0.5{\log _2}\frac{1}{{0.5}}\;\)
\( = 0.25 \times 2 + 0.25 \times 2 + 0.5 \times 1\)
= 1.5
Rate of information = r.H
= 1.5 × 3000
= 4500 bits/sec