3D visualization of entropy
- Physics/Thermodynamics: A 3D representation of particles dispersing over time.
- Information Theory: A dynamic graph showing increasing randomness in a data system.
- Abstract/Artistic: A colorful, chaotic structure that morphs and evolves.
Shannon Entropy Proof
Definition of Entropy
Entropy is defined as:
\[ H(X) = - \sum_{i=1}^{n} P(x_i) \log_b P(x_i) \]
Entropy of Equally Likely Outcomes
For a uniform distribution where each outcome has probability \( P(x_i) = \frac{1}{n} \):
\[ H(X) = - \sum_{i=1}^{n} \frac{1}{n} \log_b \frac{1}{n} \]
Simplifying:
\[ H(X) = - n \cdot \frac{1}{n} \cdot \log_b \frac{1}{n} = \log_b n \]
Entropy of Independent Random Variables
If \( X \) and \( Y \) are independent, then their joint probability distribution satisfies:
\[ P(x_i, y_j) = P(x_i) P(y_j) \]
Thus, the entropy of their joint distribution is:
\[ H(X, Y) = - \sum_{i,j} P(x_i, y_j) \log_b P(x_i, y_j) \]
Expanding using independence:
\[ H(X, Y) = - \sum_{i,j} P(x_i) P(y_j) \log_b (P(x_i) P(y_j)) \]
Using the logarithm property \( \log_b(ab) = \log_b a + \log_b b \):
\[ H(X, Y) = - \sum_{i,j} P(x_i) P(y_j) (\log_b P(x_i) + \log_b P(y_j)) \]
Separating the sums:
\[ H(X, Y) = - \sum_{i} P(x_i) \log_b P(x_i) \sum_{j} P(y_j) - \sum_{j} P(y_j) \log_b P(y_j) \sum_{i} P(x_i) \]
Since probabilities sum to 1:
\[ H(X, Y) = H(X) + H(Y) \]
Thus, entropy is additive for independent random variables.


No comments:
Post a Comment