Chain Rule of Joint Entropy | Information Theory 5 | Cover-Thomas Section 2.2

First published at 01:58 UTC on June 11th, 2021.

H(X, Y) = H(X) + H(Y | X). In other words, the entropy (= uncertainty) of two variables is the entropy of one, plus the conditional entropy of the other. In particular, if the variables are independent, then the uncertainty of two independent variab…

MORE
CategoryEducation
SensitivityNormal - Content that is suitable for ages 16 and over
DISCUSS THIS VIDEO