bn:03171158n
Noun Concept
Categories: Information theory, Entropy and information
EN
mutual information  average mutual information  Coefficient of constraint  Coefficient of uncertainty  Mutual-information
EN
In probability theory and information theory, the mutual information of two random variables is a measure of the mutual dependence between the two variables. Wikipedia
Definitions
Examples
Relations
Sources
EN
In probability theory and information theory, the mutual information of two random variables is a measure of the mutual dependence between the two variables. Wikipedia
The intersection of multiple information sets Wikipedia Disambiguation
Measure of dependence between two variables Wikidata
A measure of the entropic (informational) correlation between two random variables. Wiktionary
Measure of the entropic correlation. Wiktionary (translation)
EN
Mutual information I ( X ; Y ) between two random variables X and Y is what is left over when their mutual conditional entropies H ( Y | X ) and H ( X | Y ) are subtracted from their joint entropy H ( X , Y ) . It can be given by the formula I ( X ; Y ) = − ∑ x ∑ y p X , Y ( x , y ) log b ⁡ p X , Y ( x , y ) p X | Y ( x | y ) p Y | X ( y | x ) . Wiktionary
RECOMMENDED UNIT OF MEASUREMENT