English
English
Arabic
Chinese
Dutch
French
German
Greek
Hebrew
Hindi
Italian
Japanese
Korean
Polish
Portuguese
Russian
Spanish
more...
Translate into...
English
Arabic
Chinese
Dutch
French
German
Greek
Hebrew
Hindi
Italian
Japanese
Korean
Polish
Portuguese
Russian
Spanish
more...
bn:03171158n
Noun Concept
Categories: Entropy and information, Information theory
EN
mutual information  average mutual information  Coefficient of constraint  Coefficient of uncertainty  Mutual-information
See more
EN
In probability theory and information theory, the mutual information of two random variables is a measure of the mutual dependence between the two variables. Wikipedia
Quit
Change View
Definitions
Examples
Relations
Sources
English
More languages...
English
Arabic
Chinese
Dutch
French
German
Greek
Hebrew
Hindi
Italian
Japanese
Korean
Polish
Portuguese
Russian
Spanish
more...
EN
In probability theory and information theory, the mutual information of two random variables is a measure of the mutual dependence between the two variables. Wikipedia
The intersection of multiple information sets Wikipedia Disambiguation
Measure of dependence between two variables Wikidata
A measure of the entropic (informational) correlation between two random variables. Wiktionary
Measure of the entropic correlation. Wiktionary (translation)
EN
Mutual information I ( X ; Y ) between two random variables X and Y is what is left over when their mutual conditional entropies H ( Y | X ) and H ( X | Y ) are subtracted from their joint entropy H ( X , Y ) . It can be given by the formula I ( X ; Y ) = − ∑ x ∑ y p X , Y ( x , y ) log b ⁡ p X , Y ( x , y ) p X | Y ( x | y ) p Y | X ( y | x ) . Wiktionary
RECOMMENDED UNIT OF MEASUREMENT