bn:03801108n
Noun Concept
Categories: Entropy and information, Articles with short description, Information theory
EN
conditional entropy  average conditional information content  conditional information  equivocation  mean conditional information content
EN
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle Y} given that the value of another random variable X {\displaystyle X} is known. Wikipedia
Definitions
Examples
Relations
Sources
EN
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle Y} given that the value of another random variable X {\displaystyle X} is known. Wikipedia
Measure of relative information in probability theory and information theory Wikidata
The portion of a random variable's own Shannon entropy which is independent from another, given, random variable. Wiktionary
EN
The conditional entropy of random variable Y given X (i.e., conditioned by X ), denoted as H ( Y | X ) , is equal to H ( Y ) − I ( Y ; X ) where I ( Y ; X ) is the mutual information between Y and X . Wiktionary
HAS KIND
RECOMMENDED UNIT OF MEASUREMENT