bn:03803694n
Noun Concept
Categories: Articles with short description, Entropy and information
EN
joint entropy
EN
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Wikipedia
Definitions
Relations
Sources
EN
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Wikipedia
measure of information in probability and information theory Wikidata
Wikipedia
Wikidata