bn:00975913n
Noun Concept
Categories: Loss functions, Entropy and information
EN
cross-entropy  cross entropy  Cross entropy loss  Crossentropy  Minxent
EN
In information theory, the cross-entropy between two probability distributions p {\displaystyle p} and q {\displaystyle q} over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution q {\displaystyle q}, rather than the true distribution p {\displaystyle p}. Wikipedia
Definitions
Relations
Sources
EN
In information theory, the cross-entropy between two probability distributions p {\displaystyle p} and q {\displaystyle q} over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution q {\displaystyle q}, rather than the true distribution p {\displaystyle p}. Wikipedia
Method for comparing probability distributions Wikidata
STUDIED BY