bn:00975913n
Noun Concept
SYL
No term available
EN
In information theory, the cross-entropy between two probability distributions p {\displaystyle p} and q {\displaystyle q} over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution q {\displaystyle q}, rather than the true distribution p {\displaystyle p}. Wikipedia
Relations
Sources
STUDIED BY