In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle Y} given that the value of another random variable X {\displaystyle X} is known.
Preferred languages to be displayed in the selection menus
Please note that, in order to improve your browsing experience on this website,
BabelNet® uses various types of cookies, including: browsing functionality,
performance and statistical cookies.
By continuing to browse the site you are agreeing to our use of cookies.
OK