Find the word definition

Wiktionary
conditional entropy

n. (context information theory English) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.

Wikipedia
Conditional entropy

In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known. Here, information is measured in shannons, nats, or hartleys. The entropy of Y conditioned on X is written as H(YX).