Find the word definition

Wiktionary
joint entropy

n. (context information theory English) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.

Wikipedia
Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.