Kullback–Leibler divergence (

Wikipedia): A non-symmetric difference between two distributions:

Conditional Entropy:

Joint Entropy:

A non-symmetric measure of association (“uncertainty coefficient”?) between

and

measures the percentage of entropy reduced from

if

is given:

Or the percentage of entropy reduced from

if

is given:

Where

, the

*mutual information* of

and

, is non-negative and symmetric.

Anyway,

equals 0 if no association, 1 if knowing

fully predicts

(i.e.

is a function of

).

A

*symmetric* measure can be made of

*a weighted average* of

and

:

Relation to

or measures like Cramer’s V etc.:

No obvious relation. Generated some 2×2 contigency tables and plotted their

vs Cramer’s V: