Entropy — Information-Theoretic Measures
Shannon entropy, cross-entropy, KL divergence, mutual information, normalized entropy, and conditional entropy. Supports bits, nats, and hartleys. Accepts probabilities or raw counts. Pure Rust via ne
1 toolsentropy
Compute
Compute information-theoretic entropy measures. Modes: shannon (H), cross (H_cross), kl (D_KL), mutual (MI), normalized
modedistribution_p