Shannon Entropy
Shannon entropy, a cornerstone of Information Theory, quantifies the average uncertainty or "surprise" inherent in the possible outcomes of a random variable. It measures the minimum number of bits required to encode a message from a source, reflecting its inherent unpredictability and enabling efficient Data Compression.