>[!abstract] >In information theory, redundancy measures the fractional difference between the entropy $H(X)$ of an ensemble $X$, and its maximum possible value $\log(|A_X|)$. Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity (Wikipedia, 2025). >[!related] >- **North** (upstream): [[Information theory]] >- **West** (similar): — >- **East** (different): [[Kolmogorov complexity]] >- **South** (downstream): —