This commit is contained in:
eneller
2025-11-26 19:24:53 +01:00
parent 25f725049e
commit e015e816bd
3 changed files with 27 additions and 10 deletions

View File

@@ -348,7 +348,7 @@ The capacity of the binary symmetric channel is given by:
where $H_2(p) = -p \log_2(p) - (1-p)\log_2(1-p)$ is the binary entropy function.
As $p$ increases, uncertainty grows and channel capacity declines.
When $p = 0.5$, output bits are completely random and no information can be transmitted ($C = 0$).
As already shown in \autoref{fig:graph-entropy}, an error rate over $p > 0.5$ is equivalent to $ 1-p < 0.5$,
As illustrated in \autoref{fig:graph-entropy}, an error rate over $p > 0.5$ is equivalent to $ 1-p < 0.5$,
though not relevant in practice.
Shannons theorem is not constructive as it does not provide an explicit method for constructing such efficient codes,