update
This commit is contained in:
@@ -348,7 +348,7 @@ The capacity of the binary symmetric channel is given by:
|
||||
where $H_2(p) = -p \log_2(p) - (1-p)\log_2(1-p)$ is the binary entropy function.
|
||||
As $p$ increases, uncertainty grows and channel capacity declines.
|
||||
When $p = 0.5$, output bits are completely random and no information can be transmitted ($C = 0$).
|
||||
As already shown in \autoref{fig:graph-entropy}, an error rate over $p > 0.5$ is equivalent to $ 1-p < 0.5$,
|
||||
As illustrated in \autoref{fig:graph-entropy}, an error rate over $p > 0.5$ is equivalent to $ 1-p < 0.5$,
|
||||
though not relevant in practice.
|
||||
|
||||
Shannon’s theorem is not constructive as it does not provide an explicit method for constructing such efficient codes,
|
||||
|
||||
Reference in New Issue
Block a user