update
This commit is contained in:
@@ -1,7 +1,66 @@
|
||||
@misc{ enwiki:shannon-source-coding,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Shannon's source coding theorem --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Shannon%27s_source_coding_theorem&oldid=1301398440",
|
||||
note = "[Online; accessed 25-November-2025]"
|
||||
}
|
||||
@article{shannon1948mathematical,
|
||||
title={A mathematical theory of communication},
|
||||
author={Shannon, Claude E},
|
||||
journal={The Bell system technical journal},
|
||||
volume={27},
|
||||
number={3},
|
||||
pages={379--423},
|
||||
year={1948},
|
||||
publisher={Nokia Bell Labs}
|
||||
}
|
||||
@misc{ enwiki:shannon-source-coding,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Shannon's source coding theorem --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Shannon%27s_source_coding_theorem&oldid=1301398440",
|
||||
note = "[Online; accessed 25-November-2025]"
|
||||
}
|
||||
@misc{ enwiki:shannon-fano,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Shannon–Fano coding --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Shannon%E2%80%93Fano_coding&oldid=1315776380",
|
||||
note = "[Online; accessed 26-November-2025]"
|
||||
}
|
||||
@misc{ enwiki:huffman-code,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Huffman coding --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Huffman_coding&oldid=1321991625",
|
||||
note = "[Online; accessed 26-November-2025]"
|
||||
}
|
||||
@misc{ enwiki:lzw,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Lempel–Ziv–Welch --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Lempel%E2%80%93Ziv%E2%80%93Welch&oldid=1307959679",
|
||||
note = "[Online; accessed 26-November-2025]"
|
||||
}
|
||||
@misc{ enwiki:arithmetic-code,
|
||||
author = "{Wikipedia contributors}",
|
||||
title = "Arithmetic coding --- {Wikipedia}{,} The Free Encyclopedia",
|
||||
year = "2025",
|
||||
url = "https://en.wikipedia.org/w/index.php?title=Arithmetic_coding&oldid=1320999535",
|
||||
note = "[Online; accessed 26-November-2025]"
|
||||
}
|
||||
@misc{ dewiki:shannon-fano,
|
||||
author = "Wikipedia",
|
||||
title = "Shannon-Fano-Kodierung --- Wikipedia{,} die freie Enzyklopädie",
|
||||
year = "2024",
|
||||
url = "https://de.wikipedia.org/w/index.php?title=Shannon-Fano-Kodierung&oldid=246624798",
|
||||
note = "[Online; Stand 26. November 2025]"
|
||||
}
|
||||
@misc{ dewiki:huffman-code,
|
||||
author = "Wikipedia",
|
||||
title = "Huffman-Kodierung --- Wikipedia{,} die freie Enzyklopädie",
|
||||
year = "2025",
|
||||
url = "https://de.wikipedia.org/w/index.php?title=Huffman-Kodierung&oldid=254369306",
|
||||
note = "[Online; Stand 26. November 2025]"
|
||||
}
|
||||
@misc{ dewiki:lzw,
|
||||
author = "Wikipedia",
|
||||
title = "Lempel-Ziv-Welch-Algorithmus --- Wikipedia{,} die freie Enzyklopädie",
|
||||
year = "2025",
|
||||
url = "https://de.wikipedia.org/w/index.php?title=Lempel-Ziv-Welch-Algorithmus&oldid=251943809",
|
||||
note = "[Online; Stand 26. November 2025]"
|
||||
}
|
||||
|
||||
@@ -6,6 +6,8 @@
|
||||
\usepackage{subcaption}
|
||||
\usepackage{parskip} % dont indent after paragraphs, figures
|
||||
\usepackage{xcolor}
|
||||
\usepackage{algorithm}
|
||||
\usepackage{algpseudocodex}
|
||||
%\usepackage{csquotes} % Recommended for biblatex
|
||||
\usepackage{tikz}
|
||||
\usepackage{pgfplots}
|
||||
@@ -40,22 +42,58 @@ The concept of entropy is closely related to the design of efficient codes.
|
||||
\end{equation}
|
||||
The understanding of entropy as the expected information $E(I)$ of a message provides an intuition that,
|
||||
given a source with a given entropy (in bits), any coding can not have a lower average word length (in bits)
|
||||
|
||||
\begin{equation}
|
||||
E(l) = \sum_i p_i l_i
|
||||
\end{equation}
|
||||
than this entropy without losing information.
|
||||
This is the content of Shannons's source coding theorem \cite{enwiki:shannon-source-coding}.
|
||||
This is the content of Shannons's source coding theorem,
|
||||
introduced in \citeyear{shannon1948mathematical} \cite{enwiki:shannon-source-coding}.
|
||||
In his paper, \citeauthor{shannon1948mathematical} proposed two principal ideas to minimize the average length of a code.
|
||||
The first is to use short codes for symbols with higher probability.
|
||||
This is an intuitive approach as more frequent symbols have a higher impact on average code length.
|
||||
|
||||
|
||||
% https://en.wikipedia.org/wiki/Shannon%27s_source_coding_theorem
|
||||
\section{Kraft-McMillan inequality}
|
||||
% https://de.wikipedia.org/wiki/Kraft-Ungleichung
|
||||
% https://en.wikipedia.org/wiki/Kraft%E2%80%93McMillan_inequality
|
||||
\section{Shannon-Fano}
|
||||
% https://de.wikipedia.org/wiki/Shannon-Fano-Kodierung
|
||||
Shannon-Fano coding is one of the earliest methods for constructing prefix codes.
|
||||
It divides symbols into groups based on their probabilities, recursively partitioning them to assign shorter codewords
|
||||
to more frequent symbols.
|
||||
While intuitive, Shannon-Fano coding does not always achieve optimal compression,
|
||||
paving the way for more advanced techniques like Huffman coding.
|
||||
|
||||
\begin{algorithm}
|
||||
\begin{algorithmic}[1]
|
||||
\State first line
|
||||
\end{algorithmic}
|
||||
\label{alg:shannon-fano}
|
||||
\caption{Shannon-Fano compression algorithm}
|
||||
\end{algorithm}
|
||||
|
||||
\section{Huffman Coding}
|
||||
% https://de.wikipedia.org/wiki/Huffman-Kodierung
|
||||
\section{LZW Algorithm}
|
||||
% https://de.wikipedia.org/wiki/Lempel-Ziv-Welch-Algorithmus
|
||||
Huffman coding is an optimal prefix coding algorithm that minimizes the expected codeword length
|
||||
for a given set of symbol probabilities.
|
||||
By constructing a binary tree where the most frequent symbols are assigned the shortest codewords,
|
||||
Huffman coding achieves the theoretical limit of entropy for discrete memoryless sources.
|
||||
Its efficiency and simplicity have made it a cornerstone of lossless data compression.
|
||||
|
||||
\section{Arithmetic Coding}
|
||||
% https://en.wikipedia.org/wiki/Arithmetic_coding
|
||||
Arithmetic coding is a modern compression technique that encodes an entire message as a single interval
|
||||
within the range $[0, 1)$.
|
||||
By iteratively refining this interval based on the probabilities of the symbols in the message,
|
||||
arithmetic coding can achieve compression rates that approach the entropy of the source.
|
||||
Its ability to handle non-integer bit lengths makes it particularly powerful
|
||||
for applications requiring high compression efficiency.
|
||||
|
||||
\section{LZW Algorithm}
|
||||
The Lempel-Ziv-Welch (LZW) algorithm is a dictionary-based compression method that dynamically builds a dictionary
|
||||
of recurring patterns in the data.
|
||||
Unlike entropy-based methods, LZW does not require prior knowledge of symbol probabilities,
|
||||
making it highly adaptable and efficient for a wide range of applications, including image and text compression.
|
||||
\cite{dewiki:lzw}
|
||||
|
||||
|
||||
\printbibliography
|
||||
\end{document}
|
||||
Reference in New Issue
Block a user