begin compression
This commit is contained in:
5
.latexmkrc
Normal file
5
.latexmkrc
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
$latex = 'latex %O --shell-escape %S';
|
||||||
|
$pdflatex = 'pdflatex %O --shell-escape %S';
|
||||||
|
$pdf_mode = 1;
|
||||||
|
$clean_ext = "lol nav snm";
|
||||||
|
$bibtex_use = 2;
|
||||||
7
compression.bib
Normal file
7
compression.bib
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
@misc{ enwiki:shannon-source-coding,
|
||||||
|
author = "{Wikipedia contributors}",
|
||||||
|
title = "Shannon's source coding theorem --- {Wikipedia}{,} The Free Encyclopedia",
|
||||||
|
year = "2025",
|
||||||
|
url = "https://en.wikipedia.org/w/index.php?title=Shannon%27s_source_coding_theorem&oldid=1301398440",
|
||||||
|
note = "[Online; accessed 25-November-2025]"
|
||||||
|
}
|
||||||
@@ -1,2 +1,61 @@
|
|||||||
|
\documentclass{article}
|
||||||
|
\usepackage[utf8x]{inputenc}
|
||||||
|
\usepackage[margin=1in]{geometry} % Adjust margins
|
||||||
|
\usepackage{caption}
|
||||||
|
\usepackage{wrapfig}
|
||||||
|
\usepackage{subcaption}
|
||||||
|
\usepackage{parskip} % dont indent after paragraphs, figures
|
||||||
|
\usepackage{xcolor}
|
||||||
|
%\usepackage{csquotes} % Recommended for biblatex
|
||||||
|
\usepackage{tikz}
|
||||||
|
\usepackage{pgfplots}
|
||||||
|
\usetikzlibrary{positioning}
|
||||||
|
%\usegdlibrary{trees}
|
||||||
|
\usepackage{float}
|
||||||
|
\usepackage{amsmath}
|
||||||
|
\PassOptionsToPackage{hyphens}{url}
|
||||||
|
\usepackage{hyperref} % allows urls to follow line breaks of text
|
||||||
|
\usepackage[style=ieee, backend=biber, maxnames=1, minnames=1]{biblatex}
|
||||||
|
\addbibresource{compression.bib}
|
||||||
|
|
||||||
Compression codes (theorems: Kraft, Shannon-Fano, Huffman, arithmetic and LZW code)
|
|
||||||
|
|
||||||
|
|
||||||
|
\title{Compression}
|
||||||
|
\author{Erik Neller}
|
||||||
|
\date{\today}
|
||||||
|
|
||||||
|
\begin{document}
|
||||||
|
\maketitle
|
||||||
|
\section{Introduction}
|
||||||
|
As the volume of data grows exponentially around the world, compression is only gaining in importance to all disciplines.
|
||||||
|
Not only does it enable the storage of large amounts of information needed for research in scientific domains
|
||||||
|
like DNA sequencing and analysis, it also plays a vital role in keeping stored data accessible by
|
||||||
|
facilitating cataloging, search and retrieval.
|
||||||
|
The concept of entropy is closely related to the design of efficient codes.
|
||||||
|
|
||||||
|
\begin{equation}
|
||||||
|
H = E(I) = - \sum_i p_i \log_2(p_i)
|
||||||
|
\label{eq:entropy-information}
|
||||||
|
\end{equation}
|
||||||
|
The understanding of entropy as the expected information $E(I)$ of a message provides an intuition that,
|
||||||
|
given a source with a given entropy (in bits), any coding can not have a lower average word length (in bits)
|
||||||
|
than this entropy without losing information.
|
||||||
|
This is the content of Shannons's source coding theorem \cite{enwiki:shannon-source-coding}.
|
||||||
|
|
||||||
|
|
||||||
|
% https://en.wikipedia.org/wiki/Shannon%27s_source_coding_theorem
|
||||||
|
\section{Kraft-McMillan inequality}
|
||||||
|
% https://de.wikipedia.org/wiki/Kraft-Ungleichung
|
||||||
|
% https://en.wikipedia.org/wiki/Kraft%E2%80%93McMillan_inequality
|
||||||
|
\section{Shannon-Fano}
|
||||||
|
% https://de.wikipedia.org/wiki/Shannon-Fano-Kodierung
|
||||||
|
\section{Huffman Coding}
|
||||||
|
% https://de.wikipedia.org/wiki/Huffman-Kodierung
|
||||||
|
\section{LZW Algorithm}
|
||||||
|
% https://de.wikipedia.org/wiki/Lempel-Ziv-Welch-Algorithmus
|
||||||
|
\section{Arithmetic Coding}
|
||||||
|
% https://en.wikipedia.org/wiki/Arithmetic_coding
|
||||||
|
|
||||||
|
\printbibliography
|
||||||
|
\end{document}
|
||||||
@@ -26,7 +26,7 @@
|
|||||||
|
|
||||||
\begin{document}
|
\begin{document}
|
||||||
\maketitle
|
\maketitle
|
||||||
\section{What is entropy?}
|
\section{Introduction}
|
||||||
Across disciplines, entropy is a measure of uncertainty or randomness.
|
Across disciplines, entropy is a measure of uncertainty or randomness.
|
||||||
Originating in classical thermodynamics,
|
Originating in classical thermodynamics,
|
||||||
over time it has been applied in different sciences such as chemistry and information theory.
|
over time it has been applied in different sciences such as chemistry and information theory.
|
||||||
@@ -355,7 +355,6 @@ Shannon’s theorem is not constructive as it does not provide an explicit metho
|
|||||||
but it guarantees their existence.
|
but it guarantees their existence.
|
||||||
In practice, structured codes such as Hamming and Reed–Solomon codes are employed to approach channel capacity.
|
In practice, structured codes such as Hamming and Reed–Solomon codes are employed to approach channel capacity.
|
||||||
|
|
||||||
\section{Conclusion}
|
|
||||||
\section{Conclusion}
|
\section{Conclusion}
|
||||||
Entropy provides a fundamental measure of uncertainty and information,
|
Entropy provides a fundamental measure of uncertainty and information,
|
||||||
bridging concepts from thermodynamics to modern communication theory.
|
bridging concepts from thermodynamics to modern communication theory.
|
||||||
|
|||||||
Reference in New Issue
Block a user