56 lines
2.4 KiB
TeX
56 lines
2.4 KiB
TeX
\documentclass{article}
|
|
\usepackage[utf8x]{inputenc}
|
|
\usepackage[margin=1in]{geometry} % Adjust margins
|
|
\usepackage{caption}
|
|
\usepackage{subcaption}
|
|
\usepackage{parskip} % dont indent after paragraphs, figures
|
|
\usepackage{xcolor}
|
|
\usepackage{tikz}
|
|
\usepackage{float}
|
|
\usepackage{amsmath}
|
|
\PassOptionsToPackage{hyphens}{url}
|
|
\usepackage{hyperref} % allows urls to follow line breaks of text
|
|
|
|
|
|
|
|
|
|
\title{Entropy as a measure of information}
|
|
\author{Erik Neller}
|
|
\date{\today}
|
|
|
|
\begin{document}
|
|
\maketitle
|
|
\section{What is entropy?}
|
|
Across disciplines, entropy is a measure of uncertainty or randomness.
|
|
Originating in classical thermodynamics,
|
|
over time it has been applied in different sciences such as chemistry and information theory.
|
|
%As the informal concept of entropy gains popularity, its specific meaning can feel far-fetched and ambiguous.
|
|
The name 'entropy' was first coined by german physicist \textit{Rudolf Clausius} in 1865
|
|
while postulating the second law of thermodynamics.
|
|
The laws of thermodynamics are based on universal observation regarding heat and energy conversion.
|
|
Specifically, the second law states that not all thermal energy can be converted into work in a cyclic process.
|
|
Or, in other words, that the entropy of an isolated system cannot decrease,
|
|
as they always tend toward a state of thermodynamic equilibrium where entropy is highest for a given internal energy.
|
|
Another result of this observation is the irreversibility of natural processes, also referred to as the \textit{arrow of time}.
|
|
Even though the first law (conservation of energy) allows for a cup falling off a table and breaking
|
|
as well as the reverse process of reassembling itself and jumping back onto the table,
|
|
the second law only allows the former and denies the latter,
|
|
requiring the state with higher entropy to occur later in time.
|
|
|
|
Only 10 years later, in 1875, \textit{Ludwig Boltzmann} and \textit{Willard Gibbs} derived the formal definition
|
|
that is still in use in information theory today.
|
|
\begin{equation}
|
|
S = -k_B \sum_i p_i \ln(p_i)
|
|
\end{equation}
|
|
It gives statical meaning to the macroscopical phenomenon by defining the entropy $S$ of a macrostate as
|
|
the result of probabilities $p_i$ of all its constituting micro states.
|
|
|
|
\section{Definitions across disciplines}
|
|
%- bedingte Entropie
|
|
%- Redundanz
|
|
%- Quellentropie
|
|
\section{Shannon Axioms}
|
|
\section{Coding of a source of an information and communication channel}
|
|
|
|
|
|
\end{document} |