Compare commits
1 Commits
f8013bcee5
...
33ce8dc03d
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
33ce8dc03d |
27
entropy.tex
27
entropy.tex
@@ -8,7 +8,8 @@
|
||||
\usepackage{tikz}
|
||||
\usepackage{float}
|
||||
\usepackage{amsmath}
|
||||
\PassOptionsToPackage{hyphens}{url}\usepackage{hyperref} % allows urls to follow line breaks of text
|
||||
\PassOptionsToPackage{hyphens}{url}
|
||||
\usepackage{hyperref} % allows urls to follow line breaks of text
|
||||
|
||||
|
||||
|
||||
@@ -20,6 +21,30 @@
|
||||
\begin{document}
|
||||
\maketitle
|
||||
\section{What is entropy?}
|
||||
Across disciplines, entropy is a measure of uncertainty or randomness.
|
||||
Originating in classical thermodynamics,
|
||||
over time it has been applied in different sciences such as chemistry and information theory.
|
||||
%As the informal concept of entropy gains popularity, its specific meaning can feel far-fetched and ambiguous.
|
||||
The name 'entropy' was first coined by german physicist \textit{Rudolf Clausius} in 1865
|
||||
while postulating the second law of thermodynamics.
|
||||
The laws of thermodynamics are based on universal observation regarding heat and energy conversion.
|
||||
Specifically, the second law states that not all thermal energy can be converted into work in a cyclic process.
|
||||
Or, in other words, that the entropy of an isolated system cannot decrease,
|
||||
as they always tend toward a state of thermodynamic equilibrium where entropy is highest for a given internal energy.
|
||||
Another result of this observation is the irreversibility of natural processes, also referred to as the \textit{arrow of time}.
|
||||
Even though the first law (conservation of energy) allows for a cup falling off a table and breaking
|
||||
as well as the reverse process of reassembling itself and jumping back onto the table,
|
||||
the second law only allows the former and denies the latter,
|
||||
requiring the state with higher entropy to occur later in time.
|
||||
|
||||
Only 10 years later, in 1875, \textit{Ludwig Boltzmann} and \textit{Willard Gibbs} derived the formal definition
|
||||
that is still in use in information theory today.
|
||||
\begin{equation}
|
||||
S = -k_B \sum_i p_i \ln(p_i)
|
||||
\end{equation}
|
||||
It gives statical meaning to the macroscopical phenomenon by defining the entropy $S$ of a macrostate as
|
||||
the result of probabilities $p_i$ of all its constituting micro states.
|
||||
|
||||
\section{Definitions across disciplines}
|
||||
%- bedingte Entropie
|
||||
%- Redundanz
|
||||
|
||||
Reference in New Issue
Block a user