One of the important approaches to statistical physics is provided by information theory erected by Claude Shannon in the late 1940s.
Information theory and generalized statistics
The author would also like to thank his colleagues in the Claude Shannon Institute: Carl Bracken, Eimear Byrne, Marcus Greferath, Russell Higgs, Nadya Markin, Gary McGuire and Alexey Zaytsev, — for stimulating discussions and advices.
Recursive Code Construction for Random Networks
Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the ﬁeld of information theory as developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor.
The transmission sense of information
As a consequence, philosophers have concluded that the mathematical theory of communication pioneered by Claude Shannon in 1948 (hereafter “Shannon theory”) is inadequate to ground the notion of information in genetics and evolutionary biology.
The transmission sense of information
This fact is enormously important to the function of the biological code — not as a matter of the semiotic classiﬁcations that fascinated Charles Pierce, but rather to solve the sort of decision problem that motivated Claude Shannon.
The transmission sense of information
***