Library · book

Information: The New Language of Science

Hans Christian von Baeyer
2003·Harvard University Press

Fuente: https://archive.org/details/informationnewla0000vonb

Von Baeyer, a physicist at the College of William and Mary, writes a broad popular history of information as a scientific concept — from Boltzmann's statistical mechanics and the entropy connection, through Shannon's mathematical theory, to quantum information and the holographic principle. The book's strength is its range: it treats thermodynamics, genetics, neuroscience, and quantum computing as chapters in a single story about how information became the unifying concept of modern science. Von Baeyer is careful to distinguish between the colloquial sense of "information" and its technical meanings in different fields, a distinction many popular accounts blur. The writing is accessible without sacrificing rigour, making it a reliable introduction for readers coming to information theory from any direction. It occupies a useful middle ground between Shannon's original papers and the more specialised treatments that followed.

information-theoryphilosophycomplexityhistory