A Mathematical Theory of Communication
Fuente: https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf ↗
Texto completo: página del autor ↗
Shannon's 1948 paper is the founding document of information theory and one of the most consequential scientific publications of the twentieth century. It demonstrated that information could be quantified in bits, measured independently of meaning, and transmitted reliably over noisy channels through proper encoding. The paper drew on thermodynamics, probability theory, and Boolean algebra to establish a rigorous mathematical framework for communication systems. Its implications reached far beyond telephony: Shannon's entropy became central to cryptography, linguistics, genetics, and eventually computer science itself. The work is remarkable for its clarity and completeness — the entire field emerged essentially whole from a single paper.