James Gleick's The Information: A History, a Theory, a Flood was my favorite nonfiction book of 2011, a tour-de-force history and introduction for information theory. It's out in paperback today. Here's some of my review of the hardcover:
I've been fascinated with information theory since a friend of a friend explained "Shannon limits" to me in the late 1990s. I remember the conversation, mostly because the description was tantalizingly frustrating and incomplete, this being a hallmark of really interesting ideas. This friend of a friend explained that there were theoretical limits to how much information any channel could carry, and that these limits included rigorous definitions for "channel" and "information." I've read up on Claude Shannon rather a lot since (I've got a short story called Shannon's Law in an upcoming Borderlands book, about a hacker named Shannon Klod who tries to violate the barrier between faerie and the human realm by routing a single packet using TCP-over-magic) and every time I do, it's a revelation, because some new facet of information theory reveals itself to me.
But nothing has presented these ideas half so well as The Information, and that's a tribute to Gleick's storytelling mastery, his ability to pick out the threads of history that trace back and forward from the discipline's central thesis. Gleick begins with early lexicographers, the primitive dictionaries, the phrasebooks that translated between the talking drum and western speech. He moves onto Babbage and Lovelace (and presents an account of their invention, rivalries, victories and failings that is as heartbreaking as it is informative), and then into telegraphy.
Telegraphy leads to codes, and codes to compression, and compression to logic, and logic to the first inklings of theories, and now you've got Einstein and Godel and Shannon and Turing meeting, debating, fighting and rubbishing each other in learned journals, arguing furiously with Margaret Mead at interdisciplinary conferences — a pellmell debate in full swing. On Gleick marches, to the double helix and Dawkins and memes, to a section on randomness that is so transcendently exciting that I couldn't put the book down and read it while walking, so distracted I got lost twice within blocks of my office.
Gleick takes us through Wikipedia and the meaning of information, the debates about it, the helpelessness of information overload, the collisions in namespaces — even through his beloved chaos math — until he has spun out his skeins so that they wrap around the world and the universe, information theory at the heart of legal debates over trademark, physics feuds over Hawking radiation, epistemology and cryptography, even fights over Pokemon characters and their disambiguation.