Forgot your password?

Book Review: The Information: a History, a Theory, a Flood 44

Posted by samzenpus
from the read-all-about-it dept.
eldavojohn writes "The Information: A History, a Theory, a Flood by James Gleick has a rather nebulous title and the subtitle doesn't really help one understand what this book hopes to be about. The extensive citations are welcomed as the author barely scratches the surface of any theory of information. It also cherry picks odd and interesting facets of the history of information but presents them in a chronologically challenged order. This book is, however, a flood and as a result it could best be described as a rambling, romantic love note to Information — eloquently written and at times wondrously inspiring but at the same time imparting very little actual knowledge or tools to the reader. If I were half my age, this book would be the perfect fit for me (just like Chaos was) but knowing all the punchlines and how the story ends ahead of time rather ruined it for me. While wandering through interesting anecdotes, Gleick masks the reader from most of the gory details." Read on for the rest of eldavojohn's review.
The Information: A History, a Theory, a Flood
author James Gleick
pages 544
publisher Pantheon
rating 5/10
reviewer eldavojohn
ISBN 978-0375423727
summary A wandering well-written historical who's who of Information Theory salted with references to hot topics.
The book starts out with an introduction to the hero of The Information: Claude Shannon. It also introduces the hero's sidekick: Alan Turing. Aside from our initial introduction to Shannon's work at Bell Labs and his monumental paper from 1948, the author drops many names — a foreshadowing of what is to come in the book. George Campbell, George Boole, Norbert Wiener, Vannevar Bush, John Archibald Wheeler, Richard Dawkins and many many more. This sets the tone for the rest of the book as each chapter jumps around in time and grabs many quotations and excerpts to provide a gem studded narration by Gleick.

Chapter one provided me a piece of anecdotal information that I had actually never come across. It concerns the talking drums of Africa, an apparently ill-documented form of communication that existed in Africa. Rather, I had heard of the talking drums but never considered it in a context of information theory. It appears to be one of the earliest forms of long distance communication, predating all telegraphs. A drummer in one village would drum out the syllables and nuances in a lengthy sentence and often repeat it a few times. Drummers in distant villages would hear this and try to parse out what the drums were saying. As a result of this, they wouldn't just say 'moon' they would say something like 'the shiny white face that rises in the night' or something lengthier to ensure that the message was interpreted correctly. An ingenuous method of communicating, the chapter oddly never mentions parity bits or error detection, two things I basically equated with the additional words that were redundant. It does, of course, return to our hero Shannon who would later investigate the redundancy in the English language.

The next chapter concerns Walter J. Ong and his work concerning the persistence of information. Gleick discusses the find at Uruk and the subsequent deciphering of the cuneiform tablets. What was interesting about these tablets, however, is that they were inane things like bills and recipes. But when Donald Knuth saw one at a museum, he called what he read 'an algorithm.' The third chapter jumps to 1604 and the publishing of the very first dictionaries. Although amusing, this chapter merely extrapolates how difficult it was for us to codify our language (and still is nigh impossible). At the end Gleick translates this effort to cyberspace and similar problems.

The next chapter introduces Charles Babbage and his difference engine. To keep it interesting, Gleick includes excerpts from Charles Dickens, Edgar Allan Poe, Oliver Wendell Holmes and Lord Byron. And oddly enough there was some mentor relationship between Charles Babbage and Augusta Ada Byron King, Countess of Lovelace. Concerning Babbage, Gleick calls Ada 'first his acolyte and then his muse' for some reason this odd relationship is preserved in The Information. Lady Lovelace had many intuitions into how symbolic logic and algorithms would work in the future but I found much of this chapter to be concerning relationships and excerpts from letters. To give you an example of what I'm talking about, I learned that Ada died many years before Babbage of cancer of the womb and she took laudanum and cannabis to ease the pain. What does this have to do with The Information? You also learn that Babbage told a friend before his death that he would gladly give up whatever time he had left if he could spend three days five centuries in the future. Only one of the many stories of foolishly optimistic hope this book sells to the reader.

The next chapter involves the evolution of the telegraph. And the bulk of it concentrated on a telegraph that was quite unknown to me. The French Telegraph — or rather system of signs from high buildings — that could send messages by signaling from village to village. Aside from being an extrapolation of a binary signal from ages of yore like the lighting of fires on elevated land or smoke signals, I didn't really understand why the politics and problems of these devices were explored so in depth. When we finally get to the electric telegraph, we get some odd (albeit interesting) details about it instead of the theory. From the abbreviation of common sentences down to codewords to the fight of patenting the signaling mechanism, Gleick again avoids any sort of real numerical or even technical analysis of how humans were progressing from one bandwidth level to another. Cost per letter drove some odd advancements like acronyms and the investigation of how words could be encoded into less symbols. It ends with a reference to George Boole and logic as these symbolic representations lead the way for words to be replaced and turned into equations.

The book moves on to Claude Shannon and briefly touches on his work on signal noise. It jumps around to Russell and Whitehead's Principia Mathematica and Gödel's subsequent destruction of any dreams of representing everything with symbols by way of his famous Incompleteness Theorem. It goes on to talk about Weyl, Nyquist, Hartley, etc continuing the veritable who's who while providing very little actual knowledge of their work. Who could mention Gödel without also talking about Nazis? Certainly not Gleick. The politics of the time, the references back to Lovelace and Babbage dominate this chapter leaving very little room for any actual Information Theory. On page 201 you'll find H = n log s. Although you won't find more than a paragraph of explanation nor any extrapolations on that formula. Thsi chapter did yield something interesting — a piece of paper from Shannon's estimates of data storage on a logarithmic scale. While some estimates are close, others are very far off but he was already thinking of DNA as information storage. The anecdotes and quotations from peers of the time are impressively researched and cross referenced but at what cost?

The next chapter concentrates on the enemy: Norbert Wiener from MIT. He comes across as a cigar smoking, condescending, self involved, snobby professor who's primary contribution is a now defunct 'science' once called Cybernetics. He's quick to identify other's works as derivatives as his own and is presented as the antithesis to Claude Shannon who is portrayed as modest, cautious, well spoken. On top of that, not only is Shannon's work not defunct it is the basis of so much of everything that is useful today. Gleick portrays Wiener so negatively I almost wondered if the condescending label 'wiener' was somehow related to Norbert. This chapter delves into conferences once held and the interactions between the participants. While it lead for great humor in Shannon/Wiener interactions, I don't understand why they were relayed to the reader. Shannon's rat and its demonstration resulted in interesting remarks but I don't understand why the reader is given so much insight into these proceedings of Cybernetics when the field turned out to be little more than buzzwords. An interesting note, however, is how some of the members would let the media run away with phrases that the scientist had never actually said. They would do this almost strategically to both validate this new field and provide interest from Universities and funding sources ... but should anyone corner them and ask for clarifications they could always truthfully say that they never said that verbatim. I wonder how often this happens today?

This next chapter on Maxwell's demon and entropy was actually a little enlightening in that it provided a fairly clear discussion of entropy (physics) and entropy (information). In addition to this correlation, it discusses why it's often negentropy or negative entropy. Leo Szilárd's work is discussed as well as this concept that 'information is not free.' Although Maxwell's demon is simply a exercise in physics philosophy, this chapter begins what will be finished later: an English explanation of how information is fundamentally tied to matter and the universe.

Gleick now reaches biological information: DNA. He spends a chapter on the origins of DNA and how contemporaries of information theory approached it upon its inception. Of course Dawkins and Gould had interesting things to say in this chapter but also Hofstadtler and Gamow had perhaps the most interesting things to add. That DNA is essentially a number and that number represents a machine that can replicate and say things about itself. One thing this book does well is build this sort of interesting relationship between information and humans. This chapter takes a stab at establishing that we are all at our cores just information in the universe. As biological beings we are feeding off of negative entropy.

The book takes a bizarre twist now into memes. That's right, chain letters and lolcats. And how they replicate and infect our brain despite being nothing more than information. I found this chapter to be obvious and boring — worthy of complete removal from the text. This interjection is out of place entirely and I'm still scratching my head wondering what merit it had in this book. Since it is such an odd assortment and arrangement of the history of information, this could be skipped by the reader.

The chapter on randomness opens with an individual I've never heard of before: Gregory Chaitin. Gleick seems to imply that Incompleteness and Quantum Physics are somehow tied together by way of Turing's Uncomputability Proof — or so Chaitin (once?) thought. Because they were both related to entropy (the word I guess) and the connection was randomness. I didn't understand why this was in here if not to mislead the reader. What follows are some of the giants work and quotes about randomness and random numbers. While mildly interesting, there's not a whole lot to be gleaned from this chapter. I did appreciate the references to Andrei Nikolaevich Kolmogorov who did some original and even parallel work on information theory behind the iron curtain. Of course the text is rife with political situations and anecdotes (i.e. Kolmogorov's run in with one of Stalin's favorite pseudo-scientists). Oh and what book on information would be complete without G. H. Hardy visiting Srinivasa Ramanujan and remarking on the boring number of his taxi? The oft repeated story of the number 1,729. This anecdote feels out of place but Gleick uses it to probe the reader deeper into what randomness really means. Throw in Bach's Well-Tempered Clavier and I almost wondered if Gleick had re-read Gödel, Escher, Bach before writing this chapter.

The next chapter did actually touch on work that ties information to physics in that very basic sense of information is unable to be destroyed in our universe. The famous Preskill Hawking wager is discussed as well as the thermodynamics of computation and the resulting implications for quantum mechanics. The chapter wanders around to quantum cryptography (feeling a bit out of place) to qubits to RSA to ... well, it all (as it does throughout the book) comes back to Shannon. The chapter does end with an interesting quote from John Wheeler who apparently advocated translating the quantum versions of string theory and Einstein's geometrodynamics 'from the language of the continuum to the language of bit.' Sounds pretty interesting, right? Too bad all you get is the quote.

Was that chapter too technical for you? Don't worry, the text moves back to Wikipedia (shouldn't this have been addressed in the early chapters of dictionaries?) and actually talks about deletionism versus inclusionism and the Wikipedia debates on Pokemon articles. Of course, our old friends Babbage, Turing, Shannon, et al are brought back to somehow comment on this modern encyclopedia with quotes from Gleick like 'The universe is computing its own destiny' (for added drama that sentence is its own paragraph on page 377). Strangely enough there is no reference to Edward Fredkin throughout this book. Gleick jumps to domain name saturation on the internet and hits up 'the cloud' at the very end. I almost marvel at how many bases he can touch in one chapter. The penultimate chapter covers our inundation with news every single day of our lives probably from now to eternity. Unsurprisingly, Gleick conjures up quotes of ages long past (almost to the dark ages) of people complaining of the printing press or telegraph or newspaper or internet ruining their lives by assaulting them with information and news. Turns out 'Information Overload' is not a new concept. A chapter devoted to people complaining about too much information in a book on information seems to be too much credit for them, in my opinion.

The book really fizzles out as it tries to wrap up. Far from finalizing anything, the reader is given the concept of 'the library of babel' alongside the famous six degrees of separation. We are now more interconnected than ever before thanks to ... information!

Luckily this book has almost fifty pages of references to other books that contain far more complete and far more organized thoughts on information. I would not recommend this book to any of my colleagues unless they never went to college and never once picked up another book on Information. That said, I felt it was very well written and will no doubt continue to be sold en masse in bookstores. If anyone else read this book and came away with some very deep and profound understanding of the subject matter, I would love to hear it. Right now, the audience for this book is very small in my mind. It might best be given to a young engineer who has yet to go to college but has the vim and vigor to track down the real sources of The Information.

You can purchase The Information: A History, a Theory, a Flood from Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.


This discussion has been archived. No new comments can be posted.

Book Review: The Information: a History, a Theory, a Flood

Comments Filter:
  • I got a copy of this for about $8, the other month, when Borders was closing down. Added bonus, it came with a flattened fiddler spider on the dedication page. Information is deadly. /-o-\ I've only made it through the first chapter, which wasn't a bad read. I'll post back here, if I ever find time to really read it.
    • by physburn (1095481)
      Don't know how you can creep Claude Shannon's theory, are you confusing it with the Mass Destruction song by Faithless?


      Information Theory [] @ Feed Distiller []

      • by pntkl (2187764)
        I'd post a picture, if that was allowed. When I bought the book, found a spider pressed into the dedication page. It's approximately an inch above the 'T' in 'CYNTHIA'. That's why I posted 'Creepy Book'. /-o-\
  • "the redundancy in the English language. "

    Yeah, that's as far as I got before being induced to TLDR and post.

    The redundancy in the English language, while possibly a form of self-correcting code, often is a source of error itself.

    Hence the massive proportion of internet bandwidth given up to grammar flames.

    When your error detection system is capable of rat-holing your entire discussion, maybe it's better to rely on reducing the S/N ratio of your lower layers and forego sending the redundant bits...

    • by pev (2186)

      Does EngIish really have that much redundancy? Most of the language has evolved by assimilating new words as required so I would have thought that this naturally eschews redundancy. A lot of what could be considered redundant could be genuinely (subtley) different meanings? Im sure that some people consider the differences small enough to be redundant whilst others consider them significant - this could perhaps be one of the most succinct ways to define the difference between computer programmers and poets.

      • by Anonymous Coward

        No, there are error correction structures built into the language itself. Think in terms of how verbs and pronouns agree. Also, read 'Godel, Escher, Bach', you're probably missing a lot of context to this discussion. The reviewed book really wants to be a mix of GEB and 'A Short History of Nearly Everything'.

  • by Anonymous Coward

    The book entertains at some depth (as do physicists today) how entropy was properly the domain of thermodynamics before Shannon gave it a newfangled interpretation: a measure of the *quantity* of information (provided the symbol stream was generated as a stationary stochastic source). Proofs of this have now been reduced to a few lines and a convex inequality.

    The far more important result of Shannon's paper was the channel coding theorem, which was counterintuitive (and hence, remarkable) at the time, at wh

  • by iliketrash (624051) on Monday October 17, 2011 @05:27PM (#37744248)

    Nonetheless, Gleick's treatment of the likes of Shannon, Babbage, and Ada Lovelace are fair and fairly detailed, and will vastly enlighten the non-technical reader, which is, after all, the intended audience for this book.

  • by Anonymous Coward

    Not only is the history of information theory interesting, but Gleick touches on truly fundamental relationships between classical information theory and the evolution of all self replicating patterns.

    The most interesting quote in the book for me was this:

    'It sometimes seems as if curbing entropy is our quixotic purpose in this universe." (p. 282). Pantheon. Kindle Edition.

    While the book doesn't come to any startling conclusions, and has to deal with the historical confusion around the word 'entropy' itsel

  • by ideonexus (1257332) on Monday October 17, 2011 @05:35PM (#37744314) Homepage Journal
    If the content of this book intrigues you, I highly recommend this Lecture Series [] by UC Berkeley Online Course, Spring 2011 , Prof. Geoffrey D. Nunberg. I listened to them a few years ago when the quality was terrible, but they were still fascinating and have been re-recorded in much better quality and with slides. The course starts way back with spoken word to written word to signs on up through history. Great series.
  • I'm still confused. Is this the kind of book that has at least some equations and algorithms (I get that its not exclusively this) or is it the kind of book that mostly rampages on about Turing's love life and how the crude savages of the era screwed him over? I'm just trying to figure out how soft -n- fluffy it is.

    • by robotkid (681905) <> on Monday October 17, 2011 @06:59PM (#37744998)

      I'm still confused. Is this the kind of book that has at least some equations and algorithms (I get that its not exclusively this) or is it the kind of book that mostly rampages on about Turing's love life and how the crude savages of the era screwed him over? I'm just trying to figure out how soft -n- fluffy it is.

      Neither, and therein lies the weakness of this book. This review is spot on. The beginning chapters are all these interesting historical anecdotes that do a pretty good job of contextualizing the disjointed and awkward methods of transmitting and thinking about information in the pre-Shannon era. As a series of lesser-known historical anecdotes, it's quite fascinating to know that Babbage like to crack codes as a hobby and that Shannon and Turing directly influenced each other's work as they had regular lunchtime discussions at Bell labs. That is interesting thread that got me to read the book, it felt like a great set-up to a really interesting and accessible primer to information theory.

      But then, once Shannon is introduced, the author seems at a loss to explain what information theory is actually used for other than a vague sentiment that it's "useful everywhere, like in the internets and satellites and stuff". In fact, the narrative seems to fall into the same trap that is described wherein a bunch of non-mathematically inclined "visionaries" from psychologists to linguists and architects all jump on an ill-fated "information theory can explain everything" bandwagon without really understanding what it is that information theory can and can't do, leading to quasi-celebrity status for a (very bewildered) Shannon. This then devolves into an extended discussion of memes from the early work of Dawkins which met a similar fate (the "journal of mimetics" was short-lived due to a complete inability of it's founders to agree on exactly what belonged in it). The treatment of biological information is amazingly scant beyond some re-hashing of Dawkins and Gould, given how fundamental information theory is to the modern field of bioinformatics and the like. It then wraps up with with obligatory creation stories of wikipedia, google and discussions of information glut, the likes of which a slashdot audience would already know by heart and therefore find unenlightening.

      The actual information theory examples explained in the book do not go beyond the toy examples from Shannon's paper, which is itself very well written and eminently accessible if you have a little statistics and math background. So if you are looking for that, go straight to the source instead instead of reading this book. If you are looking for some neat historical anecdotes about what people used to do to save money on telegraph messages and dreams that Ada Loveless had about being able to see a new world in her head where algorithm developers would someday rule the world, by all means enjoy the first 5 chapters, but the remainder is quite forgettable I'm afraid.

      Link to Shannon's 1948 paper. []

      • by bmacs27 (1314285)
        I'm a little confused by your comment about the ill-fated information theory. It still does dominate many fields. I know that psychophysics has benefitted greatly from it, and the people doing it are plenty "mathematically inclined."
        • by robotkid (681905)

          I'm a little confused by your comment about the ill-fated information theory. It still does dominate many fields. I know that psychophysics has benefitted greatly from it, and the people doing it are plenty "mathematically inclined."

          I did not mean that information theory was ill-fated, but right after it's publication there was an irrational jubilation that all of science was going to be solved in an "information theory" framework that led to failed journals, societies, and hundreds of really poorly thought out papers all entitled "An information theory approach to ____(insert longstanding scientific problem here)". Generally these papers took the log of some important measurement, calculated an "effective bandwidth", and maintained

          • by bmacs27 (1314285)
            Okay, but around that time there was some important work tying information theory to perception that was relatively groundbreaking work. It's still cited today, and modeling of visual cortex as "noisy channels" is still fairly widespread practice. However, maybe that makes sense because most of the common tools used in psychophysics historically came from Signal Detection Theory, and other radio operator related math.
            • by robotkid (681905)

              Okay, but around that time there was some important work tying information theory to perception that was relatively groundbreaking work. It's still cited today, and modeling of visual cortex as "noisy channels" is still fairly widespread practice. However, maybe that makes sense because most of the common tools used in psychophysics historically came from Signal Detection Theory, and other radio operator related math.

              Agreed. IANASH (I am not a science historian) but the impression I get from the book is that it was this initial success that spawned all the other fields to try and have a "me too" moment which led to the bubble. So although I'm pretty sure the book does mention this as an event that happened, the science itself was certainly not presented with the same clarity and poignancy that the author details early developments in the mathematics of logarithms and wacky precursors to the telegraph. If you can recomme

  • Is this the first time that a book review got something other than an 8?

    • by ThorGod (456163)

      Seems like it. Unless the reviewer has a book on information theory out there, I'm more inclined to believe the criticism (after all the awkwardly positive book reviews I've read on /.)

  • by Anonymous Coward

    I mean, it takes a lot of talent to take something as interesting as information theory and to write so bad and muddled about it that I had a hard time reading it. What the book needs is a good editor to cut some of the crap and make the author rewrite some bits.

  • Gleick has done some highly regarded work. I waded through some material on his web site many years ago, and felt a strong respect.

    From my old notes, here's an audio interview about a previous book. A Miracle Made Lyrical: Jim Gleick's Isaac Newton []

    Also high praise for Chaos from I Missed the Complexity Revolution []

    I don't understand how this reviewer has never heard of Chaitin, but finds this book vastly too elementary. Oddly, I mentioned Chaitin in an earlier post this very day. Perhaps reviewer should t

  • A well written review from the poster. My shorter one: If you were to drop this book into a black hole, the information content of the universe would not change.

  • We want... Information. Information! INFORMATION!!

  • As someone who classifies himself as a "geek", albeit one terribly bad at math and logic, I thought it was a pretty good read. I do wish I had some more "hello, this is information theory presented in an engaging manner" books, though.

  • One wonders if the book points out that Richard Cox derived what amounts to information theory several years prior to Shannon...

  • by BenSnyder (253224) on Tuesday October 18, 2011 @02:27AM (#37747388) Homepage

    I'm reading this book and am about 60% through it... up to the part about entropy.

    I get that the reader was looking for equations. But I found the history of everything to be wonderfully helpful in understanding the general concepts. I'm confused as to why he's confused that Gleick is giving a history of Information Theory and not a discourse on it.

    I give the book 4 out of 5 and the reviewer 2 out of 5.

We are experiencing system trouble -- do not adjust your terminal.