Everyday uses for information prompt one to think of it as something like a string of characters which takes up space e.g. in a book or in a computer's memory. However these uses depend on context, since one person's information is another's meaningless gibberish. In fact the idea of information as a collection of isolated objects misses out on important physical aspects of the concept. As a result it has for instance distracted discussions of idea-evolution with questions about "particulate storage" in the brain, when the conversion of idea codes to digital storage is clearly taking place outside of our brains.
Even in its earliest communications-theory apps, information was an element of structure (like a string of characters delivered to the far end of a communications line) that was correlated with another element of structure (like the string of characters typed in to be sent at the near end of a communications line). This correlation value is delocalized i.e. dependent on the existence of structures at both ends of the comparison. Thus the information value of a map of the sky can be taken away by either destroying the map, or by randomizing the positions of stars in the heavens.
All quantitative applications of the information concept, like the "mutual information" described above, may be seen as special cases of a correlation measure known as KL-divergence. This is a kind of net surprisal, whose units are proportional to the logarithm of a multiplicity as for example in: #choices = 2#bits.
A particular advantage of this definition (with roots in Bayesian inference) is that it works regardless of whether or not the elements of structure take the form of replicable code strings. In the language of the introductory note to Shannon and Weaver's 1948 book, this broadened definition can deal with more complex questions of correlation, meaning, truthfulness, and fidelity than the "Level A" problems associated with the communication of replicable codes.
Thus for example it applies to the evolution of complexity (like the accretion of planets in the solar nebula) before elements of correlation-storage associated with it "go digital" e.g. in the form of nucleic acid or ASCII strings. Moreover this definition works for quantum systems e.g. qubits, and "it has second law teeth". As a result for example acquired knowledge in general, and even a blank storage device on which to write, extracts a minimum price in thermodynamic availability.
Wednesday, August 18, 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment