^-- Up

A Minimally-biased Philosophy of Life: Information-theory perspective

"Information theory" is a continuous concept as outlined in Claude Shannon's Mathematical Theory of Communication. It describes messages as sent between encoders and decoders, and describes the informational content of the messages (or of the continuous flow of message-carrying signals) in terms related to probability.

"Information theory" is not a theory in the colloquial sense, as a "theory of something," because there is no alternative. Information theory is as mathematicallyl fundamental as thermodynamics (a subject which Einstein deemed the most durable in physics), and in many cases is composed of exactly the same equations.

The most crucial information-theoretic concepts for understanding brains are memory, amplification, and bandwidth. Memory is the persistent storage medium, specially designed to prevent the smooth decay over time which bedevils every other pattern in the material world. Because brains must solve a four-dimensional problem--controlling a 3-D body over time, as provided by input from 3-D space--for efficiency it must have a four-dimensional storage medium (which, conveniently, also serves as the computation medium).

Amplification is what makes possible the selective propagation of certain patterns over others. In an ordinary digital computer "duplication" would be a better word, but we need a continuous concept because brains use continuous media. AS with life itself, amplification is the essential non-linearity making adaptive behavior possible. It is a requirement for "intelligence" and "consciousness" (whatever they are), but far simpler.

Bandwidth is the rate of information flow over time, analogous to the flow of electric current or water. But electrons and photons are single unambiguous elements, whereas bits must be determined by reference to something else. Without a particular reference, even the "noise" in a signal carries information, if only about long-passed thermal fluctuations. (In fact, the most efficiently-compressed signals in technology sound seem like noise before de-coding).

We know that each electric device demands its own flow of electricity, and that each faucet demands its flow of water, and physics can even express such limits as ironclad laws like "conservation of charge" and "conservation of mass." But since information-flow is a much more recent concept than electricity or mass, we don't yet have the words to express the same sort of inviolable limits.

But they nonetheless exist. No transmitter or computer can create more information about the message source than it received. Yet brains seem to violate this principle, displaying before our eyes an apparently infinite-resolution view of a glitch-free world based on less-than-broadband input. This would be a technological miracle, if we even knew how it worked. Below is a graph of the bandwidths and input quality (another information-theoretic term) for various human activities, all estimated for uncompressed signals so human-perceived screens can be compared with direct human perception.

Full size image