Hans Christian von Baeyer
The New Language of Science
'Information', 'deformation', 'conformation', 'transformation', and 'reformation' obviously derive from 'formation', which, in turn, comes from 'form'.
Information is therefore the infusion of form on some previously unformed entity, just as de-, con-, trans-, and re-formation refer to the undoing, copying, changing, and renewing of forms. Information refers to moulding or shaping a formless heap - imposing a form onto something. So the question of its meaning reverts to the more fundamental one: What is form? The word 'form' entered Western philosophy as a translation of Plato's word eidos, the root of the words 'idea' and 'ideal'. Plato paints a picture of a world in which every object and attribute is but a pale, imperfect copy of a perfect, abstract ideal, a form, or archetype, which resides somewhere in an imaginary heaven. Thus a horse is but a copy of the form of horseness, the horse of horses, the Urhorse, the ideal horse that has shed all material properties. Similarly, if you are good or beautiful, you are not really good or beautiful in a profonnd, ideal sense, you merely have some characteristics that reflect, in a crude manner accessible to our senses, the forms of goodness and beauty.
....The theory that would vindicate Morse's rongh-and-ready method a century later was devised by the American mathematician Claude Elwood Shannon. Shannon, who was born in Michigan, earned his PhD at MIT and worked at the AT&T Bell Telephone Laboratories in New Jersey for fifteen years before returning to MIT to teach. He died in February 2001 at the age of eighty-four, laden with honours and revered as the legendary founding father of the cyber age.
Shannon's investigation of the efficiency of communications channels. Its success was largely due to the care with which he defined and delimited the problem. Figure 1 of his monograph sets the stage: five boxes in a row are connected by arrows from left to right, and labelled, in succession, 'Information Source', 'Transmitter', 'Channel', 'Receiver' and 'Destination'. (There is also a sixth box off to the side, ominously marked 'Noise' and connected to the Channel, but I'll come back to that later.) This simple picture inspired my own questions in chapter 1: 'What mediates between the atom and the brain? What agency originates in the atom, or, for that matter, anywhere in the material world, and ends up shaping our understanding of it?' My answer, 'information', also figures in Shannon's diagram.
The opening of the second paragraph of his seminal 1948 paper 'A Mathematical Theory of Communication' - a work variously likened to the Magna Carta, Newton's laws of motion and the explosion of a bomb - is crucial, and worth recalling:
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meanin$ that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
By blithely ignoring the meaning of information Shannon succeeded in constructing a complete mathematical theory.
Information Theory: shannon
Fire in the Mind
...Just as nature abhors a vacuum, the mind abhors randomness. Automatically we see pictures in the stars above us; we hear voices in the white noise of a river, and music in the wind. As naturally as beavers build dams and spiders spin webs, people draw maps, in the sky and in the sand.
...(they) speculate about this most basic of human drives: the obsession to find and impose order. Whether the order we invent are geographic, religious, or scientific, inevitably, it seems, we come to identify the map with the territory, to insist that the lines draw a real......
...Pushed up against this edge, science often retreats into platonism. Here on earth there may be no such thing as a perfect circle, but we recognise the rough approximations because we somehow have access to the perfect circle, the pure idea existing in a separate ectoplasmic realm. And so we are left with a duality between mind and matter, ideas and things.
Some followers of the information physics being pursued in Los Alamos, Santa Fe, and elsewhere suggest a way of bridging the divide: the laws of the universe are not ethereal, they say, but physical - made from this stuff called information, the 1s and 0s of binary code.
In building a tower of abstraction, one must start with a foundation, those things that are taken as given: mass, energy, space, time. Everything else can then be defined in terms of these fundamentals. But gradually over the last half-century some scientists had come to believe that another basic ingredient was necessary: information.
Pan Books 1999
... Life keeps going using information. The relationship between information and entropy was first discovered by Claude Shannon, an engineer who worked at Bell Laboratories in New Jersey during the 40ies. Shannon is investigating what prevents information being transmitted across a channel or telephoneline. He found that fault lay with the hard to define quality that all this seemed to be increasing whenever information was lost. Shanon never witnessed a decrease in this quality in all the experiments he performed. He called this slippery quality entropy.
Life is all about insuring information is passed on, or transmitted, while at the same time preventing entropy from corrupting the message. Life has found a way to ensure that entropy keeps increasing but not that the expense of its own survival or the integrity of the information it wants to transmit.
HOME BOE SAL TEXTE