2.2. Information, entropy, and data

In this section, I will propose that information is not data. Information is the feedstock from which data are made. Further I will propose that ‘information content’ or entropy as defined by Shannon and Weaver does not tell us anything about information as defined in the last section, i.e., about information as the shape, form, or structure of something. At first, however, I put the question: Can information be measured?

Information, so I said, is the shape, the form the structure of a thing or phenomenon. A form can be measured out: You can determine the length, width, height, or the diameter of a solid body or of a cavity. You can determine length and amplitude of a wave. You can compute the size of areas and volumes and the course of a curve. Obviously, some aspects of information can be measured or computed, but the shape of a tree, for example, would be poorly described by its height, stem perimeter, and crown diameter only. Even the number of its branches, leaves, or cells will tell us less about the shape of a tree.

Some confusion came up because of the communication theory and the term ‘information content’ established by Claude F. Shannon and Warren Weaver [1]. In the 1940th, they worked on the problem of lossless transmission of information, that is, of signs or code. The ‘information content’ of a single sign is the measure of the improbability or uncertainty with which the occurrence of that sign is expected within a defined set of signs. The more uncertain the occurrence of a sign, the greater its information content after Shannon. This concept of information abstracts not – only from the meaning of a sign, but also – and that’s crucial in our context – from its specific form, that is, from what the sign makes different from other signs and recognizable for us.

An example may illustrate it: When dicing with a proper die, then the probability (or uncertainty) for all numbers of pips (1–6) to occur is equal, namely 1:6. Therefore, the information content of every throw is the same, although the numbers from one to six are quite different, and even more so it is usually not the same for a gambler whether he has diced a one or a six. The little example shows: This ‘information content’ has nothing to do with what is usually meant with the word information. It is rather about calculating an amount of data necessary for the transfer of any random information of a kind, e.g., text, images, or music. Warren Weaver wrote:

“The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning. In fact, two messages, one of which is heavily loaded with meaning and the other of which is pure nonsense, can be exactly equivalent, from the present viewpoint, as regards information. It is this, undoubtedly, that Shannon means when he says, that "the semantic aspects of communication are irrelevant to the engineering aspects." But this does not mean that the engineering aspects are necessarily irrelevant to the semantic aspects.” [2]

Shannon and Weaver have never claimed that their theory is of substantial relevance for areas other than communications engineering. Nevertheless, it has enormously influenced science and philosophy. That may mainly be due to the fact that Shannon’s equation for information content is formally analogous to an equation from statistical mechanics and thermodynamics, namely the equation for entropy – a property which can be interpreted as a measure of randomness or chaos in a system [3]. The identification of information content with entropy was confusing insofar as it seemingly meant that, not only in data transfer but also in the ‘real world’, a system contains the more information the more chaotic it is.

Others, bu contrast, claimed that it must be just reverse: Information is a measure of order or complexity in a system. Starting from the concept of entropy in thermodynamics, Erwin Schrödinger and Léon Brillouin concluded that living beings accumulate negative entropy or (abbreviated) negentropy in themselves by building up their structure and internal organization. Likewise, Norbert Wiener, the father of cybernetics, regarded information a measure of organization in a system [4]. Then, after Shannon’s theory was published, it appeared unclear whether information should be construed as negentropy or as entropy. Carl Friedrich von Weizsäcker wrote:

“Information is the measure of an amount of form. We will also say: Information is a measure of the fullness (or the richness) of shape (Gestaltfülle).” [5]

One may, however, guess what he meant with “fullness of shape” – a measure of order or of complexity? That’s not the same. The following figure may illustrate this. A: Information content after Shannon, i.e., entropy is greatest in a chaos, because entropy is a measure of uncertainty or unpredictability. B: A high degree of order with low complexity has the lowest entropy, because the distribution of dots is well predictable. Since negentropy is the same numeric value as entropy but with a minus sign, the simple order has the greatest negentropy. Therefore, neither entropy nor negentropy suit as a measure of complexity or fullness of shape. C: Entropy as well as negentropy of a more complex order range between the values of A and B.

chaos           simple order           complexity
A  Chaos:
     B  Order:
     C  Complexity:
     entropy and
     between those
     of A and B.

Regarding the amount of data required for encoding we can say it is greatest for A and smallest for B. A requires the greatest number of bits because the position of every single dot must be encoded. In the case of B. only the gap between the dots in a row and the distance between the rows need to be determined to encode the structure. Also here, the more complex structure C ranges between A and B. Again, the amount of date, the number of bits needed for the quantitative description of a form or structure is not a measure of complexity or gestaltfülle.

What kind of relationship does exist between information and data? In the last section, I have characterized information as the form or structure of something. Date are generated, when we quantitatively describe a form or structure by counting or measuring. The smallest unit of data is a bit, with every bit being a yes/no-decision (on the premise that both options are equally probable). A bit can be coded binary as one or zero. The simplest question about a thing or phenomenon is: Does it exist or does it not exist?

In measuring, we want to learn more about a thing than only whether it exists or not. However, a measurement process as well is depending on our decisions: whether and when we measure, which method and which scale we apply. The length of a board is as it is – it’s a property of the form, a part of the information. The measured number, however, is depending on the decision which scale I use, inch or cm.

Thus we can say: Information is not data, but the source from which data can be obtained. We produce data by determining that something is the case or is not the case, that is, by making decisions. Why we are able to do so, that’s the theme of a later chapter.

Let us put the reverse question: Are data information? They are not, because data are the content of information. Data as well can only exist if they exist in a physical form or structure. They can, for example, be printed as numbers in a book, or they can have the form of a curve in a diagram, or they can be transferred as a temporal structure of electrical impulses. It is just the same with data as with other content of information: In a book, for example, measured data can be printed, but also a poem. Both have a meaning for a reader able to understand the content. The role of understanding will be the theme of the next section.

My assertion data are not information may be astonishing for some readers. However, it is very important not to equate information and data, as that equation has caused much confusion in the philosophy of mind. In fact, living beings and especially their brains process information, but that does not mean that they process data. Data are just not the material which is processed here, but the final product. By contrast, computers do process data (with which we fed them), and the final product is information: forms and structures on a screen, text, images. or sounds which we can perceive and understand.

Let’s go back to our initial question: Is information (as such) measurable? Imagine, you want to quantitatively determine the structure of a simple object, e.g., of a pencil in all its aspects right down to the atoms and elementary particles. That seems to be at least in practice impossible. We can measure only aspects of the form and structure of the pencil, but not its information in total.

Does that mean, the nature of information is not quantitative but qualitative? No, it doesn’t. Qualities and contrasts are produced by our perception. In the physical world outside our mind, there are only differences, i.e., there is only a more-or-less [6]. Big and small do not exist there, but only differences of size. Cold and warm do not exist there, but only difference of heat. There is no red and no green, but only differences of wavelength, and so forth. It is our perception, it is our mind that generates qualities and contrasts. Why they do so is theme of a later chapter, however, it happens on the basis of information, that is, on the basis of all the differences existing in the world around us.


to the top

next page


  1. Claude F. Shannon, Warren Weaver. (1948, 1972) The Mathematical Theory of Communication. Urbana: University of Illinois Press, I put ‘information content’ between inverted commas in order to point to the fact that the term information is used there in a very special sense different from the usual word meaning.  [⇑]
  2. Ibidem, p. 8.  [⇑]
  3. Shannon reported how the choice of the term entropy came about:
      “My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John Von Neumann, he had a better idea. Von Neumann told me, you should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”  (M. Tribus & E. C. McIrvine: Energy and information, Scientific American, 224, September 1971)
      John von Neumann was quite right with his suggestion that both mathematical terms are analogous, thus the name entropy for Shannon’s ‘uncertainty’ is correct. The name ‘information content’ however is problematic.  [⇑]
  4. Initially, Schrödinger and Brillouin did not yet think of information but of free energy when writing about negative entropy.
    Erwin Schrödinger: Was ist Leben? – Die lebende Zelle mit den Augen des Physikers betrachtet. München: Piper 1989;
    Léon Brillouin: (1953). Negentropy Principle of Information, Journal of Applied Physics, 24, 1152-1163;
    Léon Brillouin, La science et la théorie de l'information, Masson, 1959.
    Norbert Wiener (1948, 1961). Cybernetics or Control and Communication in the Animal and the Machine. Paris: Hermann & Cie and Camb. Mass: MIT Press.  [⇑]
  5. C. F. v. Weizsäcker: Aufbau der Physik. München 1985, p.167.
      Weizsäcker believed that it is only depending on knowledge whether entropy appears as a measure of the fullness of shape or as a measure of chaos:
      “The lot of written and printed papers slacked on my desk is an enormous fullness of shape if I know what is where on those papers. If I don’t know it (or if the charwoman doesn’t know it), then it is a chaos.” (ibidem, p.168)
      The example is not appropriate, because entropy has nothing to do with knowledge in a semantic sense. Weizsäcker’s concern was to conciliate the second law of thermodynamics and the fact of biological evolution. So he wrote:
      “As mentioned above, it is common to regard entropy a measure of chaos and, by that, to understand thermodynamic irreversibility as a growth of chaos. Evolution, by contrast, is understood as the growth of fullness of shape [Gestaltfülle] and insofar as a growth of order. On these premises, evolution must be perceived as a process contrary to thermodynamic irreversibility.” (ibidem, p.169).
      It is not only perceived so, both are in fact contrary to each other. But there is a simple explanation: Entropy grows with time – but in a closed (isolated) system only. Clearly, our biosphere is anything but not a closed system. It is incessantly receiving energy from the sun, which enables the evolution as well as the reduction of entropy. If sun and earth together are regarded as one system, then the entropy in this system grows because of the energy that is radiated by the sun into the space. So we can call the earth a ‘negentropic island’.  [⇑]
  6. Exceptions perhaps are the four fundamental forces (gravity, electromagnetism, strong and weak interaction) as well as diverse elementary particles which can be regarded as (physically) different in quality. However, that has nothing (or next to nothing) to do with our sensory qualities. We don’t feel gravity as such, but our body weight or the weight of things we lift or carry. And we don’t feel electricity as such, but tingle or pain.  [⇑]

to the top

next page