Proceedings of the Eighth International Conference on Conceptions of Library and Information Science, Copenhagen, Denmark, 19-22 August, 2013
"Deep down things": in what ways is information physical, and why does it matter for information science?
David Bawden and Lyn Robinson
City University London, Information Science, Northampton Square, London, EC1V 0HB, UK
Introduction
All for all this, nature is never spent There lives the dearest freshness deep down things (Gerard Manley Hopkins)
Do the truly fundamental laws of nature concern – not waves and particles – but information? (Gilles Brassard, 2012, 1)
Information is information, not matter or energy (Norbert Wiener, 1961, 132)
The purpose of this paper is to consider some of the properties of information in its most fundamental guise; ‘deep down things’ in the information spectrum in Hopkins’ phrase (used also by Bruce Schumm (2004) in the title of his excellent account of the standard model of physics). It considers whether, and how, information is physical, and what significance these questions have for the library and information science (LIS).
Information is physical
The idea that the concept of information might play some part in the understanding of the physical universe has been proposed since the nineteenth century, through scholars such as Boltzmann. Gibbs and Szilard (Robinson and Bawden 2013, Carroll 2011). It was set out explicitly by Rolf Landauer, the German-American physicist who was for many years manager of IBM’s research laboratories (von Baeyer 2004, Bennett and Fowler 2009). His original article, entitled “Information is physical” (Landauer 1991), was followed up over the next decade with a series of articles and presentations reiterating and embellishing the theme, with titles such as “The physical nature of information”, “Information is a physical entity”, and “Information is inevitably physical”. The main point was put consistently and straightforwardly: “Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe .. information is inevitably inscribed in a physical medium” (Landauer 1999, 63 and 64).
Since then the idea of information as a physical constituent of the universe has been widely adopted within the scientific community: see, for example, Vedral (2010) and Davies and Gregersen (2010) for recent overviews, Karnani, Pääkkönen and Annila (2009) for an elaboration of Landauer’s position, and Robinson and Bawden (2013) for more references. This has led to clear endorsements of Landauer’s view, such as:
- “the states of physical systems should .. be thought of as catalogues of information” (Vedral 2012, 222)
- “starting from its very earliest moments, every piece of the universe was processing information” (Lloyd 2010, 96)
- “the physical world is a multiverse, and its structure is determined by how information flows in it” (Deutsch 2011, 304)
The Foundational Questions Institute, a non-profit physics organization, announcing a grant program to support research on physics of information, commented that “interest in research on the meeting point of physics and information has exploded in recent years (Foundational Questions Institute 2013).
The idea has also entered popular awareness through magazine articles, television programmes, and even fiction (Egan 1994, Bear 2008), so that the public becomes acquainted with ideas such as:
- “everything that goes on in [the universe] can be explained in terms of information processing” (Brooks 2012, 41)
- “information is woven profoundly into the fabric of reality” (Al-Khalili 2012).
One consequence is that the general public, or at least that section of it with an interest in scientific matters, is becoming aware of the importance of the concept of information in a very different sense from its previous usage; this may well have an impact, albeit indirect, on how the information disciplines and professions are viewed. More directly, the acceptance of the information concept in the domain of the physical sciences may be seen as particularly significant for viewpoints which explicitly link information in different domains, including the physical, most notably the viewpoints of Stonier (1990, 1992, 1997), of Bates (2005, 2006) and of Floridi (2010): other examples are given by Robinson and Bawden (2013).
But to assess the significance of these developments, we need a little more clarity as to what they entail.
How is information physical?
It is all very well to speak of information as physical, or as a fundamental constituent of the universe. Landauer’s point, as we noted above, was that whenever we find information, we find it inscribed or encoded somehow in a physical medium of whatever kind. Quantitative relations between information and other physical concepts have been recognized since Szilard first showed that the minimum amount of work needed to store one bit of information is kTln2, where k is Boltzmann’s constant and T is the temperature of the storage medium. Landauer himself showed that erasure of one bit of information generates heat, and increases physical entropy by kln2 (Landauer 1961), this being known as Landauer’s Principle. For an analysis of such inter-relations, and on the role of temperature in relating information and energy, see Bennett (2003) and Duncan and Semura (2004). (Information in such calculations is invariably treated as digital, the unit being the bit, and hence subject to analysis by Shannons’ information theory concepts.) Recent experimental studies have verified this relation, measuring the heat dissipated when one bit of information is erased (Bérut et al. 2012).
Understood in this way, information as a physical quantity is conserved, and can neither be created nor destroyed; though it can be discarded, or erased from any particular physical medium (Chiribella, D’Ariano and Perinotti 2012). Indeed, the concern that information might be lost when objects fall into black holes was the motivation for studies over several decades leading to the solution of Hawking’s ‘black hole information paradox’ (von Baeyer 2004, Davies 2010). This seems at variance with the kind of information concepts of relevance to LIS (Cornelius 2002, Capurro and Hjørland 2003, Bawden and Robinson 2012 chapter 4), where we are familiar with the idea that information can be lost; perhaps most obviously in the dramatic example of the burning of the only existing copy of a book. But even in this extreme scenario, information in the ‘deep down’ physical sense is not lost. As Carroll (2011, 274) puts it:
“Imagine that … you throw your copy of this book onto an open fire. Later, you worry that you might have been a bit hasty, and you want to get the book back. Too bad, it’s already been burnt into ashes. But the laws of physics tell us that all the information contained in the book is still available in principle, no matter how hard it might be to reconstruct in practice. The burning book evolved into a very particular arrangement of ashes and light and heat; if we could exactly capture the complete microstate of the universe after the fire, we could theoretically run the clock backwards and figure out whether the book that had burned was this one or [another]. That’s very theoretical , because the entropy increased by a large amount along the way, but in principle it could happen.”
So, in this viewpoint information, in its digital form, is seen as always instantiated in a physical medium, in the same way as other physical quantities such as energy and entropy, obeys similar general laws, and is, in a sense, inter-changeable with them. But why is this important for physical science? At the risk of over-simplification, the literature suggests that there are three main answers to this question: the relation between information and physical entropy; the informational nature of quantum theory; and the recasting of established physical laws in terms of information.
Information and entropy
An explicit link between information and the physical world was first realized in the studies of the thermodynamic concept of entropy, with the insights of Ludwig Boltzmann. As von Baeyer (2003, 98), rather over-dramatically puts it "by identifying entropy with missing information, Boltzmann hurled the concept of information into the realm of physics”. Entropy was initially understood as a purely physical, thermodynamic, concept, a measure of the ‘lost’ or ‘useless’ energy in a system (Atkins 2007, 2010), and this meaning is still relevant in physical science. Subsequent studies, particularly by Boltzmann, by Gibbs and by Szilard, showed that it could also be understood as a measure of the disorder of a physical system, and with the information that we have about it, though the concept of information was not specifically used in these analyses (Robinson and Bawden 2013, Carroll 2011, von Baeyer 2003). The explicit link between entropy and information, using Shannon’s concept of objective quantifiable information, was formalized by Zurek (1989), and this may be seen as the irrefutable confirmation of information as a physical entity.
Information and quantum theory
Quantum theory became established in the 1920s as our most fundamental approach to understanding nature at small scales. It has the reputation of being counter-intuitive and difficult to comprehend as anything other than a highly-effective mathematical formalism; for popular but scientifically sound overviews, see Al-Khalili 2004) and Cox and Forshaw (2011). It has been clear, from its earliest years, that information concepts lie at the heart of information theory. Indeed, one interpretation of quantum theory, pioneered by von Neumann, suggests that human consciousness and knowledge dictates events at the quantum level; although this viewpoint is no longer mainstream, information remains central, particularly since the states of physical systems are described what Schrödinger, one of the founders of quantum theory, termed ‘catalogues of information’ (Vedral 2012). Two examples are the uncertainty principle, which governs our possible knowledge of quantum states, and quantum entanglement, describing the extent to which quantum objects may exchange information. The currently popular Everettian 'many worlds' interpretation of quantum mechanics is inextricably linked with information concepts (Byrne 2010, Wallace 2012). There are numerous other examples, such that Vedral is quoted by Brooks (2012, 43) as saying that “Quantum physics is almost always phrased in terms of information processing …… It’s suggestive that you will find information processing at the root of everything.” Chiribella, D’Ariano and Perinotti (2012) note several explorations of the idea that information is at the core of quantum theory; see also Brassard (2005) and Vedral (2010), particularly for the development of quantum computing and quantum information science, which extend the idea in a practical way.
Informational laws of nature
Perhaps the most dramatic extension of the idea of information as a physical quantity is the suggestion that physical laws might be recast in information terms. This was first propounded by the American physicist John Wheeler (1990, 5), who suggested that “all physical things are information-theoretic in origin”, and that information should therefore be the single most fundamental notion in understanding the whole of physics. More recently, studies have shown that much of quantum theory can be derived from the principles of information theory (Chiribella, D’Ariano and Perinotti 2011, 2012).
What does it mean to say that physical laws may be expressed in information terms? Davies (2010, 75) states it bluntly (his italics): “information is regarded as the primary entity from which physical reality is built … the laws of physics are informational statements”. Chiribella, D’Ariano and Perinotti (2012, 1879) elaborate: “The mathematical framework of the theory can be expressed by using only concepts and statements that have an informational significance, such as the concept of signaling, of distinguishability of states, or of encoding/decoding … The informational concepts used … are connected to the more traditional language of physics by viewing the possible physical processes as information processing events. For example, a scattering process can be viewed as an event – the interaction – that transforms the input information encoded in the momenta of the incoming particles into the output information encoded in the momenta of the scattered particles”. It is worth noting that the ‘information’ in such analyses is regarded in “a very basic, primitive sense”’ although certainly objective, it is not necessarily related to any particular measure, such as that of Shannon.
In principle, every physical process can be described in terms of interactions between particles that produce binary answers: yes or no, here or there, etc. Thus natural laws, governing the interactions and rearrangements of the constituents of the physical universe are perceived as the flipping of binary digits, as in a digital computer. Treating physical phenomena in terms of information processing has led to the notation that “the universe computes”, and that the universe may indeed be regarded as kind of (quantum) digital computer; see, for example, Vedral (2010) and Lloyd (2006, 2010). Finally, Vedral (2012) holds out the intriguing and ambitious possibility that finding an informational basis to the laws of both general relativity and quantum mechanics could bridge the gap between these two, currently irreconcilable, physical theories.
Limitations and opposing views
We have set out a view, gaining currency within the physical sciences, that information may be seen as a constituent of physical reality: objective, quantifiable, digital, and quantum in nature. It is fair to recognise that many scholars, from a variety of fields, reject these ideas, in part or in whole. Before going on to consider what may be significance of these ideas for LIS, we need to examine some of these objections.
In terms of rejection in part, an example is given by Floridi (2011, chapter 14), who has argued against a digital ontology, while agreeing that the fundamental objects of reality are indeed informational.
More fundamentally, some scientists take strong exception to the whole idea of information as a scientific concept, let alone the main element in physical laws. Müller (2007), for example, objects to the association of information with physical entropy. Wallace (2012, 30) writes (his italics), of the idea that “the world itself is just information” that “I confess that insofar as they are mean it literally and not metaphorically, I do not understand what they are talking about. It seems to be in the nature of ‘information’ that it is information about something; how can it make sense to regard the world itself as just information?’. This has resonance with the disquiet of Hjørland (2007, 2009) about objective conceptions of information in relation to LIS. In similar vein, Timpson (2010) argues that ‘quantum information theory’ is simply the theory of information in a quantum universe, not the theory some new entity ‘quantum information’. As Wallace (2012, 30) suggests: “‘information is physical’ does not have to mean ‘the physical is merely information’; it could equally mean ‘information, too, is physical’”.
Furthermore, many scholars, both from the physical sciences and from philosophy, have argued for the reality and importance of abstract knowledge, and of disembodied information. Drawing inspiration typically from Plato’s forms, from Frege’s ‘third realm’, and from Popper’s World 3, they argue that ‘abstract objects’, which must surely be composed of - presumably abstract - information, are as real as physical entities (Gideon 2012); see also Deutsch (2011) for examples of the reality and effect of abstract concepts, specifically in science. It is also notable that one of the questions posed in the Foundational Questions Institute’s (2013) Physics of Information grant program is “Can information exist without matter, and vice-versa, or are they two sides of the same coin?”. The question is still a valid one, even in the physical context.
So, in summary, we may say that, although there is currently a strong interest in, and increasing support for, the ‘information physics’ concept of information as a constituent of the physical world, this is by no means uncontroversial. The validity of this approach, in part and in whole, is questioned. In particular, it is argued that the fact that information is invariably found in physical settings, which may be analysed in informational terms, does not mean that this is all that is to said about information. Specifically, it does not imply that abstract, non-physical, forms of information, or of entities related to information, cannot exist.
Significance for library and information science
We now consider what, if any, significance these considerations of the physical nature of information may have for library and information science. We have argued before (Robinson and Bawden 2013) that it is worthwhile to consider the relations between ‘information’ as it is understood in different domains, following the lead of scholars such as Stonier and Bates; as noted above, there are opposing views (see, for example, Hjørland 2007, 2009).
If there are indeed genuine relations, beyond shallow analogies, of this kind, then the implications for library and information science may be considerable. We will focus here on three such: the nature of the simple conceptual models commonly used as a basis for theory-building in the information science; the nature of emergent properties of information; and prospects for a broader information-focused discipline. These are implications at the conceptual level, but if they, and others like them, prove fruitful, then we may expect pragmatic implications for system design to follow.
There are two main simple conceptual models used to show the place and nature of information for the library and information sciences (Bawden and Robinson 2012, chapter 4, Ma 2012). The first regards information and knowledge as the same kind of entity, with knowledge viewed as 'refined' information, set into some form of larger structure. This is typically presented as a linear progression, or a pyramid, from 'data', or 'capta' – data in which we are interested – through 'information' to 'knowledge', perhaps with 'wisdom' or 'action' at the far end of the spectrum, or at the apex of the pyramid. The second, based in Karl Popper's 'objective epistemology' uses 'knowledge' to denote Popper's 'World 2', the subjective knowledge within an individual person's mind. 'Information' is used to denote communicable knowledge, recorded, or directly exchanged between people; this is Popper's 'World 3' of objective knowledge, necessarily encoded in a 'World '1 document, or physical communication. Information, in this model, is 'knowledge in transit'.
It seems clear that the DIKW model is best suited to represent a natural progression from the objective scientific understanding which we have discussed here, and it has been used in this way in ‘big picture’ accounts of information, such as that of Floridi (2010). However, the problem here is that such a model may be understood as implying that knowledge is ‘nothing but’ a form of information, and information ‘nothing but’ a form of data. This reductionist approach is inappropriate, since it ignores our next theme, the emergent properties of information.
It is clear, when we examine the subject matter of the library and information sciences that there are very real and important concepts which are not captured in the objective view of physical information. They include, but are certainly not limited to, knowledge, meaning, understanding and relevance. These are emergent properties, seen at the level of communicable human information, but not at the physical level. Exactly where they appear is an interesting point of debate: it has been argued that meaning emerges in the domain of biology (see, for example, Karnani, Pääkkönen and Annila 2009), while consciousness has been held to be an emergent property of integrative levels of information (Tononi 2008). Knowledge is a particularly interesting example. It is clearly related to information. and yet, while (physical) information is conserved, knowledge is not: on the contrary, it may grow indefinitely, and – though abstract – may have dramatic effects on the physical world (Deutsch 2011; see also several chapters in Davies and Gregersen 2010).
This is not to argue that we should expect to explain such LIS concepts in terms of the physical information; lessons from other sciences warn us against this. There are emergent properties and phenomena in chemistry which are not reducible to physical principles; similarly, there are those in biology which are not reducible to chemistry, and in the social sciences which are not reducible to biology. But surely no one would doubt that each science is richer for an appreciation of the properties of those which adjoin it, and from which lessons may be learnt. And, as Deutsch (2011) points out, emergent properties do not form a hierarchy, with the lower levels being more fundamental. Properties at any level may be fundamental, and this applies to information-related properties as much as to any other.
This leads us to our third consideration. If we are too take at all seriously the arguments presented in this paper, they lead us to believe that we should see the possibility of a much larger science of information, interested in all of its manifestations. Information physics should be on the horizon of LIS, even though our discipline will not directly contribute to its development. For who can say that ‘our’ emergent properties may not prove fundamental, with lessons for other disciplines who take information as an important concept.
Conclusions
Information is now becoming accepted as a fundamental constituent of the physical universe. The relations between this understanding of information and the understanding of the concept in LIS are worth studying. Emergent properties of information may be fundamental at any level, and lessons may be learnt between levels in any direction.