Mark Buchanan
Ubiquity
The Science of History Why the World is Simpler than we Think
Weidenfels&Nicolson 2000


pg 9

Keywords: the organisation of networks (system) - a small shock to trigger a response out of all proportion to itself. It is as if these systems had been poised on some knife‑edge of instability, merely waiting to be set off - instability – disaster – upheaval – The global ecosystem is occasionally visited by abrupt episodes of collapse - a profound similarity not between moving objects, but between the upheavals that affect our lives, and the ways in which the complicated networks in which they occur - economies, political systems, ecosystems and so on - are naturally organised - these events, and the workings of the systems in which they occur, may reflect the tenor of just a few simple and ubiquitous underlying processes.

Sand-pile game - Per Bak, Chao Tang, Kurt Weisenfeld - what is the typical size of an avalanche? How big, that is, should you expect the very next avalanche to be? The result? Well …there was no result, for there simply was no 'typical' avalanche - The hypersensitive state to which the computer sand pile organises itself is known as the critical state -

critical state - Could the special organisation of the critical state explain why the world at large seems so susceptible to unpredictable upheavals? - the peculiar and exceptionally unstable organisation of the critical state does indeed seem to be ubiquitous in our world – At the heart of our story, then, lies the discovery that networks of things of all kinds ‑ atoms, molecules, species, people, and even ideas ‑ have a marked tendency to organise themselves along similar lines. On the basis of this insight, scientists are finally beginning to fathom what lies behind tumultuous events of all sorts, and to see patterns at work where they have never seen them before.

Catastrophe theory - catastrophe theory, despite its provocative name, has very little to say about the workings of anything like the earth's crust, an economy, or an ecosystem. In these things, where thousands or millions of elements interact, what is important is the overall collective organisation and behaviour. To understand things of this sort, one needs a theory that applies generally to networks of interacting things.

Chaos theory - chaos by itself cannot explain why a butterfly can cause a thunderstorm. Chaos may indeed explain why a tiny cause can quickly make the future different in its details (the positions of many molecules) from what it might have been. But to explain why tiny causes can ultimately lead to great upheavals, we need something else. We may say that although chaos can explain simple unpredictability, it cannot explain upheavability.

complexity: networks of interacting things: For centuries, physicists have sought the fundamental laws of the universe in timeless and unchanging equations – equilibrium - the air in the atmosphere is very much out of equilibrium, since it is continually being stirred and agitated and energised by the influx of light from the sun, and here we have a clue to the origin of upheavals: it lies in the distinction between what happens in equilibrium, and what happens away from it. If things in equilibrium are fairly simple, things out of equilibrium can be decidedly complex

non­-equilibrium physics, or, to use the currently fashionable language, the physics of complex systems - the relationship between the critical state and complexity is really quite simple: the ubiquity of the critical state may well be considered the first really solid discovery of complexity theory

history: In coming to consider complex systems, physicists seem to have gained a new appreciation of a simple fact: in the immediate world around us, history is important - out of equilibrium, history does matter. One can only make sense of the infinitely detailed shape of a snowflake by following the history of its growth by slow freezing from the thin air. These are all problems in non‑equilibrium physics, the physics of complex systems, or, to coin a new term, historical physics. If the laws of physics are ultimately simple, why is the world so complex? Why don't ecosystems and economies reveal the same simplicity as Newton's laws? The answer, in a word, is history.

dynamics of history - For things out of equilibrium, one cannot proceed by solving timeless equations, and so physicists have turned to another approach ‑ replacing equations with games -

explore the basics of crystal growth – frozen accidents - If the laws of physics did not allow frozen accidents, the world would be in equilibrium, and everything would be like the gas in a balloon, resting forever in the same uniform and unchanging condition. But the laws of physics do allow events to have consequences that can become locked in place, and so alter the playing field on which the future unfolds. The laws of physics allow history to exist. The discovery of the ubiquity of the critical state, then, is not only the first solid discovery of complexity theory, but also the first deep discovery concerning the typical character of things historic.





...The roots of war are to be sought in politics and history, those of earthquakes in geophysics, of forest fires in patterns of weather and in the natural ecology, and those of market crashes in the principles of finance, economics, and the psychology of human behaviour. Beyond the labels 'disaster' and 'upheaval', each of these events erupted from the soil of its own peculiar setting. Still, there is an intriguing similarity.

In each case, it seems, the organisation of the system - the web of international relations, the fabric of the forests or of the earth's crust, or the network of linked expectations and trading perspectives of invest­ors - made it possible for a small shock to trigger a response out of all proportion to itself. It is as if these systems had been poised on some knife‑edge of instability, merely waiting to be set off.

In the history of life, we find a similar pattern. The fossil record reveals that the number of species on our planet has ‑ roughly speaking ‑ grown steadily over the past 600 million years. Yet on at least five separate occasions, sudden and terrible mass extinctions nearly wiped out every living thing. What happened? Many scientists point to precipitous changes in the earth's climate, caused perhaps by the impact of large asteroids or comets. Others suggest that the extinction of just a single species can, on occasion, trigger others, which in turn cause still others, leading to an avalanche of extinctions that can consume large fractions of entire ecosystems. The mass extinctions continue to mystify biologists and geologists, and yet one thing is clear: if the fabric of life seems resilient and largely in balance with itself, the truth is rather more unsettling. The global ecosystem is occasionally visited by abrupt epi­sodes of collapse.

When I was in grade school, one of the dreaded tasks assigned by the geometry teacher was to determine if two triangles were similar. Here is a big triangle, she would say, and here is another much smaller triangle, oriented in a different way. Are they, aside from the irrelevant details of overall size and orientation, the same triangle? Put otherwise: if you can shrink or expand either triangle at will, turn them over and rotate them in any way you like, can you make the one fit precisely over the other? If so, then the triangles are similar ‑ if you understand the essential logic of one, its angles and the ratios of the lengths of its sides, then you also understand the other.

Three centuries ago, Isaac Newton sparked a scientific revolution by noticing another kind of similarity. His contemporaries must have been at first disbelieving, and later stunned, when he told them that an apple falls to the ground in precisely the same way as the earth moves round the sun. Newton saw that both earth and apple fall into the single category of things moving under the force of gravity. Before Newton, hap­penings on earth and in the heavens were utterly incomparable. Afterward, the motions of an apple or an arrow, a satellite, or even an entire galaxy were seen as deeply similar ‑ as mere instances of a single, deeper process.

'The art of being wise,' the American philosopher and psychologist William James once wrote, 'is the art of knowing what to overlook', and this book is about a terrific step along the scientific road of learning what to overlook. It is about the discovery of a profound similarity not between triangles or moving objects, but between the upheavals that affect our lives, and the ways in which the complicated networks in which they occur ‑ economies, political systems, ecosystems and so on ‑are naturally organised. We might add to our list dramatic changes in fashion or musical taste, episodes of social unrest, technological change, even great scientific revolutions. As we shall see, all these events, and the workings of the systems in which they occur, may reflect the tenor of just a few simple and ubiquitous underlying processes. Even more remarkably, it may be possible to understand those workings in terms of a few rudimentary mathematical games.

In 1987 when three physicists began playing a strange little game in an office at Brookhaven National Labora­tory, in New York State. Theoretical physicists, one might expect, would be pondering the origins of the universe, or untangling the latest puzzles of nuclear or particle physics. But Per Bak, Chao Tang and Kurt Wei­senfeld were occupying themselves in another way: quite simply, they were trying to imagine what would happen if someone were to sprinkle grains of sand one at a time onto a table top.

Physicists enjoy posing seemingly trivial questions which, after a bit of thinking, turn out not to be so trivial. In this respect, the sand‑pile game was a real winner. As grains pile up, it seems clear that a broad mountain of sand should edge slowly skywards, and yet things obviously : cannot continue in this way. As the pile grows, its sides become steeper, and it becomes more likely that the next falling grain will trigger an avalanche. Sand would then slide downhill to some flatter region below, making the mountain smaller, not bigger. As a result, the mountain should alternately grow and shrink, its jagged silhouette forever fluc­tuating.

Bak, Tang and Weisenfeld wanted to understand those fluctuations: what is the typical rhythm of the growing and shrinking sand pile? A trivial question, one would think. Dropping sand one grain at a time, however, is a delicate and laborious business. So in seeking some answers, Bak and his colleagues turned to the computer. They instructed it to drop imaginary 'grains' onto an imaginary 'table', with simple rules dictating how grains would topple downhill as the pile grew steeper. It was not quite the same as a real sand pile, and yet the computer had one spectacular advantage ‑ a pile would grow in seconds rather than days. It was so easy to play the game that the three physictsts soon became glued to their computer screens, obsessed with dropping grains, and watching the results. And they began to see some curious things.

The first big surprise came as the answer to a simple question: what is the typical size of an avalanche? How big, that is, should you expect the very next avalanche to be? The researchers ran a huge number of tests, counting the grains in millions of avalanches in thousands of sand piles, looking for the typical number involved. The result? Well …there was no result, for there simply was no 'typical' avalanche. Some involved a single grain; others ten, a hundred, or a thousand. Still others were pile‑wide cataclysms involving millions that brought nearly the whole mountain tumbling down. At any time, literally anything, it seemed, might be just about to happen.

Imagine wandering into the street, anticipating how tall the next person might be. If people's heights worked like these avalanches, then the next person might be less than a centimetre tall, or over a kilometre high. You might crush the next person like an insect before seeing them. Or imagine that the duration of your trips home from work went this way; you'd be unable to plan your life, since tomorrow evening's journey might take anything from a few seconds to a few years. This is a rather dramatic kind of unpredictability, to say the least.

To find out why it should show up in their sand‑pile game, Bak and colleagues next played a trick with their computer. Imagine peering down on the pile from above, and colouring it in according to its steepness. Where it is relatively flat and stable, colour it green; where steep and, in avalanche terms, 'ready to go', colour it red. What do you see? They found that at the outset, the pile looked mostly green, but that as it grew, the green became infiltrated with ever more red. With more grains, the scattering of red danger spots grew until a dense skeleton of instability ran through the pile. Here then was a clue to its peculiar behaviour: a grain falling on a red spot can, by domino‑like action, cause sliding at other nearby red spots. If the red network were sparse, and all trouble spots were well isolated one from the other, then a single grain could have only limited repercussions. But when the red spots come to riddle the pile, the consequences of the next grain become fiendishly unpredictable. It might trigger only a few tumblings, or it might instead set off a cataclysmic chain reaction involving millions.

This may seem like something that only a physicist could find inter­esting. But hang on. The hypersensitive state to which the computer sand pile organises itself is known as the critical state. The basic notion has been familiar to physicists for more than a century, yet has always been seen as a kind of theoretical freak and side‑show, a devilishly unstable and unusual condition that arises only under the most excep­tional circumstances. In the sand pile, however, it seemed to arise natu­rally and inevitably through the mindless sprinkling of grains. This led Bak, Tang and Weisenfeld to ponder a provocative possibility: if the critical state arises so easily and inevitably in a sand pile, might some­thing like it also arise elsewhere? Could riddling lines of instability of a logically equivalent sort run through the earth's crust, for example, through forests and ecosystems, and perhaps even through the some­what more abstract 'fabric' of our economies? Think of those first few crumbling rocks near Kobe, or that first insignificant dip in prices that triggered the stock market crash of 1987. Might these have been 'sand grains' acting at another level? Could the special organisation of the critical state explain why the world at large seems so susceptible to unpredictable upheavals?

A decade of research by hundreds of other physicists has explored this question and taken the initial idea much further. There are many subtleties and twists in the story to which we shall come later in this book, but the basic message, roughly speaking, is simple: the peculiar and exceptionally unstable organisation of the critical state does indeed seem to be ubiquitous in our world. Researchers in the past few years have found its mathematical fingerprints in the workings of all the upheavals I have mentioned so far, as well as in the spreading of epi­demics, the flaring of traffic jams, the patterns by which instructions trickle down from managers to workers in an office, and in many other things.

At the heart of our story, then, lies the discovery that networks of things of all kinds ‑ atoms, molecules, species, people, and even ideas ‑ have a marked tendency to organise themselves along similar lines. On the basis of this insight, scientists are finally beginning to fathom what lies behind tumultuous events of all sorts, and to see patterns at work where they have never seen them before.


Catastrophe theory, Chaos, and Complexity

This is a discovery with implications; but before we come to these, it is crucial to be clear about what the critical state is, and what it is not. If the word critical begins with a 'c', then so do a host of other words or phrases that have in recent years been mentioned in connection with things such as financial markets or the weather. First there was catastrophe theory, then chaos, and more recently, complexity. What does the critical state have to do with these ideas?

If you push very gently on both ends of a drinking straw, as if to try to compress it and make it shorter, it will indeed become ever so slightly shorter. Push harder, however, and there will come a point when the straw will abruptly and suddenly bend.

In the 1970s, a mathematician named Rene Thom worked out a theory to make sense of sudden changes of this sort, changes that he referred to as 'catastrophes'. But Thom's catastrophe theory, despite its provocative name, has very little to say about the workings of anything like the earth's crust, an economy, or an ecosystem. In these things, where thousands or millions of elements interact, what is important is the overall collective organisation and behaviour. To understand things of this sort, one needs a theory that applies generally to networks of interacting things.

Chaos theory has its origins more than a century ago in the work of the great French physicist Henri Poincaré, but scientists only realised its true importance in the 1980s. If something is chaotic, then as with a pinball careering through a pinball machine, what happens in the future is extraordinarily sensitive to tiny influences along the way. Inside any ordinary balloon, for instance, the molecules move according to the law of chaos: give a tiny nudge to even a single molecule, and in much less than a minute every last one in the balloon will be affected. In the context of the earth's atmosphere, chaos brings us to the 'butterfly effect': the paradoxical conclusion that the flapping of a butterfly's wings in Portugal now might just lead to the formation of a severe thunderstorm over Moscow in a couple of weeks' time.

Because of this incredible sensitivity, predicting the future of any chaotic system is practically impossible; a chaotic process looks wildly erratic even if the underlying rules at work are actually quite simple. Researchers have discovered the mathematical signatures of chaos in the fluctuations of things ranging from lasers to rabbit populations, and perhaps even in the healthy rhythms of the human heart. And in the late 1980s and early 1990s, some scientists even hoped that chaos might finally help make sense of the wild ups and downs of financial markets. But it didn't ‑ for a very simple reason.

The butterfly effect, it seems, is the conceptual handle that most non­scientists have attached to the notion of chaos. Unfortunately, the tale as usually told is somewhat misleading. Notice that while there is chaos in the movement of the molecules inside a balloon, not a lot happens in there. Have you ever seen a balloon with a thunderstorm inside? A butterfly in a balloon could flap its wings for eternity and never get much response. So chaos by itself cannot explain why a butterfly can cause a thunderstorm. Chaos may indeed explain why a tiny cause can quickly make the future different in its details (the positions of many molecules) from what it might have been. But to explain why tiny causes can ultimately lead to great upheavals, we need something else. We may say that although chaos can explain simple unpredictability, it cannot explain upheavability.

There is one more c‑word: complexity. For centuries, physicists have sought the fundamental laws of the universe in timeless and unchanging equations, such as those of quantum theory or relativity. Notice that there is also in the balloon a kind of timelessness, since the air lives there in equilibrium under unchanging conditions. In contrast, the air in the atmosphere is very much out of equilibrium, since it is continually being stirred and agitated and energised by the influx of light from the sun, and here we have a clue to the origin of upheavals: it lies in the distinction between what happens in equilibrium, and what happens away from it. If things in equilibrium are fairly simple, things out of equilibrium can be decidedly complex. The discovery that can make sense of upheavals, and the idea at the heart of this book, lies within the rapidly growing field of non­-equilibrium physics, or, to use the currently fashionable language, the physics of complex systems.

By studying the natural kinds of patterns that evolve in networks of interacting things under non‑equilibrium conditions, we may be able to understand an immense range of natural phenomena, from our turbulent atmosphere to the human brain. The study of complex systems is all about things that are out of equilibrium, and on this task, of course, scientists are really just starting out. So the relationship between the critical state and com­plexity is really quite simple: the ubiquity of the critical state may well be considered the first really solid discovery of complexity theory.

And yet, there is another useful way to look at all this. In coming to consider complex systems, physicists seem to have gained a new appreciation of a simple fact: in the immediate world around us, history is important. For living things, which ultimately develop from a single cell, this is obvious. But one cannot even understand the hardness of a steel pipe, or the irregular surface of a fractured brick, without referring to the full history of its making.

History does not matter in the balloon, for in equilibrium, nothing changes. But out of equilibrium, history does matter. One can only make sense of the infinitely detailed shape of a snowflake by following the history of its growth by slow freezing from the thin air.

These are all problems in non‑equilibrium physics, the physics of complex systems, or, to coin a new term, historical physics. If the laws of physics are ultimately simple, why is the world so complex? Why don't ecosystems and economies reveal the same simplicity as Newton's laws? The answer, in a word, is history.

For things out of equilibrium, one cannot proceed by solving timeless equations, and so physicists have turned to another approach ‑ replacing equations with games. The physics research journals are now stuffed with papers about the workings of simple games: some are meant to explore the basics of crystal growth, others to mimic the formation of rough surfaces, and so on. There are hundreds, all slightly different, and yet all, like the sand‑pile game, concern non‑equilibrium systems and are therefore deeply historical in their nature, all sharing a susceptibility to what Francis Crick once termed 'frozen accidents'. In the sand‑pile game, a sand grain falls accidentally here or there. The pile then grows over that grain, 'freezing' it in, and then forever feels the influence of that grain being just where it is and not elsewhere. In this sense, what happens now can never be washed away, but affects the entire course of the future.

If the laws of physics did not allow frozen accidents, the world would be in equilibrium, and everything would be like the gas in a balloon, resting forever in the same uniform and unchanging condition. But the laws of physics do allow events to have consequences that can become locked in place, and so alter the playing field on which the future unfolds. The laws of physics allow history to exist. The discovery of the ubiquity of the critical state, then, is not only the first solid discovery of complexity theory, but also the first deep discovery concerning the typical character of things historic.

In principle, history could unfold far more predictably than it does. It need not, in principle, be subject to terrific cataclysms of all sorts. One of our tasks in this book is to examine why the character of human history is as it is, and not otherwise. The answer, I suggest, is to be found in the critical state and in the new non‑equilibrium science of games, which aims to study and categorise the kinds of historical processes that are possible.

If many historians have searched for gradual trends or cycles as a way of finding meaning and making sense of history, then they were using the wrong tools. These notions arise in equilib­rium physics and astronomy. The proper tools are to be found in non-­equilibrium physics, which is specifically tuned to understanding things in which history matters.

In the very same year that Bak, Tang and Weisenfeld invented their game, the historian Paul Kennedy published a book entitled “The Rise and Fall of the Great Powers”. In it, he laid out the idea that the large­scale historical rhythm of our world is determined by the natural build­up and release of stress in the global network of politics and economics. His view of the dynamics of history leaves little room for the influence of 'great individuals' and is more in keeping with the words of John Kenneth Galbraith quoted at the beginning of this chapter. It sees individuals as products of their time, having limited freedom to respond in the face of powerful forces. Kennedy's thesis, in essence, is this: the economic power of a nation naturally waxes and wanes. As times change, some nations are left clinging to power that their economic base can no longer support; others find new economic strength, and naturally seek greater influence. The inevitable result? Tension, which grows until something gives way. Usually, the stress finds its release through armed conflict, after which the influence of each nation is brought back into rough balance with its true economic strength.If this sounds at all like the processes working in the earth's crust, where stresses build up slowly to be released in sudden earthquakes, or in the sand‑pile game, where the slopes grow higher and more unstable until levelled again in some avalanche, it may be no coincidence. We shall see later that wars actually occur with the same statistical pattern as do earthquakes or avalanches in the sand‑pile game. Kennedy could find strong support for his thesis ‑ as well as a more adequate language in which to describe it ‑ in this theoretical idea. He may have been struggling to express in words, and in an historical context, what the critical state expresses mathematically. Whatever lessons historians may be able to draw from all this, the meaning for the individual is more ambiguous. For if the world is organised into a critical state, or something much like it, then even the smallest forces can have tremendous effects. In our social and cultural networks, there can be no isolated act, for our world is designed ‑ not by us, but by the forces of nature ‑ so that even the tiniest of acts will be amplified and registered by the larger world. The individual, then, has power, and yet the nature of that power reflects a kind of irreducible existential predicament. If every individual act may ultimately have great consequences, those consequences are aImost entirely unfore­seeable. Out there right now on some red square in the field of history a grain may be about to fall. Someone trying to bring warring parties to terms may succeed, or may instead spark a conflagration. Someone trying to stir up conflict may usher in a lengthy term of peace. In our world, beginnings bear little relationship to endings, and Albert Camus was right: 'All great deeds and all great thoughts have ridiculous begin­nings.'

One of the inevitable themes of our story is that if one wants to learn about the rhythms of history (or, shall we say, its dis‑rhythms), one might just as well become familiar with the process by which, say, earthquakes happen. If the organisation of upheaval and hyper­sensitivity is everywhere, one need not look far to find it. So let us leave human history and the individual aside for the moment, and first look to the simpler world of inanimate things. Let us go underground, into the dark, gritty world beneath the earth's surface, and take a closer look at what goes on there. Surprisingly, in the underworld rumblings of our changeable planet, we shall encounter a schematic template for the workings of a thousand things.







HOME      BOE     SAL     TEXTE