What is random? That which cannot be predicted with any confidence. But there is a weak and a strong sense to ‘unpredictable’. We might say that the motion of a leaf blown about by the wind is ‘random’ ― but then that may simply be because we don’t know the exact speed and direction of the wind or the aerodynamic properties of this particular leaf. In classical mechanics, there is no room for randomness since all physical phenomena are fully determined and so could in principle be predicted if one had sufficient data. Indeed, the French astronomer Laplace claimed that a super-mind, aware of the current positions and momenta of all particles currently in existence could predict the entire future of the universe from Newtonian principles.

In practice, of course, one never does know the initial conditions of any physical system perfectly. Whether this is going to make a substantial difference to the outcome hinges on how sensitively dependent on the initial conditions the system happens to be. Whether or not the flap of a butterfly’s wings in the bay of Tokyo could give rise to a hurricane in Barbados as chaos theory claims, systems that are acutely sensitive to initial conditions undoubtedly exist, and this is, of course, what makes accurate weather forecasting so difficult. Gaming houses retire dice after a few hundred throws because of inevitable imperfections creeping in and a certain Jagger made a good deal of money because he noted that certain numbers seemed to come up slightly more often than others on a particular roulette wheel and bet on them. Later on, he guessed that the cause was a slight scratch on this particular wheel and there seems to have been something in this for eventually the management thwarted him by changing the roulette wheels every night (Note 1). All sorts of other seemingly ‘random’ phenomena turn out, on close examination, to exhibit a definite bias or trend: for example, certain digits turn up in miscellaneous lists of data more often than others (Bensford’s Law) and this bias, or rather its absence, has been successfully used to detect tax fraud.

There is, however, something very unsatisfactory about the ‘unpredictable because of insufficient data’ definition of randomness: it certainly does not follow that there is an inherent randomness in Nature, nor does chaos theory imply that this is the case either. Curiously, quantum mechanics, that monstrous but hugely successful creation of modern science, does maintain that there is an underlying randomness at the quantum level. The radioactive decay of a particular nucleus is held to be not only unforeseeable but actually ‘random’ in the strong sense of the word ― though the bulk behaviour of a collection of atoms can be predicted with confidence. Likewise, genetic mutation, the pace setter of evolution, is regarded today as not just being unpredictable but, in certain cases at least, truly ‘random’. Randomness seems to have made a strong and unexpected come-back since it is now a key player in the game or business of living ― a bizarre volte-face given that science had previously been completely deterministic.

The ‘common sense’ meaning of randomness is the lack of any perceived regularity or repeating pattern in a sequence of events, and this will do for our present purposes (Note 2). Now, it is extremely difficult to generate a random sequence of events in the above sense and in the recent past there was big money involved in inventing a really good random number generator. Strangely, most random number generators are not based on the behaviour of actual physical systems but depend on algorithms deliberately concocted by mathematicians. Why is this? Because, to slightly misquote Moshe, “complete randomness is a kind of perfection”(Note 3).

The more one thinks about the idea of randomness, the weirder the concept appears since a truly ‘random’ event does not have a causal precursor (though it usually does have a consequence). So, how on earth can it occur at all and where does it come from? It arrives, as common language puts it very well, ‘out of the blue’.

Broadly speaking there are two large-scale tendencies in the observable universe: firstly the dissipation of order and decline towards thermal equilibrium and mediocrity because of the ‘random’ collision of molecules, secondly the spontaneous emergence of complex order from processes that appear to be, at least in part, ‘random’. The first principle is enshrined in the 2nd Law of Thermo-dynamics : the entropy (roughly extent of disorder) of a closed system always increases, or (just possibly) stays the same. Contemporary biologists have a big problem with the emergence of order and complexity in the universe since it smacks of creationism. But at this very moment the molecules of tenuous dispersed gases are clumping together to form stars and the trend of life forms on earth is, and has been for some time, a movement from relative structural simplicity (bacteria, archaea &c.) to the unbelievable complexity of plants and mammals. Textbooks invariably trot out the caveat that any local ‘reversal of entropy’ must always be paid for by increased entropy elsewhere. This is, however, not a claim that has been, or ever could be, comprehensively tested on a large scale, nor is it at all ‘self-evident’ (Note 4). What we do know for sure is that highly organized structures can and do emerge from very unpromising beginnings and this trend seems to be locally on the increase ― though it is conceivable that it might be reversed.

For all that, it seems that there really are such things as truly random events and they keep on occurring. What can one conclude from this? That, seemingly, there is a powerful mechanism for spewing forth random, uncaused events, and that this procedure is, as it were, ‘hard-wired’ into the universe at a very deep level. But this makes the continued production of randomness just as mysterious, or perhaps even more so, than the capacity of whatever was out there in the beginning to give rise to complex life!

The generation of random micro-events may in fact turn out to be just about the most basic and important physical process there is. For what do we need to actually produce a ‘world’? As far as I am concerned, there must be something going on, in other words we need ‘events’ and these events require a source of some sort. But this source is remote and we don’t need to attribute to it any properties except that of being a permanent store of proto-events. The existence of a source is not enough though. Nothing would happen without a mechanism to translate the potential into actuality, and the simplest and, in the long run, most efficient mechanism is to have streams of proto-events projected outwards from the source at random. Such a mechanism will, however, by itself not produce anything of much interest. To get order emerging from the primeval turmoil we require a second mechanism, contained within the first, which enables ephemeral random events to, at least occasionally, clump together, and eventually build up, simply by spatial proximity and repetition, coherent and quasi-permanent event structures (Note 5). One could argue that this possibility, namely the emergence of ‘order from chaos’, however remote, will eventually come up ― precisely because randomness in principle covers all realizable possibilities. A complex persistent event conglomeration may be termed a ‘universe’, and even though an incoherent or contradictory would-be ‘universe’ will presumably rapidly collapse into disorder, others may persist and maybe even spawn progeny.

So, which tendency is going to win out, the tendency towards increasing order or reversion to primeval chaos? It certainly looks as if a recurrent injection of randomness is necessary for the ‘health’ of the universe and especially for ourselves ― this is one of the messages of natural selection and it explains, up to a point, the extraordinarily tortuous process of meiosis (roughly sexual reproduction) as against mitosis when a cell simply duplicates its DNA and splits in two (Note 6). But there is also the “nothing succeeds like success” syndrome. And, interestingly, the evolutionary biologist John Bonner argues that microorganisms “are more affected by randomness than large complex organisms” (Note 7). This and related phenomena might tip the balance in favour of order and complexity ― though specialization also makes the larger organisms more vulnerable to sudden environmental changes.                                                                 SH

 

Note 1 This anecdote is recounted and carefully analysed in The Drunkard’s Walk by Mlodinow.

 Note 2 Alternative definitions of randomness abound. There is the frequency definition whereby, “If a procedure is repeated over and over again indefinitely and one particular outcome crops up as many times as any other possible outcome, the sequence is considered to be random” (adapted from Peirce). And Stephen Wolfram writes: “Define randomness so that something is considered random only if no short description whatsoever exists of it” (Stephen Wolfram).

 Note 3 Moshe actually wrote “Complete chaos is a kind of perfection”.

Note 4 “The vast majority of current physics textbooks imply that the Second Law is well established, though with surprising regularity they say that detailed arguments for it are beyond their scope. More specialized articles tend to admit that the origins of the Second Law remain mysterious” (Stephen Wolfram, A New Kind of Science p. 1020

 Note 5 This is essentially the principle of ‘morphic resonance’ advanced by Rupert Sheldrake. Very roughly, the idea is that if a certain event, or cluster of events, has occurred once, it is slightly more likely to occur again, and so on and so on. Habit thus eventually becomes physical law, or can do. At bottom the ‘Gambler’s Fallacy’ contains a grain of truth: I suspect that current events are never completely independent of previous similar occurrences despite what statisticians say. Clearly, for the theory to work, there must be a very slow build-up and a tipping point after which a trend really takes off. We require in effect the equivalent of the Schrodinger equation to show how initial randomness evolves inexorably towards regularity and order.

Note 6. In meiosis not only does the offspring get genes from two individuals rather than one, but there is a great deal of ‘crossing over’ of segments of chromosomes and this reinforces the mixing process.

Note 7 The reason given for this claim is that there are many more developmental steps in the construction of a complex organism and so “if an earlier step fails through a deleterious mutation, the result is simple: the death of the embryo”. On the other hand “being small means very few developmental steps, with little or no internal selection” and hence a far greater number of species, witness radiolaria (50,000) and diatoms (10,000). See article Evolution, by chance? in the New Scientist 20 July 2013 and Randomness in Evolution by John Bonner.