What is a random event? Most people call an event random if they cannot predict when and where it will occur and this is good enough for most practicalpurposes.
But is the unpredictability of an event such as the outcome of a coin toss simply caused by lack of sufficient data and/or the non-availability of advanced technology ? If it is, then more data and more refined computing would make the event predictable — and  it would cease to be classed as random. After all, once the coin has left my hand, its eventual configuration on the floor head/tail has already been decided — at any rate on strictly Newtonian principles. Considering the current pace of technological invention it is not inconceivable that one day a machine attached to my finger would be able to calculate the outcome of each coin toss (Note 1).

from the painting  Vivacity by June Mitchell
In the nineteenth century the majority of scientists and thinking people believed that all physical events were, in principle if  not yet in fact, predictable.  Laplace, a French mathematician and astronomer who perfected Newton’s analysis of the motions of celestial bodies, claimed that a Super-mind, if it knew the instantaneous positions and states of motion of all existing  particles of matter, could predict the entire future of the universe. And if everything, including ‘mental states’, essentially boiled down to matter and motion, then the behaviour of human beings was in principle completely predictable also — though at the time few people dared to put it quite so crudely because of the waning but still considerable power of the established Church. As late as the eighteenth century, La Mettrie, a  French surgeon and author of a book entitled L’Homme est une Machine, had to flee to liberal Prussia to avoid prosecution and imprisonment.
Today, this sort of extreme deterministic view is no longer de rigueur because of Quantum Mechanics and Heisenberg’s Uncertainty Principle. Physicists believe that there is a limit to what we can know about what goes on at the subatomic level. In practice though this makes little difference since so-called ‘statistical determinism’ is almost as constraining as individual determinism : it certainly does not give much support  to those misguided souls who still believe in free will. Indeed, we have the curious situation where belief in the predictive power of scientific thinking is at a maximum even though physical theory includes a heavy dose of uncertainty. Quantum events, and this category  includes genetic ‘random mutations’, are not just unpredictable for technical reasons but, so we are assured, truly indeterminate — but this is an unverifiable assumption which the ‘hidden variables’ school (de Broglie, Bohm et al.) rejects. Nonetheless, the future is generally considered to be ‘open’, though that does not necessarily mean that we can take advantage of its lack of definiteness : we are no better off with indeterminate elements that will remain out of our control than we were with total determinism.
The ‘frequency’ definition of randomness is somewhat different. If each one of a set of possible events (e.g. heads or tails with a coin toss) occurs just as frequently as any other, over a substantial period of time, then this event is colnsidered to be random. The difficulty with this definition is that, strictly speaking, we will never know whether a particular type of event was truly random because we would have to carry on tossing coins for years. Worse still, it is found that in actual experiments the frequency of an individual event, say a digit from 1 to 9 coming up on a roulette wheel, turns out not to be exactly equal to the frequency of other events, even with an electronically generated ‘roulette wheel’. There would seem almost always to be a slight bias in favour of certain outcomes and the Rand Corporation, when it amassed a large collection of numbers based on electronic ‘noise’ for mathematical purposes, never managed to get rid of  a certain bias (Note 2). Although the bias in numbers generated by mechanical or electronic procedures is attributed to slight manufacturing defects in the devices employed, one cannot see any obvious reason for a bias in numbers culled from  miscellaneous data such as Balance Sheets  or books of logarithms. But Benford’s Law so-called states that, rather than each digit appearing more or less as many times as any other, the early digits appear much more frequently in numerical data than the later ones, 1 appearing about 30% of the time and 9 about 5%. Benford’s Law is taken seriously enough to be used on occasion by the Inland Revenue when trawling through bank statements on the look out for  evidence of fraud. It is somewhat paradoxical that true randomness is so rare that it has to be generated deliberately and even then without a 100% success rate.
What is rather odd is that causality does  not appear  in either definition, indeed causality is scarcely ever mentioned explicitly in physics, even though without its operation nothing could actually come about. Causality has become persona non grata, someone respectable physicists know perfectly well must  exist but whom they don’t mention in polite company.
There are various reasons for this. The principal one is that causality is something of an embarrassment to a hard-nosed scientific rationalist. Causality cannot be quantified like mechanical stress and strain; there is no base unit for causality in the SI system. Worse still, causality is invisible and intangible : no one has ever claimed to see or touch ‘causality’. If the  supposed causal connection between certain pairs of events (those pairs we call ‘cause’ and ‘effect’) were as evident as all that we would not have the difficulties that we manifestly do have in distinguishing bona fide causal pairs from coincides. Chantecleer in the fable believes that the sun rises because he crows  — and if challenged would tell the objector to be present towards the end of the next night and check for himself.  Hume, as far back as the eighteenth century, threw a spanner in the works of triumphant Newtonian Mechanics by pointing out that there was no foolproof way of ‘proving’ that a certain event ‘causes’ another. The very next time we perform some everyday action such as adding a pinch of salt to boiling water, something quite unexpected might result : the whole pan could burst into flames for example. We can’t be sure of what is going to happen until we carry out the experiment and once we’ve done the experiment what has happened is past history.  No matter how many times we repeat a particular action, we can never be absolutely certain that event B will obligatorily follow event A : this is the problem of Incomplete Induction. If Hume’s arguments had been taken at all seriously, they would have scuppered the advance of science, for it is implicit in the scientific enterprise that identical conditions produce identical outcomes — or at any rate this was the case prior to the advent of Quantum Mechanics (Note 3). Nineteenth century philosophers tried hard but never managed to decisively refute Hume; nineteenth century scientists and engineers simply ignored him.
Another reason for the phasing out of causality from science is modern mathematics. Modern mathematics is essentially inert : it ‘maps’ one set of supposedly pre-existing variables to another set and causality doesn’t come into it. In Calculus the dependent and independent variables, x and f(x),  are casually inverted if this makes differentiation easier even though this contradicts the very idea of dependence. A ‘function’, mathematically speaking, is  a procedure that matches two sets of variables and in such a way that each  member of the first set is paired off with a single  member of  the second set (but it does not necessarily  have to work the other way round). ‘Classical’ mathematics was an altogether more active business : it took an  unspecified but perfectly well-defined quantity such as volume, mass, pressure and so forth, then did something to it, expanded it, contracted it, skewed it sideways, and matched the result to the starting value. There was a sort of implicit mathematical causality built into the system which is why for a long time mathematics and mechanics went hand in hand, for if there is one thing that is undeniably causal, it is the movements of the different parts of a machine.  It is not surprising that for at least two or three centuries there was no great  distinction made between ‘pure’ and ‘applied’ mathematics.
I have never lost any sleep over Hume’s arguments about ‘Incomplete Induction’. My view is that “We know there is causality and there’s an end to it” as Dr Johnson said about freewill. An intelligent visitor from an alien planet would most likely be astonished that causality is  not brought directly into science and classed as the force par excellence since, without causality, neither gravity nor electricity would ‘work’.  As a matter of fact, on reflection I find that it is not true that the ‘force of causality’ — which in the science of Eventrics I prefer to call ‘Dominance’ — cannot be quantified. It should be possible to construct at least a rough scheme akin to that of the Richter scale for earthquakes or the Beaufort scale for wind velocity. A ‘normal’ event like a leaf falling to the ground has been brought about by a previous event such as the wind detaching the stem from the branch of the tree, and the fall of the leaf will momentarily depress the soil by a tiny amount. Call the fall of the leaf event B, its immediate antecedent event A and the immediate successor C. We now attribute Dominance values to the effect A has on B and to the effect B has on C. Dominance can be divided into ‘passive’ and ‘active’ so a single event in a casual event-chain has two associated Dominance values depending on whether it is viewed from the standpoint of being acted upon or acting on another event.  Event B, the fall of the leaf, can be considered to have ‘passive’ Dominance of say about 1 × 10-10 with respect to event A,  and about 1 × 10-10 active Dominance with respect to event C. Of course, each event is part of a general event-environment (and is itself made up of billions of ultimate events) but we neglect this for the moment to get the broader picture).

Note that the Dominance values in this example are symmetrical. This tends to be the case in ‘normal’ circumstances : indeed can be used as part of the definition of what is a ‘normal’ event. Specific natural occurrences such as the formation of a tycoon or the unleashing of an avalanche are exceedingly asymmetrical with respect to the antecedent and subsequent events. The ‘passive’ Dominance value of the final event that triggers a natural catastrophe is normally quite small, much the same as previous members of the chain (assuming that there is a gradual build up of tension). But the consequences of a big  earthquake are so vast that the ‘active’ Dominance of the event that triggers the catastrophe will have to be very high, say 1020 . These figures are arbitrary but could be made more precise with time, at least with respect to certain simple events. (Note 4)

It is the presence of a particular ‘event environment’ that makes the active Dominance value of a catastrophe so high, not generally any intrinsic intensity in the event itself. This is far from being a trivial point if we consider human events, historical or personal. It was the explosive 1914 European event-environment that made Princip’s assassination of Archduke Ferdonand at Sarajevo have such an extreme impact. Princip himself was a young (19 years old) high school student who got talked into participating in a fairly amateurish assassination attempt masterminded by Serbian nationalists; in the normal run of things his name would not even be remembered today. As it happened, this act, which very nearly didn’t take place at all, led within a couple of months  to the outbreak of hostilities between the four Great World Powers, Germany, Russia, France and Britain with poor little Austria being the last to join in.
Nor is this all. The bloody conflict, the first truly international all-out war in history, led to Germany’s humiliation and the subsequent rise of Nazism. So this assassination must be attributed an incredibly high ‘active’ Dominance value while its ‘passive’ value, though not negligible like that of the fall of a leaf, is in comparison quite small. (This line of analysis will be taken further in posts devoted specifically to historical event-chains.)
Returning to science and its neglect of causal analysis, science, there is another reason apart from those already given. Newton and Galileo’s treatment of motion, which in turn affected their treatment of the forces that gave rise to motion (more particularly those that gave rise to accelerated motion), was essentially continuous : Newton especially was obsessed with the Heraclitan idea of ‘eternal flux’ which is why he called his version of the Calculus ‘the Theory of Fluxions’. Because of the fateful influence of the mathematical notion of continuity, physicists tend to view causality as something which, if it exists at all, is spread out continuously  over a large area (barring quantum indeterminacy). This means you can in effect assume it and say no more. But if you take the line that both matter and motion are radically discontinuous as I do, and, moreover, believe that there are gaps between moments, causality has to be brought in from the very beginning. Why should one particular ultimate event be followed by another particular event, or, for that matter, by anything at all?   Rather few Western  thinkers have been bothered by this problem (Note 3). Descartes, however,  one of the half-dozen persons most responsible for the new mechanical world-view, was extremely bothered  by the problem , which is why he elaborated his Theory of Instantaneous Being, something rarely mentioned when discussing his scientific philosophy. Descartes’ answer was substantially this : that God sustained the universe at every single instant, for without His support the universe as a whole would be in a state of  ‘unstable equilibrium’ (as we would put it) perpetually in danger of collapsing into bits or, worse still,  disappearing.
It is no longer the fashion to introduce God to explain the workings of the physical world (as both Newton and Descartes did) and although I believe that there probably is something people once called ‘God’, He (or She or It) has no special connection with the functioning of the universe and humanity, and probably did not create it either. So I need some agency which fills the role Descartes attributed to God : it is what I call ‘Dominance’, a sort of overriding force which seems not to originate within the events themselves — they merely pass it on somewhat in the way colliding objects pass on  momentum in present-day physics. This invisible presence was there ‘in the  beginning’ and will be there ‘in the end’ — or rather its decline and eventual disappearance would bring about the end of the universe as we know it (Note 5).
If you think all this is over the top, try the following thought experiment. Imagine that overnight the ‘regular connections’ between sequences of everyday events have been somehow tampered with. Although some activities proceed normally, the order of events of others is erratic, effects seem to precede causes, one perfectly ordinary action has a completely unexpected consequence. One can no longer be sure that by simply touching the wall, it would not collapse, or  burst into flames; one ends up by hardly daring to draw a breath in case one got poisoned by the oxygen in it instead of unvigorated. Chronic uncertainty with regard to behaviour of physical objects would soon make life scarcely liveable : people would commit suicide out of ‘event anxiety’.
In an article I wrote some time ago, I suggested, half seriously, that if an enemy state wanted to really bring a country to its knees, it  could  not do better than to derange the  ‘causality’ people were used to: this would be far more devastating than to attack a power station or a central computer, since all electronic and mechanical devices would become, if not necessarily non-functioning, hopelessly unreliable.
This thought experiment shows to what extent we depend upon, and entirely take for granted, the causal processes and connections that underlie practically everything we do and see around us. We accept that machines have to be replaced because of wear and tear and can plan for this, but it would be impossible to plan for a future where we could, for example, no longer assume that fluids always move from a region of high density to a region of relatively lower density, where identical weights put on the pans of scales would give completely different results from one moment to the next. If there were even a fairly slight malconnection of events in an everyday event-chain, this would quite likely propagate itself rapidly throughout the entire system causing  incalculable damage. Attacking the rationality of the (macroscopic) physical world would be far more destructive than attacking any of the objects that rely on this rationality.
Corollary : Interestingly, two days after after I wrote and published this post, I came across a piece in the current week’s New Scientist ) “Black Strings expose the naked singularity” by Lisa Grossman (NS 14 Jan 2012 p. 7) and realised at once, as I should have done before, that the scenario I imagined  was what is known in physics as a ‘singularity’, i.e. a situation where the ordinary laws of physics no longer apply (Note 6).
“Near a naked singularity things would  become bewildering as we would no longer be able to predict the fate of anything in its line of sight. ‘It might not do anything nasty, but we have lost predictive power,’ says Lehner [of the Perimeter Institute in Ontario]. ‘I couldn’t tell you if this glass would be sitting on the table tomorrow.’ “ (Note 7)


It appears that someone, Michael Franklin, has “found a way to predict with an accuracy slightly better than chance, whether a ball will fall to red or black“.  But Franklin is a serious scientist who has conducted a series of investigations into precognition : his results were reported in the Journal of Personality and Social Psychology (vol. 100 p 407). See the article by Bob Holmes, “Feeling the Future” (New Scientist, 14 January 2012).

(2)  From the point of view of Eventrics, the persistent slight bias of apparently identical events betrays an irregular distribution of ‘Dominance’, some types of events are seemingly able to attract others (or copies of itself) a little more than other types.
(3) Dominance values can be used to express the difference between so-called first order and second order phase transitions in molecular physics.

(5) Belief in the gradual evaporation of the material universe plays a central role in the doctrines and practices of the mystical sect known as the Yther  — see my SF novel The Web of Aoullnnia some chapters of which are to be found in my general website (ww.sebastianhayes.co.uk). The term Yther, from the language Lenwhil Katylin, means “ebbing away”.  According to the Yther, the entire physical universe is a manifestation of a transcendent entity, Aoullnnia (‘The One’) and is subject to long cycles of  increasing materialization and dematerialization. According to the Yther, the decreasing phase has already begun.

(6)  The situation in my thought experiment would be even more extreme than an ordinary physical singularity since it would constitute not just the breakdown of a particular set of physical ‘laws’, those that apply in our universe, but to any set of physical laws in any universe (or at any rate any universe capable of  supporting life); if it continued without check it it would return the ordered cosmos we live in to the Kaos from which everything emerged according to early Greek mythology.

Note 7  I find it remarkable that the article uses almost exactly the same words as I did, and that it appeared in my letter box the very next day. This is the contrary of a causal disconnection, a serendipitous association of events and this is not the first time this has happened while in the process of writing these posts. One might call such a felicitous connection an ‘event-convergence’ : many such cases are discussed in the interesting book, Coincidence by Brian Inglis (Hutchinson 1990).       S.H.







      from the painting  Vivacity by June Mitchell