There are two strategies with regard to the question of how to confront the all-pervasive presence of chance in the structure of the universe. Both are extreme in their ambitions and equal in their zeal. According to the first strategy, chance explains everything; according to the second, chance contradicts explanation and should be eliminated as much as possible in order to make the universe understandable. Both of these strategies, however, are wrong. I suspect that their adherents share a misplaced idea of what chance really means.

As usual in philosophical matters, the story goes back to the Greeks. Aristotle believed that all responsible knowledge is based on causal explanation. The essence of chance consists in the fact that it breaks down the causal relationship between events. In this sense, chance contradicts explanation and is beyond the domain of rationality. The influence of Aristotle’s teaching on our philosophical imagination is still powerful and, from this perspective, the dichotomy of “chance or Intelligent Design” is rendered particularly powerful. The other strategy, however, also owes a lot to Aristotle. If chance explains nothing then chance itself does not require any explanation. If, therefore, we assume that chance is the source of everything, we remove the problem of explanation.

Aristotle’s doctrine on chance was one of his greatest errors. Chance is indeed a little stubborn, but it can be tamed. The history of the calculus of probability can be regarded as a tedious road towards the taming of chance. It was not by chance that the first steps along this road were enforced by human greed. If you aim at winning a large sum in a game of hazard, you must defeat chance. Pascal, when consulted by a gambler, wrote to Fermat, and it is from their exchange of letters that the first mathematical approach to chance and probability was born. The very name probability was the contribution of theology to this process. Jacob Bernoulli, the author of the first fully mathematised treatise on probability entitled *Ars conjectandi*, was a pious Calvinist but he also knew Catholic theology well, and was aware of the prolonged dispute between Dominicans and Jesuits on moral matters. The problem concerned how to act if there are two sets of rules, contradictory with each other, that are to be applied to a given situation, and both of them have only probable arguments on their behalf. In his *Ars conjectandi* Bernoulli proved the “first limiting theorem” on probability which, roughly speaking, states that in a sufficiently long series of random trials (e.g. throwing dice) the average of results tends to a certain value (later on called the expectation value). How could chance events be the subject matter of a mathematical theorem? As a Calvinist, Bernoulli believed in predestination: what to us seems to be a random or chance event, for God is fixed once and for all. In this way, Bernoulli’s theology helped his mathematics. The London plague death lists and annuity documents collected by Dutch bankers provided ample material on which to test reasonings based on probability. From then on, statistical calculations became an important factor in economic analysis.

In the first decades of the 20th century, the powerful applications of probability in physics preceded rapid developments in mathematics that have changed probability calculus from a set of useful rules and algorithms into a mature branch of modern mathematics. The process culminated in 1933 when Andrei Kolmogorov expressed the probability theory in the form of an axiomatic system. Thanks to this achievement, probability entered into a network of interactions with other mathematical theories and initiated a chain of rapid progress.

Chance has finally been tamed. It is no longer a gap in causal explanations but a sensitive tool providing explanations where causal mechanisms fail. Chance can now be described as an event, the (a priori) probability of which is less than one (where by convention “one” means “certitude”). There are two sources for an event to have “probability less than one”. The first is our own ignorance when, for instance, we guess whether the white or black ball is in the box. The second is when the probabilistic behaviour is intrinsic to a given natural process such as the process of radioactive decay.

Let us try to put a well-sharpened pencil in an upright position on a smooth surface. Because of the physical equilibrium law, it will certainly fall, but in which direction? It depends on many chance events, such as motions of the air, the tension of muscles in my finger, acoustic waves from a passing lorry, and so on. Many of these events could be the results of strictly causal processes; they are random only from the point of view of our pencil making a hopeless effort to sustain its upright position. Without these chance events, the laws of physics could not act, and the pencil would not know how to fall down.

This is a generic situation. In the network of physical laws there are many “free places” in which chance events could act and there are as many of them as is necessary for the entire system to work. Chance does not contradict the laws of physics but co-operates with them. Moreover, we know today that the laws of physics that are active in the processes leading to the growth of complexity and the generation of structure, including living systems, must be especially sensitive to input from random events.

Our most distant roots go back to the epoch, deep in the cosmic past, in which the first nuclei of chemical elements were born. This could not have happened before the universe was about one minute old. Before that moment temperatures were too high to allow protons and neutrons to hold together to form stable nuclei. Before the universe was three minutes old, its “chemical composition” was established: 25 per cent of the total mass was helium, less then 1 per cent other light elements, and all the rest hydrogen. Here we are referring to the synthesis of nuclei of chemical elements. Nuclei were able to capture electrons and become atoms only when the temperature dropped to 4,000 Kelvins. This happened 400,000 years after the Big Bang. At the same time, the era ended when the universe was dominated by hot electromagnetic radiation, and the processes leading to the origin of stars and galaxies accelerated.

Heavier elements are synthesised in the interiors of massive stars and carbon is an element which is crucial for the origin of life. Stars evolve and explode and, from the ashes of dying stars, new stars are born. Three or four generations of stars are needed to produce carbon out of lighter nuclei. Around one such star our planetary system was formed and one of the planets of this system is now our home. The protons and neutrons of which our body is composed were once the building-blocks of stars. We really are children of the cosmos.

If in our attempt to balance a well-sharpened pencil so many chance events feature, what can we say about all the processes which have led to our existence? Without the laws of physics there would be no chance events; without chance events the laws of physics could not operate. Grand design contains both laws and chance.

We should stop thinking about such processes in terms of Aristotelian chance which destroys causality and order; they are fully mathematised factors participating in the creative dynamics of the universe.

Is the watchmaker blind? Perhaps he does not need eyes, since he has mathematics at his fingertips.

“For many, the end of this uneasy year cannot come quickly enough”

Ian Cobain’s book uses the killing of Millar McAllister to paint a meticulous portrait of the Troubles