Eddington 1927: Coincidences


This paper presents Chapter IV (section 3) from the book THE NATURE OF THE PHYSICAL WORLD by A. S. EDDINGTON. The contents of this book are based on the lectures that Eddington delivered at the University of Edinburgh in January to March 1927.

The paragraphs of original material are accompanied by brief comments in color, based on the present understanding.  Feedback on these comments is appreciated.

The heading below links to the original materials.



There are such things as chance coincidences; that is to say, chance can deceive us by bringing about conditions which look very unlike chance. In particular chance might imitate organisation, whereas we have taken organisation to be the antithesis of chance or, as we have called it, the “random element”. This threat to our conclusions is, however, not very serious. There is safety in numbers.

In the double-slit experiment, electrons end up creating an interference pattern. This is an example of what may appear chance in an isolated instant, may be part of a regulated pattern visible only with large number of instants. This also shows that electrons are not completely discrete, but they maintain continuity with each other at some level. This supports the “continuum of substance” perspective. Here we must have a universal law that regulates the emergence of quantization from continuity, as the field-substance increases in frequency. This law is yet to be discovered. 

Suppose that you have a vessel divided by a partition into two halves, one compartment containing air and the other empty. You withdraw the partition. For the moment all the molecules of air are in one half of the vessel; a fraction of a second later they are spread over the whole vessel and remain so ever afterwards. The molecules will not return to one half of the vessel; the spreading cannot be undone—unless other material is introduced into the problem to serve as a scapegoat for the disorganisation and carry off the random element elsewhere. This occurrence can serve as a criterion to distinguish past and future time. If you observe first the molecules spread through the vessel and (as it seems to you) an instant later the molecules all in one half of it—then your consciousness is going backwards, and you had better consult a doctor.

The truth is that different conditions lead to different equilibrium organizations. By changing the condition, we can change the equilibrium organization. The equilibrium organization shall not change by itself if the condition remains the same. For example, an absolute velocity exists because of its equilibrium with the inertia of the object. If we change the inertia of the object, the absolute velocity shall change by itself to establish a new equilibrium. Thus, an irreversible process may be reversed by changing the conditions of the process. The second law of thermodynamics simply highlights the natural phenomenon of equilibrium for a given condition.

Now each molecule is wandering round the vessel with no preference for one part rather than the other. On the average it spends half its time in one compartment and half in the other. There is a faint possibility that at one moment all the molecules might in this way happen to be visiting the one half of the vessel. You will easily calculate that if n is the number of molecules (roughly a quadrillion) the chance of this happening is (½ )n. The reason why we ignore this chance may be seen by a rather classical illustration. If I let my fingers wander idly over the keys of a typewriter it might happen that my screed made an intelligible sentence. If an army of monkeys were strumming on typewriters they might write all the books in the British Museum. The chance of their doing so is decidedly more favourable than the chance of the molecules returning to one half of the vessel.

When the condition brings certain equilibrium, the chance that a different organization will occur by itself is quite remote.

When numbers are large, chance is the best warrant for certainty. Happily in the study of molecules and energy and radiation in bulk we have to deal with a vast population, and we reach a certainty which does not always reward the expectations of those who court the fickle goddess.

When there are large number of particles, they settle into an overall equilibrium organization as a bulk, while the individual particles may have a range of motion.

In one sense the chance of the molecules returning to one half of the vessel is too absurdly small to think about. Yet in science we think about it a great deal, because it gives a measure of the irrevocable mischief we did when we casually removed the partition. Even if we had good reasons for wanting the gas to fill the vessel there was no need to waste the organisation; as we have mentioned, it is negotiable and might have been passed on somewhere where it was useful. (If the gas in expanding had been made to move a piston, the organisation would have passed into the motion of the piston.)  When the gas was released and began to spread across the vessel, say from left to right, there was no immediate increase of the random element. In order to spread from left to right, left-to-right velocities of the molecules must have preponderated, that is to say the motion was partly organised. Organisation of position was replaced by organisation of motion. A moment later the molecules struck the farther wall of the vessel and the random element began to increase. But, before it was destroyed, the left-to-right organisation of molecular velocities was the exact numerical equivalent of the lost organisation in space. By that we mean that the chance against the left-to-right preponderance of velocity occurring by accident is the same as the chance against segregation in one half of the vessel occurring by accident.

With the partition in place there was one condition of equilibrium. With the partition removed there was another condition of equilibrium. It is not likely that the earlier organization will come about without the earlier condition. Conversion into mechanical work shall depend on the change in conditions that bring about different equilibrium organizations. There is equilibrium between field-substance and material-substance also.

The adverse chance here mentioned is a preposterous number which (written in the usual decimal notation) would fill all the books in the world many times over. We are not interested in it as a practical contingency; but we are interested in the fact that it is definite. It raises “organisation” from a vague descriptive epithet to one of the measurable quantities of exact science. We are confronted with many kinds of organisation. The uniform march of a regiment is not the only form of organised motion; the organised evolutions of a stage chorus have their natural analogue in sound waves. A common measure can now be applied to all forms of organisation. Any loss of organisation is equitably measured by the chance against its recovery by an accidental coincidence. The chance is absurd regarded as a contingency, but it is precise as a measure.

There is a definite relationship between conditions and the resulting equilibrium organizations. As conditions change the equilibrium organization changes also. So there is “change” rather than “loss” of organization. There are definite laws of force here. Only thing different here is a large number of particles and the equilibrium with surrounding field-substance. We use statistical methods to reduce the complexity of calculations.

The particles are moving according to the law of inertia and quantization. If their velocities are changing with collisions then their inertia must be interchanging with the quantization of surrounding field-substance under the law of conservation of force (substance). 

Under “particles in void” perspective, the law of conservation of energy assumes material particles to be completely discrete. Thus, it ignores the surrounding field-substance and equilibrium with it.

The practical measure of the random element which can increase in the universe but can never decrease is called entropy. Measuring by entropy is the same as measuring by the chance explained in the last paragraph, only the unmanageably large numbers are transformed (by a simple formula) into a more convenient scale of reckoning. Entropy continually increases. We can, by isolating parts of the world and postulating rather idealised conditions in our problems, arrest the increase, but we cannot turn it into a decrease. That would involve something much worse than a violation of an ordinary law of Nature, namely, an improbable coincidence. The law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. This exaltation of the second law is not unreasonable. There are other laws which we have strong reason to believe in, and we feel that a hypothesis which violates them is highly improbable; but the improbability is vague and does not confront us as a paralysing array of figures, whereas the chance against a breach of the second law (i.e. against a decrease of the random element) can be stated in figures which are overwhelming.

I wish I could convey to you the amazing power of this conception of entropy in scientific research. From the property that entropy must always increase, practical methods of measuring it have been found. The chain of deductions from this simple law have been almost illimitable; and it has been equally successful in connection with the most recondite problems of theoretical physics and the practical tasks of the engineer. Its special feature is that the conclusions are independent of the nature of the microscopical processes that are going on. It is not concerned with the nature of the individual; it is interested in him only as a component of a crowd. Therefore the method is applicable in fields of research where our ignorance has scarcely begun to lift, and we have no hesitation in applying it to problems of the quantum theory, although the mechanism of the individual quantum process is unknown and at present unimaginable.

Entropy is a measure of equilibrium. Entropy maximizes when equilibrium is achieved. A decrease in entropy means moving away from equilibrium, which is not natural.



Both comments and trackbacks are currently closed.
%d bloggers like this: