Clockwork universe theory

The clockwork universe theory compares the operation of the physical universe to the inner workings of a mechanical clock. As a metaphor for the underlying order and predictability of nature, this idealized machine continues ticking along with its precision gears governed by a few simple, yet elegant, laws of physics making every aspect of the device completely predictable. Since Sir Isaac Newton unified the description of terrestrial and heavenly motion, scientists originally had good reason to believe that all the constitutent parts of the universe were mathematically predictible, at least in principle.
This was not incompatible with the religious view that God the Creator wound up everything in the first place at the Big Bang; and from there the laws of science took hold and have governed most everything since as in Secondary Causation. But it did tend to undermine the notion that God's instant-by-instant attention was the sole motivation for the functioning of the universe as expressed in the theory of Occasionalism.
The observation of regular order and consistency in a clockwork universe also tended to discredit the assumption of a Pantheon of Deities with ever changing and contradictory moods and motivations, rather than the unity of purpose possible with a single God.
The clockwork universe was popular among Deists who made a break with traditional religious organizations during the Enlightenment, when scientists first demonstrated that Newton's laws of motion, including the law of universal gravitation, could predict the behavior of falling objects on earth as well as the motion of the planets to within the limits of the observational accuracy of the day.
Art
In 2009 artist Tim Wetherell created a large wall piece for Questacon (The National Science and Technology centre in Canberra, Australia) representing the concept of the clockwork universe. This steel artwork contains moving gears, a working clock, and a movie of the moon's terminator in action.
Opposition
Arguments against the possibility of the universe being completely predictable via the laws of science include: the concept of free will acting through the agency of a soul not strictly governed by the laws of physics; the second law of thermodynamics in which the total entropy of the universe tends to increase over time; the axiomatic foundation of mathematics which underlies scientific inquiry; especially modern Chaos Theory; and finally quantum physics with its probabilistic description of the wave function. The nuances of all such objections are fundamentally different in character.
Even though the clockwork universe theory was rooted in no little part in the momentous discoveries of Isaac Newton, he was neither the inventor of nor a proponent for the concept. Indeed, Newton warned against viewing the universe as a mere machine, specifically as something similar to a great clock.

"Gravity explains the motions of the planets, but it cannot explain who set the planets in motion. God governs all things and knows all that is or can be done.”

Throughout his life, Newton noted the inherent harmony between natural law and religion and claimed in his master work, the Principia:

“This most beautiful system of the sun, planets and comets, could only proceed from the counsel and dominion of an intelligent and powerful Being.”

Also as Edward B. Davis notes in recounting the writings of Newton-supporter Samuel Clarke, Newton's belief was that the clockwork universe theory wrongly reduced God's role in the universe. Responding to Gottfried Leibniz, who was a prominent supporter of the clockwork proposal, Clarke wrote, echoing the beliefs of Newton, (in the Leibniz-Clarke correspondence):

"The Notion of the World's being a great Machine, going on without the Interposition of God, as a Clock continues to go without the Assistance of a Clockmaker; is the Notion of Materialism and Fate, and tends, (under pretense of making God a Supra-mundane Intelligence,) to exclude Providence and God's Government in reality out of the World."

World-machine
A nearly identical concept was earlier described in a 13th-century introduction to astronomy by Johannes de Sacrobosco : On the Sphere of the World. In this widely popular medieval text, Sacrobosco spoke of the universe as the machina mundi, the machine of the world, and waxed poetic suggesting that a reported eclipse of the Sun at the crucifixion of Jesus might have been a disturbance in the order of that machine.
As in a clockwork mechanism, the machina mundi consisted of a huge, regulated, and uniform machine that operated according to natural laws in absolute time, space, and motion. God was the master-builder, who created the perfect machine and let it run. God was the Prime Mover, who brought into being the world in its lawfulness, regularity, and beauty. This concept was a core theological assumption in the Western Tradition and remains the teaching of the Catholic Church as well as most Protestant Sects, who believe that a single God is the creator and conservator of natural law. Christians would, however, insist that both God and man have the potentiality to inject random non-physical perturbations into the world through the exercise of free will.
A variant of this view of God the creator, but who subsequently stands aside from his work and doesn’t get involved with humanity even for the occasional miracle, is called Deism. In this philosophy, God and man still possess free will but only man injects it into the natural order of things. Deism predates Newton and was accepted by many who supported the “new philosophy”.
While atheism especially embraces the determinism implicit in a mechanical world or a clockwork universe, it denies the existence of God the creator and thus requires the universe to be eternal, notably without a beginning. In this sense, the universe must always have existed in some physical form. Atheists would also deny the existence of free will as well as consciousness, claiming they are nothing more than the predetermined and predictable chemical activity of the brain.
Objections Due to Free Will
Most arguments for free will are rooted in the sense that it is self evident and a natural outgrowth of our consciousness or soul. Supporting evidence includes a common sensibility of right and wrong independent of culture and often contrary to individual interest (ethics), a common regard for altruistic acts to include self-sacrifice for unrelated strangers (altruism), a common curiosity as to origins and purpose, and a human ability to conceptualize such varied things as absolute ideals and non-physical number sets; none of which are easily explained by the mechanisms of Darwinian selection pressures. These features do however suggest a spiritual component, which would be indeterministic by definition since operating in a realm outside of nature and natural laws necessarily implies a lack of predictability based on those laws. The religious argument is thus that any conscious being, either God or man made in His image, has the potential to introduce random uncertainties into the physical clockwork universe.
Indeed, so firmly ensconced is the popular belief in an individual’s freedom of choice as to generate an inordinate societal effort in the development of law and of religion to regulate behavior. In this view, the unthinkable consequence would be, that since free will is the necessary precursor of responsibility, its absence would remove any real foundation for ubiquitous social norms to include rewards and punishment for children, the treatment of one's neighbors, and especially for any criminal justice system; making all such efforts pointless at the core.
Another consideration is that science lacks the tools to investigate, much less explain, how a clockwork mechanism consisting of electrons bouncing off neurons could give rise to the self-awareness needed to host free will. The assertion is that the axioms of mathematics and science limit their applicability in the most fundamental sense to a generalized description of physical motion through the transfer of chemical energy which cannot, even in principle, provide traction towards explaining the sense of being a free agent. Basically science is limited to predictions of atomic locations and motions and can only correlate these in the most general way to physical sensations which are not strictly speaking pertinent to the core issue of consciousness. Given these considerations, many assume the existence of a supernatural soul, not strictly governed by the laws of physics, which is thus the simplest explanation for the common, and daunting, experience of being aware of one's own existence.
The primary argument against free will is then, that while advances in neuroscience at the macroscopic level would not be determinative, a simulation of a living brain at the microscopic level, might conclusively demonstrate whether an individual’s choice was predictable or not. Whether this pivotal calculation is even theoretically possible using Newtonian dynamics, or not given the likely evolution into chaotic complexity, is somewhat moot since the resources required would almost certainly be beyond any means. That is not to say that model simulations cannot provide meaningful representations of and valuable insights into neural structures; but rather that a conclusive demonstration of predictability would require an intimate and precise comparison to the real time chemistry of a living brain, which would be more difficult.
At the most fundamental level, the question of whether the chemical activity of the brain is hidden behind the probabilistic veil of quantum mechanics continues to generate discussion as more and more biological processes that do exhibit quantum behavior are discovered. Generally, while most chemical reactions and diffusions active in biology are too large to be more than slightly random in a quantum sense, a few are not. And while large cascading events and their timing, often have smaller and more subtle triggers, most of the brain's functions seem to operate classically.
Objections Due to Entropy
In the mid and late 19th century the concept of Entropy in Thermodynamics was first described by both Rudolph Clausius and William Thomson (Lord Kelvin) and was given mathematical rigor by Boltzmann, who had the pivotal equation inscribed on his gravestone. To the extent it is a correct formulation, it requires disorder in the universe taken as a whole to continually increase and thus demands the universe, or any and all possible multiverses in total, have a beginning. The best scientific thinking is thus that the universe, or all multiverses if they are real, did not always exist.
Because, under the assumptions of natural science, it is logically impossible for physical objects to spring into existence without a physical cause or basis of some sort , many invoke a supernatural being, or God, as the creator of the universe from nothing or ex nihilo. As such, this argument does not strictly bear on the current functioning of the universe, which is well described by the laws of thermodynamics, but rather on the origin of those laws and thus the possibility of a higher order than can be described by physics alone.
That the universe must have had a beginning was logically developed by Thomas Aquinas based on the earlier works of Augustine and Plato and Aristotle; all of which is generally supported by the laws of thermodynamics. Interestingly, and at the other extreme, atheism, which rejects a creation event, demands a leap of faith in the eternal existence of the universe necessarily denying currently accepted scientific theory and observations related to entropy.
Objections Due to Axiomatic Mathematics
All of geometry and mathematics is based on a few simple assumptions, or unprovable axioms, from which logical consequences are drawn to form a bewilderingly complex assembly of proofs and laws. While the initial assumptions of Euclid's Elements or the Peano Axioms have been chosen to be simple and "self-evident", there can be no assurance that they do not result in some subtle error far down the chain of reasonings.
Indeed one consequence of any axiomatic system is Godel's Theorem which demonstrates there must be mathematical statments which are both true and at the same time impossible to prove as well as statements which are false and yet impossible to refute. In this sense, any mathematics we could devise must always be incomplete. And as Galileo opined:
"Philosophy is written in this grand book — I mean the universe — which stands continually open to our gaze, but it cannot be understood unless one first learns to comprehend the language in which it is written. It is written in the language of mathematics, and its characters are triangles, circles, and other geometric figures, without which it is humanly impossible to understand a single word of it; without these, one is wandering about in a dark labyrinth."

None of these considerations imply that mathematics or science is wrong but rather that there is no way to be absolutely certain of their results. Simply speaking, we tend to prefer those theories that currently match all observations and are simplest.
As a case in point, a central underpinning of the clockwork universe was Newton’s law of gravitation which fostered the sense that nature was precisely predictable in the heavens as well as on earth. Newton’s putative gravitational fields were apparently pervasive and conclusive. Trajectories of cannon balls and planetary orbits were predicted with spectacular success. It was only much later that a deviation from Newtonian dynamics of roughly 43 arc seconds per century was detected in the position of the planet Mercury. This is an amount only discernible with telescopes and precision instruments and demands a leap of faith in your grandfather’s unwavering attention to astronomical detail. But the consequences were profound.
Newton’s gravitational fields are now recognized to be a complete illusion. And at no little increase in complexity, we now prefer to believe instead that the inertia of moving bodies carries them along straight lines through a space warped by massive objects in an unseen dimension. Whether this reflects reality any more than Newton’s inverse square law is problematical because Einstein’s General Relativity is not compatible with quantum theory and therefore known to be somehow fundamentally in error as well.
In addition, while a Theory of everything , may be possible, a true theory of everything is not logically possible. An oft quoted demonstration, in addition to those given above, is that one might always ask “Why that particular theory and not another?”
In summary of the objection, as science has improved our knowledge of nature, so in like proportion has there been a sharpened appreciation of the fundamental limits of logic and the inherent limits of science itself.
Objections Due to Chaos Theory
Since the late 1960's with the rediscovery of Poincaré's proofs on the three body problem, difficulties with many non-periodic real world systems slipping into a chaotic state have been demonstrated. Amazingly, these systems were previously thought to be well described by Newton's classical laws of motion and gravitation. Instabilities in predictions generally arise because of some non-linear aspect as well as an extreme sensitivity to initial conditions.
Another difficulty is that even for simple systems there seems to be a finite limit on any predictability. Even though an increasingly better knowledge of the initial state allows a prediction further into the future, the requirements on initial accuracy increase at a faster rate than the window of predictability improves. Thus even for classical systems, there is a finite window extending both into the past and the future beyond which no predictions are theoretically possible.
That the evolution of a few seemingly simple differential equations in a Lorenz system model was inherently unpredictible in any possible computer simulation came as a real surprise in the early 1960's. But since then most real world systems, having complex relationships much beyond the early models of Lorenz, have been shown to be chaotic. In 1969, Sir James Lighthill (1924-1998) was elected Lucasian Professor of Mathematics to succeed Physics Nobel Laureate, P.A.M. Dirac. This is the chair at the University of Cambridge previously held by Sir Issac Newton. As a firm believer of Newtonian mechanics, Sir James’ statement of public apology is an enlightenment to read:
“Here I have to pause, and to speak once again on behalf of the broad global fraternity of practitioners of mechanics. We are all deeply conscious today that the enthusiasm of our forebears for the marvelous achievements of Newtonian mechanics led them to make generalizations in this area of predictability which, indeed, we may have generally tended to believe before 1960, but which we now recognize were false. We collectively wish to apologize for having misled the general educated public by spreading ideas about the determinism of systems satisfying Newton’s laws of motion that, after 1960, were to be proved incorrect. In this lecture, I am trying to make belated amends by explaining both the very different picture that we now discern, and the reasons for it having been uncovered so late.”
The best current thinking seems to be that even for classical systems, the argument for a clockwork universe as a strict consequence of Newtonian dynamics is no longer logically valid. Since both complexity and errors accumulate over time, perhaps exponentially, we cannot be certain of determinism even for short times, or even in principle, or even for classical systems. The claims of Chaos theory are thus that nature seems to draw a curtain on predictions of mechanical motion in a clockwork universe that is forever beyond any ability to penetrate.
Objections Due to Quantum Mechanics
Quantum mechanics describes physical objects as wave functions whose amplitudes are smeared out to give only relative probabilities of being in different states rather than exact locations and velocities. In a sense this does not remove all determinism from physics because, if the initial state of a wave function could be known with absolute precision, its future evolution could be exactly predicted, at least in principle. But since the uncertainty principle declares the theoretical impossibility of a precise knowledge of initial conditions, this alone prevents absolute certainty as to a deterministic clockwork universe even one consisting of wave functions.
The objection of quantum mechanics does not, however, depend on any lack of knowledge concerning positions or velocities but rather is fundamentally rooted in the mathematics itself. In principle even if we knew the initial conditions perfectly, the wave function, by describing a particle as existing in many different states at the same time, removes causality from individual events; notwithstanding the satisfyingly precise predictions of their relative probabilities in aggregate. For instance, identical particles in identical states, at least to all outward appearances, are observed to decay into different fragments along many different paths. In the other direction we cannot say that state Z was caused by state A, but rather that state Z could also have been immediately preceded by state A, or by state B, or by state C, and so forth, again more or less often. Nor do the states A, B, C, etc. always result in state Z. It seems the best we can do to make sense of this nonsense, is to replace causation with correlation. We are forced to such descriptions because all deterministic formulations, to include Newtonian dynamics, fail miserably in describing phenomena at the atomic level. They are, at best, collective approximations of the seething indeterminism of the wave function that apparently governs an underlying and more fundamental reality.
Quantum mechanics predicts, for instance, that when you leave for work in the morning, the now unobserved pendulum in your grandfather clock will suddenly assume multiple positions and velocities all at the same time to include jumping off its restraints to skewer your house cat, which in turn will not be killed outright but will exist uncomfortably in a state of being both alive and dead. And the result of this carnage will be not resolved until you return home in the evening to survey the damage. Fortunately for large tabbies like Schrödinger's cat, but not for atoms, the probability of common sense behavior is much the more likely.
Indeed, the power of quantum mechanics is to start by assuming a lack of determinism and then to successfully calculate the consequences. To reiterate, not only is determinism not a required assumption but rather it is assumed not to be the case at all. Thus, to the extent that quantum mechanical equations reflect reality, rather than simply being a mathematical contrivance, then all of nature is at its core unequivocally indeterministic.
Driven by a life-long philosophical revulsion at this quantum mechanical violation of common sense sensibilities, and in undoubtedly the most famous attempt, Einstein championed the old world order against the upstart ideas of this newfangled indeterminism with the famous quip:
“God does not play dice with the universe.”
But more rigorously, even as the predictive success of quantum mechanics climbed to nearly unassailable heights, Einstein and a small cadre of supporters developed the now famous hidden variable conjecture. The idea was that there might be heretofore unknown aspects of an object which when measured could perhaps provide a deterministic description of atoms without resort to pesky probability waves. Note that this did not re-formulate or re-interpret quantum mechanics so much as criticize its “incompleteness” and suggest it should be replaced with something else entirely.
A few early interpretations of quantum mechanics attempted to wish away the violation of common sense notions by manipulating the basic equations. But with theoretical advances to include the Schrödinger equation, the equivalency of the Heisenberg matrix formulation, the unification with special relativity under Dirac, the linear Hilbert space operators of von Neumann, the quantum field theories of Feynman and many others, the mathematics describing this inherent lack of determinism has become rigorous and undeniable. To the extent this remains an uncomfortable fact, some to include Bohm, `t Hooft, and others, have suggested the possibility of replacing quantum mechanics with better, albeit as yet incomplete, theories or “re-interpretations” of microscopic phenomena. To date, none of these attempts founded entirely on deterministic principles are mathematically rigorous; none provide the means to calculate all or even most quantum behavior so far observed; and finally none have any experimental verification.
The problems in attempting to restore determinism to modern physics have been twofold, one practical and the other theoretical.

a) First and foremost, in literally millions of experiments and in many new venues, quantum mechanics with its intrinsic indeterminism has not had a single predictive failure.

b) The second more profound reason to reject determinism is the theoretical insight by Bell in Bell's theorem that there are experiments which can actually measure whether all microscopic objects in the real-world are indeterministic as described by the wavefunction of quantum mechanics; or instead are deterministic as described by any conceivable hidden variable formulation, or indeed by any classical deterministic theory. The deciding criteria is Bell’s inequality which must be satisfied if reality follows the determinism of classical physics and must be violated if reality follows the indeterminism of the wavefunction.
The nature of Bell's insight was not in any way a new interpretation of existing theory but rather surprisingly a mathematical proof that experiments, able to measure this most basic aspect of reality, are actually possible. And the net result of every single one of the Bell test experiments to date, indicate that the indeterminism of quantum particles is real and consequentially that “God does indeed play at dice" as the formulae of quantum mechanics have always stipulated.
Indeed, perhaps the last possible loopholes providing any hope of determinism are limited to the following possibilities:

a) The glaring assumption of Bell’s work is the validity of Einstein's well tested Special theory of relativity. Thus determinism might be rescued from the clutches of Bell by the future development of "non-local" theories, q.v. nonlocality, which would unfortunately require superluminal signaling, i.e. faster than light, and which would also require physical effects to occur before their causes. In any event and in the strictest sense, all Bell test results to date thus require that either determinism or special relativity be wrong. Because there is an overwhelming preponderance of evidence that traveling faster than light is not possible and a great difficulty in explaining how effects might come before causes, indeterminism seems to be the indicated result.

b) Critics cite technical difficulties in the experiments measuring quantities in Bell’s inequality to include not detecting enough events for statistical confidence. But as each new generation of Bell test experiments definitively eliminates more and more loopholes in more and more combinations, these objections are becoming less and less viable . And this is especially true as each and every test measurement has supported the finding of indeterminism, at whatever confidence level, and never the opposite.

c) Entirely new theories might be invented in the future which would supersede quantum mechanics, with its inherent indeterminism, based on novel assumptions or principles of which we cannot now imagine. Unfortunately we currently lack any such complete replacement formulation for quantum mechanics despite more than a century of effort.
 
< Prev   Next >