UNCERTAINTY AND DETERMINISM

Introduced by Victor Suchar on 24 April 1998

(Note: Reference should be made to Victor Suchar: "Contemporary Issues in Physics and Philosophy", 4 April 1997, Vol.1 No. 2, page 19, of the Proceedings.)

The concept of determinism in classical mechanics stems from Newton’s Principia elaborated by the three great works of dynamics: Lagrange’s Mecanique Analitique (1788), Hamilton’s On A General Method in Dynamics (1834), and Hertz’s Principles of Mechanics (1894). All three make use of a Variational Principle - the first two, of the Principle of Least Action, the last of the Principle of Least Curvature. The Principle of Least Action, published by Maupertuis in 1747, has a teleological character, meaning ‘shaped by a purpose’ or ‘directed toward an end’: ‘Among all possible motions, Nature reaches its goal with minimum expenditure of action’. Hamilton used this as an organizing principle in order to express all laws of Newtonian mechanics as a representation of minimum problems.
The Hamiltonian of a system is the energy expressed in terms of position and momentum of a particle. Given the Hamiltonian, equations can be written and solved, which give the orbits of the particles in terms of a set of initial conditions. This is the most complete expression of the concept of determinism in classical mechanics and it is vital to the theories of magnetism and relativity and to the development of quantum mechanics.
Toward the end of the last century, the concepts of classical physics formed a seemingly perfect and magnificent edifice. The unification brought by the use of the minimum principle was, however, only a grouping of very different phenomena for treatment by the same mathematical formalism. Soon, contradictions began to appear:
The fundamental irreversibility imposed on the transformation of heat, energy and work summed up in the Second Law of Thermodynamics did not agree with the laws of Newtonian mechanics. The conflict was resolved by Boltzmann’s formulation of the entropy equation and the development of statistical mechanics.
Maxwell’s electrodynamics did not agree with Thomson’s and with Lorentz’s theories of the electron. The Michaelson and Morley experiments did not ascertain the existence of ether. These contradictions were resolved by the synthesis achieved by the Special and then the General Theory of Relativity.
The Black Body experiments conducted by the German Bureau of Standards did not result in a single formula for the distribution of energy among emitted wavelengths. This led Planck to formulate in 1900 the discrete, quantum, structure of energy - taken by him initially as a mathematical trick without physical meaning. In 1905 Einstein formulated the light quanta hypothesis: light consists of particles - photons, each having an energy and moving at the velocity of light. In turn this led to the discovery of the photoelectric effect and to the problem of wave-particle duality.
The early planetary model of the atom proposed by Rutherford with electrons located around a small positively-charged heavy nucleus, proved unworkable. When applying Maxwell’s theory of radiation, the prediction was that the atom was not stable, losing energy continuously by radiation and the electrons then falling into the nucleus.
In 1913 Bohr produced a Quantum Theory of Line Spectra, and formulated a fundamental postulate - an atomic system can exist permanently only in a discontinuous series of ‘stationary states’. In other words, electrons in an atom are allowed to be stable only in certain classically possible orbits, a constraint unknown in classical mechanics. Between 1913 and 1920 he formulated the Correspondence Principle: the laws of Quantum Physics should approach those of Classical Physics in the limit of large quantum numbers. The work during the period 1919 to 1925 which led to quantum mechanics may be described as systematic guessing guided by the Principle of Correspondence.
Contradiction between theory and experimental data produced from the analysis of spectral lines, led Heisenberg to reconsider the foundations of physics on a different basis. Physics should only use observables, initially in this case frequencies, intensities and polarizations of the spectral lines emitted by atoms. No mention should be made of classical orbits since no experiment can show their existence. In 1925 he obtained for the first time the relative intensities of spectral lines in agreement with experiment.
The Uncertainty Principle, first articulated in 1927, is the remarkable consequence of this work. The position and velocity of a particle cannot both be measured exactly, at the same time, even in theory. The very concept of exact velocity and exact position together, in fact, has no meaning in nature. In more detail, the product of the error committed in the determination of the position of a particle by the error committed in the determination of its momentum cannot in any case become inferior to the Planck constant. Since momentum is proportional to velocity, it means that any increase in the precision of measurement of position of a particle, is at the cost of a reduction in the precision of measurement of velocity.
Quantum mechanics had its origin from two simultaneous but distinct research programmes: wave mechanics and matrix mechanics. Since both theories predicted the same frequencies and intensities for the spectra of atoms, it seemed that they must be closely related. Schroedinger’s demonstration of their mathematical equivalence and the subsequent axiomatic presentation by Dirac and von Neumann completed the formalism. But fundamental questions remain, in particular: what are the states of nature that the mathematical apparatus of the quantum theory is supposed to represent - this is the notorious interpretation problem of quantum mechanics.
When did physicists become indeterminists? The majority around 1925-27, due to the power of quantum mechanics to produce physically significant results. The first remarkable event in this direction was Pauli’s determination of the spectral lines of hydrogen in 1926. Probability and indeterminism came along with the success. But probabilistic methods and ideas had already penetrated classical physics through the kinetic theory, Brownian motion, molecular statistics, radioactivity, radiation in general, and so on. They were familiar in the old quantum theory that preceded quantum mechanics in 1900-1925. The difference was expressed by von Neumann in his seminal book ‘Mathematical Foundations of Quantum Mechanics’ of 1932. In classical physics, every probability judgement stems from the incompleteness of our knowledge. Probabilities, in this case, are nonphysical, epistemic additions to the physical structure, a ‘luxury’ according to von Neumann. Epistemic probability, then, is a matter of degree of ignorance, or of opinion. In contrast, quantum mechanics has probabilities which stem from the chancy nature of the microphysical world itself - they are fundamental.
The difference is also reflected in the manner we understand the mathematical formalism of each theory. In classical mechanics we start with some intuitive concepts based on our attempts to describe and classify our direct experience. We then refine and make consistent these already given concepts by the means of mathematical formalisms with which we must become familiar. In quantum mechanics, we developed successfully the mathematical formalisms representing relations between observables, but are unable to figure out the kind of states of nature that the accepted mathematical structure could be taken to represent.
This does not mean that there is an absolute contradiction between classical and quantum mechanics. Classical mechanics is a particular case of quantum mechanics, the case where the Planck constant can be neglected. It can then be said that classical physics is a relative knowledge of reality, of which quantum physics offers a more profound knowledge. We did not discover that classical mechanics, with its conceptions of causality and determinism, is false, we discovered the limits where it is valid. We admit that the classical corpuscular conception is inadequate at the atomic level - the electron cannot be assumed to be a particle of classical mechanics. This could lead us to David Bohm’s conclusion: ‘Classical definitness and quantum potentialities complement each other in providing a complete description of the system as a whole’.
The process of transition from the sub- atomic to the macroscopic world remains, however, unknown. Since the formulation of quantum mechanics, more than seventy years ago, physicists have attempted explanation often in the form of ‘mental experiments’, and it is only recently that technology has permitted their actual physical conduct. The speaker had the opportunity to hear, at the ‘Colston Symposium’ held in April at the University of Bristol, a group of leading quantum physicists describing experiments of extraordinary complexity and precision. Possible explanation depends on the success of such experimental programmes, and there is a good deal of excitement and even some ground for optimism in this regard. In the interim, we lack the tools for interpretation.
Science may be unable to offer a Theory of Everything, but it does make a fundamental contribution to our view of the world. The concept of harmony - informed ancient, medieval and renaissance philosophy, and the rationalism of Newtonian physics - greatly influenced the Enlightenment. The concept of uncertainty, in turn, has been a key to the philosophy in this century, often with strange results. The unintuitive and formal concepts of modern physics, raise also a great deal of pseudo-scientific interpretation.
Victor Suchar