HIGH ANXIETY: THE STRUGGLE TO INTERPRET MODERN PHYSICS

Victor Suchar, Member, on 25 April 2003

This paper describes how quantum theory evolved from classical physics, the problems of interpreting the quantum formalism, and the main proposed solutions. It is suggested that the understanding of the process of transition from the sub-atomic to the macro-physical world may be enhanced by a change in the methodology of research from a foundationist approach based on reduction (metaphorically, the tower) to a network approach based on multiple relations which implicate and partially explain each other (metaphorically, the sphere).

A. DETERMINISM, CORRESPONDENCE, UNCERTAINTY, EMERGENCE

Max Born defined determinism as a condition of general lawfulness postulating " that events at different times are connected by laws in such a way that predictions of unknown situations (past and future) can be made". (Born, 1949). In classical mechanics, this concept stems from Newton’s Principia elaborated by the three great works of dynamics: Lagrange’s Mecanique Analitique (1788), Hamilton’s On A General Method in Dynamics (1834), and Hertz’s Principles of Mechanics (1894). All three make use of a variational principle – the former two, of the Principle of Least Action, the last of the Principle of Least Curvature. The Principle of Least Action published by Maupertuis in 1747 has a teleological character, meaning "shaped by a purpose" or "directed toward an end": "Among all possible motions, Nature reaches its goal with minimum expenditure of action". Hamilton used this as an organising principle in order to express all laws of Newtonian mechanics as a representation of minimum problems.

The Hamiltonian of a system is the energy expressed in terms of position and momentum of a particle. Given the Hamiltonian, equations can be written and solved, which give the orbits of the particles in terms of a set of initial conditions. This is the most complete expression of the concept of determinism in classical mechanics and it is vital to the theories of magnetism and relativity and to the development of quantum mechanics. But let's step back for a moment.

The purpose of mechanics is to describe the time evolution of physical systems, and in classical mechanics this is done by describing the state of a physical system. The state of a physical system in classical mechanics is given by the knowledge of positions AND velocities of the points at which the matter of the system is concentrated. (we know where a classical particle is and how fast it is moving)

But what are the main tenets of classical mechanics?:

The principle of continuity: Continuous variation of physical quantities.
The state of an object at every instant of time is determined by describing its co-ordinates and velocities, which are continuous functions of time. This is what forms the basis of the concept of motion of objects along trajectories.

The principle of determinism:
If the states of an object (position and velocity) as well as all forces applied to it are known at some instant of time, we can precisely predict the state of the object (position AND velocity) at any subsequent instant. This is in agreement with the continuous nature of the change of physical quantities: for every instant of time we always have an answer to two questions: "what are the co-ordinates of an object "? and " how fast do they change"?

The principle of analysis: the analytical method of studying objects and phenomena.
Matter is treated as made up of different parts which, although they interact with each other, may be investigated individually. This means that first, the object may be isolated from its environments and treated as an independent entity, and second, the object may be broken up if necessary, into its components and constituents whose analysis could lead to an understanding of the nature of the object. It means that classical physics reduces the question "what is an object like" to "what it is made of" i.e. in order to understand an apparatus we must "dismantle" it, at least in our imagination, into its constituents. The same is applicable to phenomena: in order to understand the idea behind a phenomenon, we have to express it as a function of time, i.e. to find out what follows what. The extent of this "destruction" of the object can be evaluated each time by taking into account the interactions between different parts and relation between the time stages of a phenomenon. It may happen that the initially isolated object or part of it may considerably change with time as a result of its interaction with the surroundings (or interaction between parts of the object). However, since these changes are continuous, the individuality of the isolated object can be returned over any period of time.

4. The principle of independence:

The mutual independence of the object of observation and the measuring apparatus or observer (a consequence of the principle of analysis). The observer, the instrument and the object of measurement can be considered separately, independently from each other. In classical mechanics, this inter- reaction is simply ignored.

One should not forget the enormous range of applicability of classical mechanics: from the size of an atom- that is 10 (pr. -10) to the size of a galaxy- that is 10(pr. 20), and for speeds of up to 30000 km/sec or 10% of the speed of light

However, toward the end of the last century, the concepts of classical physics formed a seemingly perfect and magnificent edifice. The unification brought by the use of the minimum principle was, however, only a grouping of very different phenomena for treatment by the same mathematical formalism. Soon, contradictions began to appear:

 

The fundamental irreversibility imposed on the transformation of heat energy and work summed up in the Second Law of Thermodynamics did not agree with the laws of Newtonian mechanics. The conflict was resolved by Boltzmann’s formulation of the entropy equation and the development of statistical mechanics.

Michelson and Morley experiments did not ascertain the existence of ether. This contradiction was resolved by the synthesis achieved by the Special and then the General Theory of Relativity.

The Black Body experiments conducted by the German Bureau of Standards did not result in a single formula for the distribution of energy among emitted wavelengths. This led Plank to formulate in 1900 the discrete quantum structure of energy- taken by him initially as a mathematical trick without physical meaning.

In 1905 Einstein formulated the light quanta hypothesis: light consists of particles – photons, each having an energy and moving at the velocity of light. In turn this led to the to the problem of wave- particle duality of light.

The early planetary model of the atom proposed by Rutherford with electrons located around a small positively charged heavy nucleus, proved unworkable. When applying Maxwell’s theory of radiation, the prediction was that the atom was not stable, loosing continuously energy by radiation and the electrons then falling into the nucleus.

In 1913 Bohr produced a Quantum Theory of Line Spectra, and formulated a fundamental postulate - an atomic system can only exist permanently in a discontinuous series of "stationary states". In other words, electrons in an atom are allowed to be stable only in certain possible orbits, a constraint unknown in classical mechanics. Between 1913 and 1920 he formulated the Correspondence Principle: the laws of Quantum Physics should approach those of Classical Physics in the limit of large quantum numbers. The work during the period 1919 to 1925 that led to quantum mechanics may be described as systematic guessing guided by the Principle of Correspondence.

In 1924 de Broglie suggested that the duality should be extended not only to radiation, but also to micro-particles. He proposed to associate with every micro-particle corpuscular characteristics (energy and momentum) on one hand and wave characteristics (frequency and wavelength) on the other hand. De Broglie's ideas received confirmation in 1927 with the discovery of electron diffraction by Davisson and Germer.

Contradiction between theory and experimental data produced from the analysis of spectral lines led Heisenberg to reconsider the foundations of physics on a different basis. Physics should only use observables (an empiricist principle taken from Mach), initially in this case frequencies, intensities and polarisations of the spectral lines emitted by atoms. No mention should be made of classical orbits since no experiment can show their existence. According to Heisenberg: "The real obstacle, which we suspected indeed at the time, but did not yet understand, was the fact that we were still talking of electron pathways, and were really compelled to do so; for electron tracks were certainly visible in the cloud - chamber, so there also had to be electron pathways in the interior of the atom" (Heisenberg, 1989). In 1925 he obtained for the first time the relative intensities of spectral lines in agreement with experiment.

The Uncertainty Principle, first articulated in 1927 is the remarkable consequence of this work. The position and velocity of a particle cannot both be measured exactly, at the same time, even in theory. The very concept of exact velocity and exact position together, in fact, have no meaning in nature. In more detail, the product of the error committed in the determination of the position of a particle by the error committed in the determination of its momentum cannot in any case become inferior to the Plank constant. Since momentum is proportional to velocity, it means that all increase in the precision of measurement of position of a particle ends at the cost of a reduction in the precision of measurement of velocity. It was Heisenberg's decisive insight which led him to formulate the Uncertainty Principle as fundamental: " The answer was then, as you know, made possible by reversing the statement of the problem; the question was no longer to be- how do we represent the path of the electron in the cloud- chamber? instead we had to ask- are there perhaps, in the observation of nature, only such experimental situations as can be represented in the mathematical formalism of quantum theory? It is in other words correct, as Einstein once maintained against me, that theory first decides what can be observed? The answer could than be given in the form of the uncertainty- relation". (Heisenberg, The Beginnings of Quantum Mechanics in Goettingen in Heisenberg, 1989)

Quantum mechanics had its origin from two simultaneous but distinct research programmes: wave mechanics and matrix mechanics. In 1926, Schroedinger gave a precise formulation of de Broglie's wave hypothesis. He set out to find a wave equation for matter that would give particle like propagation when the wavelength is very small. Schroedinger soon realised that the wave function of a many electron atom is not defined in ordinary physical three dimensional space, as de Broglie would have it, but in a much more abstract configuration space described by the position coordinates of all electrons (Hilbert space). The wave is therefore very different from a simple electromagnetic wave and it is a much more abstract kind of mathematical field, much more difficult to understand, since it takes place in a formal space and its values are complex. This feature was something yet unknown in physics. How to interpret the wave function, what physical meaning should be given to it became the immediate concern.
Max Born analysis of electron scattering upon an obstacle, led him to the conclusion that "the electron can be anywhere in a place where the wave function is different from zero, and that there is no way to saying exactly where it is because this is a random effect. What one can derive from the wave function is the probability for an electron to be found in any small volume in the vicinity of a point x, the probability being given by the square of the wave function. Schroedinger originally supposed that the electron was spread in space and that the density at point x, y, z was given by the square of the wave function at that point. Born's proposal, namely that the square of the wave function gives the probability of finding the electron at xyz is now the accepted interpretation. The distinction between the two proposals is important. If the square of the wave function is small at a particular position, then Schroedinger’s interpretation implies that a small fraction of the electron will be always detected there - the square of the wave function is then a probability. In Born's proposal, nothing will be detected there most of the time, but when something is observed, it will be the whole electron- the square of the function is then the amplitude of a probability density. The concept of the electron moving in a well-defined path around the nucleus is replaced in wave mechanics by clouds that describe the possible locations of electrons. (Parenthetically, it is often a stumbling block that a physical quantity, the wave function, is represented by a complex quantity. It was to me. The reason is simply the analogy with the case of light waves. The type of expression, which in other wave systems represents the energy density, would in this case represent the particle density. Imaginary and complex numbers have been introduced in the 18th. C. As a natural extension of classes of numbers from positive to negative to rational (represented by fractions of integers) to irrational (which cannot be represented by fractions of integers), to imaginary which represent square roots of negative numbers. The geometry of complex numbers is similar to the geometry of circular and harmonic oscillators, so they have been introduced by Euler in the 18th. C into wave mechanics as a useful notation that permits generalisation and compressed expression of its equations. It is, in effect, this trend to changes in notation that permits abstraction and powerful compression of equations into short expressions, which made possible the development of theoretical physics since Euler. The same kind of approach made possible the expression of the wave taking into account the positions of many electrons in a single equation, by using a configured space- the Hilbert space, where each point in the space is a "n" dimension vector - rather then a Euclidian 3-dimensional physical space. So the complex mathematical forms of either matrix or wave mechanics did not come out of the blue to either Heisenberg, Schroedinger, Born or Dirac. Matrix calculus existed since 19th century; non-commutative algebra stems from Hamilton's introduction of quaternions around the mid-19th century, and the use of imaginary and complex numbers in classical wave mechanics, as I said before, started with Euler in the 18th century. The mathematical tools were there. For the matter, Courant and Hilbert's book on the methods of mathematical physics, which was published in 1924, was considered a Godsend by physicists at that time - it included precisely the mathematical methods they needed for the new quantum theory.)

Since both theories matrix and wave mechanics, predicted the same frequencies and intensities for the spectra of atoms, it appeared that they must be closely related. Schroedinger’s demonstration of their mathematical equivalence and the subsequent axiomatic presentation by Dirac and von Neumann completed the formalism. But fundamental questions remain, and in particular: what are the states of nature that the mathematical apparatus of the quantum theory is supposed to represent? This is the notorious interpretation problem of quantum mechanics
When did physicists become indeterminists? The majority around 1925-27, due to the power of quantum mechanics to produce physically significant results. The first remarkable event in this direction was Pauli’s determination of the spectral lines of Hydrogen in 1926. Probability and indeterminism came along with the success. But probabilistic methods and ideas already penetrated classical physics through the kinetic theory, Brownian motion, molecular statistics, radioactivity, radiation in general, and so on. They were familiar in the old quantum theory that preceded quantum mechanics in 1900-1925. The difference was expressed by von Neumann in his seminal book "Mathematical Foundations of Quantum Mechanics" of 1932. In classical physics, every probability judgement stems from the incompleteness of our knowledge. Probabilities, in this case, are non-physical, epistemic additions to the physical structure, a "luxury" according to von Neumann. Epistemic probability then, is a matter of degree of ignorance, or of opinion. In contrast, quantum mechanics has probabilities which stem from the chancy nature of the microphysical world itself - they are fundamental.

The difference is also reflected in the manner we understand the mathematical formalism of each theory. In classical mechanics we start with some intuitive concepts based on our attempts to describe and classify our direct experience. We then refine and make consistent these already given concepts by the means of mathematical formalisms with which we must become familiar. In quantum mechanics, we developed successfully the mathematical formalisms representing relations between observables, but are unable to figure out the kind of states of nature that the accepted mathematical structure could be taken to represent.

Bohr derived the correspondence principle from the observation that there is a close relation between the frequencies obtained from the classical theory of radiation and from his theory of line spectra when the quantum numbers are large. When Dirac formulated the fundamental equations of quantum mechanics in 1925-26, he relied on the correspondence principle and sought a solution for dealing with Heisenberg’s non-commuting entities. The development of his equations stemmed from the fact that the Hamiltonian dynamics can be formulated by means of non-commuting Poisson bracket algebra. From this point on, Dirac proceeded to formulate the fundamental laws of quantum mechanics, by simply taking them over from classical mechanics in Poisson bracket formulation. The correspondence principle has been once and for all subsumed in the above formulation. Dirac goes on to say " The correspondence between quantum and classical theories lies not so much in the limiting agreement when quantum numbers are large, as in the fact that the mathematical operations of the two theories obey the same laws- the correspondence principle generalised".

There is no contradiction between classical and quantum mechanics. Classical mechanics is a particular case of quantum mechanics, the case were the Plank constant can be neglected. It can then be said that classical physics is a relative knowledge of reality, of which quantum physics offers a more profound knowledge. We did not discover that classical mechanics, with its conceptions of causality and determinism, is false- we discovered the limits where it is valid. We admit that the classical corpuscular conception is inadequate at the atomic level- the electron cannot be assumed to be a particle of classical mechanics.

But there is a fundamental contradiction, within the quantum theory itself. To use van Fraassen’s description " On the one hand, it is a truism that quantum physics describes an indeterministic world. On the other, the quantum theory of an isolated system describes its state as evolving deterministically. How can the two be reconciled?" (van Fraassen, 1991). There are four alternative proposals:

I. The Copenhagen Interpretation. One way is to deny the apparent indeterminism. In this case, measurement is not a process that can be characterised as the evolution of an isolated system ever: a measurement is an interaction incompletely described, by leaving out something or other. In its most radical form, it means that quantum theory was devised to describe only situations in which an observer (or at least the measuring environment) is involved, while leaving that part of the equation. This is the Copenhagen Interpretation: "The right way to understand quantum mechanics is not as a true description of physical reality but rather as an instrument for predicting outcomes of laboratory experiments. There is no coherent interpretation of quantum mechanical formalism as describing an unobservable reality that is responsible for those experimental results. That reality is beyond our capabilities." Further, In Bohr's view, " the universe is basically an un-analysable whole, in which the notion of separateness of particle and environment is an abstraction that has no content, except as an approximation that may be applied within the limit of Heisenberg's principle" (Bohm and Hiley, in Barut et al, 1984).

The Copenhagen interpretation can be taken as the "majority" view within the research community and its main theses are as follows (Omnes 1994):

1. The theory is concerned with individual objects.

We must note here an essential difference between the quantum and classical formalisms. Within the framework of classical physics the laws describing the behaviour of large number of objects are of a statistical nature, while the laws relating to the behaviour of an individual object are dynamic. By considering the element of chance in the behaviour of a single object, quantum mechanics places itself in a special position- that of a statistical theory of an individual object.

2. Probabilities are primary.

The probabilities of quantum theory do not reflect some ignorance of the observer or the theorist, but are a characteristic feature of nature itself, or if one prefers of the relation between reality and its mathematical description by the theory. It is impossible to predict anything except probabilities, as far as circumstantial experiments are concerned. ; one can say that the theory is complete when it predicts these probabilities for all conceivable measurements.

The frontier separating the observed object and the means of observation is left to the choice of the observer.
According to Heisenberg, the physical world can be split in two parts, the observed object and the observing system. A frontier separates the modes of description that one should use for each one of them- quantum and classical respectively. However, every choice in the level of the frontier will correspond to a different error, which will be decreasing when one considers larger and larger observing systems. Bohr's point of view is more radical. According to him, quantum mechanics only refers to the classical properties of a macroscopic system and it is expressed by statements relevant to that domain. It is impossible to reach by induction anything explicit concerning an atomic object from the knowledge of some classical data. There is no frontier as Heisenberg would have it, but an experiment should be considered as an indivisible whole- atomic objects have no specific properties of their own and only factual phenomena exist. They concern the totality of the atomic object and the macroscopic apparatus altogether. The concept of "phenomenon" should be strictly reserved to what is directly perceived and should be always expressed by a statement in the language of classical physics.

The observational means must be described in terms of classical physics
.
An experiment is an action, if only to set up the apparatus, and this can only be expressed by ordinary language - for instance, a user's manual- not by mathematical symbols.

The act of observation is irreversible and it creates a document
.
For an experimental datum to be used some time after its occurrence, it must be recorded on some document. It may be written, shown on a dial, or kept in a computer memory, or in the brain of the observer. Whatever is done, the production of such a document is essentially an irreversible process.

The quantum jump taking place when a measurement is made is a transition from potentiality to actuality.
A measurement involves an action upon the measured object and it is difficult to say how strong and how deeply it perturbs the object. It is assumed in any case to provoke a reduction of the wave packet, which is in no manner a consequence of quantum dynamics but a jump that Bohr considered to be a new physical law. The jump is a random effect. Its probability follows from the theory but its mechanism or its actuality is supposed to exceed the reach of theory.

Complementary properties cannot be observed simultaneously
.
Bohr defined complementarity as a mode of thought admitting various concepts that are mutually exclusive while being nevertheless necessary for a full understanding of all experiments. The best example is provided by radiation, which can be considered as a wave or as a collection of particles according to the experiment. Bohr considered the choice of one mode of thought rather than another to be completely dictated by the experimental environment. Radiation does not have a unique conceptual framework but one or the other must be used in each case - waves when using a receiving antenna, particles when using a photo multiplier. These concepts have only a cognitive value in definite conditions.

Only the results of measurements can be taken as true.
Complementarity and truth are intimately related notions. Heisenberg stressed the idea that as long as one is restricted to talk only about phenomena, they become the only elements of truth at your disposal. One cannot say for instance that a particle has a straight-line trajectory if it is not detected, as the track is not seen, as happens in a vacuum. There is no meaning in saying that a photon goes along a definite arm in an interferometer as long as this is not confirmed by a phenomenon signalling it. Moreover, to say that something is " happening" or " occurring" has no meaning, except when this is a way of describing a phenomenon. One cannot say anything true about an atom between the time it is prepared and the time it is measured. This is not legitimate. An often-quoted example concerns a book in a closed room. After closing the door, on knows that the book is still in the same place, because this is a classical statement involving certainty and it is legitimate in the classical world (relying by the way, on determinism). Nothing of this sort can apply to an atom except at the few times when it is directly observed. Is it still there or not? Heisenberg considered the question as meaningless.

Pure quantum states are objective but not real.
Now here things get a little more difficult. The notion of objectivity is usually defined according to Kant. Something is being said to be objective when it exists outside of the mind, as an object independent of the mind. To elaborate: remember that in classical mechanics we had position and velocities as basic notions, and Newton's laws told us how positions and velocities evolve with time. We also have probabilistic theories, in which the basic objects are probabilities and we can set up laws dictating how these probabilities evolve with time. Quantum mechanics has basic objects called amplitudes - or probability amplitudes, which are complex rather than real numbers (remember the square of the wave function). The mathematical part of quantum mechanics dictates how the amplitudes evolve with time- the evolution equation being the Schroedinger equation mentioned earlier. Note again that the evolution of the amplitudes is deterministic. The mathematical part of quantum theory also contains objects called observables (technically they are linear operators) Finally, given an observable- call it A- and a set of amplitudes, one can compute a number called the mean value of A, which we should denote by (A). Quantum mechanics tell us how to compute the time evolution of amplitudes, and then how to use these amplitudes to obtain the mean value (A) of an observable A. How do you connect these mathematical concepts with physical reality? Let's be specific and suppose that you are a particle physicist - you like to accelerate particles to high energies, aim them at a target, and see if new patterns of interaction emerge. You have surrounded your target with a number of detectors, I, II, III, and so on, that will click when a particle of the right kind hits them at the right time. The "right kind" means the right charge, the right energy, etc. The " right time" means that, for instance, detector II is activated only if detector I has clicked, and then only for a definite interval of time. You decide to call event A the situation in which I and II click and III does not. (Event A is the signature of a particular kind of collision that you expect to see in your experiment). You now consult your manuals of quantum mechanics, and they will tell you which observable corresponds to event A- events are thus viewed as a special kind of observables. The manuals will also tell you how to compute the amplitudes relevant to the experiment. Then you will be able to estimate the mean values of the amplitudes (A). A fundamental tenet of quantum theory is that (A) is the probability that you will see event A. Specifically, if you repeat your experiment a large number of times, the proportion of cases in which all detectors will click as required is (A). This is the connection between the mathematics of quantum theory and operationally defined physical reality. Incidentally, one should say that not all chapters of the manuals of quantum mechanics have been written- we do not know many of the details of interactions between particles and this is why experiments are still being performed. So let me repeat the main features: A physical process is taking place- say a collision between particles. The totality of measurements represents an event, and quantum theory allows us to compute its probability. In this way we obtain a description of the world that is profoundly different than that of classical mechanics, but completely consistent. If you want to say that quantum mechanics is deterministic, it is: the Schroedinger equation predicts unambiguously the time evolution of probability amplitudes. If you want to say that quantum mechanics is probabilistic you may; the only predictions are about probabilities (the probabilities are occasionally 0 or 1, and then you have certainty, but usually this is not the case). But while quantum mechanics is probabilistic, it is not a probability theory in the usual sense. Specifically, when event "A" and event "B" are defined in an ordinary probabilistic theory, an event "A and B" is also defined (with the intuitive meaning that "A and B" occurs if "A" occurs and "B" occurs). In quantum mechanics " A and B" is usually not defined- there is no entry for "A and B" in the manuals of quantum mechanics. Physically, what happens is that (in general) you cannot pick detectors to measure "A" and "B" simultaneously - i.e. check at the same time if "A" occurs and "B" occurs. You can try to measure first "A" then "B" or first "B" then "A", but you get different answers. This is often expressed by saying that the first measurement perturbs the second. This intuitive interpretation is not really wrong, but it is somewhat misleading - it suggest that the event " A and B" in fact makes sense, but we are too clumsy to measure it. The mathematics of quantum theory is, however, unambiguous- "A and B" do not make sense. This has to do with what we talked about at the beginning - the observables A and B do not commute .So, is quantum mechanics objective? The answer is positive. When it asserts a property or reaches a conclusion, the theory only makes use of established facts. It has no more difficulty with objectivity than classical physics, though its background is less obvious. But, according to the Copenhagen interpretation, the right way to understand quantum mechanics is not as a true description of physical reality but rather as an instrument for predicting outcomes of laboratory experiments.

II. de Broglie - Bohm’s "Hidden Variable" or "Causal" Interpretation

(Bohmian Mechanics, Bohm 1951).
Bohm's basic criticism of the Copenhagen interpretation is that "there is a uniquely defined reality which can be grasped in thought, and is yet independent of thought. Without considering this reality, science is reduced to a set of formulas and recipes for predicting experiments" (Bohm and Hiley, in Barut 1984). In his theory, the quantum world consists of particles that always have determinate positions and a wave function interpreted as a "guiding field" - a kind of radar- that propagates in configuration space of the particles according to Schroedinger equation of motion. The particles change their relative positions over time according to the deterministic equation of motion, which reflects the evolution of the wave function in configuration space. According to J. S. Bell, " No one can understand this theory unless he is willing to think the wave function as a real objective field rather than just a probability amplitude. Even if it propagates not in 3- space but in 3N-space" (Bell, 1964). The field is just as "real" and "objective" as say the fields of Maxwell’s theory- although its action on the particle is rather original. So the real change in the Bohmian quantum world is the change in the wave function as a field in the configuration space and the change in the positions of the particles. The only real properties are the position properties of the particles, and these positions change as the particles are pushed and pulled in various ways by the evolving field. However, J. S. Bell showed (Bell, 1964) that no deterministic hidden variable theory can reproduce the predictions of quantum mechanics for certain composite systems without violating the principle of locality, in other words, no purely deterministic theory can explain the behaviour of subatomic particles, unless it invokes some kind of "non - local" instantaneous connection. Bohm’s interpretation requires superluminal communication between the guiding wave and the particle. The de Broglie-Bohm mechanics has been very much a minority position, albeit with strong supporters. (Some years ago, James Cushing, a leading philosopher of science said: " How do so many people know that the de Broglie - Bohm theory does not work, when so few people studied it?") Here what Bohm himself said about this: " For though perhaps unsatisfactory in many respects, they made possible, as explained, certain important insights into the meaning of quantum theory. In addition, they helped keep alive an interest in finding a deeper understanding of the quantum theory. I feel that a correct approach might have been to encourage such work as a purely provisional approach, but recognising that it was not likely in itself to be a fundamental theory, without further radically new ideas. The result of not doing this sort of thing was that, for the most part, fundamental physics was reduced to its present state of relying almost exclusively on formulae and recipes constituting algorithms for the prediction of experimental results, with only the vaguest notions of what these algorithms might mean physically" (Bohm and Hiley: de Broglie Pilot Wave Theory and the Further Development, in Barut, Merwe and Vigier, 1984)

 

Even more radical, but in a different direction, is Everett’s "Many Worlds" Interpretation: indeterminism is an illusion, and it disappears if we describe all the worlds there are besides our own. " In 1957, in his Princeton doctoral dissertation, Hugh Everett III, proposed a new interpretation of quantum mechanics that denies the existence of a separate classical realism and asserts that it makes sense to talk about the state vector for the whole universe. The state vector never collapses, and hence reality as a whole is rigorously deterministic. This reality, which is described jointly by the dynamical variable and the state vector, is not the reality we customarily think but is a reality composed of many worlds. By virtue of the temporal development of the dynamical variables, the state vector decomposes naturally in orthogonal vectors reflecting a continuing splitting of the universe into a multitude of mutually unobservable but equally real worlds, in each of which every good measurement has yielded a definite result and in most of which the statistical quantum laws hold. Looked at in another way, Everett's interpretation calls for a return to naïve realism and the old fashioned idea that can be a correspondence between formalism and reality. Because physicists have become more sophisticated than this, and above all because the implications of this approach appear to them to be so bizarre, few have taken Everett seriously. Nevertheless his basic premises provides a stimulating framework for discussion of the quantum theory of measurement" (De Witt and Graham, 1973)
IV.

The fourth alternative- the Modal Interpretation (van Fraassen, 1991), is to deny neither the determinism of the total system evolution nor the indeterminism of outcome, but to say that the two are different aspects of the total situation. The quantum state delimits what can and cannot occur, and how likely it is- it delimits possibility, impossibility, and probability of occurrence- but it does not say what actually occurs. In other words, we can deny the identification of value attribution to observables with value attribution of states- the state can develop deterministically, with only statistical constraints on changes in the values of the observables. This may be a fruitful path to the construction of quantum logic.
In choosing which way to interpret the quantum realm, we must accept either inherent uncertainty or believe that locality is an illusion.

Ultimately, however, it is the process of transition from the sub atomic to the macroscopic world that it is unknown and remains the fundamental problem. One approach to dealing with it is the application of environment induced decoherence, the process that leads from superposition to mixture, a currently substantial research programme. In more specific terms, what remains unknown is the process that leads from mixture to definite outcome - the notorious measurement problem in quantum mechanics.

According to Zurek, " the key point of the decoherence approach is that there is a basic difference between the predictions of quantum theory for quantum systems which are closed (isolated) and open (interacting with their environments). In the case of a closed system, Schroedinger equation and the superposition principle apply literally. By contrast, for an open system the superposition principle is not valid - there the relevant physics is different. The paramount implications of the interaction with the environment for the transition from quantum to classical were not appreciated until recently. Thus, decoherence (between the quantum and the classical environment) results in a negative selection process which dynamically eliminates non- classical states" (Zurek, 1993) To describe this in physical terms- "in an ideal model of measurement, the coupling between a macroscopic apparatus ("meter") and a microscopic system ("atom") results in their entanglement and produces a microscopic system " meter + atom" system. Such a superposition is, however, never observed. Schroedinger has illustrated vividly the problem, replacing the meter by a "cat" and considering the dramatic superposition of dead and alive animal states. Although such a striking image can only be a metaphor, superpositions involving "meter states", are often called "Schroedinger cats". Following von Neumann, it is postulated that an irreversible reduction process takes the quantum superposition into a statistical mixture in a "preferred" basis, corresponding to the Eigen values of the observable measured by the meter. From then on, the information contents in the system can be described classically. The nature of this reduction has been much debated, with recent theories stressing the role of quantum decoherence." ( Brune, et al, 1996) .

What is remarkable in comparing the four proposals is their divergence- there is no dominant narrative at this juncture

 

B. SOME PHILOSOPHICAL IMPLICATIONS

The Tower and the Sphere: two metaphors in the methodology of science

Since the formulation of quantum mechanics, more than seventy-five years ago, physicists have attempted explanation often in the form of "mental experiments", and it is only recently that technology has permitted their actual physical conduct. Possible explanation depends on the success of such experimental programmes, but in the interim, we are still in the realm of speculation about the bridging process.

This degree of complexity and the large assumptions about transition may lead to David Bohm’s conclusion " Quantum theory presupposes a classical level and the correctness of classical concepts at that level. Classical definiteness and quantum potentialities complement each other in describing the system as a whole". A new logic should start with the complementarity of the two formalisms, classical and quantum, as its universal principle.

Science may be unable to offer a Theory of Everything, but it does make a fundamental contribution to our view of the world. The concept of harmony informed ancient, medieval and renaissance philosophy, and the rationalism of Newtonian physics greatly influenced the Enlightenment. The concept of uncertainty, in turn, has been a key to the philosophy in this century, often with strange results, since in the absence of a dominant narrative we do not have a completely formulated epistemology associated with quantum mechanics as Kant’s is with Newtonian Physics. The concept of complementarity of the two formalisms and a criterion of transition based on an increasingly predictable evolution of quantum states may lead us to an acceptable formulation. But there is also a question of methodology which arises in this context: is it not appropriate at this juncture that physicists consider turning this question on its head as Heisenberg turned his, and ask not how to reduce the formalism of classical mechanics to the more fundamental of quantum theory, but rather how to link the two formalisms through a process of transition? In more general terms, would we gain more insight by changing from a foundationism based on reduction to a network based relationism?

First of all, I would like to replace the notion of " fundamental level" to that of "stable link" between the random processes that take place at the sub- atomic level, below 10 (pr. -10) and the determinate processes at the macro level above. In other words, to say that on one hand at the sub- atomic level interactions are either random but local or determinate but non- local (the selection is yet to be decided), and on the other, at the macro level interactions are determinate and local. The two kinds of processes that characterise each of the two domains are linked by stable relations and that the stability of that link is what can be called fundamental, since we know that it does exist. How these stable relations evolved, however, is unknown at the present.

The first approach - foundationism - is to arrange science hierarchically, with the fundamental theory at the base and with theories at the higher levels explained by those below. The implied reductionism is then considered as another basic hypothesis. Metaphorically, this approach can be represented by a tower.

The second approach is to look at science as a network of theories, which implicate and partially explain each other, but without a privileged foundational level. Metaphorically, this approach can be represented by a sphere, with groups of theories and practices linked to each other. But what can be gained by a move from foundationism to relationism? First of all, it allows the display of probable links between theories, without a commitment to reduction of one to another until such time that those bridging laws become entirely known. When one theory is reducible to another, then the two are collapsed into one. Second, it charts the process of theory formation, the evolution of our knowledge over time, and this map is important to continuing research.

The understanding of the process of transition from the sub- atomic to the macroscopic world, in other words, the understanding of how the "stable link" is formed, should constitute the focus of research. What, in the first instance, are the grounds for the existence of this link? Is the evolution toward increasingly determinate physical states contingent on a process of self-organisation emerging at a level above the Plank scale?

Victor Suchar

BIBLIOGRAPHY

Achinstein, P. 1991: Particles and Waves. OUP, 1991

Albert, D.Z. 1994: Quantum Mechanics and Experience. Harvard, 1994

Albert, D.Z. 2000: Time and Chance. Harvard, 2000

Auyang, S.Y. 1995: How is Quantum Field Theory Possible? OUP, 1995

Bacciagaluppi, G. and M. Heimmo 1996: Modal Interpretations, Decoherence, Measurement in Studies in History and Philosophy of Modern Physics, Elsevier Science, Sept. 1996

Barbour, J. 1989: Absolute and Relative Motion. Vol I: The Discovery of Dynamics. CUP 1989.

Barbour, J. & Pfister, H. (Ed.) 1995.: Mach's Principle. Birkhauser, 1995

Ben- Menahem & Pitkowski, I.(Ed.) 2001: Foundations of Statistical Physics. Studies in the History and Philosophy of Modern Physics, Special Number. Elsevier Science, Vol. 32 B. No. 4 2001.

Barut, A.O.; van der Merwe, A. & Vigier, J.- P. (Editors) 1984: The Quest Continues: Studies and Essays in Honour of Louis de Broglie, Paul Dirac and Eugene Wigner. CUP 1984

Bitbol, M. & Darrigol, O. (Editors) 1989: Schroedinger, Philosophy and the Birth of Quantum Mechanics. Frontieres/Blanchard, 1989

Blokhintsev, D.I. 1968: The Philosophy of Quantum Mechanics. Reidel, 1968

Blokhintsev, D. I. 1973: Space and Time in the Microworld. Reidel, 1973

Bell, J.S. 1994: Speakable and Unspeakable in Quantum Mechanics. CUP,1994

Beller, M. 1999: Quantum Dialogue. The Making of a Revolution. Chicago, 1999

Bohm, D. & Hiley, B.J. 1993: The Undivided Universe. Routledge 1993

Bohm, D. 1984: Causality and Chance in Modern Physics. Routledge, 1984

Bohm, D.1980: Wholeness and the Implicate Order. Routledge, 1980

Bohm, D 1951: Quantum Theory. Prentice Hall, 1951

Bohr, N.1987: Philosophical writings. Ox Bow, 1987

Born, M. 1949: Natural Philosophy of Cause and Chance. OUP, 1949

Boltzmann, L. 1974: Theoretical Physics and Philosophical Problems. Reidel 1974

Born, M. 1969: Physics in my generation. Springer, 1969

Born, M. 1971: The Born- Einstein letters. Macmillan 1971

Brody, T. 1994: The Philosophy Behind Phisics. Springer, 1994

Broglie, L. de 1930: Introduction de la mecanique ondulatoire. Hermann, 1930

Broglie, L. de 1943: De la mecanique ondulatoire a la theorie du noyau. Hermann, 1943-46

Brune, M., et al 1996: Observing the Progressive Decoherence of the "Meter" in a Quantum Measurement. Physical Review Letters 77 No. 24 (Dec. 1996 )

Bub, Jeffrey 1997: Interpreting the Quantum World, Cambridge, CUP, 1997

Butterfield, J. & Pagonis, C.(Ed.) 1999: From Physics to Philosophy. CUP 1999

Callender, C. & Huggett, N. (Ed) 2001: Physics Meets Philosophy at the Plank Scale. CUP, 2001

Cartright, N 1994: Nature’s Capacities and their Measurement. OUP, 1994

Cartwright, N. 1999: The Dappled World. A Study of the Boundaries of Science. CUP 1999

Cassidy, D.C. 1992: Uncertainty. The life and science of W. Heisenberg. Freeman, 1992

Cao, T. Y. 1997: Conceptual Developments of 20th. Centuries Field Theories. CUP, 1997

Cao, T. Y. (Ed.) 1999: Conceptual Foundations of Quantum Field Theory. CUP, 1999

Cohen, R. S.&Seeger, R. J. 1970: Ernst Mach Physicist and Philosopher. Reidel, 1970

Cushing, J.T & McMullin, E. (Ed.). 1989: Philosophical Consequences of Quantum Theory. Reflections on Bell's Theory. Notre Dame, 1989

Cushing, J.T. 1994: Quantum Mechanics. Historical Contingency and the Copenhagen Hegemony. Chicago, 1994

D’ Espargnat, B. 1990: Reality and the physicist. CUP, 1990.

De Witt, B. & Graham, N. (Ed.) 1973: The Many Worlds Interpretation of Quantum Mechanics. Princeton, 1973.

Dickson, W. M. 1998: Quantum Chance and Non- locality. CUP 1998

Dirac, P.A.M. 1935: The Principles of Quantum Mechanics. OUP, 1935 (2nd. Ed.)

Dirac, P.A.M. 1971: The Development of Quantum Theory. Gordon & Beach, 1971

Einstein, A. & Infeld, L. 1938: The Evolution of Physics. CUP, 1938

Ellis, J. & Amati, D. (Ed.) 2000: Quantum Reflections. CUP,2000

Enz, C.P. 2002: No Time to be Brief. A Scientific Biography of Wolfgang Pauli. OUP 2002

Fine, A. 1986: The Shaky Game. Einstein Realism and the Quantum Theory. Chicago, 1986

Feynmann, R. P. & Weinberg, S. 1987: Elementary Particles and the Laws of Physics. The 1986 Dirac Memorial Lectures. CUP 1987

Feynman, R. P. 1992: The Character of Physical Law. Penguin, 1992

Galison, P. 1987: How Experiments End. Chicago, 1987

Galison, P. 1997: Image & Logic. A Material Culture of Microphysics. Chicago, 1997

Gibbins, P. 1994: Particles and Paradoxes. The limits of quantum logic. CUP, 1994

Giere, R.N. 1999: Science without Laws. Chicago, 1999

Gillies, D. 1993: Philosophy of Science in the Twentieth Century. Four Central Themes. Blackwell, 1993

Giulini, D., Joos, E. Kiefer, C., Kupsch, J., Stamatescu, I.-O., Zeh, H.D. 1996: Decoherence and the Appearance of a Classical World in Quantum Theory. Springer,1996

Greenstein, G. & Zajonc, A. G. 1997: The Quantum Challenge. Modern Research in the Foundation of Quantum Mechanics. Jones & Bartlett, 1997

Haroche, S., M Brune, J. M. Raimond 1997: Experiments with Single Atoms in a Cavity: Entanglement, Schroedinger Cats and Decoherence. Philosophical Transactions Royal Society Apr. 1997

Healey, R. 1989: The Philosophy of Quantum Mechanics, CUP, 1989

Heisenberg, W. 1971: Physics and Beyond. Harper & Row, 1971

Heisenberg, W. N.D.: The Physical Principles of Quantum Theory. Dover, N.D.

Heisenberg, W. 1958: The Physicists Conception of Nature. Hutchinson, 1958

Heisenberg, W. 1989: Encounters with Einstein and other Essays on People, Places and Particles, Princeton, 1989

Hertz, H 1899: The Principles of Mechanics. Macmillan, 1899

Hiley, B.J. & Peat, F.D. 1987: Quantum Implications. Essays in honour of David Bohm. Routledge, 1987

Holland, P.R. 1993: The Quantum Theory of Motion. An account of the Broglie- Bohm causal interpretation of quantum mechanics. CUP 1993

Hughes, R. 1989: The Structure and Interpretation of Quantum Mechanics. Harvard, 1989

Isham, C.J. 1995: Lectures on Quantum Theory. Imperial Coll., 1995

Jammer, M. 1974: The Philosophy of Quantum Mechanics. Wiley, 1974

Jammer, M. 1989: The Conceptual Development of Quantum Mechanics. AIP, 1989

Jeans, J.H. 1924: Report on Radiation and the Quantum Theory. Physical Soc., 1924

Kilminster, C. W. (Editor) 1989: Schroedinger. Centenary of a Polymath. CUP 1989

Kragh, H.1992: Dirac. A Scientific Biography. CUP, 1992

Krips, H. 1987: The Metaphysics of Quantum Mechanics. OUP, 1987

Kuhn, T. 1978: Black Body Theory and Quantum Discontinuity 1894- 1912. OUP 1978

Kurusoglu, B. & Wigner, E.P. 1987: P.A.M. Dirac. CUP, 1987

Lindemann, F.A. 1932: The Physical Significance of Quantum Theory. OUP, 1932

Laurikainen, K. U. 1988: Beyond the Atom. The Philosophical Thought of Wolfgang Pauli. Springer, 1988

Mach, E. 1919: The Science of Mechanics. Open Court, 1919

Margenau, H.1950: The Nature of Reality. A philosophy of modern physics. Mc Graw Hill, 1950

Mehra, J. & Rechenberg, H. 1982:The Historical Development of Quantum Mechanics. Springer, 1982- 1987

Moore, W. 1989: Schroedinger : Life and Thought . CUP, 1989

Morrison, M. 2000: Unifying Scientific Theories. Physical Concepts and Mathematical Structures. Cambridge, 2000

Neurath, O., Carnap, R. Morris, C. 1955: Foundations of the Unity of Science. Chicago, 1955

Omnes, R. 1994: The Interpretation of Quantum Mechanics. Princeton, 1994

Omnes, R. 1999: Understanding Quantum Mechanics. Princeton, 1999

Omnes, R. 1999: Quantum Philosophy. Princeton, 1999

Pais, A. 1991: Niels Bohr and his times. Oxford, 1991

Pais, A. 1992: Subtle is the Lord. The Science and life of A. Einstein. OUP, 1982

Pauli, W. 1994: Writings on Physics and Philosophy. Springer, 1994

Plank, M. 1925: A Survey of Physics. Methuen, 1925

Redhead, M. 1987: Incompleteness, Non- locality, Realism. Prolegomena to the Philosophy of Quantum Mechanics. OUP 1987

Redhead, M. 1995: From Physics to Metaphysics. CUP 1995

Ridderboss, K. 2002: The Coarse Graining Approach to Statistical Mechanics in Studies in the History and Philosophy of Physics,. Elsevier Science, Vol. 33B. No. 1. March 2002.

Salam, A. 1990: Unification of Fundamental forces. CUP, 1990

Scheibe, E. 1973: The Logical Analysis of Quantum Mechanics. Pergamon, 1973

Schlipp, P.A. 951: Albert Einstein: Philosopher and Scientist. Open Court, 1951

Schroedinger, E. 1928: Collected Papers in Wave Mechanics. Blackie, 1928

Schroedinger, E. 1950: Space- Time Structure. CUP, 1950

Schroedinger, E.: The Interpretation of Quantum Mechanics. Ox Bow, 1987

Sklar, L. 1985: Philosophy and Spacetime Physics, California1985

Sklar, L 1995: Philosophy of Physics. OUP, 1995

Sklar. L. 2000: Theory and Truth. OUP 2000

Stapp, H.P. 1993: Mind , Matter and Quantum Mechanics. Springer, 1993

Taylor, J. G. (Editor 1987: Tributes to Paul Dirac. Inst. of Physics/Hilger, 1987

Toretti, R. 1999: The Philosophy of Physics. CUP, 1999

Treiman, S. 1999: The Odd Quantum. Princeton, 1999

Van der Waerden, B.L. 1968: Sources of Quantum Mechanics. Dover, 1968

Van Fraasen, B.C. 1991: Quantum Mechanics . An empiricist view. OUP, 1991

Von Neumann, J. 1955: Mathematical Foundations of Quantum Mechanics. Princeton, 1955

Weyl, H. 1931: The Theory of Groups and the Quantum Mechanics. Methuen, 1931

Wheaton, B.R. 1992: The Tiger and the Shark. Empirical roots of wave particle dualism. CUP, 1992

Wheeler, J.A. & Zurek, W. H. (Ed.) 1983: Quantum Theory and Measurement. Princeton, 1983

Wigner, E. 1979: Symmetries and Reflections. Ox Bow. 1979

Zurek, W. H. and J.P. Paz 1993: Decoherence, the Quantum and the Classical in Symposium on the Foundations of Modern Physics, World Scientific, 1993