Wave-particle duality: the conflict of 1st quantization (one wavefunction per onshell particles) and 2nd quantization (multiple wavefunctions per particle, with a sum over histories)
The Schroedinger 1st quantization wavefunction amplitude is exp(iS), where S is the action of the path, measured in units of h-bar. This is a simple solution to Schroedinger’s wave equation, which has a single wavefunction.
Feynman’s genius was explicitly updating this 1st quantization wavefunction to the multipath interference of 2nd quantization, where the uncertainty principle is no longer a mystery but a simple result of multipath (multiple wavefunctions) interference.
2nd quantization quantizes the field, and the field quanta then provide the stochastic or random interactions with charges that account for non-classical behaviour of particles whose path actions are small compared to h-bar.
1st quantization is based on mystery, and merely re-asserts Planck’s E = hf in the form of the uncertainty principle Et = h (remember f = 1/t) allowing no mechanism. Dirac proved the necessity for 2nd quantization in 1927, on the basis that the Hamiltonian energy as written in Schroedinger’s 1st quantization wave equation makes Schroedinger’s single-wavefunction equation (basically the whole of undergraduate quantum mechanics) non-relativistic and thus wrong. Dirac’s replacement, relativistic Hamiltonian spinor has negative energy states, predicting pair production in the vacuum, so the vacuum’s field is quantized. Feynman recognised that the virtual particles of the quantized force field interact with charges in a manner partly analogous to real radiation, at least from the perspective of imparting force by interaction, i.e. virtual photon scattering by charges in Feynman diagrams. Summing all possible virtual photon interactions with a charge, appropriately weighted using exp(iS) as the amplitude for each path of action S, gives the path integral.
However, we live in an Orwellian “doublethink” world when it comes to 1st and 2nd quantization. For a pure mathematician, no equation that has any applicability is “wrong”. Take epicycles, the incorrect earth centred universe of Ptolemy’s highly popular Almagest (published 150 AD, some 400 years after Aristarchus of Samos had correctly postulated the solar system with spinning earth in 250 BC!). If you are a mathematician, it really doesn’t matter whether the sun orbits the earth or the vice-versa, so long as the equations are interesting. A mathematician will happily try to find dualities. Dr Lubos Motl, I recall, suggested to me that Ptolemy’s epicycle equations for planetary motions were not wrong in the sense that they were useful for predictions. (However, although Ptolemy could predict the positions of planets as seen in the sky, where only two degrees of freedom – latitude and longitude – are the variables, Ptolemy’s epicycle model is not a mathematical duality of the correct 3-dimensional motion of the planets, since it fails to properly model the variations in the distances of the planets from the earth.)
The point is, Maxwell, unifier of electricity and magnetism, tried to find a mechanism for the equations of electromagnetism using a complex model of space, filled with moving parts. When that model failed, more “pure” mathematicians and philosophers like Mach combined forces in a mathematical revolution in physics which, like all revolutions, is sustained by Orwellian “doublethink”. While it is the basis of the Standard Model that Feynman’s 2nd quantization gives fundamental particles uncertainty by random interactions with quantized (non classical) field quanta, no efforts are made to deal with this by radiation transport Monte Carlo simulations of the vacuum dynamics. Instead, the physicist hides behind the mathematics of the path integral, just as medieval Ptolemaic believers hid behind the trignometry of epicycles, for obfuscation. In addition, 1st quantization (single wavefunction per particle!) non-relativistic quantum mechanics continues to be taught because it is easier for students to apply to atomic energy levels than 2nd quantization. In this “doublethink”, the errors of 1st quantization persist as a hardened dogma, despite being overturned by relativistic 2nd quantization, where indeterminancy arises simply from field quanta interactions.
Einstein and Infeld in their book “Evolution of Physics” discuss the randomness of Brownian motion. When the random, indeterministic motion of fragments of pollen grains was first seen under a microscope, the water molecules bombarding the fragments were invisible, and Brown actually believed that the motion was intrinsic to small particles, an inherent indeterminancy on small scales in space and time! This error is precisely Bohr’s 1st quantization error. It is no wonder that Bohr was so ignorantly opposed to Feynman’s path integral:
“… My way of looking at things was completely new, and I could not deduce it from other known mathematical schemes … Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle …”
This attitude of Bohr persists today with regard to the difference between 1st and 2nd quantization; the attitude is that because non-relativistic 1st quantization was discovered first, and is taught first in courses, it must somehow take precedence over the mechanism for indeterterminancy in quantum field theory (2nd quantization). The doublethink of most textbooks omits this and glues on 2nd quantization as a supplement to 1st quantization, rather than as a replacement of it! Why not have doublethink, with two reasons for indeterminancy: intrinsic, unexplained, magical indeterminancy typified by the claim “nobody understands quantum mechanics (1st quantization)”, plus the mechanism that virtual particles in every field randomly deflect charges on small scales (like Brownian motion on dust)! Feynman’s answer of course is that 1st quantization is plain wrong, since it is non-relativistic and also Occam’s Razor tells us that we need 2nd quantization only because it explains everything mechanically without needing an 1st quantization (intrinsic or magical) uncertainty principle:
“I would like to put the [1st quantization] uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [wavefunction phase amplitudes] for all the ways an event can happen – there is no need for an [1st quantization] uncertainty principle! … on a small scale, such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by 2nd quantization field quanta] becomes very important …” – Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84.
This blog post is motivated by a kind email from Dr Mario Rabinowitz on wave-particle duality in the double-slit experiment, which was sent as a result of yet another “not even wrong” paper published in a journal which uses non-relativistic (single wavefunction!) 1st quantization quantum mechanics to analyze quantum indeterminancy in the double-slit experiment. Whenever you use the earth centred planetary theory of Ptolemy to try to get higher accuracy, you always “discover” more evidence for endless epicycles, so the dogma is becomes a self-fullfilling cult, sucking in research funding and peddling science fantasy in place of fact. The facts don’t speak for themselves, because they aren’t as exciting as dogmatic indeterminism:
Thank you for emailing me your paper “Examination of wave-particle duality via two-slit interference”. In Section 5.1, at page 26, you state:
“In Bohm’s quantum mechanical theory, there is no wave-particle duality. [The Undivided Universe, 1993] For Bohm, the particles shot at the slit-plate have definite trajectories, and each particle goes through only one slit or the other. In this theory as excellently presented by Holland [Quantum Theory of Motion, 1993], the interference pattern results from the interaction of each particle with the quantum potential determined by its own wave function and the presence of the two slits. … 5.2 Prosser, and Wesley’s Poynting vector particle guidance In 1976 Prosser made a ground-breaking suggestion that, at least for the case of light, the underlying causal reality for the formation of interference and diffraction patterns is the energy flow given by the Poynting vector. [Intl. J. Theoretical Phys. 15, 169 (1976).]”
I don’t know what you mean by “wave-particle duality”, which is as vague as the word “God”. Feynman explains the double slit using path integrals, although he explains that the spatial extent of a photon transversely is a “small core of space” surrounding the classical path (the path of least action), in his 1985 book QED, stating:
“Light … uses a small core of nearby space. (In the same way, a mirror has to have enough size to reflect normally: if the mirror is too small for the core of nearby paths, the light scatters in many directions, no matter where you put the mirror.)” – R. P. Feynman, QED, Penguin Books, London, 1990, page 54.
The amplitude contribution of each path with action S to the path integral is exp(iS/[h bar]) which reduces by Euler’s equation cos (S/[h bar]) relative to the path of least action (where we don’t need the complex exponent, the additional information being merely the direction of the resultant, which is always parallel to the axis of least action when the path integral is done by summing arrows on an Argand diagram).
Therefore, with amplitude cos (S/[h bar]), only paths with actions within plus-or-minus h-bar around the path of least action contribute to the net amplitude significantly (the paths with larger actions cancel one another out). So it is indeed a very small core of space around the path of least action where the alternative paths of the path integral are significant and cause the double-slit phenomena.
The actual mechanism for the diffraction is very simple: the slits in the screen contain atoms with electromagnetic field quanta, which interact with a passing photon, diffracting it. When a photon travels through an electromagnetic field, it interacts with the virtual photons of the field, which is also why light is slowed down and refracted by glass. Because these field quanta are stochastic or random in timing and paths taken, an element of uncertainty is thereby introduced into the change of momentum of the passing photon. In addition, the “small core” of paths taken around the classical path means that if the slits are close enough together, some of the multiple paths taken by a “single” (sum over histories) photon will pass through each of the slits, before recombining on the other side. This causes the double slit interference pattern, seen with so-called “single” photons.
On page 27 you state:
“In 1984 Wesley [J. P. Wesley, Found. Phys. 14, 155 (1984)] independently formulated a similar theoretical concept of the role of the Poynting vector in two-slit interference. Wesley gave due credit to Prosser, and referenced his two papers. He pointed out that smaller slits with wider separation would more clearly show the flow needed to explain two-slit interference.”
On page 30, you state:
“5.4 Marmet’s relativistic waveless and photonless two-slit interference Marmet [Absurdities in Modern Physics: A Solution,1993] uses an original if not peculiar invocation of relativity theory to obtain interference without either waves or photons. He says, “The wave or photon interpretations are not only useless, they are not compatible with physical reality. Waves are simply the relativistically distorted appearance of relativistic coupling between two atoms exchanging energy.”
This is not a “peculiar” idea as far as I can see, it is QED, the standard model’s gauge theory of electrodynamics. All electric charges and magnets produce force fields by the exchange of offshell photons with one another. This “fills the vacuum” with offshell radiation which produces only fundamental forces. What I’ve never been able to understand is why it is still taboo to try to produce a Monte Carlo or simple geometric model of this exchange of virtual photons, as a duality to the usual mathematical technique of integrating exp(iS/[h bar]) over all paths.
This anti-mechanism taboo is a “doublethink” disease of the mathematical priesthood in physics. What happened when Maxwell’s mechanical aether failed was that mechanical models became taboo, and this taboo survives today. It is a lurch from one extreme to another. Really, QED is a theory of offshell radiation being exchanged between charges to produce fundamental forces, and the appearance of the real (onshell) photon is just an asymmetry in the normally unobservable exchange of virtual photons between charges (the asymmetry being caused by the acceleration of a charge).
“6. Conclusion … If a photon goes through both slits at the same time, there is little or no momentum transfer to the slit plate compared with a photon traversing only one slit. … It is extraordinary from a particle point of view that more photons reach the screen when one slit is closed than when both slits are open.”
What is the evidence that more photons reach the screen when one slit is closed than when both are open? If I make lots of pinholes in a screen, more light will definitely get through. Have you thought about the conservation of energy, taken over the pattern of light and dark fringes in the interference pattern?
What happens to photon energy if a single photon “lands” at a “dark interference fringe”?
Clearly, it would violate conservation of energy, since a photon’s energy doesn’t “disappear” from the universe when it travels through two slits and “interferes with itself”. What is the “cancellation” process? If I send two water waves in opposite directions towards one another from oscillators at opposite ends of a water tank, when the waves meet and pass through another, for a brief period the water surface is completely calm. The wave amplitudes have temporarily cancelled out. However, the energy still exists, and is seen a moment later when both waves, having passed through one another, magically reappear and the calm water surface rears up into two waves travelling away from one another.
Photons only arrive at the bright bands in the interference pattern. Nothing arrives at the dark bands in the interterference pattern, which means that the usual explanation by Young is plain wrong: individual photons don’t arrive “out of phase” to form the dark bands. This fact is obfuscated by the usual diagrams based on Young’s analysis which show that light waves arrive out of phase at the dark bands. Are you aware of this?
—– Original Message —–
Thursday, June 07, 2012 1:36 AM
May be one of the most important experiments in the last two centuries on the Wave-Particle Duality
I hope all is going well for you.
In case you haven’t already heard, I thought you might like to know a little about what may be one of the most important experiments in the last two centuries on the Wave-Particle Duality.
A recent experiment by Menzel et al observes through which of two slits a photon passes, while still preserving the customary interference pattern of Young’s original 1802 experiment. This violates both the Uncertainty Principle and Bohr’s Complementarity Principle.
Interestingly I was the first to propose and analyze this experiment in my 1995 paper, Examination of Wave-Particle Duality Via Two-Slit Interference. It was published in Modern Physics Letters B 9 pp. 763 – 789 (1995), and appeared as ArXiv 0302062 in 2003. My description and analysis of this novel experiment that determines which slit the particle/photon goes through and still preserves the interference pattern is in Sec. 4 pp. 18 – 38, and illustrated in Figs. 1 & 2 of my ArXiv paper.
A copy of my ArXiv paper is attached; as is the Abstract of the Menzel et al paper which is expected to be published in the Proceedings of the National Academy of sciences. The Web Sites for these two papers are:
I recall that Thomas Young’s monumental paper was roundly criticized by some authors in the same 1802 issue of Philosophical Transactions in which his paper was published. Lucky for him and for posterity, they were not in a position to reject his paper from publication.