My domain http://quantumfieldtheory.org/ as last updated on 28 October 2007, mentioned Dr Chris Oakley’s criticism of the mainstream (interaction picture) quantum field theory work as follows:
For some discussion of quantum field theory equations without the interaction picture, polarization, or renormalization of charges due to a physical basis in pair production cutoffs at suitable energy scales, see Dr Chris Oakley’s page http://www.cgoakley.demon.co.uk/qft/
:‘… renormalization failed the “hand-waving” test dismally.
‘This is how it works. In the way that quantum field theory is done – even to this day – you get infinite answers for most physical quantities. Are we really saying that particle beams will interact infinitely strongly, producing an infinite number of secondary particles? Apparently not. We just apply some mathematical butchery to the integrals until we get the answer we want. As long as this butchery is systematic and consistent, whatever that means, then we can calculate regardless, and what do you know, we get fantastic agreement between theory and experiment for important measurable numbers (the anomalous magnetic moment of leptons and the Lamb shift in the Hydrogen atom), as well as all the simpler scattering amplitudes. …
‘As long as I have known about it I have argued the case against renormalization. [What about the physical mechanism of virtual fermion polarization in the vacuum, which explains the case for a renormalization of charge since this electric polarization results in a radial electric field that opposes and hence shields most of the core charge of the real particle, and this shielding due to polarization occurs wherever there are pairs of charges that are free and have space to become aligned against the core electric field, i.e. in the shell of space around the particle core that extends in radius between a minimum radius equal to the grain size of the Dirac sea – i.e. the UV cutoff – and an outer radius of about 1 fm which is the range at which the electric field strength is Schwinger’s threshold for pair-production (i.e. the IR cutoff)? This renormalization mechanism has some physical evidence in several experiments, e.g., Levine, Koltick et al., Physical Review Letters, v.78, no.3, p.424, 1997, where the observable electric charge of leptons does indeed increase as you get closer to the core, as seen in higher energy scatter experiments.] …
‘[Due to Haag’s theorem] it is not possible to construct a Hamiltonian operator that treats an interacting field like a free one. Haag’s theorem forbids us from applying the perturbation theory we learned in quantum mechanics to quantum field theory, a circumstance that very few are prepared to consider. Even now, the text-books on quantum field theory gleefully violate Haag’s theorem on the grounds that they dare not contemplate the consequences of accepting it.
‘With regard to the first thing, I doubt if this has been done before in the way I have done it3, but the conclusion is something that some may claim is obvious: namely that local field equations are a necessary result of fields commuting for spacelike intervals. Some call this causality, arguing that if fields did not behave in this way, then the order in which things happen would depend on one’s (relativistic) frame of reference. It is certainly not too difficult to see the corollary: namely that if we start with local field equations, then the equal-time commutators are not inconsistent, whereas non-local field equations could well be. This seems fine, and the spin-statistics theorem is a useful consequence of the principle. But in fact this was not the answer I really wanted as local field equations lead to infinite amplitudes. It could be that local field equations with the terms put into normal order – which avoid these infinities – also solve the commutators, but if they do then there is probably a better argument to be found than the one I give in this paper. …
‘With regard to the second thing, the matrix elements consist of transients plus contributions which survive for large time displacements. The latter turns out to be exactly that which would be obtained by Feynman graph analysis. I now know that – to some extent – I was just revisiting ground already explored by Källén and Stueckelberg4.
‘My third paper [published in Physica Scripta, v41, pp292-303, 1990] applies all of this to the specific case of quantum electrodynamics, replicating all scattering amplitudes up to tree level. …
‘Unfortunately for me, though, most practitioners in the field appear not to be bothered about the inconsistencies in quantum field theory, and regard my solitary campaign against infinite subtractions at best as a humdrum tidying-up exercise and at worst a direct and personal threat to their livelihood. I admit to being taken aback by some of the reactions I have had. In the vast majority of cases, the issue is not even up for discussion.
‘The explanation for this opposition is perhaps to be found on the physics Nobel prize web site. The five prizes awarded for quantum field theory are all for work that is heavily dependent on renormalization. …
‘Although by these awards the Swedish Academy is in my opinion endorsing shoddy science, I would say that, if anything, particle physicists have grown to accept renormalization more rather than less as the years have gone by. Not that they have solved the problem: it is just that they have given up trying. Some even seem to be proud of the fact, lauding the virtues of makeshift “effective” field theories that can be inserted into the infinitely-wide gap defined by infinity minus infinity. Nonetheless, almost all concede that things could be better, it is just that they consider that trying to improve the situation is ridiculously high-minded and idealistic. …
‘The other area of uncertainty is, to my mind, the ‘strong’ nuclear force. The quark model works well as a classification tool. It also explains deep inelastic lepton-hadron scattering. The notion of quark “colour” further provides a possible explanation, inter alia, of the tendency for quarks to bunch together in groups of three, or in quark-antiquark pairs. It is clear that the force has to be strong to overcome electrostatic effects. Beyond that, it is less of an exact science. Quantum chromodynamics, the gauge theory of quark colour is the candidate theory of the binding force, but we are limited by the fact that bound states cannot be done satisfactorily with quantum field theory. The analogy of calculating atomic energy levels with quantum electrodynamics would be to calculate hadron masses with quantum chromodynamics, but the only technique available for doing this – lattice gauge theory – despite decades of work by many talented people and truly phenomenal amounts of computer power being thrown at the problem, seems not to be there yet, and even if it was, many, including myself, would be asking whether we have gained much insight through cracking this particular nut with such a heavy hammer.’
Recently, Dr Oakley has expressed displeasure at the “noise” he considers to be my effort to build quantum field theory on the interaction picture; gauge boson exchange radiations interacting to produce forces.
His basic starting point is Haag’s theorem (formulated by Haag in 1955 and proved by others later), which is supposed to discredit the interaction picture (Feynman diagram formulation) of quantum field theory:
http://en.wikipedia.org/wiki/Haag’s_theorem
Rudolf Haag postulated[1] that the interaction picture does not exist in an interacting, relativistic quantum field theory, something now commonly known as Haag’s Theorem. The theorem was subsequently proved by a number of different authors. It is, however, inconvenient as in the canonical development of perturbative quantum field theory – which includes quantum electrodynamics – cited as one of the great successes of modern science – the interaction picture is used throughout.
Citing the formulation used by Arageorgis[2]:
- If two pure ground states are not equal, then they generate unitarily inequivalent irreducible representations.
- If two local quantum fields are unitarily equivalent at any given time, then both fields are free if one of them is free.
[edit] References
- ^ Haag, R: On quantum field theories, Matematisk-fysiske Meddelelser, 29, 12 (1955).
- ^ Arageorgis, A.: 1995, Fields, Particles, and Curvature: Foundations and Philosophical Aspects of Quantum Field Theory in Curved Spacetime, Ph.D. Thesis, Univ. of Pittsburgh.
[edit] Further reading
- John Earman, Doreen Fraser, Haag’s Theorem and Its Implications for the Foundations of Quantum Field Theory, Erkenntnis 64 (2006): 305-344, online at philsci-archive
- Doreen Fraser, Haag’s Theorem and the Interpretation of Quantum Field Theories with Interactions, PhD thesis, U. of Pittsburgh, online
(Note that the above extract from Wikipedia was edited by Dr Chris Oakley, amongst others.)
http://coraifeartaigh.wordpress.com/2008/08/15/hubble-puzzle/
Hi Chris,
I did two years of undergraduate physics before switching disciplines to IT and graduating in programming. I think this is pretty relevant to your comment.
With regard to your advice, I think in comment 4 above I make it clear that your argument that the Hubble constant is varying is misplaced. This isn’t electronic parrot talk.
I’ve read your own page, and while you are justifiably proud of your Oxford PhD in theoretical physics, I don’t see any falsifiable predictions in your papers. Could you point them out please, if there are any, so maybe we can discuss the merits of your research, since you find my argument to be so much noise?
Maybe I can make my key point from comment #4 above clearer to you as follows:
Hubble law: v/r = H
where H = 1/t,
Chris Oakley argument: v/r = H = 1/t,
hence: v/r = 1/t
therefore (according to Chris Oakley in comment #3): v is constant and r is directly proportional to t. I.e., your argument is that both denominators in v/r = 1/t vary the same way.
This is wrong because r is not directly proportional to t (age of the universe).
Instead, r is merely directly proportional to time past T = r/c.
Actually 1/t is constant on the right hand side, because t is the age of the universe for the observer, not for the star being observed.
“Instead of wasting everyone’s – and your own – time getting involved in discussions that are above your head you should spend time studying the subject properly instead.”
I’ve been studying physics in my all my spare time since as far back as I can remember. I don’t think that your comment is addressing the point I made, maybe you can do so? Have you actually studied modern cosmology, or just mathematical quantum field theory?
Comment by nigel cook | August 20, 2008
Dr Chris Oakley’s site includes an interesting partly completed brief textbook on quantum field theory or rather “relativistic quantum mechanics”: http://www.cgoakley.demon.co.uk/qft/RQM.pdf
It’s good to have this kind of mathematical material on the Poincare group and other topics for reference, and the criticisms of the existing treatment of quantum field theory on page 1 begin clearly enough:
“… We are therefore implying the behaviour of the more comprehensive quantum world from the far less general classical world, which clearly is the wrong way round. …”
Quantum field theory should reduce to classical physical for large numbers of quantum interactions. I agree that the traditional approach is the wrong way around. Unfortunately, Chris continues:
“2. The traditional approach makes extensive use of the interaction picture, which does not exist (Haag’s theorem).
“3. The result of subtracting infinity from infinity is indeterminate. If one ignores this fact one will produce theories of no scientific value.”
Well, Chris, if you use Haag’s theorem to throw out the quantum interaction picture from quantum field theory, you’re left with just a sea of mathematics unlinked to clear pictorial physical processes. I’ve always found that the less physical the mathematical model (the less tied down to real physical facts like interactions), the more likely people are to end up digging themselves into a hole and not producing any checkable predictions. Equations that aren’t linked directly to hard physical facts end up having a landscape of interpretations, which is surely the problem with string.
Renormalization of charge etc. makes quite a lot of sense physically: in strong fields the vacuum contains pairs of virtual fermions which polarize, absorbing field energy and weakening the field which creates them. This reduces the apparent electric charge of a fermion as seen from a long distance. The naive mathematical model for running couplings based on this variation of charge is a simple logarithmic expression which would make the bare core charge of the electron become infinite at zero distance, so a cutoff must be imposed on the variation of the charge to get a realistic bare core charge to use in predicting the magnetic moment of leptons and the Lamb shift. This is physically justifiable if leptons don’t have zero size. No vacuum pair production can occur within the physical size of a lepton, so it’s quite natural to have a cutoff on the running coupling corresponding to the smallest physically sensible distance.
“Although it covers issues normally associated with quantum field theory, this treatise has the title ‘relativistic quantum mechanics’ on the grounds that the quantum field is not viewed as fundamental here, being derived instead from annihilation and creation operators, which in turn are defined as operators on Fock space. It will be shown that the presence of interactions does not invalidate this analysis. Interactions will be seen, in effect, to be just interference patterns between free field states.” (Emphasis added.)
That paragraph (again on page 1 of your book) indicates that you’re sure that the quantum field isn’t fundamental, and that quantum interactions are just interference patterns. You need to prove that assertion if you’re really claiming this. Your treatment just seems to show that it’s possible to treat interactions as interferences; i.e. at best you just have an alternative way of representing physics. My argument is the opposite of yours. Rather than moving away from Feynman diagram quantum interactions to greater mathematical abstraction, Feynman diagrams should be taken even more literally than they currently are. E.g., virtual radiations can be treated for some purposes like real radiations (after all they cause all known forces). If the boots fit, wear them. If you can get falsifiable predictions out of simple well established physics, why attack it? Having said that, I’ll read your new book draft carefully and update my domain accordingly ASAP.
Comment by nigel cook | August 20, 2008
The Wikipedia page on Haag’s theorem (which Dr Oakley has edited) links to Haag’s major paper: http://doc.cern.ch/yellowrep/1955/1955-008/p1.pdf
I will read Dr Haag’s paper when time permits in addition to Dr Oakley’s draft book, because it seems obvious to me that the point Dr Oakley makes about quantum field theories is that Haag’s theorem has negated the physical relevance of the interaction picture (so my work building entirely on quantum interactions sounds/is just unwelcome noise to his ears).
I wonder if anyone knows if Feynman made any comments about Haag’s theorem, since it is allegedly a disproof of the physical relevance of the interactions Feynman’s diagrams represented?
Is Haag’s theorem supposed to debunk the quantum field interactions of the weak field gauge bosons discovered at CERN in 1983? Are those gauge bosons just mathematical “interferences in free field states”? Is this really a helpful physical way of looking at the different fundamental interactions?
***
To restate briefly Dr Oakley’s problem:
Hubble law: v/r = H
where H = 1/t,
Chris Oakley argument: v/r = H = 1/t,
hence: v/r = 1/t
where Dr Oakley suggests that r is proportional to t, so v doesn’t vary: “(so it is Hubble’s “constant”, not the speed of the galaxy that is changing)” comment #3.
This is wrong since, as you look to bigger distances (r) you are seeing smaller times (t) after the big bang.
So r definitely is not proportional to age of universe t. In fact, if the age of the universe is that of the observer’s frame of reference, t is fixed at 13,700 million years. Hubble’s point was that v/r = constant = H, regardless of how far away (back in time) you look. This is why I feel that Dr Oakley’s comments (comments #2 and #3) above are in error.
If an Oxford PhD/D.Phil in quantum field theory can make such an error in looking at the very basics of cosmology, and then come back with personal comments ignoring the science, you can see the problem in communicating the fact that there is an acceleration inherent in the Hubble law!
***
The problem with an abstract mathematical approach to a physical problem is that it’s just mathematical model building. Depending on the assumptions it may be tied closely to physical reality, or (like string theory) it may not. Theorems built and proved on the basis of mainstream ideas are not necessarily physical fact unless experimentally justified by making falsifiable predictions: suppose Ptolemy proved that his epicycle theory of cosmology (which used many epicycles to model planetary positions accurately) was incompatible with elliptical orbits. That theorem might be true and would then have been taken to disprove elliptical orbits. Unfortunately, the physical interpretation in this case would be wrong. Epicycles could only accurately model apparent positions (the direction in which a planet is found), not the accurate distance of planets from the Earth at any time. Galileo disproved epicycles, so all the extensive mathematical physics, theorems and proofs accompanying the epicycle (Earth-centred universe) system were void.
http://en.wikipedia.org/wiki/Geocentric#Copernican_system:
In 1543 the geocentric system met its first serious challenge with the publication of Copernicus‘s De revolutionibus orbium coelestium, which posited that the Earth and the other planets instead revolved around the Sun. The geocentric system was still held for many years afterwards, as at the time the Copernican system did not offer better predictions than the geocentric system, and it posed problems for both natural philosophy and scripture.
With the invention of the telescope in 1609, observations made primarily by Galileo Galilei (such as that Jupiter has moons) called into question some of the tenets of geocentrism but did not seriously threaten it.
In December 1610, Galileo Galilei used his telescope to observe that Venus showed all phases, just like the Moon. This observation was incompatible with the Ptolemaic system, but was a natural consequence of the heliocentric system.Ptolemy placed Venus’s deferent and epicycle entirely inside the sphere of the Sun (between the Sun and Mercury), but this was arbitrary; he could just as easily have swapped Venus and Mercury and put them on the other side of the Sun, or made any other arrangement of Venus and Mercury, as long as they were always near a line running from the Earth through the Sun. In this case, if the Sun is the source of all the light, under the Ptolemaic system:
But Galileo saw Venus at first small and full, and later large and crescent.
Astronomers of this time period saw the result of this being unsalvageable for a Ptolemaic cosmology, if the results were accepted as true.
This is of course the risk for certain types of mathematical physics proofs. Infamously, John von Neumann claimed to prove that “hidden variables” type quantum mechanics were vacuous in 1932, supporting Bohr and Heisenberg’s condemnation of physical mechanisms in fundamental physics. Von Neumann based his proof on five postulates, the fifth of which eventually turned out to be less than entirely solid and indeed apparently vacuous. This failure of postulates (remember Euclid’s fifth?) is just one problem of many potential difficulties with trying to do physics in the rigorous way that pure mathematics is done, based on postulates. That’s not good enough in physics, where you must have physically defensible assumptions (not just accurate mathematical model assumptions such as Earth-centred-universe epicycle assumptions!) but also to make checkable predictions. Otherwise you’re likely to end up stuck in a mathematical hole, digging yourself further into a string theory-like landscape of uncheckable speculation, with no solid physical facts to check that you’re in the real world.