Dr Woit on modern physics challenges

Chapter 8 of Dr Woit’s 2006 book Not Even Wrong is entitled “Problems of the Standard Model”, and after acknowledging that the Standard Model with 17 ad hoc parameters precisely describes and predicts all low-energy, non-gravitational particle physics, states that it suffers from the following problems:

1. It doesn’t explain the reason for SU(3) x SU(2) x U(1): “A truly fundamental theory should explain where this precise set of symmetry groups is coming from.”

2. It doesn’t predict the coupling parameters for the broken symmetries. E.g., it doesn’t predict the value of the “fine structure constant”, the low energy asymptotic value of alpha, the electromagnetic coupling or relative electric charge. For low energy particle collisions, i.e. below about 1 MeV, alpha is fixed at roughly 1/137.036 but at higher energy, alpha increases or “runs” because you penetrate partially through – and thus see less of – the vacuum polarization shielding around the electromagnetic charge when you approach that charge very closely, using higher energy particles. At 91 GeV, alpha (measured by high energy electron scattering in 1997 by Koltick, Levine and others) was only 1/128.5. In other words, the electric charge of an electron is about 6.6% higher when you collide particles at 91 GeV energy, seeing through part of the polarized cloud of virtual fermion pairs which surround the electron core. (By analogy, the intensity of sunlight is reduced by cloud: if you move higher in a cloud with an aircraft, the amount of cloud between you and the source is reduced, so the amount of sunlight reaching you is increased.) Quantum field theory predicts the logarithmic way that alpha increases as a function of collision energy, but not the low energy asymptotic value of alpha itself.

3. The U(1) weak hypercharge symmetry is not asymptotically free, “and as a result it may not be completely mathematically consistent.”

4. SU(3) x SU(2) x U(1) doesn’t explain why only certain mathematical “representations” of those symmetry groups produce quarks and leptons, e.g., why weak interactions only involve particles having left-handed spin. In other words, although the mathematical patterns given by SU(3) x SU(2) x U(1) include “representations” which are real physics, they also include other representations which have never been observed and so are not real. SU(3) x SU(2) x U(1) doesn’t tell you which representations are real and which aren’t: you have to select what you want by the ad hoc process of assigning suitable weak isospin charges to left-handed particles and zero weak isospin charges to right-handed particles, to make the predictions correct. In this sense, the physics is weak: we have to pick out the representations which fit the real world and ignore the others.

5. SU(3) x SU(2) x U(1) comes with 3 generations of quarks and leptons, but doesn’t explain why there are specifically 3 generations: more evidence that it isn’t a real theory, just an ad hoc model.

6. SU(3) x SU(2) x U(1) is only speculatively assumed to be a unified theory at high energy. It fails for high energy since the predicted couplings don’t unify at high energy without additional ad hoc hypotheses like supersymmetry (from string theory, for example). For low energy, it requires symmetry breaking mechanisms to explain why leptons don’t feel strong interactions (nobody has such a theory) and why electromagnetism is broken from weak interactions at low energy (supposedly due to a Higgs field which mechanically gives mass to only weak gauge bosons at low energy, breaking the electroweak symmetry at low energy):

“If the origin of [speculative electroweak symmetry breaking] really is a Higgs field, then at least two new parameters are needed to describe the size of the symmetry breaking and the strength of the Higgs interaction with itself. Why do these parameters have the value they do? One of these parameters is determined by the observed properties of the electroweak interactions, but the other is still undetermined by any experimental result. This is why the Standard Model predicts the existence of a Higgs particle, but does not predict its mass. In addition, the standard quantum field description of a Higgs field is not asymptotically free and, again, one worries about its mathematical consistency.

“What determines the masses and mixing angles of the quarks and leptons in the theory? These particles have a pretty random looking pattern of masses, giving nine numbers that the theory doesn’t predict and which have to be put in by hand. The mixing angles are four more parameters that determine precisely how the electroweak forces act on the particles. In the Standard Model, these thirteen parameters appear as the interaction strengths of the Higgs field with the quarks and leptons and are completely arbitrary. This problem is closely related to the previous one, since our inability to predict these parameters is probably due to not understanding the true nature of the electroweak gauge symmetry breaking of the vacuum. …

“One way of thinking about what is unsatisfactory about the Standard Model is that it leaves seventeen non-trivial numbers still to be explained, and it would be nice to know why the eighteenth one [the theta parameter for the QCD theory] is zero. Of the seventeen, fifteen show up in the Standard Model as parametrising the properties [due to particle masses] of the Higgs field. So most of our problem with the Standard Model is to find a way to either get rid of the Higgs field, or understand where it comes from.

“Glashow [Woit’s PhD adviser], whose earlier version of the electroweak theory was incomplete (unlike the later Weinberg-Salam model) because it lacked something like the Higgs to break the gauge symmetry, has been known to refer to the Higgs as ‘Weinberg’s toilet’.”

7. The original version of the Standard Model ascribed zero mass to neutrinos. The measurement during the 1980s and 1990s of neutrinos from nuclear fusion in the sun, using photomultipliers inside underground swimming pools filled with dry cleaning fluid, showed that only one third as many flavour specific neutrinos are detected on earth than are produced in the sun (the production rate of neutrinos can also be verified using a nuclear fission reactor, which was done back in the 1950s when neutrinos where experimentally discovered: you simply need a massive beta radiation source so that the anti-neutrino flux is so high your inefficient detector can measure enough of them to get an accurate reading). Hence this proves that neutrinos oscillate while travelling from sun to us between the three types of “flavours” (i.e., in the three generations of the standard model). Hence they leave the sun as 100% one flavour and by the time they arrive at the Earth they are equally distributed between three flavours (1/3 of each electron neutrinos, muon neutrinos and tauon neutrinos), only one of which is detected by the standard neutrino detector system used, thus accounting for the factor of 3 discrepancy in the detection rate. In order for this “oscillation” of neutrino flavours to occur while the neutrinos are coming to earth, they must go slightly slower than the velocity of light (or they would be “frozen” by relativity and unable to experience any time-dependent change), which implies that they must have some rest mass (with no rest mass, they would travel at the velocity of light). Hence, there is evidence that neutrinos have mass:

“It is relatively easy to extend the Standard Model in a simple way that allows for neutrino masses. This introduces a new set of seven parameters very much like the quark masses and mixing angles. The situation is slightly more complicated since the fact the neutrino has no charge allows two different sorts of mass terms. The exact mechanism responsible for these masses and mixing angles is just as mysterious in the neutrino case as it is for the quarks.”

8. “There is one remaining important part of physics that is completely ignored by the Standard Model: the gravitational force. … the strength of its action on a particle is just proportional to the particle mass. … it is always attractive, never repulsive … [i.e. he is claiming falsely that the graviton is spin-2, see disproof linked here] What makes this problem especially hard is that one has no experimental guidance about where to look for a solution, and no way of checking the kinds of predictions that tend to come out of any conjectural quantum gravity theory.”

Dr Woit here ignores the gravitational repulsion observed as cosmological acceleration of distant supernovae and clusters of galaxies. The incredibly large masses (galaxy clusters), isotropically located all around us, exchange gravitons with us, simply because there is no mechanism to prevent such an exchange. They have much bigger masses than say an apple or the Earth, and thus they have much bigger gravitational charges than the objects near us, and the gravitons they exchange converge in towards us. The repulsive force from the inward convergence of gravitons from these immense masses is far larger than the repulsion locally between an apple and the Earth, since the apple and the Earth are of insignificant gravitational charge (mass) compared to the clusters of galaxies all around us. Hence, you don’t need to be Einstein to work out which force is bigger: the apple simply gets pushed down to the earth by massive galaxies above it, far more forcefully than it gets repelled upwards from the earth. The net result is what you think of as “attraction”. It’s a net repulsion downwards from the masses above us which are bigger and at such distances that the gravitons are converging towards us, increasing the interaction strength and thus exceeding the Earth’s repulsion.

In chapter 11, “String Theory: History”, Dr Woit writes that the quantum field theory of the Standard Model was not a mainstream project because most of the mainstream in the 1950s and 1960s was concentrating on S-matrix bootstrap alternative:

“In reality, the successful ideas described in detail here were often pursued only by a small minority of physicists, with the great majority of their colleagues following very different research programme that were ultimately to fail.”

The scattering or S-matrix method which dominated particle physics until the birth of the Standard Model with ‘t Hooft’s proof of the renormalizability of gauge theories in the early 1970s, originated from the philosophy of the “Vienna school of Logical Positivism” according to Dr Woit. Logical positivism was the rejection of anything like a quantum field which was not directly observable, and this philosophy against quantum field theory was supported in the 1930s by the infinities problems in quantum field theory, which were only resolved by Schwinger’s and Feynman’s work on renormalization in the late 1940s (E. C. G. Stueckelberg and Sin-Itiro Tomonaga also worked on the subject in Nazi Germany and fascist Japan, respectively, and thus were almost completely isolated from American research on the subject).

Rejecting quantum field theory in the 1930s because of its problems with infinities at that time, John Wheeler in 1937 developed the first version of S-matrix theory, and Heisenberg developed it in 1943. Woit explains: “the idea was that one should express the theory purely in terms of the scattering matrix. The scattering matrix … tells one what happens if one has two particles that are initially far apart and one sends them towards each other. Do they scatter off each other, emerging from the collision intact but moving in a different direction? Do they annihilate each other, producing other particles? … A quantum field theory can be used to calculate the S-matrix, but it inherently contains the much more complicated structure of fields interacting with each other at every point in space and time. Unlike a quantum field theory, the S-matrix is smoething that has nothing to say about exactly what is going on as the two particles approach each other and their interaction evolves.

“Pauli was highly sceptical of Heisenberg’s S-matrix ideas, noting at a conference in 1946 that:

Heisenberg did not give any law or rule which determines mathematically the S-matrix in the region where the usual theory fails because of the well-known divergencies. Hence his proposal is at present still an empty scheme.

“Pauli’s point was that the S-matrix proposal did not actually solve any of the physical problems that had motivated it.”

This is identical to the problem with string theory today. String theory was originally hyped with one set of clear unification and prediction objectives, and having failed to accomplish any of them in a falsifiable way, it is simply applied to other problems like calculating the viscosity of a quark-gluon plasma or whatever. So there is a strong analogy between the failure of the S-matrix and the failure of string theory to go beyond the Standard Model today.

Woit extends the analogy by examining what eventually happened to S-matrix theory. Despite Pauli’s criticism, it continued to the focus of most mainstream attention until the advent of the quantum field theory of QCD which dealt more successfully with strong interactions than S-matrix theory could. Geoffred Chew, the leading proponent of S-matrix theory, hyped it by talking about developing a “bootstrap” version in which each particle’s properties would be determined by interactions with all other particles, so that the theory would “pull itself up by its own bootstraps”. This “bootstrap” idea is actually a good one to apply to long range interactions in quantum field theory, where all particles are exchanging particles such as gravitons with one another and thus affecting each other, but it failed to predict or describe the quark structure of hadrons: “The bootstrap programme hopes that there would somehow be a unique consistent S-matrix were simply wishful thinking.”

At the end, S-matrix theory went the way of all physics failures: it literally became a religion, headed by the physicist and Eastern religious enthusiast Dr Fritjof Capra, who used Eastern religion to praise the philosophical virtues of bootstrap S-matrix theory and to denounce the symmetries of quantum field theory in his 1975 book The Tao of Physics:

The discovery of symmetric patterns in the particle world has led many physicists to believe that these patterns reflect the fundamental laws of nature. … The attitude of Eastern philosophy with regard to symmetry is in striking contrast … Like geometry, it is thought to be a construct of the mind, rather than a property of nature, and thus of no fundamental importance …

“Like lots of people, over the years I’ve been deluged with examples of what I’ll call “unconventional physics”, in a spectrum ranging from utter idiocy to serious but flawed work. Much of it shares the all-too-common feature of making grandiose claims for new understanding of fundamental physics, based on vague ideas that often use not much more than a few pieces of high-school level physics and mathematics. The beautiful and deep physical and mathematical ideas that go into the Standard Model are ignored or thrown out the window.”

– Dr Peter Woit, Expanding Crackpottery post on Not Even Wrong

Here I agree with Dr Woit. The Standard Model is established in the sense it is based on observations and makes falsifiable predictions about relatively low energy physics (up to a few hundred GeV only, since there are falsifiability problems with the Higgs field used for breaking the electroweak symmetry). Any convincing presentation of quantum gravity must begin with the Standard Model, and show how quantum gravity fits into it and thereby modifies it, and what predictions are made in consequence. In fact, any convincing presentation needs to review – not ignore – the entire quantum field theory infrastructure, and to clarify and condense each part of it and make it more widely understood, less boringly abstract, and more straightforward and exciting to teach and to learn.

Not only does the Standard Model fail to break electroweak symmetry in a falsifiable way (the Higgs field is ugly and ad hoc), but it is also a failure in terms of unifying the strong force with the electroweak force, which it doesn’t do: both are included in the symmetry groups presentation U(1) x SU(2) x SU(3), but there is no symmetry breaking mechanism to cut the strong interaction SU(3) off from the electroweak groups U(1) x SU(2). In order to make the theory work, therefore, a vast number of parameters need to be supplied for different fundamental field charges.

If the unification U(1) x SU(2) x SU(3) was true in the Standard Model, there wouldn’t be so many free parameters: there would be only one charge (the unified field strength), and symmetry breaking mechanisms would predict the variation from that unified coupling or unified charge strength for the various different “broken symmetries” we see at low energy (electromagnetism, weak interactions and strong interactions). The symmetry grouping U(1) x SU(2) x SU(3) is therefore physically empty: nobody has unified these three gauge groups, U(1), SU(2) and SU(3) properly because the symmetry breaking mechanisms are still at best speculative (electroweak) or unknown (strong) and yet are vital to connect the low energy broken U(1), SU(2) and SU(3) symmetries to the presumed unified high energy U(1) x SU(2) x SU(3) symmetry, so each component has to be individually supplied with constants in order to make predictions for the observed “broken symmetry” forces we see in everyday, low energy, physics: electromagnetism, weak forces (beta radioactivity or the stability of neutrons against decay into protons in nuclei) and strong forces (the binding of neutrons and protons in nuclei against electromagnetic repulsion of protons).

There are mathematical problems with the Standard Model too. The existing mathematical structure of quantum field theory necessarily utilizes Fock space, and Haag’s theorem disproves the possibility of rigorously representing quantum field interactions in Fock space! This is why I’m attracted to Feynman’s attempts in his 1985 book QED, to simplify the path integral approach to quantum field theory, representing it as a very simple mechanism for physically correct second quantization calculations.