Unification of particles and fundamental forces

“The charged massive vector intermediary and the massless photon were to be the gauge mesons. … We [Schwinger and Glashow in 1956] used the original SU(2) gauge interaction of Yang and Mills. Things had to be arranged so that the charged current, but not the neutral (electromagnetic) current, would violate parity and strangeness. Such a theory is technically possible to construct, but it is both ugly and experimentally false [H. Georgi and S. L. Glashow, Physical Review Letters, 28, 1494 (1972)]. We know now
that neutral currents do exist and that the electroweak gauge group must be larger than SU(2). … We come to my own work done in Copenhagen in 1960, and done independently by Salam and Ward. … I was led to SU(2) X U(1) by analogy with the appropriate isospin-hypercharge group which characterizes strong interactions. In this model there were two electrically neutral intermediaries: the massless photon and a massive neutral vector meson which I called B but which is now known as Z. The weak mixing angle determined to what linear combination of SU(2) X U(1) generators B would correspond.”

“Standard Model architect Sheldon Glashow explained in his 1979 Nobel Prize Lecture how in 1956 as the research student of Julian Schwinger, he had tried to construct an SU(2) electroweak theory by making the charged SU(2) vector bosons massive and keeping the neutral vector boson massless, but ran into trouble making the massive vector bosons left-handed and allowing for strange quark effects (quarks
had not yet been discovered, but mesons with 2nd generation strangeness-conserving quarks had been discovered)” – p36 of https://vixra.org/pdf/1111.0111v1.pdf

From the previous post here https://nige.wordpress.com/2023/10/10/the-final-theory/:

Can I just ask please if you’re aware of the following “anomalies” with the strange quark (electric charge -1/3) and the omega minus (triplet of strange quarks) which has is normally swept under the carpet but has enormous implications (I’ll try to keep this brief and clear).

  1. Fermi’s point theory of beta decay says a muon decays into an electron, and strange quarks decay into upquarks.

But when the W- propagator was added to Fermi’s theory, an anomaly emerged: if a muon decays into an electron, it has to become a W- boson (briefly) en route, then the corresponding Feynman diagram for beta decay of a strange QUARK decaying via a W- boson turns the strange quark into an electron! There you have quark-lepton unification. (Diagram: https://vixra.org/pdf/1111.0111v1.pdf at Fig 34 in the middle of p44.)

  1. Fractional “electric charges” of quarks are artifacts like emergent from the very large vacuum polarization shielding of a pair or triplet of quarks and this is proved by the Omega Minus, which should be viewed as the Rosetta Stone for understanding everything: it is a triplet of strange quarks with the electric charge -1, so the strange quarks are all -1/3. This makes it understandably simple!

Look at the maths in this: you have in close proximity THREE similar electric charges, which must physically produce an OVERLAPPING vacuum polarization (pairs of charged virtual fermions which align to shield the core charge within) THREE times stronger than a single charge would produce. It’s like wearing three pairs of sunglasses at once, you get three times as much filtering. Hence, the strange quarks hypothetical isolated electric charge is 3 x (-1/3) = -1, the same as the electron.

So there’s your quark-lepton unification.

The missing “shielded” energy, can be easily caculated here: 2/3 of the -1 electric charge per quark is shielded, so you see an apparent total omega minus charge of -1. The virtual particles acquire it this energy, adding to their survival time beyond the Heisenberg’s t = ℏ/E (virtual fermions only exist between UV and IR cutoff energies, which translate into distances out to ~33 fm). So this acquired electric field energy allows them to briefly behave like real particles, obeying Pauli’s exclusion principle and thus gaining a quasi nuclear shell structure (near the UV cutoff where virtual quark pairs exist) and a quasi electron structure further out (nearer the IR cutoff where electron-positron pair production occurs). Simple calculations prove this predicts particle masses: Table 1 in https://vixra.org/pdf/1408.0151v1.pdf (which is merely based on the nuclear shell model magic numbers analogy). It should also be possible to make more detailed calculations by calculating statistically average mass of the polarized vacuum particles using the easily deduced omega minus shielded electric field energy.

The bottom line is, if you are going to build a model of particles, beware that part (maybe all) of the existing problem of stagnation can be people trying to model the standard model quarks etc, with all their supposed features. If it turns out (as per above) that “anomalies” imply the quarks are “really” integer charged and just strongly vacuum polarization shielded from a distance due to existing in pairs/triplets rather than than singletons, you’re done for. Dalton made the mistake of assuming the chemists had discovered everything when he made his Law of Octaves. They had missed key elements, so his model was false.

THE ROAD AHEAD:

Contrast the above approach to unification with superstrings, i.e. SUSY unification using an argument consisting of running couplings supposedly converging around 10^15 GeV.

For unification just calculate energy conservation. Ignore collision energy in GeV. Just look at distance from a particle core and what the couplings are, and why they are varying due to vacuum polarization mechanisms etc.

First, it’s clear that not all particles have the strong charge: with leptons there is just electroweak. If “unification” is true, there must be some way to transform a lepton into a quark.

Consider field energy conservation: quarks have less electric charge than leptons. The electric field extends to infinity because its vector bosons are massless, but for the strong and weak fields the vector bosons are massive with a short range.

Take the simplest example to analyse: the Omega minus, composed of three strange quarks, each with -1/3 the electric charge of the electrically charged lepton, electron/muon/tauon.

Now consider the vacuum polarization: if strong interactions and associated color charge are an emergent property of having 2 or 3 particles in proximity, a simple way to understand this, for the Omega minus, is to imagine 2 or 3 particles with unit electric charge like a charged lepton: the only they can exist in proximity without violating Pauli’s exclusion principle is to have an extra quantum number for color charge, which means a gluon field that contains energy.

If you can integrate the total gluon field energy per strange quark over radial distance – which shouldn’t be that difficult because although the coupling gets bigger at much higher energy, that corresponds to very tiny distances with very small volumes since volume is proportional to the cube of the radius – so you’re multiplying a large uncertain coupling controlled energy density for the field by an extremely small volume which goes to zero as the coupling gets large near zero radius.

(The mainstream “unification” approach whereby radii from a particle core are inverted and called “energy scale” seriously obfuscates the whole issue, particularly as the volume gets smaller as the cube of the radius and thus the cube of the energy scale!)

Therefore, if “unification” between quarks and leptons is real, that calculation should indicate the total gluon mediated color field strong interaction energy around a strange quark in Joules. You then simply compare that result to the energy in the electric field of the charged lepton. If you can’t decide what small radius to use, don’t worry, simply assume the total electric charge field energy of an electron is its rest mass energy 0.511MeV.

If the former is 2/3 of the latter, this is evidence that the “unification” theory for the transformation between leptons and quarks is equivalent to electric charge field energy being converted into strong colour charge energy, this explaining why quarks have fractional charges, I.e. a strange quark has 1/3 of the electron’s electric charge because 2/3 of the electric charge energy is turned into colour charge energy.

There’s still plenty of fun here, e.g. calculating the energy in the weak (W/Z mediated) field of a particle, and looking at different quark’s charges (beyond the strange quark).

Since the 1960s electroweak unification successes and the 1970s successes of the standard model incorporating strong interactions, quantum gravity has been pursued mostly through a theory called supersymmetry, which assumes (1) the simplest possible quantum gravitational interaction, depicted as an exchange of a spin-2 quanta between two masses, together with (2) the assumption that the word unification implies equal coupling constants for all forces at very high energy scales, corresponding to early times in big bang cosmology. Both of these assumptions are fundamentally wrong: (1) gravity is the simplest possible exchange of field quanta but a Casimir-type exclusion of radiation (dark energy is the cause of it) and (2) unification implies that polarized vacuum volume-integrated field strength energy densities for different particles differ according to the proximity of 1, 2 or 3 leptons or quark cores in leptons, mesons or baryons, respectively: the strong force coupling for a lepton singleton is zero because none of its electromagnetic coupling is converted into a strong field, whereas this is not the case for quarks! This is a new direction of physics where calculations can be checked by data immediately to make progress!

This post is a response to https://robwilson1.wordpress.com/2024/01/02/gluons-and-octonions/

I should add that in the model above, the short-ranged strong and weak forces are emergent effects due to pair production in the vacuum between close-in UV cutoff (basically a radius corresponding to collision energies needed to create the heaviest virtual particles) and IR cutoff (1.022 Mev collision energy equivalent or ~33 fm radius depending on the particles being considered, the low-energy or large distance limit beyond). Beyond the IR cutoff range, the electron’s charge stops obeying the logarithmic running coupling formula, and instead just obeys the classical Coulomb law; because the effective collision energy corresponding to such distances of approach between colliding particles is then too low to create even the lightest virtual particle pair, (the electron + positron, a total of 2 x 0.511 = 1.022 Mev collision energy equivalent).

Because the weak and strong nuclear forces are “emergent,” i.e. depend on the conversion of electromagnetic force field energy via its self-shielding by the pair-production polariation mechanism (which takes up EM field energy, producing virtual particles that mediate short-ranged interactions), they have a physical mechanism and so aren’t quite so “fundamental,” at least in the sense normally assumed by the simplistic bigots sorry mainstream physicists… What’s really going on is an emergent effect due to vacuum polarization mechanisms, so the completely fundamental forces are dark energy (which creates gravitation as a Casimir type shielding of dark energy by mass, so even gravity is “emergent” from a mechanism!) and the hidden SU(2) Yang-Mills electromagnetic theory: the observed “Maxwell equations” version in an apparently Abelian U(1) form, is another mechanism-produced “emergent” phenomenon (massless charge SU(2) field quanta can’t propagate due to infinite magnetic field self-inductance. unless in a two-way exchange whereby incoming charge massless field quanta cancel the magnetic field curl vector outgoing field quanta, being exchanged between particles. This mechanism physically sets the Yang-Mills charge transfer term equal to zero for such phenomena, so you just see a set of phenomena that conforms to Maxwell’s U(1) equations. Not entirely the “beautiful” superstring hype dogma!

(Hubris aka corruption is rife in groupthink everywhere and consists of selective censoring certain “news” and over-reporting other “news” and “fact checking” certain news, and not “fact checking” other “news”. We see this in political and also scientific corruption. Most of the wars and terrorism in the world is about “people trying to get a message across” which tells you all you need to know about groupthink bigotry. What is going on here is religious fundamentalism, but with “superstring” or “socialism” replacing old fashion beliefs like “Christianity” or “slavery”. The “top” members of such belief systems are zealots, defending their agenda by fighting off genuine criticisms and better alternatives. Their social lives, careers, and entire future is superglued to falsehood, so to them it really is a fight to death to prevent innovation, progress, and the triumph of human decency.)

Above: anomalies in experimental data are just treated as a joke by https://resonaances.blogspot.com/2022/04/how-large-is-w-boson-anomaly.html

  • Discussion of some of this https://robwilson1.wordpress.com/2024/01/01/a-finite-version-of-su3/
  • Robert A. Wilson Says:
    January 3, 2024 at 5:02 pm | ReplyI don’t know what to make of this at all. But there is something very odd going on in the mathematics, which makes it look as though the up/down/strange triplets that gave rise to the original 8-fold way should really be up/down/muon triplets. This doesn’t make much sense to me, but maybe it makes sense to you? Certainly the muon mass and the sum of the strange+down+down quark masses are equal to within experimental uncertainty, so maybe the strange quark is just what’s left when you try and subtract a couple of down quarks from a muon? Just something “virtual”, rather than a real particle? I don’t know.
    • Nige Cook Says:
      January 3, 2024 at 5:19 pmWhat it needs is someone to take a fresh look at the actual evidence for standard model particle charges (which depends on how you calculate them regarding vacuum polarization shielding), masses (does the polarized vacuum contribute most of the mass?), what happens to the energy of fields that are screened by vacuum polarization (it’s becomes short-ranged field quanta, e.g. the nucleus is held together by virtual pions as in Yukawa’s model, in a very simple way like air pressure forces are down to bombardment by air molecules, so “quantum tunnelling” is just the stochastic nature of the field quanta).
    • Woit kindly sent me an email suppling his 1980s papers, but didn’t reply on this topic, about correcting interpretational errors in SM orthodoxy before trying to see what the real experimental data is (nobody has ever seen a -1/3 charged quark directly; it’s value is simply implied by a faulty analysis only). Prof. Clifford Johnson (a string theorist) replied that he “hadn’t thought about” such mechanisms of the vacuum polarization. Distler tried to make complex numbers in the optical theorem of QFT an excuse to ignore physical mechanisms entirely (note my mass gap paper https://vixra.org/abs/1408.0151 uses the Laplace transform not the complex Fourier transform). Warren Siegel (string theorist) replied on nuclear physics, but only trying to obfuscate the simple virtual pion mediated nuclear binding energy by arguing that there are loads of different virtual mesons etc at higher energy (which are simply not relevant for the nucleus distance scale, where pions dominate), etc.
  • Robert A. Wilson Says:
    January 3, 2024 at 5:28 pm | ReplyYou could well be right about that – but I’m now out of my depth in physics and can’t contribute meaningfully to that part of the discussion.
    • Nige Cook Says:
      January 3, 2024 at 9:57 pmIt’s up to you, but as a mathematician, you may have an advantage in not being brainwashed by data-alloyed-to-ad hoc theory, if you ever decided to investigate in the future. It’s not as if there’s mountains of stuff out there anyway, the anomalies stick out like sore thumbs even when wallpapered over or covered up

Update (5 Jan 2024): I’m under some (semi-voluntary) pressure to turn out a “Understanding quantum field theory mechanisms” (or such-like title) book draft, so the new innovation from Woit and its response from Wilson is very important and I feel has broken an impasse. All my stuff is based on facts, and results in careful calculations with checkable predictions, but it probably doesn’t come across that way, e.g. https://vixra.org/pdf/1111.0111v1.pdf was published on vixra in 2011 for all to read and check, but now we are in 2024 and there is no interest. https://vixra.org/pdf/1111.0111v1.pdf is deliberately nascent and abrasive in style: its title “U(1) x SU(2) x SU(3) quantum gravity successes” for example refers to a theory in which U(1) is dark energy (which provably produces Casimir type emergent quantum gravity) whose charge is mass, SU(2) is electromagnetism (which produces weak interactions when it chirally mixes with U(1) which gives mass to its left-handed gauge bosons; the massless unmixed charged SU(2) gauge bosons also exist for electromagnetism but naturally can’t propaganda on a net one-way path to transfer charge because of the infinite self-inductance, unlike the case for massive versions, the weak force bosons); SU(3) is the regular SM strong force/quark energy. There are lots of bits of the argument in that paper which support these, and extensions and clarifications of details of particle mass predictions in later paper like this one. There are also shorter papers (e.g. just a single page) on key results. But the weak spot the organization and the faith that readers would read it all.

But the key problem is deciding the gravity-electroweak gauge group. As stated above, a re-interpretation of U(1) x SU(2) might be right, but you have to mix U(1), the simple dark energy model, where the charge is mass, in such a way with massless SU(2) so that you let chiral massive field quanta giving a left-handed weak force, as well as allowing other SU(2) field quanta to remain massless, thus eliminating the net charge-transfer term as shown below (pp 26-27 of https://vixra.org/abs/1111.0111):

So that’s one option, which leads to a different re-analysis of the mixing parameters (CKM parameters) of the SM (one early look at this is pp. 41-45 of https://vixra.org/abs/1111.0111 but that is very ad hoc so I wouldn’t stake my life on that very preliminary discussion about interpreting parameters, unlike the situation for most other stuff in that paper, where solid theoretical predictions are made and tested, e.g. dark energy/cosmological acceleration was accurately predicted and published prior to Perlmutter’s discovery). The key issue has always been with SU(2); Schwinger and Glashow in 1956 used the original Yang-Mills SU(2) theory to try to give an electroweak interactions but discovered that: “Things had to be arranged so that the charged current, but not the neutral (electromagnetic) current, would violate parity and strangeness … I was [in 1960] led to SU(2) x U(1) by analogy with the appropriate isospin-hypercharge group … there are two electrically neutral intermediaries: the massless photon and a massive neutral vector boson [Z] … The weak mising angle determined to what a linear combination of SU(2) x U(1) would correspond.” – Glashow, 1979 Nobel prize lecture.

The issues to be resolved for the results obtained is, “Does SU(2) x U(1) model electroweak & dark energy (whose charge is mass) correctly? What is the CKM matrix for that, and does it reveal any “simple” symmetries that can point to understanding what is going on (see Fig 35 on p45 of https://vixra.org/pdf/1111.0111v1.pdf for one CKM “pattern” which may – or may not – be relevant)? Or does the correct result involve Woit’s SO(4) = SU(2) X SU(2) ⊂ U(2); chirally, Spin(4, 0) = SU(2)_L × SU(2)_R, or Wilson’s idea? What CKM matrices would they require, and again, do they appear to offer a clearer understanding than the conventional CKM of the SM (which appears fairly random, possibly indicating a defective model)?” Woit’s model appeals (see p23 of https://vixra.org/pdf/1111.0111v1.pdf, second column) because: “the most trivial possible Clifford algebra representation of U(2) spinors in Euclidean spacetime yields the chiral electroweak isospin and hypercharge law. This proves the claim that left-handed helicity electrons have a hypercharge of -1, and right-handed electrons have a hypercharge of -2 because, in a left-handed electron, half of the hypercharge field energy appears as weak isospin
charge, which doesn’t happen in right-handed electrons.” (Another interesting argument on that page: Dirac’s purely electrodynamic model of quanta is SU(2), long before the discovery of weak interactions. Maxwell always argued that magnetic fields were a manifestation of spinning electric charge, thus a SU(2) electromagnetic theory, but it’s heresy to say because the SU(2) Yang-Mills equation has a charge-transfer term, and the corresponding Maxwell electromagnetic field equation doesn’t. But we know why.)

Anyway, there’s a new comment at Woit’s blog: “Unlikely as you may think it is, if a lone researcher or a small group of privately-funded researchers came with a viable solution to the hep-th crisis, it would deal a big blow to the whole academic system. And perhaps with good reason.” (Er, the academia problem is nothing new in the world: it’s traditional Parkinson’s Law/groupthink corruption/political tea party elitism, trying to defend failure by ignoring criticisms, or by rejecting alternative nascent ideas in Herod’s style. The problem with assuming it will be “dealt a blow” is that a “ship” so huge and bloated that it is already grounded, can’t actually sink, any more than creditor banks could bankrupt Trump in 1992 when his debt was too big to go bad, i.e. big enough to suck down the creditors in vortex. Anyway, as in the situation of the Emperor’s New defective clothing in the kid’s story, the ending is not a revolution, but the masses agreeing to imagine that there is no real anomaly and that the clothing isn’t in any way, shape or form defective. The kid who complains that he sees some kind of problem needs to have his eyes checked.) 

Woit responds: “I’m writing more about this in the next posting, but my point about hep-th funding is that, with about 4000 hep-th arXiv submissions/year and order of magnitude 10,000 active hep-th researchers in the world, the problem with progress coming to a halt is not due to not enough funding. Yes, maybe some very different funding mechanisms would help, but no one is even discussing that, instead just asking for more of the same.”

The commenter responds: “More “non-mainstream” research could be needed after so much mainstream (at least until mainstream theory becomes non-mainstream, as it happens to be the case with holography now, according to some sources). This may suggest that the difference between mainstream and non-mainstream can start to become unclear after the debacle of string theory. So I would rather talk about promising lines of research -mainstream or not-, which better be publicly-funded.”

Again, I think this has been discussed before, but the whole problem with “Communism” is that what passes for it is dictatorial slavery. The greatest con of them all is the re-branding, by some kind of “elite”, of slavery as “good socialism” or (in extreme cases) “Communism”. (I won’t go on about this topic, but you can find more out about this if you read Reagan’s evil empire speech, or by googling Holodomor.) In a “democracy” you get corruption. In state-funded science, you get corruption. If private-funded science keeps on failing, it either goes bankrupt or at least doesn’t do as much damage (if people want to privately fund failing projects until they are penniless, that is more or less their decision). State-funded science diverts tax payer cash from everyone and if it doesn’t supply the goods, it becomes a leech.

All that sustains me is the conviction that our Realism, our rejection (in the face of all temptations) of all silly nonsense and claptrap, must win in the end.”

Update (6 Jan 2023): Again (copied section from previous post):

… Woit’s original 1988 paper, “Supersymmetric quantum mechanics, spinors and the standard model”, Nuclear Physics B303, pp329-342, which he kindly posted (after I emailed him, since I don’t currently have easy journal access outside academia) at https://www.math.columbia.edu/~woit/ssym-nuclphysb.pdf you can see the issue at page 332:

U(2) = SU(2) × U(1)/Z_2

Woit identifies SU(2) with weak isospin, and U(1) with weak hypercharge, showing that the hypercharge Y = -1 for doublets like the pairing of the left-handed electron and the left-handed neutrino;

yet is correctly Y = -2 for the right-handed electron which doesn’t partake in weak interactions (simply because it physically has no right-handed neutrino to interact with (because right-handed neutrinos simply don’t exist).”

So, using this 1988 argument of Woit’s, the corrected SM is

SU(3) × U(2) = SU(3) × SU(2) × U(1)/Z_2

This needs very careful analysis, because it appears to accomplish a lot without needless complexity.

Dr Wilson today has a new post up: “Welcome to my Humble Unified Theory”, https://robwilson1.wordpress.com/2024/01/06/welcome-to-my-humble-unified-theory/ where he points out that the three 2 x 2 Pauli matrices (in other words, three small tables of two columns and two rows each, containing traceless arrays of 0, i, -i, 1 and -1; which are also symmetric and thus Hermitian, with determinant 1; the 0, +/-1 and +/-i values in the matrices are inserted in the exponent g = exp(iX^a Y_a) to yield the group’s elements) form 3 “walls” which have 8 “corners”, the 8 being the number of SU(3) Gell-Mann matrices:

“Have you noticed how the 8 corners (Gell-Mann matrices) are related to the 3 walls (Pauli matrices)? Isn’t it really simple and functional? Do you see how it makes the HUT stand up, instead of being a pile of bits and pieces, like the Standard Model, looking like an earthquake has hit it? Do you see how, instead of a direct product SU(2) x SU(3) as in the Standard Model, the frame and the walls act together to create a structure, a model HUT that one can live in?”

Above: “Quaternions as 2 x 2 matrices”, extracted from https://leandraphysics.nl/qqd9.html which comments “Pauli matrices are a kind of “square roots” from quaternions.” I include this illustration because there are no pictures of patterns laid out like this in Wilson’s blog post. (I don’t guarantee that this illustration is 100% relevant/accurate for what we are interested in, but it at least breaks up the text.)

Wilson continues with the house building analogy and then states: “The Pauli matrices act on the Gell-Mann matrices to create a quark-mixing matrix and a lepton-mixing matrix, and the (unnamed) matrices on the top act on everything so that three generations can live under one roof.

“For those who are technically minded, and want the full spec, this group of order 648 is the full unitary group of 3×3 unitary matrices written over the field of order 4. But the generators I have given you can also be interpreted as unitary matrices over the complex numbers, by mapping the four elements of the finite field to 0, 1 and the two primitive cube roots of 1. So now you can measure everything with real and complex numbers, work out the dimensions, cut everything to size, weigh everything, measure the angles, and build you own Standard Model to your own specifications.”

Wow. Thanks a lot. (This last quoted paragraph shows perfectly the problem with trying to communicate with mathematicians.) SU(2) gives 2^2 – 1 = 3 distinct charge options (including neutral), and thus 3 distinct vector bosons. SU(3) gives 3^2 – 1 = 8 gluons of strong forces. So is the 3 weak bosons from SU(2) related to the 3 colours in SU(3)?

Wilson stated in his earlier post on the SU(2) Pauli matrices and SU(3) Gell-Mann matrices: “You cannot mix an electron with a neutrino, they are fundamentally different distinct objects, and there is no continuous symmetry that converts one to the other. So you must use the Pauli matrices as discrete symmetries, not as generators for the continuous group SU(2). The same is true for Gell-Mann matrices, because the proton, neutron, lambda baryon, three sigma baryons and two xi baryons are eight distinct particles, not an 8-dimensional continuum of particles. So we have to have a finite group generated by Gell-Mann matrices, not just a Lie algebra and a Lie group. It so happens that mathematicians know how to do this, although physicists apparently do not.”

But I’m not convinced by the starting argument there: “You cannot mix an electron with a neutrino, they are findamentally different distinct objects, and there is no continuous symmetry that converts one to the other.” The electron and neutrino form a pair of particles that emerge from the decay of a short-lived W boson, as you can see from the Feynman diagrams at the top of this blog post! If they are not related by a “continuous symmetry” then they must be related in some other way (e.g. a polarized vacuum physical mechanism which shields charge, takes up the energy in the polarization of virtual particles for a brief spell of time that’s longer than Heisenberg’s loan time, and thus converts one type of charge into effecyive mass, or whatever). So you have to be careful about attributing too much power over nature to Noether (brilliant though her idea is). Simple physical mechanisms can break Noether’s continuous symmetries: Goldstone’s theorem.

(Trinification i.e. the 1984 attempt using three linked SU(3) groups by Glashow, Georgi, and de Rujula, offers an example of the kind of calculations need to check in detail basic alternatives; Banu, Borut and Sisic’s 2023 “Trinification from E6 symmetry breaking” finds: “In the context of E6 Grand Unified Theories (GUTs), an intriguing possibility for symmetry breaking to the Standard Model (SM) group involves an intermediate stage characterized by either SU(3) × SU(3) × SU(3) (trinification) or SU(6) × SU(2). The more common choices of SU(5) and SO(10) GUT symmetry groups do not offer such breaking chains.” The SU(6) × SU(2) is interesting due to Cacciapaglia, Cai, Deandrea and Kushwaha’s “Composite Higgs and Dark Matter Model in SU(6)/SO(6)”, and Lunsford’s 6-dimensional unification via SO(6) > SO(3,3), which yields the Pauli-Lubanski vector. To put space and time fully on equal footings, you need equal numbers of space and time dimensions, even thugh we only observe one effective time dimension.)

Update (7 Jan 2024). Woit’s new post (dated 5 Jan, must have missed it yesterday or maybe it is a time-zones issue) is unsurprisingly brief and funny:

“I just finished watching the video here, which was released today. Since this was advertised as a panel discussion on the state of string theory, I thought earlier today that it might be a good opportunity to write something serious about the state of string theory and its implications more generally for the state of hep-th. But, I just can’t do that now, since I found the video beyond depressing. I’ve seen a lot of string theory hype over the years, but on some level, this is by far the worst example I’ve ever seen. I started my career in awe of Edward Witten and David Gross, marveling at what they had done and were doing, honored to be able to learn wonderful things from them. Seeing their behavior in this video leaves me broken-hearted. What they have done over the past few decades and are doing now has laid waste to the subject I’ve been in love with since my teenage years. Maybe someday this field will recover from this, but I’m not getting any younger, so dubious that I’ll be around to see it.

“Most shameful of the lot was Andy Strominger, who at one point graded string theory as “A+++”, another only “A+”. He did specify that very early on he had realized that actual string theory as an idea about unification was not going to work out. He now defines “string theory” as whatever he and others who used to do string theory are working on.

“David Gross was the best of the lot, giving string theory a B+. At two points (29:30 and 40:13), after explaining the string theory unification vision of 1984-5 he started to say “Didn’t work out that way…” and “Unfortunately…”, but in each case Brian Greene started talking over him telling him to stop.”

Brian Greene and Edward Witten who says world was not created for our convenience for understanding! Brief clip for critical review from 5 January 2024 String Theory Hype Fest starring superstring “theorists” (they still don’t have a theory for the universe we happen to inhabit, but they have theories for 10^500 others) from left to right: Brian Greene, Andy Strominger, Edward Witten, with David Gross (Witten’s mentor) at far right. For brief humor review see https://www.math.columbia.edu/~woit/wordpress/?p=13770 or for correction of errors see https://www.quantumfieldtheory.org/ Witten does acknowledge the problems, but then he becomes an assertive Creationist: “the universe was not created for our convenience in understanding it.” Who told him that dogma? This is a repeat of the same old story, e.g. Einstein’s issue with the 1st quantization dogma (imaginary single wavefunction “collapse”, when in fact path integrals prove there is simply multipath interference) which was falsely “resolved” by Bohr making up a similar Creationist story. In fact, saying something is “indeterministic” when you aren’t measuring it is nothing new. My watch is indeterministic. I always have to look at my watch to collapse the “single wavefunction” and determine the time. It never collapses when not looking… Duh.
Brian Greene and Edward Witten saying QM plus GR gives a MESS. Witten then adds some prattle about QM and SR unification being done successfully (by Dirac). But GR is a classical theory, QM is quantized.
Brian Greene and Edward Witten on the infinities in QFT. The infinities are due to discontinuties due to the physicl mechanisms for cutoffs; the IR is the threshold for the lowest energy, lightest quantum pair production energy cutoff for low energy (defining the boundary between classical electromagnetism and the running coupling of QFT/QED), while the UV cutoff corresponds to the largest possible masses of pair production in the vacuum, occurring closest to the zero distance. See https://vixra.org/abs/1408.0151 for how to get mass gap predictions from this renormalization program.
Brian Greene and Andy Strominger asserting 1st quantization single wavefunction indeterminancy crap, not proper QFT path integrals infinite wavefunction multipath interference! Proof of ignorant propaganda.

For anyone who loves lying propaganda from dictators being fawned over by gullible newspaper editors, here’s some more of this sort of hype to make you happy:

Update (9 Jan 2024): the superstring religious zealot “theory” hype is not only TV evangelism, it’s also still being promoted by (not censored out of) arXiv: see 3 Jan 2024 https://arxiv.org/pdf/2401.01939.pdf at page 24:

https://arxiv.org/abs/2401.01939 “The Standard Model from String Theory: What Have We Learned?” WHAT HAVE WE LEARNED?
“We would like to conclude this brief review by returning to the question posed in its title: What
have we learned from Standard Model constructions in string theory?

  1. The SM can be embedded into string theory: The main ingredients of the SM – the gauge
    group, chiral matter and Yukawa interactions – follow relatively naturally from general
    principles of string theory. Reproducing exactly the SM including, for instance, its vectorlike matter content or the precise flavor structure, is more involved, but each property per
    se has been obtained or is within reach. The challenge is to find vacua that combine all
    these features at the same time, with all moduli stabilized and incorporating a realistic
    cosmology.
  2. Why the world is described by the SM remains mysterious: Why the SM gauge group,
    matter content (e.g. three chiral families) or couplings describe our world is to date not
    clear from a string theoretic point of view, beyond anthropic arguments not specific to
    string theory, and may not even have a good explanation. This is related to the vacuum
    selection problem which is yet to be deciphered in the future…

Compare this crap to religious or communist fascism hard-sell: “you’ll get utopia in the future if you join our movement now.” Of course, you get the opposite, hell, because they just start a war they can’t win, for a utopia that will become hell in the course of the war, in which you will be used as cannon fodder. Since there are known defects in the mixings (see diagram at top of this post, and also our discussion of how vacuum polarization field energy conservation is traditionally ignored and “taboo” for the childish loons who are perceived as QFT topdogs) in the empirical Standard Model, even if the SM could be embedded into Superstring in a scientific way (quantitatively predicting stuff that is checkable), it would be wrong, like Ptolemy’s deductions of an earth centred cosmology from data glued into epicycles. The fact you can come up with empirical equations that – with some additional fiddles – can be forced to accurately fit some data doesn’t mean you’re on the right road. The physical dynamics must also be self-consistent, and that isn’t the case with the Standard Model. Fix that, then find the big picture.

What is curious and highly depressing about this what happens to you when you now focus on the mainstream’s “outsiders and critics”, the folk like Dr Woit of Columbia Uni maths department and author of Not Even Wrong 2006, Dr Smolin of Perimeter Institute for Theoretical Physics, Waterloo (don’t you love that place name?), Canada, author of The Trouble with Physics 2006. These people have alternative ideas but approach the subject the precisely the same erroneous way as superstring “theorists”, i.e. accepting the Standard Model as it stands, and seeking to find it intact as part of overarching unification scheme. Furthermore, they tend to accept that spin-2 quantum gravity or at least some spacetime structure that accommodates such a theory, is a proved fact, so their view is that the way forward is to find a theory that unifies two defective mainstream models! When they have trouble, Parkinson’s Law is the convenient way out: procrastination will permit the problems to fade away, in the way you “get used to” a bad smell and stop complaining, if you are forced to live near an open sewer for long enough.

This situation is a simple fact of human nature, something we see in politically dictatorial slavery dressed up as exciting, sexy socialist utopia (Orwell’s Animal Farm, 1984, etc.). Whenever it really hits the rocks, the dictators at the top simply start a war to deflect internal dissent to external enemies. The only radical revolutionaries to oppose the dictators effectively are driven underground literally or metaphorically by either icepick through the skull in Mexico (e.g. the case of Trotsky, author of Revolution Betrayed, which upset Stalin, or more recently Sergei and Yulia Skripal, who had Novichok sprayed on their front door handle 4 March 2018, Salisbury, UK leading to the killing of Dawn Sturgess and contamination of others). If you are tough, the enemy makes an active effort to stop you; be weak and they just passively censor/no-platform/ignore you. In fact, the latter case, weak opponents of dictatorship, actually do the dictatorship a favor by acting as strawmen to be laughed off stage. There is peaceful road to revolution.

A good analogy perhaps to the situation we’ve been in since 1984 would be if theorists in 1920 had all decided to follow Einstein, and spent the next forty years not pursuing QM and QFT, but studying ever more elaborate classical generalizations of GR, very excited about how they unified EM and GR. These generalizations somehow would never manage to make testable predictions, with the excuse that the properties of atoms were determined by the solutions to non-linear equations involving geometry at the Planck scale. So, while untestable, the theory would still deserve an A+++.” – Dr Woit.

At present, Dr Woit’s approach to dealing with mainstream superstring hubris – while I like the way he gets chiral SU(2) and U(1) from U(2) plus key SM charges (which may well, hopefully, still be applicable after correcting the SM’s simplistic errors as highlighted in this blog post) – reminds me repulsively of Catt’s (Maxwell equations) Anomaly and also James (carbon hype fest) Delingpole’s failed attacks on mainstream hubris – in both cases, these people (Ivor Catt and James Delingpole) refused to push for the reality to replace the mainstream dogmatic hypefest. Catt proposed to solve alleged issues in electromagnetic theory by reverting the theory to the pre-electron (JJ Thomson discovered electron 1897) Heaviside TEM wave theory of 1893! Delingpole proposed to flush mainstream climate drivel down the tubes as a political socialism based (CND “disarm or be annihilated) type fraud, without replacing it with reality. I argued with both Catt and Delingpole, but fould them both obnoxious elitists, who just wanted to fight a war in a politically correct manner, without escalating fast enough to win, and whose goal seemed to be very vague. If you ever have to fight in a war, make sure your leader has a battle plan that can win. Woit is proposing an alternative theory, but very, very weakly, very very slowly, against a torrent of hard propaganda from superstring. He’s also not being radical enough just yet.

I have now a lot of personal pressure to get things moving very fast here, and will have to make an effort to push forward on all fronts myself, if nobody else will do anything that seems to be required to win. (This tends to lead to inevitable mistakes in presentation or even occasionally in calculations, that are unnecessary, if only people who are supposedly mathematical experts would investigate the facts themselves. It also leads to inevitable bitterness if one person has to try to fire a whole battery of cannon at the enemy, only to be ignored or “criticised” for unprofessional behaviour by supposed opponents of the same superstring enemy. We have seen now weak attacks on mainstream electromagnetic theory and on escalation to win small wars hands-down simply encourages hubris and dictatorship. Only overwhelming escalation can break down status quo.)

Further update on 9 Jan 2024: I may be able to resolve the remaining problems fast enough anyway.

(Groupthink is either hypocritical or blatant dictatorship, no matter whether it token dissenters are used to camouflage it. But from the viewpoint of the opponent, blatant dictatorship offers the best target.)

Update (16 January 2024): Peter Woit has a new interview up on YouTube, https://www.youtube.com/watch?v=LcKx1xRW2sQ Fortunately, he is speaking more slowly than the less relaxed controversial interviews of him debunking string theory nonsense in videos from a decade or two ago. Woit begins by saying he first became interested in astrophysics in the 70s and then quantum mechanics, finally mathematics. This does throw some light on the problem. He was indoctrinated early on in certain dogmatic mainstream physics ideas, and then moved on to the mathematics behind that kind of mainstream interpretations of mainstream dogma in physics. He went to the institute for theoretical physics at the top floor of the maths department of Stony Brook University, NY (5mins 10 sec):

“… I wasn’t so interested in what physicists were doing … one reason for ending up in mathematics departments was that I could actually get a reasonable job in a mathematics department, which was not so easy in physics! If the job situation were different, then maybe I would have stayed in a physics department.”

At 7 min 30 seconds, Woit praises the M-theory promoter Witten and also mathematician Sir Michael Atiyah (who was the Master of Trinity when Catt confronted him at High Table over his alleged “anomaly” in Maxwell’s equations and then printed Atiyah’s letter in “The Catt Anomaly” in 1996 to some “controversy”, something I tried to resolve when writing some articles about this for Electronics World over 20 years ago; Atiyah also more recently became involved in another scandal, claiming in 2018 to have solved the 160 year old Riemann prime numbers hypothesis at age 88, before passing away in somewhat mysterious circumstances a year later). Woit has a dry sense of humor for obfuscating, double-thinking nonsense: “you want to spit in their face by voting for Trump. If so, you are quite right to feel the way you do. From a lifetime spent among such elites I can tell you that, yes, they do look down on you. Most people here in New York City probably do think you’re an ignorant racist. Your problem though is that Donald Trump is one of us. … American politics has become a reality TV show, with the plot line all about convincing people that a contestant is unethical and dislikable, and so should be voted off the island.” (Emphasis added. Yep he really wrote that. Take it as you will. What is interesting is that in 2016, Clinton was the one mired in “unethical and dislikable” behavior, whereas now, ironically, the left is using the same tactics to publicise Trump, who didn’t provoke the proxy wars with Russia in 2022 and Hamas in 2023 by going isolationist in 2021, withdrawing peace-keeping forces from Afghanistan in a way designed to wave a green lamp or white flag at Putin and Hamas. Carter did this in the 70s by deferring the neutron bomb deployment in deference to WPC propaganda paid for by Brezhnev, who then used the ensuing world power vacuum to escalate the cold war by invading Afghanistan. Appeasement also occurred in the 1930s when we disarmed until 1935 while the Nazis rearmed, then we rearmed slower than Nazis, to avoid a repeat of the 1914 “arms race” that UK Foreign Secretary Grey had falsely blamed for WWI.)

Update (17 Jan 2024): Wilson has a mathematical paper up, linked at the end of his fairy tale post https://robwilson1.wordpress.com/2024/01/17/fe-fi-fo-fum/ My comment (in moderation queue there, copied below in case “lost”):

Thanks for uploading the paper https://robwilson1.files.wordpress.com/2024/01/pgm2.pdf which is precisely what I want. 

“The full 4 × 4 quaternion matrix algebra can also be interpreted as a Clifford algebra for a 6-dimensional real space with signature (6, 0), (5, 1), (2, 4) or (1, 5).”

What about (3,3)? Also, can be the 4 x 4 Dirac matrices be related to SU(4)? (There’s a 1985 old paper on this: https://inis.iaea.org/collection/NCLCollectionStore/_Public/20/080/20080752.pdf but it’s obscure.)

The SM linking groups of 1, 2, and 3 basic charges, with combinations given by 1, (2^2) -1 and (3^2) – 1 matrices, respectively, doesn’t fit mathematically that well with Dirac’s original QED equation with four 4×4 matrices of spin and antimatter: problems come when reducing Dirac 4×4 matrices to 2×2 Pauli/SU(2) matrices, e.g. Dirac versus Majorana. Maybe Dirac’s 4×4 matrices should be expanded to 6×6, or the weak SU(2) matrices should be expanded to 4×4 like Dirac matrices or SU(4)? Part of the issue is that the importance [of] chirality was only proved experimentally in 1957 by parity violation. There are dynamical reasons in physics why parity violation may be covered-up in electromagnetism (e.g. infinite self inductance preventing phenomena which would indicate it). I don’t think the mixing angles and SU(3) colour charges are problems; they are explained as emergent properties due to pair production phenomena when constrained by energy conservation in the dynamics of vacuum polarization (this dynamical [mechanism] and [vacuum polarization] energy conservation is currently ignored by the mainstream).

ABOVE: Einstein’s argument against “quantum mechanics” in Physical Review on 15 May 1935 is fake new because it addresses SINGLE WAVEFUNCTION 1st quantization, such as Schroedinger and Dirac use of the wavefunction, NOT the correct postwar Feynman 2nd quantization or “quantum field theory” which has MULTIPLE WAVEFUNCTIONS for multiple paths, which must be summed in a path integral. The entire “quantum entanglement” and “Bell inequality” brigade tribe is off in prewar parallel universe in which path integrals (multiple wavefunctions, one per possible path) don’t exist. This comes from not having a focus on mechanisms, dynamics, and what really works best mathematically to describe nature.

UPDATE (23 January 2024):

If you look at the 1-page brief paper here: https://vixra.org/abs/1305.0012, you see it uses quantum gravity (full analysis is in earlier paper such as https://vixra.org/abs/1111.0111) to obtain tc^3 = Gm, which is an equation populised as an “empirical” model (without derivation) by Louise Riofrio, a NASA researcher. The more detailed paper https://vixra.org/abs/1111.0111 (and various other later papers listed at https://vixra.org/author/nigel_b_cook), plus a lot of posts from over a decade ago on this blog https://quantumfieldtheory.org/, makes the case that Riofrio should not assume c is a variable (not because it was “defined” as a “constant” by some committee 60+ years ago; such a “committee of experts” also defined the number of planets as 9 when I was a student, before another committee revised it to 8 – groupthink has a political spin which must be suspect to all serious reserachers), but it was Edward Teller’s “no-go theorem” against varying G (Teller in 1948 claimed that if G varied as Dirac claimed, there would be no life on earth because the sun would have been hot enough to boil off earth’s oceans in the Cambrian era!) which is still being used to bar progress, and Teller’s “no-go theorems” were disastrous for the early H-bomb program (Teller also came up with one against compression in the H-bomb, which led Stalin to take over Eastern Europe and nearly triggered WWIII). In a nutshell, the fundamental force QFT mechanisms are inter-related so – contrary to Teller’s assumption for his non-variable G “non-go theorem” of 1948, if one force like gravity varies, so does electromagnetism, and fusion rates in the sun depend on each force in opposing ways so gravity causes four protons’ fusion into helium by compression (something Teller was still refuting in 1948 and only reversed in March 1951 in his secret paper with Ulam, LAMS-1225), but electrostatic repulsion between the four protons ALSO varies with time in the SAME was that G is varying with time, so Teller’s simplistic calculations for the sun as as junk as his 1948 H-bomb “no-go compression” arguments. One thing Teller got right: Dirac was wrong about G decreasing with age of universe; it is proportional to time instead. This replaces “cosmic inflation”: the universe was initially “flat” simply because G was initially small, giving little curvature! Now recent evidence from astronomy backs up the premise that instead of the early “inflationary universe”: instead of a very rapid inflationary expansion on the Planck time scale (postulated by mainstream cosmology), you get effects from rising G.

Update 20 Feb 2024:

Peter Woit

 says:

February 14, 2024 at 6:52 pm

zzz,
I think “huh?” is a pretty universal reaction from physicists to that paper. The argument for a different relation between vector and spinor geometry is hard to comprehend if you haven’t thought a lot about the usual version of this. I tried to write something short and as comprehensible as possible, but don’t think that succeeded very well. Next there will be a much longer version, we’ll see if that helps.

In the paper I try and point out how this different relation opens up possibilities for a new way to do unification. To me, these seem very promising and something that no one has looked at before, but time will tell. It’s perfectly sensible for people to decide they want to ignore this until it’s further developed, but I’m still seeing no argument for why I shouldn’t keep trying to explain these ideas to anyone who will listen.

https://www.math.columbia.edu/~woit/wordpress/?p=13824#comment-245006
  1. Robert A. Wilson says:February 11, 2024 at 3:29 amYes, indeed, my question is about real forms, specifically whether you are working in SL(8,R) or SL(4,H) to describe the chirality of twistors. I understand that you don’t yet have an answer to that, but that you are looking at real forms of SU(4), where the corresponding question is to distinguish between SL(4,R)=Spin(3,3) acting on a Majorana spinor, and SL(2,H)=Spin(5,1) acting on a Dirac spinor. But then I can’t reconcile this question with properties of twistors acted on by Spin(4,2)=SU(2,2), or with your embedding of U(1) x SU(3) which requires real form SU(4) or SU(3,1).
    So you will surely eventually have to address the question of SL(8,R) or SL(4,H), and when you do, it may be useful to note that SL(4,H) contains SU(2) x U(3), but SL(8,R) does not. – https://www.math.columbia.edu/~woit/wordpress/?p=13824#comment-245006

Woit’s worldhttps://robwilson1.wordpress.com/2024/02/20/woits-world/

February 20, 2024

Meanwhile, over in Woit’s world, he complains that his papers get only a “huh?” reaction. So I have tried to help him with the following comment:

Possibly one reason for the “huh?” reaction is that people cannot translate it into the familiar language of Dirac matrices. The standard Dirac matrices are built as a product of finite groups Z_4.Q_8.D_8, where Z_4.Q_8 are the Pauli matrices and D_8 are 2×2 real matrices. Here D_8 is used for gamma^0 and gamma^5 to implement electro-weak unification. What you are pointing out, if I understand correctly, is that the same group can also be factorised as Z_4.Q_8.Q_8, simply by multiplying gamma^0 and gamma^5 by i, so that by exponentiation of Z_4.Q_8 we get SL(2,C), and the other Q_8 gives SU(2)_L. So you now have SU(2)_R inside SL(2,C), and SO(4) = SU(2)_L\otimes SU(2)_R acting on “Euclidean spacetime”. Moreover, by exponentiating the whole of Q_8\otimes Q_8 you get SL(4,R) acting on this spacetime.
You therefore achieve two remarkable advances at once: first, you have put the whole of SU(2)_L into the Dirac algebra, instead of just U(1); and second, you have a model which is not just Lorentz covariant, but generally covariant.
Expressed in this way, I think “huh?” might turn into “wow!”. …Now if you want to understand how Q_8.Q_8.Q_8 works to unify physics, read the new version of my paper, posted today at https://arxiv.org/abs/2401.13000. …

My response to the above:

About the Dirac matrices: again, I’m glad you’re looking at the fundamentals rather than building speculations upon speculations like the string theorists. But as with Woit, I fear you’re too conservative in looking at the assumed “facts”, which are an alloy of hard data (which is reliable) and crass early interpretative guesswork (which is unreliable).

The so called “matter-antimatter asymmetry” whereby about 90% of the observable universe is hydrogen (1 proton per 1 electron) is resolved by a quark-electron unification in which the allegedly fractional charges of the quarks in protons (and other hadrons) actually have integer charges, which are shielded by vacuum polarization during which energy is transferred to virtual particles as the short-ranged nuclear forces. If this isn’t a lie, the Dirac matrices (plus the vacuum polarization energy utilization mechanism) yields the truth.

Dr Wilson has kindly responded with more about his recently updated paper: https://arxiv.org/pdf/2401.13000.pdf I think there is actually some help to be had from reading Woit and Wilson, in the sense that it focusses your mind on the key problems that need to be overcome by alternative areas of research. If you step off the beaten track, it’s almost all quicksand that sucks you in, you disappear, and leave virtually no trace whatsoever (maybe a floating hat and an arxiv paper upload). The trick is that if you manage to find submerged golden nugget rocks way off in the quicksand, on which to safely tread, you need to document it all exactly, and make a convincing case for others to take it seriously. Then what? Others move in and get the gold you discovered? Or keep clean by going on ignoring it? I think you just have to do the whole thing yourself if you can and ignore the mainstream.

Woit’s https://www.math.columbia.edu/~woit/wordpress/?p=13830 “Two Items” is a classic. Having in his previous post https://www.math.columbia.edu/~woit/wordpress/?p=13824 popped the champagne to celebrate having his theory to explain the electroweak sector of the SM reported by New Scientist, he goes back to criticising New Scientist string hype for nonsense, this time by Conlon who apparently tries to grease over difficulties by commenting: “I also enjoyed our lunch in Oxford. No personal quarrel; but I am as mystified by your views as you seem to be about mine. … I cannot really understand the hostility to string theory as a subject, especially when compared to the alternatives … I can’t really grok why you seem hostile to ideas if they have the label `string theory’ attached, but indifferent if the same ideas appear without that label. … I don’t recall any posts by you criticising cosmologists who write papers about this era as unfalsifiable, etc [yeah but as Woit might way, he works in QFT not general science] …  I can’t really understand intellectual hostility from anyone with a broad-minded interest in particle theory.” Woit responded: “The focus on string theory is because the way it has been pursued and hyped to the public and other scientists over the last 40 years has done a lot of damage to the kind of research I care about. This first drew almost all resources in the field to one speculative idea, which was bad enough. Now after decades of this idea not working out, it has discredited all research on unified theories.” Conlon rejoined: “… I try and be super-careful about any expression like ‘test of string theory’. As you know, if you talk to journalists (which generally scientists should do if asked) you don’t control the blurb and you don’t control the title: and likely something will be picked that you would never choose yourself. So you either have to live with headlines you would rather were different, or not say anything at all: and I choose the former.”

There are some slight signs Woit sees the real light at the end of the tunnel, e.g.: “The problem isn’t so much writing down the Lagrangian, which will just be usual one for the Standard Model + GR. The real question is whether the use of chiral variables opens up a way of constraining the usual SM, explaining something new. The main problems here are that to get something new you probably need a new idea about the geometry of the Higgs, and I’m trying various things there. On the GR side, you have the usual problem that you are doing Euclidean quantum gravity, how do you understand analytic continuation?

Another: https://robwilson1.wordpress.com/2024/02/20/wimps/#comment-7491

Dark matter has a long anc chequered history. The key graph alleging to prove it is now an antique: https://en.wikipedia.org/wiki/Galaxy_rotation_curve#/media/File:Rotation_curve_of_spiral_galaxy_Messier_33_(Triangulum).png

Newtonian gravitation predicts that the orbital velocity of stars in the spiral arms of rotating galaxies should decrease at great distances from the centre of the galaxy. Redshift and blueshift data from the stars shows that the velocity instead remains constant or increases at large radii, and this was held up as evidence of dark (non-luminous) matter of some sort (initially assumed to be either cold dust or cold dust clouds, but later more exotic stuff when observations of light coming through the spaces from more distant galaxies showed no attenuation)

Milgrom’s alternative idea of modifying Newtonian gravity at very low accelerations below 10^{-10} m/s^2.

If you take Hubble’s recession velocity law, v = HR, where distance R = cT, and H is Hubble’s “parameter”,

v = HR = HcT, where T is “time into the past”

Time since big bang, t (years) = (1.37 x 10^10) – T

Thus, T = (1.37 x 10^10) + t (years)

v = Hct = Hc[(1.37 x 10^10) + t (years)]

acceleration, a = dv/dt = Hc ~ 10^{-10} m/s^2

Now what’s really interesting is that if you take this acceleration and put it into newton’s 2nd law, F = ma, you get the outward force of the big bang. Newton’s 3rd law then gives an equal inward force, like an implosion bomb. From Feynman’s rules for interactions, you get the simple graviton cross section (it’s about 1 barn for the sum of all the particles of observable matter in the universe). Then simple geometry gives you the mechanism for gravitational forces as depicted clearly in https://vixra.org/pdf/1305.0012v2.pdf

My father was arguing this back in the 1980s: everything in the universe is very similar to the phenomena in a 10^55 megaton nuclear explosion. Dr Gamow, Dr Teller and later Dr Kuroda first encountered this in the 1950s in the sense of using nuclear space burst phenomenology to predict the cosmological background radiation, abundances of the light elements due to an incomplete fusion burn (like Teller’s early H-bomb design, it expanded and cooled too fast to fuse all the hydrogen in the big bang fireball!), etc. Unfortunately, this is very unfashionable amongst the WIMPy physicists at the top. Big Bang cosmologist Weinberg tried to use Einstein’s classical curved space to argue against this simple explosion analogy, and this argument is taboo for almost religious reasons, as well as unfashionable. (You even get people trying to deny that explosions are possible in “space”, despite many successful nuclear tests in space, not to mention supernovas, etc.)

Dr Wilson replied:

Robert A. Wilson Says:
February 20, 2024 at 4:33 pm | Reply

Yes, I agree that there is a lot in common between nuclear bombs and explosions (e.g. of stars) in the larger universe. Where I find it difficult to follow is to extrapolate to the “Big Bang”. That is, I can imagine big bangs that take place in space and time, and may be many orders of magnitude bigger than exploding stars, and I can imagine that there was a bang 14 billion years ago that was so huge it wiped out almost all trace of anything that came before it, up to a huge distance from where we are in the universe. But I cannot imagine an explosion that was so big as to create the entire universe, because that explosion would have had to be infinitely big, and therefore physically impossible. …

Robert A. Wilson Says:
February 20, 2024 at 4:53 pm

It’s the hypothetical “inflation” that is the problem. The universe is not uniform enough for this to be a realistic scenario. …

  • Nige Cook Says:
    February 20, 2024 at 6:33 pmNot infinitely big, 4.2 x 10^70 Joules (10^55 megatons of TNT equivalent). Within 3 minutes of the big bang, the temperature was ~10^9 K, on the same order of that reached in a very efficient nuclear explosive, such as the isentropically compressed Ripple II, 30 October 1962 at Christmas Island. After that, it’s a straightforward nuclear fusion reaction. If you junk the GR-based classical cosmological epicycles model like the FRW metric (fiddled with ad hoc and unexplained dark energy and dark matter), you can get a simple implosion bomb analogy that explains dark energy and gravity quantitatively. I used to argue that this debunks string theory, because once you’ve got one fact based mechanism that gives non-spin-2 gravitation and correctly predicted dark energy in advance, you don’t need string. However, string theorists don’t want falsification of their budgets. It’s more curious why Woit and people like him don’t care.
  • Nige Cook Says:
    February 20, 2024 at 6:50 pm“It’s the hypothetical “inflation” that is the problem. The universe is not uniform enough for this to be a realistic scenario.” – Dr WilsonRegarding “inflation”; this is to get extreme flatness in the FRW metric (GR epicycles) cosmology which predicts that the compressed ~10^52 kg mass at early times has huge gravitational fields (curvature), causing far too much predicted fluctuation in masses around the universe than observed in the 300,000 year originated microwave background radiation. (This was first observed in the 70s using high altitud U2 spy planes with microwave sensors, then satellites like COBE were used from 1992.)NASA’s Louise Riofrio’s empirical equation for the universe, tc^3 = Gm, is actually derivable (see https://vixra.org/pdf/1305.0012v2.pdf). Riofrio claims, mistakenly I believe, that this shows c varies with the cube-root of time, without mechanism. I’ve gone into this and it’s wrong. The correct solution is Newton’s coupling G is directly proportional to age of universe t. This gets rid of inflation by predicting the observed weak curvature at 300,000 years (background radiation decoupling time, when matter ceased being a radiation-absorbing ionized gas).

Update (24 Feb 2024): Dr Woit has a link to a new interview with Prof. Lee Smolin, Author of the 2006 book criticising the failure of string theory’s hubris, “The trouble with physics”, and the more interesting 2013 book (which I’ll review below), “Time reborn”. As indicated in the video, Smolin has been diagnosed with Parkinson’s disease, supposedly a dopamine production problem (which also affects two of my uncles), but he has been treated with a deep brain simulator last year. The problem here is that dopamine can’t cross from the blood to brain tissue, so simply taking dopamine has no effect on Parkinson’s; thus the treatments are indirect and relatively ineffective, compared to treatments for diseases where deficiencies can be directly remedied.

Quick review of Smolin’s 2013 book Time reborn (the appendices of which are available online at https://leesmolin.com/time-reborn-2/online-appendices/):

  1. Smolin remarks on p.xiv of his Introduction: “Laws are not timeless. Like everything else, they are features of the present, and they can evolve over time.” (I have reasons – see above – for agreeing 100% with this, but it’s not clear what Smolin’s are. He has no equations, no evidence to support it.)
  2. p. xxv-xxvi of Introduction: “I do not have such a theory, but what I can offer is a set of principles to guide the search for it. These are presented in chapter 10. … The central principle is that … physical laws must evolve in that real time. … The American philosopher Charles Sanders Peirce wrote in 1891: ‘… Now the only way of accounting for the laws of nature and for uniformity in general is to suppose them results of evolution.’ … Paul Dirac … speculated: ‘At the beginning of time, the laws of Nature were probably very different from what they are now. … we should consider the laws … as continually changing … instead of as holding uniformly throughot spacetime.’ [Quoted from Dirac’s paper “The relation between mathematics and physics,” Proc. Roy. Soc. v59, pp122-9, 1939.] … Feynman … mused in an interview: ‘The only field which has not admitted any evolutionary question is physics. Here are the laws … how did they get that way, in time? … So, it might turn out that they are now the same all the time …’ [Quoted from Feynman’s 1973 PBS TV interview program, “Take the world from another point of view.”]”
  3. p.xxvii of Introduction: “Leibniz formulated a principle to frame cosmological theories called the principle of sufficient reason, which states that there must be a rational reason for every apparent choice made in the construction of the universe.” Anaximander (610-546 BC) began to look for mechanisms for phenomena, rather than simply inventing new Gods of lightning and fire.
  4. p.xxviii: “Time must be a consequence of change; without alteration in the world, there can be no time.”
  5. p.xxx: “In the Standard Model … the properties of an electron, such as its mass, are dynamically determined by the interactions in which it participates … masses arise from their interactions with other particles … Most of the laws of nature once thought of as fundamental are now understood as emergent and approximate [e.g. the least action approximations in path integrals, whereby many interactions result in a “classical law”]. Temperature is just the average energy of atoms in random motion, so the laws of thermodynamics that refer to temperature are emergent and approximate.”
  6. p97: “It remains a great temptation to take a law or principle we can successfully apply to all the world’s subsystems and apply it to the universe as a whole. To do so is to commit a fallacy I will call the cosmological fallacy.”
  7. p100: “Mira’s father also believes in another law, which is that all children prefer chocolate … But suppose that Mira is the only child that exists. There will be no way to test whether … hypotneses are general laws or just observations. … in cosmology there is genuinely only one case. … Newton’s first law of motion asserts that all free particles move along straight lines. … But each test involves an approximation, because … Every particles in our universe feels a gravitational force from every other. If we wanted to check the law exactly, there would be no cases to apply it to. Newton’s first law can, at best, be an approximation …”
  8. p207: “If the fundamental laws are time-symmetric, then the whole burden of explaining why our universe is time-asymmetric falls on the choice of initial conditions. … This point has been emphasised by Roger Penrose, and he has proposed a principle to explain it, which he calls the Weyl curvature hypothesis. The Weyl curvature … is non-zero whenever there is gravitational radiation or black or white holes. Penrose’s principle is that at the initial singularity this quantity vanishes.”
  9. p208: “There’s another and much simpler option. We believe that our laws are approximations to some deeper laws. What if the deeper laws were time-asymmetric? … A time-asymmetric universe would no longer be improbable, it would be necessary. … The difference between a physics near the initial singularity and a physics late in the universe would be forced on us by a quantum theory of gravity, which in Penrose’s view should be a highly time-asymmetric theory. But a time-asymmetric theory is unnatural if time is emergent. If the fundamental theory contains no notion of time, we have no way to distinguish the past from the future.”
  10. p219: “We all ride flows of matter and energy – flows driven ultimately by the energy from the sun. … the earthly realm is kept from equilibrium by the flow of energy through it. … the law of increasing entropy does not apply to the biosphere, which is not an isolated system. Indeed, natural selection is a mechanism for self-organisation … Highly complex systems cannot be in equilibrium, because order is not random, so high entropy and high complexity cannot coexist. Describing a system as complex does not just mean that it has low entropy. A row of atoms sitting in a line has low entropy, but is hardly complex …”
  11. p221: “There is good reason to believe that the matter and radiation in the early universe was nearly in thermal equilibrium. The matter and radiation were in a hot state, with a remarkably uniform temperature … All the structure and complexity we see today formed after matter and radiation decoupled” [i.e. when atoms formed from a gas of electrons and ions at a temperature around 3,000K around 400,000 years after the big bang, with that red-shifted radiation observable today as the 2.7K microwave background radiation].
  12. p224: “Stars, solar systems, galaxies and black holes are all anti-thermodynamic. … Consider a planet in orbit around a star. If you put energy in, it will move to an orbit FURTHER from the star, where it moves SLOWER.” Smolin is pointing out here that, although such a planet has lower kinetic energy, the added energy appears as increased gravitational potential energy which thermodynamics ignores.
  13. pp231-3: “There’s a small scientific literature attempting to chart what will happen far in the universe’s future. It’s all speculative, because … you have to make some big assumptions. One is that the laws of nature must never change … And no undiscovered phenomena must exist that could change the course of the universe’s history. … What happens to the universe in the long term depends mostly on … dark energy … about 73 percent of the mass-energy of the universe. … The universe’s future will be very different depending on whether the density of dark energy remains constant or not.” [Dark energy is just exchanged field quanta radiation, which is redshifted by the expansion of the universe, but it’s complicated because dark energy also powers gravitation as a Casimir force, and the gravity coupling, G, increases as a function of time; is the gravitational cross-section of a particle constant? These are deep waters and Smolin is too superficial, looking just at Einstein’s GR cosmological constant, not the detailed and involved QFT dark energy mechanisms.]
  14. p237: “Roger Penrose … speculates [in his 2011 Cycles of Time] that after some point … only photos and other massless particles would be left. if so, there would be nothing to detect the infinite passage of eternity, because photons, since they travel at the speed of light, don’t experience time at all. To a photon, the eternity of the very late universe would be indistinguishable from the very early universe. The only difference would be temperature. … Penrose argues that a single scale doesn’t matter … all that matters is the comparisons, or ratios, between things that exist at that time; the overall scale cannot be detected. So the late universe … becomes indistinguishable from the hot gas of the same particles filling the early universe … the late universe is also the birth of another universe.”
  15. p240: “To make further progress in cosmology (and in fundamental physics as well) we need a new conception of a law of nature, valid on the cosmological scale, which avoids the fallacies, dilemmas, and paradoxies and answers the questions that the old framework cannot address.”
  16. pp259-260: “In a complex modern economy, with many goods made by many firms and bought by many consumers, there are a lot of ways to set the prices of goods so that supply and demand are in balance. [E.g., a cup of tea can have different prices on a plane, in a motorway cafe, or at the Ritz restaurant. Cars, computers, houses, bars of chocolate have different options, and different prices] … economist Brian Arthur … began to argue that economics was path-dependent [i.e. with lots of different stable prices for the same product]. His evidence for this was that the economic dictum known as the law of decreasing returns is not always correct. This law says that the more of something you produce, the less profit you make from each item you sell [because the market is saturated with goods, and people only need so many loaves of bread per household per day]. This is not necessarily true, for example, in the software business, where it costs almost nothing to make and distribute additional copies of a program, so all the costs are up front. Arthur’s work was treated as heresy …” [The point is, even the basic “mathematical laws” of economics are undermined by the underlying mechanisms, which are usually far more complex or subjective, than assumed.]

Update (25 Feb 2024): Woit’s basic propagator calculation in his new lecture notes, https://www.math.columbia.edu/~woit/QFT/qftmath.pdf The Fourier transform (or its equivalent without complex number games, the Laplace transform) is central to practical calculations because it converts a force field potential energy formula such as the Coulomb potential (energy as a function of distance), into what is called a “propagator” (energy as a function of particle momentum) of a vector boson which is exchanged between charges to actually cause a force. The fact Woit sticks to Fourier transforms, whose integration require the mathematical trick of adding an infinitesimal negative imaginary amount to denominator just to do the integration, rather than using a simple-to-physically-understand Laplacian transform in real space, is really significant because it proves that what is important to him is symbolic orthodox dogma, which leads to confusion not rapid progress in physics.

Precisely the same mathematical obfuscation was used to kill off progress in understanding electromagnetism and circuit transients by Oliver Heaviside and his acolytes. If you have a capacitor (C farads) and resistor (R ohms from ohm’s law V = IR) in series circuit, the current flowing through them is proportional the function of time: exp[-t/(RC)], which is derived from the integral of the voltage change, RC dV/dt. Once the capacitor is full charged up to V volts, the flow of current through it drops to zero, because the total charge it can store is limited to: q = CV. Now this generates a time-varying current, I = C dV/dt. Magnetic fields generated by the current will themselves induce further currents if there are loops or coils of conductors in circuits; magnetic inductance L being related to voltage and current by V = L dI/dt. So you can understand the macroscopic behaviour of a circuit enough to make some predictions, but this you’re not looking at the fundamentals of electron and field quanta mechanisms, just at the statistical averages on large scales, akin to the measurement concepts of “temperature” and “pressure”, rather than the actual distributions of air molecule speeds and impacts that actually cause the forces behind such gross measurement units.

Heaviside and others, for various practical short-term reasons, rather than analyzing what is really going on at the particle level, instead went further into mathematical abstraction for the purpose of making calculations. For example, instead of analyzing the gauge boson mechanism for magnetic self-inductance of current in a transmission line, they introduced complex numbers and frequencies into lumped inductance and capacitance, replacing L and C with Liω and 1/(Ciω), where i is the square root of -1, and ω is the frequency of the time-dependence of the current (complete oscillations per second). This was originated by Steinmetz as a convenient way to analyze A.C. circuits. Heaviside introduced other non-physical mathematical concepts to aid simplistic calculations such as his “step-function”. However, such non-physical methods have lead to unnecessary mathematical obfuscation, because they end up being taught as orthodoxy and covering-up the real simplicity, as occurs in abstruse QFT calculation methods:

“… Oliver Heaviside … attained his results in a half-intuitive manner, which opened up his methods to suspicion from the more orthodox mathematicians of his time. Heaviside’s obscure and disorderly writings are enlivened by sarcasm directed at these misguided rigorists, such as ‘Whether good mathematicians, when they die, go to Cambridge, I do now know’. However, Cambridge eventually returned good for evil by the hand of T. J. I’A. Bromwich [ref: “Normal Coordinates in Dynamical Systems”, Proc. Lond. Math. Soc. (2), v15, p401, 1916] who showed that Heaviside’s results really could be soundly based on mathematical resoning. The same thing was done simultaneously by K. W. Wagner of Berlin [ref: “Uner eine Formel Heaviside zur Berechnung von Einschaltvorgangen”, Archiv fur Elektrotechnik, v4, p159, 1916]…” [Whether a mathematical tool is sound or not is irrelevant to whether it exposes physical reality.]

  • G. W. Carter, The Simple Calculation of Electrical Transients, Cambridge Uni. Press, 1944, p2.

The point is, precisely the same kind of Fourier analysis with complex conjugates has been used for over a century in A.C. circuit analysis, to convert time dependent field or electrical energy waveforms (energy as a function of time) into energy frequency spectra (energy as a function of frequency) for calculations, as is also similarly used in quantum field theory to convert a Lagrangian energy formula into propagator that can be used to make QFT calculations via Feynman’s rules for perturbative expansions (where the propagator of the vector boson is multiplied by couplings that represent effective cross-section areas of charges, to get a path probability). The fact is, where complex conjugates occur they’re calculational tricks that have to be eliminated elsewhere in the calculation, so they’re simply not proof of any kind of real world phenomena. What is being taught is not physics, but mathematical tools that cover it up.

Very simple physical dynamics are submerged by irrelevant obfuscating and frankly misleading mathematical calculations, and the resulting fog of war is then used to defend continued ignorance by officialdom. “Look, our analysis has all these square roots of -1, therefore the world is beyond comprehension, there are no mechanisms, we need 10/11 supersymmetry and spin 2 gravitons, progress requires extremely complex mathematics since God is thus proved to be a pure mathematician.” Nah.

Update (27 Feb 2024): Woit’s current post https://www.math.columbia.edu/~woit/wordpress/?p=13830 refers to Padmanabhan‘s article https://arxiv.org/abs/1712.06605 “What happens to the anti-particles when you take the non-relativistic limit of QFT?”, which in equatio 195 at page 53 states (basically) that Dirac’s path amplitude (after deleting bra-and-ket symbols and other math trivia to get to the machine),

Or, path amplitude = exp[-it(p^2 + m^2)^{1/2}]

Now this is interesting because you’ve got relativity stuff – specifically Einstein’s Hamiltonian H = (p^2 + m^2)^{1/2} – in there, that also appears to be related to the modern QFT Feynman propagator, is Dirac’s c1933 solution to his own 1929 equation. (Dirac, “The Lagrangian in QM”, Physikalische Zeitschrift der Sowjetunion3: 64–72, embedded below; but note that Dirac was too conservative and really sticking to Hamiltonian not Lagrangian multipath interference treatment – notwithstanding the misleading title of the paper – since he got the equation wrong, by ignoring multipath interference between “virtual” paths that is the whole basis for the non-classical mechanisms in QFT! Like the con he was – they all are – he similarly in 1929 “predicted” that the anti-electron was the massive proton. Enough said.)

What Dirac did was to make QM relativistic in his 1929 equation that “predicts antimatter”. You can see how it predicts antimatter: negative masses in the Hamiltonian “predict antimatter”. You square a negative number, that gives you positive energy just as if you squared a positive number. Feynman’s spacetime path integral from 1948 is different:

The literature of QFT has a lot in common with Marxist or obfuscating religious literature: high in trivia and dogma, low in understanding to the point of actually denying the existence or possibility of understanding, in order to defend some vaguely specified “orthodoxy”.

Update (28 feb 2023): on the subject of mathematical crap being substituted for physical understanding of actually observed phenomena, Sabine has a comments on Woit’s Not Even Wrong today, saying “I find it interesting how graphic designers can’t seem to resist the temptation to arrange the standard model particles in some sort of symmetrical configuration even if that doesn’t make any sense.” A bit rich coming from somebody who promotes herself as an active physicist, yet whose most recent paper she links to on Google Scholar is from 2013 and titled: “Minimal Length Scale Scenarios for Quantum Gravity” Living Rev. Relativity, 16, (2013), 2, “We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances.” Philosophy, not science.

Update (leap day 2024): Woit finally responded to Sabine with a sensible picture of fermion relations

This is key: if you follow our evidence for lepton-quark unification above, the SM “charges” are not fundamental because they are shielded by vacuum polarization; what’s fundamental is spinor chirality:

Woit’s analysis kinda fills in the gaps in the mechanism described above. Note that all the factors of 1/3 in the hypercharges (Y) which occur for quarks, are due to the shared (combined, multiplied) stronger vacuum polarization shielding mechanism that occurs for pairs or triplets of quarks, as opposed to the weak (single veil) vacuum polarization shielding for lone electrons.

Notice above that these “morphisms” or “transformations” are mathematical, like chemical reactions written down on paper (they don’t necessarily happen due to energy considerations, and when they do, they may be dependent on factors like kinetic energy – i.e. temperature – or catalysts, to try to prevent a reverse reaction from out what you want to try to get). We’re talking about the relationships between different things, not saying you can transform them. A good example is beta decay of nucleons. A neutron is heavier than a proton, so the neutron can decay. The reverse reaction, proton decay, can be written down on a piece of paper as for SU(5) unification theory, but doesn’t happen under observation. So what we’re doing is trying to understand the basis for unification, rather than offering recipes for real world transformations of leptons into quarks, etc.

To get back to the Dirac equation matrices for matter-antimatter “unification”, and Wilson’s paper, what I think we need to see are Dirac style matrices for left and right handed spinors above, in other words, explaining all the left-handed and right-handed fermions of the a generation of S.M. particles, yet not getting bogged down with factors that are simply vacuum polarization shielding mechanism effects (like the 1/3 factor in the charges of all quarks). Simple.

Update (7 March 2024): There’s some more interesting discussion in comments to a post on Woit’s blog where he asked “I’d be curious to know if anyone can point me to a good discussion of the path integral formalism for non-relativistic quantum field theory, which is something I haven’t found.” The replies in the comments section include https://arxiv.org/pdf/1209.1315.pdf and its earlier version https://cds.cern.ch/record/257342/files/P00020083.pdf as well as https://hagenkleinert.de/documents/pi/HagenKleinert_PathIntegrals.pdf which includes the following nice illustration:

Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84:

‘I would like to put the uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [arrows = path phase amplitudes in the path integral, i.e. eiS(n)/h-bar] for all the ways an event can happen – there is no need for an uncertainty principle! … on a small scale, such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by field quanta] becomes very important …’

Upates (19 March 2024):

“I’ve always been an admitted elitist: in the face of a really hard problem, only a very talented person trained as well as possible and surrounded by the right intellectual environment is likely to be able to get somewhere.” -Woit, 20 years.

Have you read Irving Janis’s 1972 book, Victims of groupthink? There’s a danger that the leading alternative to one dictatorship, turns into another one, with its own elitist system of hubris.

“So, this is my candidate for the Holy Grail of Physics, together with a guess as to which direction to go looking for it. There is even a possible connection to the other Holy Grail, I’ll probably get around to writing about that some other time.” – Holy Grail of Physics, https://www.math.columbia.edu/~woit/wordpress/?p=3

Still awaiting the “possible connection”… just as still awaiting completion of Woit’s theory, and still awaiting the introduction of one iota of realism into superstring “theory”.

There’s also a bit of discussion with mathematician Dr Wilson on “Crackpots versus crackpots” https://robwilson1.wordpress.com/2024/03/15/crackpots-versus-crackpots/;

I suggest (typo corrected) that Woit’s and Wilson’s problems may be due to a mathematical failure in “relativity theory” since “Einstein’s and Klein-Gordon Hamiltonian, p^2 + m^2, leads to Dirac’s gamma matrices. By contrast, the Laplace transform of the Coulomb field, 4π ∫ r exp[-r(m + p)] = 1/(m + p )^2, the correct Coulomb field propagator, suggests that m^2 + 2pm + p^2 is the true basis for finding gamma matrics, rather than p^2 + m^2 as Dirac assumed.” Dr Wilson’s response is to sniff that momentum is a vector and clarify his own approach, which is interesting but raises some questions:

“There may well be something important in these remarks. But I do get nervous when momentum is treated as a scalar rather than a vector, so that I don’t really know how to interpret a formula like (m+p)^2. Sometimes I implement m as a vector, which allows me to interpret (m+p).(m+p) as a scalar product, but scares the shit out of people because I am linking the three generations of electrons to three directions of the gravitational field. More often I combine scalars and vectors into quaternions, which avoids that particular problem but creates others.

“But I feel the problems with the Dirac equation go even deeper. Dirac starts from Einstein’s m = sqrt{E^2-p^2}, and re-arranges it as -m^2 = p^2-E^2. Then he wants to factorise the right-hand side as (p+E)(p-E), but this doesn’t make sense as it stands, so he introduces the gamma matrices so that it does make sense. But he still has anti-particles with negative energy. To avoid negative mass as well, he factorises the left-hand side as (im)^2, which actually makes matters worse because it makes the mass imaginary.

“On the other hand, if we follow Hamilton rather than Lagrange, we should write the equation as E^2=m^2+p^2, and write m+p as a quaternion q=m+ip_1+jp_2+kp_3, so that the right-hand side factorises as q.bar{q}. Now we want to factorise the equation into particles E=q and anti-particles E=qbar, so that anti-particles have positive energy and positive mass (as experiment tells us). Again we need to introduce more quaternions (i.e. the spinors) into the equations in order for them to make sense, but we do not need the gamma matrices.

“This makes everything much simpler than in Dirac’s version. Anti-particles have opposite parity to the particles, but they do not travel backwards in time. Local spacetime symmetries become Euclidean SO(4), as required by QM, rather than Lorentzian SO(3,1). Instead of requiring SL(2,C)=Spin(3,1) to describe spinors, or Spin(4) = SU(2) x SU(2), we can identify spinors as Euclidean spacetime vectors. Of course, spinors are not the same as Minkowski spacetime vectors, as everybody knows.

“Finally, this avoids the problem that there is no thing as a generally covariant spinor, which is one of the major problems with combining particle physics with gravity. We do not need spinors, which is just as well, because they don’t exist in the real world.”

I’m not happy about “antimatter” models until you look objective at things like the alleged matter-antimatter inbalance. The hydrogen atom, as basic building block, is four particles, two +2 charged upquarks, and two -1 charged downquarks. One of the latter however doesn’t feel nuclear confinement forces, so is excluded from the nucleus and is called the electron. The three that are in the nucleus then have a net charge of 2(+2) -1 = +3 units, which due to close proximity boosts the vacuum polarization (electric charge shielding veil) by a factor of 3 from what it would be for a single particle, thus giving an observed proton charge of +3/3 = +1. Not very sophisticated mathematically, just a neat mechanism! Now if you redefine “antimatter” to allow for this mechanism, maybe it will change the “Dirac spinor” interpretation or whatever you wish to call it.

Professor Srednicki (string theorist) showed what happens when you come up with a new idea of even a slightly different approach to textbook orthodoxy: https://www.math.columbia.edu/~woit/wordpress/?p=3#comment-39 This kind of hubris holds back progress.

“The arrogance of people in the particle theory community never ceases to amaze me. Assuming that anyone who dares to criticize what is going on in the subject must be ignorant is all too common behavior. … It’s hard to extract from the torrent of personal abuse what Srednicki’s criticisms are. The weird thing is he doesn’t deal with my views on string theory, which are controversial, but here gets very worked up about what are accurate and not at all controversial statements.” – The Holy Grail of physics, https://www.math.columbia.edu/~woit/wordpress/?p=3#comment-39

Scalars become vectors when you need to worry about direction, but I’m keeping to understanding interactions between two particles where energy is exchanged in the path of least action – a very mathematically straight line indeed between two points – so goodbye vectors. (This simplest Feynman diagram, corresponding to classical physics, is the biggest contribution in the perturbative expansion to the path integra, because more complex diagrams pick up additional 1/137 couplings, rapidly becoming very insignificant.) I’m interested in calculating force strengths and masses, both very much scalar. It helps to clean up a few million pages of mainstream QFT drivelmongering before thinking mechanisms.

Update (20 March 2024): Dr Wilson has another comment explaining the Dirac spinor problems

“There is another aspect to Dirac’s negation of m^2=E^2-p^2, which is that physicists define the Lie bracket (for the Lie algebra of spacetime derivatives) as [A,B]:=i.hbar.(AB-BA), rather than the mathematicians’ definition [A,B]:=AB-BA. The factor of hbar is just a distraction, used for quantising the Lie algebra, but Lie algebras are automatically quantised, so we might as well put hbar=1. This allows us to use Lie algebras for things other than angular momentum/spin, and put in the appropriate units afterwards.

“But the factor of i is insidious, because it only allows physicists to use complex Lie algebras, never real or quaternionic Lie algebras. The reason is that this factor of i converts mathematicians’ anti-Hermitian matrices into physicists’ Hermitian matrices, but this only works in the complex case, where there are n^2-1 independent matrices in both the Hermitian and anti-Hermitian cases. In the real case, there are n(n+1)/2 independent Hermitian (symmetric) matrices, but only n(n-1)/2 anti-Hermitian (anti-symmetric). In the quaternionic case, there are n(2n-1) Hermitian and n(2n+1) anti-Hermitian.

“In order to model three generations of electrons, it is necessary to use a quaternionic Lie algebra, which means extending the complex Pauli matrices to quaternionic. Now there are 6 independent Hermitian matrices, and 10 independent anti-Hermitian. The Hermitian ones do not form a Lie algebra, so do not model differential equations, and do not quantise anything. The anti-Hermitian ones do form a Lie algebra, which happens to be so(5), in case anyone is interested. You can find an isomorphic copy of it inside the Dirac algebra by taking generators igamma_0, gamma_1, gamma_2, gamma_3, igamma_5.

“Now we can ask, what does this Lie algebra quantise? Well, it’s basically the same as the Dirac equation, that is a quantisation of the Einstein mass equation, so what it quantises is mass. It needs re-arranging a bit, inside a quaternionic (rather than complex) Dirac algebra, in order to separate the electron generation from the direction of spin, but after that there are five fundamental masses quantised by this Lie algebra, namely the three generations of electron, plus the proton and neutron.

“If it quantises mass, then it quantises gravity. The 5-dimensional representation of so(5) is reminiscent of the “spin 2 graviton” that comes out of GR, and the 10-dimensional (adjoint) representation, that should contain the gauge bosons, matches the 10 dimensions of the Einstein field equations (i.e. the 10 dimensions of the stress-energy tensor and the Ricci tensor). But it doesn’t match properly, because Einstein uses so(3,1) instead of so(5). Kaluza-Klein gets closer, but still hasn’t got the quaternionic structure right. Dirac, of course, identifies so(3,1) with sl(2,C), which requires both the Hermitian and anti-Hermitian versions of the Pauli matrices, so we need to do the same in the quaternionic case, which gives us the Lie algebra sl(2,H) = so(5,1). Now we at least have a chance that it might work.”

Update (21 March 2024): on neutrinos. My table of particle mass calculations in https://vixra.org/abs/1408.0151 doesn’t include neutrinos and that paper only points the way towards neutrino masses by arguing that since there is very strong evidence that mass arises from vacuum polarization by the field energy of a particle’s charge(s), the fact (left-handed) neutrinos only have weak charges, implies that they have correspondingly small masses. Clearly this mechanism points in the direction of theoretical calculations and checks with experimental data, but I’ve also been more deeply into the whole Dirac matter-antimatter spinor delusion, with some progress there. The key insight seems to be the asymmetry in the equivalence (for charges in first generation of fermions), with LH = left-handed and RH = right-handed:

LH neutrino (hypercharge -1) + LH downquark (hypercharge +1/2) = RH downquark (hypercharge -2/3).

The total hypercharges on both sides are equal at -2/3, the total electric charges (observed outside a vacuum polariation IR cutoff) on each sides equal -1/3, and the weak isospin charges on each side equal 0. (You could also add on a right-handed neutrino to the right hand side, but it is only mass-energy.)

To emphasise what is interesting here, the equivalent equation for electrons and upquarks (again with hypercharge adding up to -2/3 on each side) lacks this asymmetry:

LH electron (hypercharge -1) + LH upquark (hypecharge +1/3) = RH electron (hypercharge -2) + RH upquark (hypercharge +4/3).

What I’m driving at here is the way that the left-handed weak isospin charge affects the whole concept of matter-antimatter transformations. In 1954, Pauli and others “proved” for any Lorentz invariant quantum field theory, the fake “CPT theorem” (based on Feynman diagrams) holds, which alleged that antimatter is a transformation in which the simultaneous reversal all charges (C) and spin parity (P) is equivalent to simply reversin the time direction (T), i.e. an electron would transform into an antimatter positron with opposite charge and spin. Pauli was then debunked experimentally just before he died in room 137!

The whole implication of this are deliberately obfuscated in physics (in the same way that 1st quantization “entangled wave functions” disappear in proper multipath interference 2nd quantization path integrals, where there is no such thing as a single wavefunction to become entangled; there’s an infinite number of wave functions and most interfere – the net result is all that matters). The experimental debunking of CPT by the entirely left-handed weak nuclear force debunks Pauli’s Lorentz-invariance based proof of CPT.

“One can show that for CPT theorem to hold, Lorentz invariance is not always needed. … if one attempts to describe the particle and antiparticle propagation with definite masses by pole approximation, for example, then the off-shell Lorentz covariance of the propagator is lost.” – Anca Tureanu, CPT and Lorentz Invariance: Their Relation and Violation, Journal of Physics conf. series 474, 2013.

So Lorentz-invariance should be “questioned” on this basis, if not entirely debunked and kicked out of physics. Lorentz-invariance is simply not needed in the path integral if you understand the mechanism for the multipath interference it models, and therefore don’t need to impose mathematical safeguards to prevent errors. (This is the key reason for complex numbers in quantum field theory equations; something easily avoided when you understand the mechanisms of the physical processes, which averts error. The need for “i” in mathematical physics boils down to the fact that vector information is lost when wavefunctions are squared to turn them into real probabilities, i.e. both -1 and +1 when squared gives +1, so you lose vector e.g. directional flow, information unless you take include a factor of “i”, which gives -1 when squared. It’s just a calculational trick, with nothing supernatural to turn equations into a religion of the damned.) The problem of course is that Lorentz-invariance is regarded as sacrosanct, as a result of Maxwell and Einstein, the saints of mathematical physics. So a hybrid muddle persists instead of clarity.

If we take our example of CPT-volating weak isospin REALITY: LH neutrino + LH downquark = RH downquark. Now, how does REALITY affect simplistic Dirac spinor’s “matter-antimatter” transformations?

Answer: the observable universe is essentially composed of hydrogen, i.e. one electron, one “downquark”, and two upquarks. If this 2×2 matrix is a 50:50 “matter-antimatter” array with some camouflage due to the chiral nature of the weak isospin force and also the vacuum polarization shielding of electric charge (converting most of the electric charge energy into short ranged virtual particles that mediate nuclear forces), then LH neutrino + LH downquark = RH downquark shows the left-handed neutrino can be treated as a isomeric transition’s radiation emission from right handed downquarks, which decay into left-handed downquarks. Or alternatively, a LH downquark could emit an antineutrino, thus decaying into a RH downquark: LH downquark = RH downquark + LH antineutrino.

The corollary to this equation is the prediction of that not all neutrino interactions result in apparent flavour changes, i.e. a neutrino can simply charge the parity of a quark, without changing the flavor of the quark (i.e. without changing a downquark into an upquark). This would result in lower neutrino interaction rates than predicted, i.e. an effect analogous to the neutrino oscillations, which are postulated to account for observed neutrino interactions from the sun being only about 1/3 what was expected.

I’m not saying this is definitely a complete alternative to neutrino flavor oscillations, which can be measured at different distances to beta sources like nuclear reactors (at least in principle; there is always more accurate data around the corner), but it is defensible solid physics so it might throw some light on the problem or contribute to a better understanding of neutrinos.

In other words, like our mass mechanism in https://vixra.org/abs/1408.0151, the direction of travel of our physical understanding is just extending existing simple mechanisms, to gain more clarity and further predictions. (The failures of mainstream physics have resulted, as always, from fake “no-go theorems” used initially as barriers against rapid progress for fear of making mistakes, but then these become hardened orthodoxy, for example Lorentz invariance, or 1st quantization single-wavefunction absurdities dressed up as “proofs” of the impossibility of simple mechanisms! Ironically, it’s taboo for them to fake such a “no-go theorem” against utter bullshit like superstring “theory”, so the only thing they end up with is the biggest mistake possible.)

Update (23 March 2024): Woit’s December 2023 https://arxiv.org/pdf/2311.00608.pdf, “Spacetime is Right-Handed” paper directed the reader to https://arxiv.org/pdf/0706.3307.pdf ,”Gravi-Weak Unification” for gravity-weak unification, which states on page 11: “Even though in the SM left- and right-handed fermions occur in different representations of the gauge group, there are many unified models where at a more basic level the symmetry between left and right is restored.

“The minimal such models were based on the Left-Right-symmetry [9, 10] and the Pati-Salam partial unification SU(2)_L × SU(2)_R × SU(4) [11]. In these models the hypercharge U(1) group is enlarged to a group SU(2)_R acting on the right-handed fermions in the same way as the weak SU(2)_L acts on the left-handed ones.

(Emphasis added to key idea. Much of this paper, as with so much speculation in this field – millions of pages of it – is concerned with the rigor mortis of mathematical trivia, not the physically intuitive hard mechanism for what’s going on in terms of particles going around and interacting to produce observable phenomena. You have to make progress by simplifying enought to understand dynamics, not shovel more mathematical trivia over the mechanisms to hide them completely. This point rarely if ever occurs to these people. There is absolutely nothing about gravity fact or mass mechanism fact in that paper!)

ABOVE: Woit’s evidence that spacetime is right-handed Euclidean spacetime, from page 5 of https://arxiv.org/pdf/2311.00608.pdf where Woit states: “Remarkably, except for the Higgs field, all the fields of the Standard Model in Euclidean spacetime have chirally asymmetric descriptions and dynamics that only depend on the right-handed space-time degrees of freedom.” On page 4 he states: “Since the work of Schwinger [J. Schwinger, Proceedings of the National Academy of Sciences 44, 956 (1958)] it has been apparent that a fundamental formulation of quantum field theory in Euclidean spacetime is possible and has many attractive features. In particular, path integrals often are well-defined in Euclidean spacetime, not in Minkowski spacetime. Euclidean quantum field theories have some very different properties than in the Minkowski case.”

The problem here is, as we have seen in first diagram at the top of this post, the Standard Model’s distinction between “matter” and “anti-matter” and therefore the whole basis of “lepton-quark unification” efforts based on that distinction, is BS, due to the failure to identify the anomaly of beta decay via W vector boson, which was introduced when electroweak theory replaced the Fermi point beta decay theory in around 1967. Woit makes no mention of this kind of thing, ignores emails or comments about it, and after he’s been doing this for 20 years I feel there’s no hope of progress from “collaboration” in physics. You are forced in doing things yourself in this manner. The whole elitist groupthink delusion persists. I had the same problem with Riofrio, who shows no interest in a factual derivation of her equation. There are extreme groupthink style reactions against any real (radical, revolutionary) progress; feigned disinterest, abuse, pretending it is invisible, doesn’t exist, etc. There’s a similar mindset on nuclear weapons effects, despite a similar mindset on chemical weapons in the 1930s leading to appeasement and WWII! You’d think that after all the disappointments of herd socialist mentality in trying to abolish individual dissent, the media would learn to ignore “consensus science taboos” and dig up the facts!

Above: Woit explains to Tong how SO(4) = SU(2) X SU(2) needs to be complexified when the spacetime spin SO(1,3) replaces SO(4), so that you get SL(2, C) X SL(2, C), rather than SU(2) X SU(2). This leads to a chiral Spin(2,2) group, as Woit shows on page 105 of https://www.math.columbia.edu/~woit/QFT/qftmath.pdf:

Homework question: How is this related to efforts to improved understanding of the distinction btween matter and antimatter in corrected electroweak theory, say to improve the physical basis of Dirac’s spinor from an improved energy Hamiltonian so it produces, for instance, a better understood chiral electroweak unification, rather than then ambiguous nonsense that can be fitted to virtually any data?

Above: in 2002, Woit put forward (deeply buried in his paper https://arxiv.org/pdf/hep-th/0206135.pdf and with diffident comments to back it up, the exact opposite of the hype from superstring folk for complete speculation) what looked to me like a very solid, fact-based and fact-producing electroweak model suggestion (but lacking the dynamics detail that we can supply easily from mechanisms that are ignored by the mainstream). However, his deep math approach is to submerge vital evidence in an ocean of symbolic trivia, that may appeal to funding committees and impress Lenny Susskind, and may indeed have some mathematical barrel-organ skill (like writing a play in Latin), but doesn’t lead to rapid progress in physics, being ignored by the mainstream. In addition, the same paper highlights but fails to correct the key error in quantum field theory:

There shouldn’t be any calculus in quantum field theory, because the whole point is that you are replacing continuous fields with a series of point interactions. Like pollen fragments undergoing Brownian motion due to air molecule bombardment, the electron is undergoes a series of interactions with Coulomb field quanta (“virtual” photons) which deflect it. A better example of the failure of calculus in the real world of discrete interactions in the 1950s was the fallout speed error from the H-bomb’s 100,000 foot mushroom cloud: calculations with Stokes law and average air viscosity massively underestimates the fallout descent rate of 5 micron diameter particles at such high altitudes. The standard theory of a continuously acting force gives you a “terminal velocity” which doesn’t actually exist: the fallout particle instead plummets like an accelerating apple in a vacuum in free-fall, apart from occasional discrete collisions (impacts) with air molecules in that low-density air. It turns out that the continous (calculus based) approximation is only valid for particles large enough, in air of density high enough, that the rate of interaction between the dust and the air molecules bombardment is high enough to prevent significant acceleration between impacts! There is every reason to think that this kind of error is also applicable to quantum field theory.

There’s a further application of the fallout analogy of use in understanding the path integral, namely Schuert’s method of mapping out – and adding up the contributions to – path integrals for particles falling through a wind shear structure which carries them in different directions and with different speeds at different altitudes, in order to work out the “hot line” of maximum fallout on the ground. This has a certain analogy to the use of path integrals to work out the path of least action:

So, looking again at Schuert’s graph, and comparing it to the QFT path integral as Feynman depicts it in several spatial (not space versus time, as classically done) graphs in his 1985 QED book, you can develop a clearer understanding of what’s really going on in the latter. For example, suppose Schuert had wanted not to see the “big picture” of where particles end up, but merely wanted to see what fallout particles arrive at a fixed spatial point in the fallout area. Then he would ignore all particle trajectories that didn’t end up at that termination point. All he wants to know, then, is what arrives at designated location.”

In the path integral, you’re working out the multipath interference amplitude by summing all possible spatial paths, where the individual paths have a phase amplitude that’s a function of the action (K.E. – P.E. integrated over a fixed time for a path; the amplitude is always for multipath inteference where at a given time, the paths arrive at a fixed spatial point to interfere). This treats space and time differently, and Dr Woit argues for using the Euclidean not Minkowski signature for such integrals. The usual path integral for SM particle physics cross-sections and reaction rate type calculations, where the amplitudes for different paths vary, due to varying SPATIAL configurations over a FIXED TIME for all the paths involved (every path integrated must arrive at spatial end point at the SAME time) and are are summed to give total amplitude at a FIXED SPATIAL ENDPOINT LOCATION and for a FIXED TIME. Schuert’s plots, and Feynman’s revolutionary all-spatial path integral diagrams in his 1985 QED book, are a step forward in physical understanding…”

“There are loads of other “clues”. One massive issue which again is totally ignored by the mainstream (including PW) and by popular science writers is that the quantum electrodynamic propagator has the form: 1/(m + k)^2, where m is the virtual massive (short ranged) electromagnetic field quanta (e.g. the virtual electrons and positrons that contribute vacuum polarization shielding and other effects between IR and UV cutoffs), and k is the term for the massless (infinite range) classical Coulomb field quanta (virtual photons which cause all electromagnetic interactions at energies lower than the IR cutoff, i.e. below collision energies of about 1 MeV, which is the minimum energy needed for pair production of electrons and positrons).”

“The point is, you have two separate contributions to the mass of a particle from such a propagator: k gives you the rest mass, while m gives you additional mass due to the running coupling for collision energy >1MeV. (See for instance fig 1 in https://vixra.org/pdf/1408.0151v1.pdf .)”

“The fact that you can separate the Coulomb propagator’s classical mass of a fermion at low energy (<1 MeV) from the increased mass due to the running coupling at higher energy, proves that there’s a physical mechanism for particle masses in general: the virtual fermions must contribute the increase in mass at high energy by vacuum polarization, which pulls them apart, taking Coulomb field energy and thus shielding the electric charge (the experimentally measured and proved running of the QED coupling with energy). In being polarized the electric field, the virtual positron and electron pair (or muon or tauon or whatever) soaks up real electric field energy E in addition to Heisenberg’s borrowed vacuum energy (h-bar/t). So the virtual particles must have a total energy E + (h-bar/t), which allows them to turn the energy they have have absorbed (in being polarized) into mass. This understanding of the two terms in the propagator, m and k, therefore gives you a very simple mechanism basis for predicting all particle masses, m, which shows how the mass gap originates from treating the propagator as a simple physical model of real phenomena…”

Basically, Woit has a useful clue, but is sailing in the wrong direction due to an elitist bias that fancy mathematics is definitely the right way to go – it helps in some ways but you need to also correct errors in the standard model and in some areas like the path integral, the direction should be away from calculus and towards discrete interactions. We should look at what’s physically occurring and take the perturbative expansion as reality, seeking to relegate calculus to just an approximation for high rates of interactions, but of much use for understanding mechanisms, which are discrete summations, not continuous variables in integrals.

ABOVE: under left wing censorship of anything that diverges from the Stalinist Party Line, most “independents” go 100% loony in the end. Raeto West of Big-Lies insanity stuff (West falsely claimed nonsense on his site, viz “nuclear weapons are fake; Hitler survived, was revived by his lover Stalin and is running the Kremlin from a bunker in the basement, via Putin, to ensure world peace and stability; David Irving is honest; the Jews have it in for everyone, etc.“) evaluating former UK MP and Russian TV Sputnik (fellow traveller) host George Galloway (who has now been re-elected an MP again, I believe – correct me if I’m wrong – the Monster Raving Loony Party, after Labour withdrew their rep from the election for “alleged” anti-Jewish racism and genocidal comments on Israel). Electronics World magazine “capacitor = transmission line” crosstalk (EMI interference) engineer Ivor Catt, associate of Raeto West, unfortunately introduced me to him two decades ago in St Albans. As the 1930s proved, don’t appease dictators: the only reason they want to meet you is to stick a knife in your back. I was hoping Catt would want to make progress in theoretical physics, helping to sort out the mess there, but he was an obstructor. When I asked him what he wanted to achieve he responded: “an international conference to discuss my work!” When I pointed out that he had already published a detailed discussion in his 1996 book “The Catt Anomaly” (in which Trinity College master mathematician Sir Michael Atiyah and Bradford Uni mathematical physics Reader Dr McEwan argue over how an electron drift current is set up in both conductors at the front of a logic step proceeding at light velocity, proving controversy), he responded “there is no controversy, because it is completely censored!” (Despite the controversy of having been argued over repeatedly in Electronics World and books for decades!) Unfortunately, since the October 2023 Hamas attack on Israel, the BBC has clearly taken sides with racist terrorist propaganda fronts, and this is encouraging the Nazi fanatics. They’re all quite loony.

This is why we have to fight our own battles. Psychologist Dr Irving Janis in 1951 wrote the RAND Corporation book “Air War and Emotional Stress”, pointing out that those who ducked and covered to avoid radiation, blast wind, and flying debris at collateral damage distances in modern concrete buildings at Hiroshima and Nagasaki survived with often little or no injury, the key nuclear war civil defense fact later censored out by Russian fellow traveller groupthink propaganda for Western nuclear disarmament and surrender; so in 1972 Janis wrote his classic book “Victims of groupthink: A psychological study of foreign-policy decisions and fiascoes”, examining how politically expedient “let’s agree to disagree socialism”-mentality (whereby justifiable dissent and better alternatives are simply suppressed or ignored by management to create the illusion of unity under leadership of Herr Fuhrer, Mein Tsar) leads to all the big avoidable disasters, from Kennedy’s moonlit Bay of Pigs invasion without USAF air support, to the Vietnam war sellout fiasco. Yet there are still people today who believe in bizarre and nefarious nazi groupthink philosophy, including “collaboration, teamwork and consensus”. Crazy stuff indeed!

ABOVE: Raeto West did not find a collaborator in dissident scientist Dr Hillman, a “censorship in science” associate of Catt. West claims falsely that Jews are responsible for groupthink (it is actually the Nazis and other dictators) and states at https://big-lies.org/science-revisionism/index.html#jq: “By approx 2022 Ivor Catt came to the conclusion that most or all people at the top of scientific and supposedly learned societies were mediocre and incompetent and politically unskilful.  He found plenty of evidence: the head of the Electrical Engineers complained he lived in a semi. Josephson of the Josephson Junction didn’t know how his junction worked. He knew education was in a mess, but never worked out why some people, in secret, wanted it that way. He knew immigration and housing were manipulated, but got nowhere in analysis. He knew (from experience) that the legal system ‘did not exist’ (his words, I think) without seeing beyond that.

His site reincarnates the old, mad, bad conspiracy theories of Mein Kampf and the Das Kapital. What worries me now is how easily Hamas was able – by invading Israel and slaughtering kids – to start a war which is turning much of the mad media back to how it was in 1930s. These are no longer fringe crackpots being censored out by the media. They’re crawling back out of the woodwork.

Update (24 March 2024); at least Dr Woit now finally comments damningly about the groupthink issue: https://www.math.columbia.edu/~woit/wordpress/?p=13864&cpage=2#comment-245352

“Unfortunately I don’t think fundamental physics was actually exciting 20 years ago. It was already in a very bad way, with a huge amount of outrageous hype to the public about “exciting” ideas like the multiverse. Now the hype level has died down and the public is more aware of the problems with string theory, but the situation of actual fundamental theory research is actually significantly worse than 20 years ago.

“What am I angry about? I’m looking back at a 40+ year career that started when the field was very healthy, and watched it be essentially destroyed and killed off. That the people who did this are holding public events assigning themselves grades of “A+++” is just disgraceful, and specifically aimed at making impossible any acknowledgement of what happened and any drawing of lessons for the future.”

Update (26 March 2024): “A Report From Mochizuki” – “To summarize the situation before yesterday, virtually all experts in this subject have long ago given up on the idea that Mochizuki’s IUT theory has any hope of proving the abc conjecture. … All experts I’ve talked to agree that Scholze/Stix are making a credible argument, Mochizuki’s seriously lacks credibility. The one hope for an IUT-based proof of abc has been the ongoing work of Kirti Joshi, who recently posted the last in a series of preprints purporting to give a proof of abc, starting off with “This paper completes (in Theorem 7.1.1) the remarkable proof of the abc-conjecture announced by Shinichi Mochizuki…”. My understanding is that Scholze and other experts are so far unconvinced by the new Joshi proof, although I don’t know of anyone who has gone through it carefully in detail. … Mochizuki’s new report destroys any such hope, simultaneously taking a blow-torch to his own credibility. …” From the comments section: “This is painful to look at. It does not seem to me that Prof. Mochizuki is well.” Perhaps questioning someone’s health is now a substitute for rigor in math proofs? The ignorant, poor old amateur mathematician might have been persuaded in ancient times that math is an elite discipline based on logical proofs. Nowadays if an ill celebrity publishes a photoshopped pic, they are a hero not a fraud. Surely the same applies to math proofs? Or are there double standards?

Purity of mathematics via elitist eugenics for extermination of allegedly inferior ideas?

Update (30 March 2024): the many interpretations of quantum mechanics versus the path integral of QFT

Rob Wilson has a post on the spin entanglement mechanism, which I have to object to very strongly:

“… the two measurements will still be opposite to each other, almost all the time. That is how entangled electrons work. It is also how entangled photons work. And it completely explains all of the experimental properties of measurements of entangled particles.”

Two particles are supposedly emitted in opposite directions by the same decay, with correlated spins. I’m sorry but we have to be clear this is a fundamental error. You cannot assume such an emission takes place, because of 2nd quantization. Everytime you think you are emitting a “single” particle or a “pair of them with correlated spins”, you’re actually emitting a vast number along a huge number of paths (the path integral), most of which interfere and thus apparently “cancel out”. This is totally ignored by Einstein, Polansky, Rosen, Bohm, Bell and Aspect.

So until the measurements are made at the detectors, there are never two particles with wavefunction “entanglement” (false “1st quantization”, i.e. QM model used by Aspect’s analysis of experimental data). There are loads of particle emitted and Aspect’s detectors pick up two apparent particles that are actually resultants of a path integral. I urge you to look at the pictures of this path integral effect for reflection of light by mirrors and refraction in prisms in Feynman’s 1985 book QED. Feynman developed the path integral, then had to fight with a bit of help from Dyson and Bethe against Oppenheimer, Einstein, Pauli and Bohr to get it accepted, yet Feynman’s wonderful application to light in his 1985 book QED is censored out by the mainstream of QFT. It is really, really fundamental, it has hard evidence behind it, and it’s being ignored! A photon is not emitted by a light bulb, travelling to your eye. Instead, every single photon that arrives on your retina is a superposition of numerous virtual photons, most of which are out of phase and cancel out. You have to have this model to explain principle of least action, e.g. light takes route of least time if having to go through different media at different velocities (air, glass, water etc). It’s a scandal Aspect uses 1st quantization!

Update (30 March 2024): Wilson is putting off the path integral until brainwashed by nonsense that causes confusion, which is a disaster (I’m not hoping Wilson will come in to the Feynman view of the world and mathematicize it properly – that would be hoping for too much – but I’m just trying to stop him disappearing down the usual loony route where everyone gets brainwashed by 1st quantization, finds out it fails to work properly to give a mechanical explanation of stuff like “entanglement” (used mostly as an excuse by journal editors to reject any paper/idea that could lead anywhere beyond existing paradigms), then concludes – wrongly – that “nobody understands quantum mechanics”, which was Feynman’s own view back in the 1960s before he made progress in applying simplified path integral – 2nd quantization – to simple atomic and light phenomena!): Thanks, it’s up to you, but if you want to get the physical handle on the mathematics of of the “path integral” (better a discrete summation of geometric paths), you really need to start with the pictures (suitable for an 8-year-old) in Feyman’s 1985 book QED: the strange theory of light and matter. There’s no equation in that book and most of the text is drivel (except the fun of him kicking the “uncertainty principle” into oblivion in a footnote!), as usual for his books, so don’t read the text, just look at pics.

Just look at the disgrams and convert them into simple equations of geometric paths and allow it to sink in. Feynman was finally getting into applying the path integral to basic everyday physics, and the results are phenomenal. (I’m convinced that this is the way to go. I won’t link to my papers on this at vixra, giving quantitative evidence emission of radiation from electrically charged fundamental particles is at Hawking’s black hole radiation formula rate – pages 32 and 35-6 of my Nov. 2011 paper which needs updating urgently – but Feynman’s new approach enables you to understand all radiation emission and reception correctly, and this explains the zero point field vacuum (dark) energy. You get “ridiculous” numbers from Hawking’s radiation power from a an electron, ~10^53 K temperature and ~10^205 watts/m^2 Hawking radiating power from the black hole electron or 10^92 watts for the entire radiating surface area. When you calculate the exchange forces resulting from such charged massless radiation (most of it can only be exchanged because – being charged and massless – it has infinite magnetic self-inductance so can’t simply propagate unless the B fields are cancelled out by a two-way exchange equilibrium betweek charges, thereby explaining the physics of the Z.P. field), the huge radiating power is needed to offset the very tiny cross-sections and produce observed force couplings. The application of this to “real” photons is that acceleration of charge upsets the symmetry of excharge, resulting in it!

Update (from Woit’s blog):

How I fell out of love with academia

“… in the field of fundamental theoretical physics, she is quite right that most academic research is now bullshit. This is … about the continuing disaster of overwhelming bullshit that has afflicted a field …” – Dr Peter Woit, April 5, 2024 at 1:45 pm at https://www.math.columbia.edu/~woit/wordpress/?p=13907#comment-245508

DISCLAIMER: I’ve added Sabine’s recent video on mainstream peer reviewed modern physics “bullshit” destroying the subject, but please note that this does NOT constitute an endorsement of her 1st quantization (one wavefunction per particle, instead of one wavefunction per “infinite” path interaction of each particle!) junk physics decoherence “entanglement” crap video and papers. She has previously called quantum gravity facts “crap” so I return the complement to her unpredictive crap. Likewise, when I quote Woit: what I’m doing is showing that mainstream bigotry is not a “conspiracy theory” but a reality, just as the US government and other groupthink quackery taxpayer-leeches are not in a “conspiracy” but merely a corrupted “teamwork” effort to undermine nuclear deterrence of the invasions that set off world wars. This crap has been with me since having blocked inner ear hearing to age 10 and resulting speech problems (hearing and speaking only low frequencies) and seeing the bloody Nazi groupthink crap which results from any slight deviation from “normality”.

UPDATE: above, moderation queue crap YET AGAIN censoring unfashionable realities in science. Why bother? If I don’t keep trying, bigots use the facts that my comments don’t appear as if they DON’T EXIST OR WERE NEVER SUBMITTED. In other words, as with the “paranoid groupthink deceptive Nukemap” rubbish, they are abusive and lie: “look, LOADS of comments ALL GENERALLY AGREEING with these people you call thugs are allowed on their blogs, PROVES that they are NOT REALLY CENSORING ANYTHING! Nope. They FILTER the comments they allow and simply obstruct those that debunk them!

ABOVE: Selective deletion of comments that deliberately biases posts at Dr Woit’s Not Even Wrong. Same occurred in 1930s “debates” about Nazi threat: as Dr Freeman Dyson relates in his 1984 book “Weapons and Hope”, Herman Kahn told him: “you had to be paranoid” (and thus get censored as “paranoid” from mainstream media!) to believe Japan would bomb Pearl Harbor in 1941, or that the Nazis would try to exterminate the Jews (despite what Hitler had written in Mein Kampf in jail in 1923, or rather had dictated to Rudolf Hess in jail). Dr Irving Janis, originator of RAND Corp research on “duck and cover” (see his 1951 “Air War and Emotional Stress”) and the term “groupthink”, goes into this in his book “Victims of Groupthink”: anyone warning of any disaster is basically called “paranoid” and censored by the media. Very convenient, since the media gets a hell of a lot more money out of selling “disaster” news “stories” it creates this way, than it would earn out of campaigns to prevent disasters in the first place, by publishing warnings that help avert tragedies in advance! Thus, Janis goes into the reasons why Churchill’s factual warnings of Nazi and Commie genocide were suppressed by “free presses”, to cause war: Churchill was simply insulted as a “warmonger”, repeatedly, and even before WWI (see Professor Cyril E. M. Joad’s 1939 book “Why War?”).

Still nobody dares criticise the “free press” for Nazi fascism collaboration, any more than condemning the BBC for promoting the Hamas war, or helping USSR fake “peace” (war) propaganda! Likewise, if anyone dares criticise Woit for not being hard enough to make headway against quackery and bigotry rotting the heart of “free debate”, he just censors the comments, leaving the fascist siding “mass media” of quackery to use that censorship as “evidence” there is free debate and nobody has anything to say in the debate that changes the direction of travel! Crap. (Yes, Woit and Witten and Dr Goebbels are “free” on their own blogs to delete what the hell they want, but if that deliberately biases the resulting comment thread to make it appear that “everybody is ignorant, or nobody has anything that really debunks string theory”, then all anyone reasonable can respond is: Quacks!)

“In my freetime I practised singing in the choir of the monastery church at Lambach, and thus it happened that I was placed in a very favourable position to be emotionally impressed again and again by the magnificent splendour of ecclesiastical ceremonial. What could be more natural for me than to look upon the Abbot as representing the highest human ideal worth striving for, just as the position of the humble village priest had appeared to my father in his own boyhood days? At least, that was my idea for a while. But the juvenile disputes I had with my father did not lead him to appreciate his son’s oratorical gifts in such a way as to see in them a favourable promise for such a career … I chanced to come across some publications that dealt with military subjects. One of these publications was a popular history of the Franco-German War of 1870-71. It consisted of two volumes of an illustrated periodical dating from those years. These became my favourite reading. In a little while that great and heroic conflict began to take first place in my mind. And from that time onwards I became more and more enthusiastic about everything that was in any way connected with war or military affairs. … Why did not Austria also take part in it? Why did not my father and all the others fight in that struggle? Are we not the same as the other Germans? Do we not all belong together? … I would not become a civil servant. No amount of persuasion and no amount of ‘grave’ warnings could break down that opposition. I would not become a State official, not on any account. … My father forbade me to entertain any hopes of taking up the art of painting as a profession. I went a step further and declared that I would not study anything else.”

  • A. Hitler, Mein Kampf, chapter 1.

What we see here in this example is the tragedy of coercion: the dictator springing from the censorship of freedom, by the controlling behaviour of those who think they are elites, but who sow the seeds of war!

UPDATE (13 April 2024): THE REAL ELITIST CRACKPOTISM OF THE CENSORSHIP THUGS

Sticking up a finger at genuine commenters, whose constructive work he simply deletes, Woit has decided to publish the following greasy praise for Woit:

April 12, 2024 at 7:31 pm Interested Amateur says:One thing I’ve been very impressed with over the past twenty years is how Peter still doesn’t bear a grudge against those within the String community that were outrageously abusive towards him on their blogs.”

The post the comment appears in appears to quote as an expert the string “theorist” (there is no “string theory” of substance whatsoever, contrary to millions of books of Marxist-Lenist style pseudomathematics crap by overpaid “academics”), Dr Lubos Motl, who appeared to issue death “threats/wishes” against Dr Woit a decade ago after Dr Woit published his book on string theory, Not Even Wrong:

So what is my point here in bring up this “unfortunate storm-in-the-teacup-cake of the marvellous fantasy of 10 dimensional superstring surface membranes on 11 dimensional supergravity bulk with 10^500 metastable vacua representing either an anthropic landscape of different theories of gravity or a landscape of different parameters of spin-2 quantum gravity, that per se, overpredicts everything so absurdly it can never ever be checked even by someone with access to the fantasy of Dr Who’s Tardis; plus a load of bollocks nobody will every be able to check as well”? My point is sheer hypocrisy. Woit is so up the elitist ivory tower coffee pot spout, he can’t grasp the difference between string theory terrorist quackery, and genuine suppressed alternative ideas, at all. Yes, he has every right to censor me and to endlessly promote Lubos, because it’s his bloody blog. OK, I get that. Just as Hitler had every right to publish Mein Kampf, and Marx and Engels the Communist manifest. Fine. Publish and censor what you bloody want. But don’t expect me to write lies saying you’re sane on my blog, here. You’re all loons, loons, loons. Cheers.

(I might need to add a note here about my discussions in physics Lubos Motl. I repeatedly tried to have discussions with him around 20 years ago, to no avail. The same occurred with the electronics engineer Ivor Catt, Peter Woit, and various others. These people all have an elitist groupthink worldview in which they are allowed to gently probe mainstream anomalies and alleged errors and suggest solutions, but they then try fanatically to use corrupt censorship techniques to block innovation that leads anywhere fast. It’s totally absurd paranoid elitist fanatical groupthink delusion. But they’re “good” at trying to reflect their own mental illnesses on those they foolishly oppose. At least, the dumb (m)ass media believe them; they’re “good” at propaganda in that sense. Their mathematical physics is wooden, needlessly complicated so that the wood is lost in the trees, and their mechanistic reasoning is non-existent.)

Above (15 April 2024 update): Woit allows crazy comments, but not sensible ones. By excluding or falsely denouncing anybody telling the truth, he keeps his blog clear of anything except storms-in-teacups. Very convenient for the Moriarties and Wittens of this world to have an ineffective critic: “Look, Woit allows comments, nobody appears to say sense, so no sense exists. QED!” Again, this is the stuf I had over 20 years ago in trying to find somebody interested in a disproof of string theory’s spin-2 graviton on “Physics Forums”. Moderators there would only allow fake “no-go theorems” such as the fake claim that the path integral of quantum gravity proves spin-2 (nope, see here, here, here, here, here etc.): in reality, you can’t have a path integral of discrete interactions that makes any sense because the discontinuities are not properly represented by calculus, especially in the reductionist limit which specificaly considers just few particles interacting!

That’s why Feynman’s perturbative expansion (which is physically defensible) has an infinite number of terms. Really, you should do 3-d Monte Carlo simulations of field quanta being exchanged between charges, allowing for mechanisms of energy absorption by the screening of vacuum polarization between IR and UV cutoff distances (which are inversely proportional to collision energies) around particle cores. So the mathematical tools in current use in QFT are themselves obfuscating and stagnating progress, especially when it comes to gravity, which is more like a Casimir force, e.g. metal plates pushed together by the fact they exclude wavelength longer than the gap between them, like a partial faraday cage, and thus “shield one” another in LeSage fashion from virtual photon exchanges with some of the general background “zero point or dark energy” of the vacuum; without the zero point field of QFT being debunked by a “no-go theorem” of the crass Teller kind: “earth would be slowed down and heated up if it passed through any dark energy field in space”.

Someone (probably Dad) once said that the best place for a revolutionary to be is in the town square, arguing with the laggards. Professor Cyril Joad, the second-greatest (after Angell) pro Nazi “pacifist” propagandarist of all time – he won the 1933 Oxford Union motion to surrender to Hitler and in 1939 published a book called “Why War?” arguing (just like Charlie Chaplin in “Great Dictator”) that Nazism can be defeated without spending a cent on bullets by laughing at goose-stepping invaders (a “funny” strategy exploited in the form of popular music like “We’re gonna hang out our washing on the Siegfried Line”, which suddenly became slightly less amusing after the B.E.F. at Dunkirk was kicked out of France by Panzers, and had to be evacuated in small boats like the economic migrants fleeing from the economic and war terrors allegedly raging in modern France today), was rewarded for being a traitor by endless promotion on on BBC radio as being the world’s great philosopher. Joad would always begin his answers to questions on BBC radio with the disclaimer: “It all depends what you mean by —–!” Anyway, what got Joad in the end was his exposure as a fraud. Not for lying to everyone that if we simply disarm and surrender to avoid being gassed by bombs, Hitler will provide free comedy, rather than “peaceful” gas chambers, but by what the public and the BBC perceived as a real fraud: not buying a railway ticket! That’s how “God” works. The same happened in the USA to mass killer, gangster Al Capone: he was jailed for tax evasion. The hidden message (as Hitler stated it): the public loves big lies, but hates small lies!

(Basically Woit’s problem is believing that extremely sophisticated mathematics is needed to get a simpler understanding of unification issues, without mechanistic understanding. That’s like pulling a hundred rabbits and a thousand doves out of a hat which is made out of perspex. It’s asking too much. The only way forward is a hybrid of mechanism, simplifying – not complexifying – mathematics so that you have a chance to get a mental pictorial “handle” for grasping the dynamics of what’s physically going on regarding charge shielding by polarized vacuum phenomena aka “renormalization”, Monte Carlo simulations of such phenomena to reproduce the perturbative expansion result, thus at least checking your understanding by calculating the same result using two very different methodologies, and other well-established theoretical physics “tricks” used to understand physical phenomena which are used by everybody in theoretical physics, except for dumb-ass-Hitler’s who dominate fake “unification” crap.)

What happens when the “free” mass media is given too many brown paper envelopes stuffed with cash in a collusion with state terrorism that deliberately promotes paranoia by enforcing loony “conspiracies” (better called: “celebrity-led groupthink fashion“, as Windscale is better called Sellafield, and Tactical Nuclear Weapons are better called Non-Strategic Nuclear Weapons in a deluded attempt at populist obfuscation for governmental cover-ups):

https://hbr.org/1995/05/why-the-news-is-not-the-truth/ (Peter Vanderwicken in the Harvard Business Review Magazine, May-June 1995): “The news media and the government are entwined in a vicious circle of mutual manipulation, mythmaking, and self-interest. Journalists need crises to dramatize news, and government officials need to appear to be responding to crises. Too often, the crises are not really crises but joint fabrications. The two institutions have become so ensnared in a symbiotic web of lies that the news media are unable to tell the public what is true and the government is unable to govern effectively. That is the thesis advanced by Paul H. Weaver, a former political scientist (at Harvard University), journalist (at Fortune magazine), and corporate communications executive (at Ford Motor Company), in his provocative analysis entitled News and the Culture of Lying: How Journalism Really Works … The news media and the government have created a charade that serves their own interests but misleads the public. Officials oblige the media’s need for drama by fabricating crises and stage-managing their responses, thereby enhancing their own prestige and power. Journalists dutifully report those fabrications. Both parties know the articles are self-aggrandizing manipulations and fail to inform the public about the more complex but boring issues of government policy and activity. What has emerged, Weaver argues, is a culture of lying. … The architect of the transformation was not a political leader or a constitutional convention but Joseph Pulitzer, who in 1883 bought the sleepy New York World and in 20 years made it the country’s largest newspaper. Pulitzer accomplished that by bringing drama to news—by turning news articles into stories … His journalism took events out of their dry, institutional contexts and made them emotional rather than rational, immediate rather than considered, and sensational rather than informative. The press became a stage on which the actions of government were a series of dramas. … The press swarmed on the story, which had all the necessary dramatic elements: a foot-dragging bureaucracy, a study finding that the country’s favorite fruit was poisoning its children, and movie stars opposing the pesticide. Sales of apples collapsed. Within months, Alar’s manufacturer withdrew it from the market, although both the EPA and the Food and Drug Administration stated that they believed Alar levels on apples were safe. The outcry simply overwhelmed scientific evidence. That happens all too often, Cynthia Crossen argues in her book Tainted Truth: The Manipulation of Fact in America. … Crossen writes, “more and more of the information we use to buy, elect, advise, acquit and heal has been created not to expand our knowledge but to sell a product or advance a cause.” “Most members of the media are ill-equipped to judge a technical study,” Crossen correctly points out. “Even if the science hasn’t been explained or published in a U.S. journal, the media may jump on a study if it promises entertainment for readers or viewers. And if the media jump, that is good enough for many Americans.” … A press driven by drama and crises creates a government driven by response to crises. Such an “emergency government can’t govern,” Weaver concludes. “Not only does public support for emergency policies evaporate the minute they’re in place and the crisis passes, but officials acting in the emergency mode can’t make meaningful public policies. According to the classic textbook definition, government is the authoritative allocation of values, and emergency government doesn’t authoritatively allocate values.” (Note that Richard Rhodes’ Pulitzer prize winning books such as The making of the atomic bomb which uncritically quote Hiroshima firestorm lies and survivors nonsense about people running around without feet, play to this kind of emotional fantasy mythology of nuclear deterrence obfuscation so loved by Uncle Sam’s folk.)

ABOVE: we don’t necessarily agree with all of “conspiracy theories” (aka “fashion based celebrity-backed propaganda fake news used to sell mass media with governmental illicit backing”), but we do believe in freedom of speech, not merely “freedom of the press, freedom of multibillion dollar TV “news”, freedom of Putin, freedom of terrorist states”. We do certainly applaud Max’s condemnation of Stanley Kubrick’s cash making propaganda films, albeit not necessarily for Max’s reasons! The same applies to some of the criticisms of democracies in certain Marxist works; they sometimes identify problems, albeit to try to sell an ever worse form of government than we have now (i.e. jumping from frying pan into the fire itself):

More paranoid delusional stuff claiming falsely to not be celebrity culture obsessed from Woit blog Not Even Wrong.

Update (21 April 2024): Just in case you think I’m only being censored by Woit and Putin fans, the “conspiracy” (whoops, meant to say “fashion obsessed celebrity groupthink road to nowhere club”) for time wasting includes others too. “My new article in the (Peer-reviewed) Journal of Applied Mathematics and Physics describes the evidence for small Black Holes hidden in our solar system, even a Hole in the Earth!” – Louise Riofrio on Facebook 2 days ago! (She ignores the fact that the theoretical derivation of her equation which is confirmed by quantum gravity evidence, shows that all matter is black holes! But nowadays it’s much easier to get speculations past peer-reviewers, than to get proved facts published.)

Update 24 April 2024: after about 20 years of nonsense from Woit dressed up as profundity as a superstring critic, I’ve not going to bother trying to get any sense from him anymore. The latest BS from him is on quantum computer hype where he states: “Quanta magazine, Nature and various other places were taken in by a publicity stunt, putting out that day videos and articles about how “Physicists Create a Wormhole Using a Quantum Computer”. … Within a few days though, people looking at the actual calculation realized that these claims were absurd.” The problem here for Woit is that quantum computers are based on 1st quantization, where there is only one wavefunction, ψ ~ exp(iS), per onshell particle, which can become “entangled” with another wavefunction, or be in a “indeterminate state until measured”, when really that is non-relativistic QM and thus wrong, and the fact is that relativistic QFT which is the correct theory for relativistic QM has an infinite number of wavefunctions for every onshell particle, ∑ψψ, leading to the “path integral” concept where “entanglement” is simply replaced by “multipath interference”. It’s all in Feynman’s QED 1985 (not Distler’s reading of that reference which is “Feynman and Hibbs 1965”; nope, Feynman QED 1985 which denounced the uncertainty principle of 1st quantization as an effect of multipath interference in the correct theory, QFT. This severely kicks the crap out of the metaphysical world Woit et al inhabit, where Aspect’s “experimentally based” entanglement wavefunction hype is wrong merely because there are no quantum computers. Nope; that hype is wrong because the theory used to interpret the theories is non-relativistic and wrong.

They’re covering garbage because people want to believe in smarty pant geniuses and smarty pants physics, like you do. Only you don’t like their particular version of it. The only way you’re going to get past this is to disprove string theory unification by finding the correct theory for gravity, unification, etc.

What these smarty pants should be looking at is:

  1. why the dimensional analysis numerology of the Planck scale unification of couplings assumption, when the black hole size for say an electron’s mass is smaller than the Planck scale and more fundamental.
  1. why couplings should become similar at any scale, when it’s clear from vacuum polarization – which absorbs field energy, giving energy to the polarized “virtual particles” to last longer than Heisenberg’s energy-time uncertainty formula – that this phenomenology creates pions etc that mediate nuclear forces! In other words, at high energy (short ranges) the physical mechanisms of vacuum polariation required to explain running couplings (propagators) give a mechanism to convert one force field like EM into another type of force field. Is there any mechanism that stops vacuum polarization fields around leptons from containing quarks with gluons etc, at very short distances? Or don’t you guys bother to think past your textbook SM equations from over 50 years ago? Witten leads on this nutcase groupthink quackery conspiracy to censor reality out of physics:

War mongering, paranoid, elitist, homophobic, sly “superstring defending” (by deleting genuine comments debunking superstring entirely, while pretending to attack superstring himself using strawman arguments) Columbia University “physicist” rant in 2016 about Trump: “To those planning on voting for Donald Trump: Please don’t. … note that he has reportedly told Peter Thiel that he would like to appoint him to the Court. Thiel is a gay, radical libertarian Silicon Valley billionaire from San Francisco with highly eccentric views. … This same argument applies to those opposing Clinton and supporting Trump’s election on grounds such as “she’s a war-mongerer, unlike Trump”, since (on some days) Trump claims to oppose US military interventions abroad. If you really believe that “Make America Great” means Trump will institute a policy of restraint on the use of the US military, I think it will likely be just a few weeks into the Trump administration before you find out that you, like your right-wing brethren in flyover country, have been conned. … You’re angry at well-off coastal elites who you feel look down on you and your culture, and you want to spit in their face by voting for Trump. If so, you are quite right to feel the way you do. From a lifetime spent among such elites I can tell you that, yes, they do look down on you. Most people here in New York City probably do think you’re an ignorant racist. Your problem though is that Donald Trump is one of us.”

Dictatorial fake physics leader Witten (hero of Woit) totally ignores Woit’s anti-string arguments:

Leave a comment