http://vixra.org/abs/1302.0004

**RECENT COMMENTS BY GERARD ‘T HOOFT ON PEER REVIEWS OF THIS PAPER:**

http://vixra.org/abs/1111.0111 was submitted (Foundations of Physics submission FOOP2945) to Gerard ‘t Hooft, Chief Editor of “Foundations of Physics”, who emailed on January 11, 2012: “Both the structure and the unduly high degree of speculativeness of the arguments presented in this manuscript place it outside the scope of Foundations of Physics.” This is precisely the opposite of the confirmed predictions based on facts which are given in the paper, and are precisely what the paper itself says about mainstream “string theory” trash hype, which contains no checkable predictions and is poorly structured with a landscape of 10^{500} metastable vacua. However, to remove all excuses, a briefer version cut from 63 pages to 7 pages and now hosted at http://vixra.org/abs/1302.0004 with the detailed literature survey including 43 references completely removed was prepared in order to focus concisely on the key prediction and its confirmed, factual basis (Foundations of Physics submission FOOP-D-13-00076). Gerard ‘t Hooft has emailed on 28 February 2013 by reversing his original 2012 criteria: “The author of this manuscript fails to make clear how his work relates to current discussions in the foundations of physics. Regrettably, this fact places the current submission outside the scope of Foundations of Physics. This is displayed by a lack of references to recent literature.”

This contradicts the original submission, which *did* have a recent literature survey of 43 references (http://vixra.org/abs/1111.0111) and a very detailed discussion of how the new result overthrows “current discussions in the foundations of physics.” These 43 references were removed in the resubmission to force the peer reviewers to focus on the accuracy of the scientific calculations and their factual, defensible basis. First the man claimed that the discussion of the problems in existing research and the literature survey of 43 references had distracted him from seeing the factual basis of the confirmed predictions, and then when the references and literature discussions were removed, he reversed his argument and simply ignored the facts presented in the paper by complaining instead that the 43 references and literature discussion were now missing from the paper! This contradiction is due to contriving inconsistent and trivial reasons for ignoring the hard science in both papers.

However, we’ll improve the paper in an effort to reach a compromise and see what happens. Notice that the role of “Foundations of Physics” (and all other journals) is no longer to physically communicate science or data (which anybody can put on the internet), but is purely advertising/marketing/publicity/hype. With the internet available, nobody needs to publish in this or that journal/newspaper/TV show in order to directly make information physically available for people who actually want that information.

Instead, the role of these media is all about advertising or hyping a result, in other words, it is the purely unscientific, political act of making a song and dance out of science just to attract serious funding for further research. (Peer review politics is described in http://vixra.org/abs/1211.0156.)

It should be added that “Foundations of Physics” editor Gerard ‘t Hooft (who proved that the U(1) X SU(2) electroweak theory is renormalizable since the infinite momenta problem disappears in the UV or high energy unbroken symmetry limit where the SU(2) field quanta lose their mass, thus helping to solidify the current dogma that doesn’t include quantum gravity), is author of misleading and unpredictive papers on QM including “Determinism beneath quantum mechanics” whose Abstract states:

“Contrary to common belief, it is not difficult to construct deterministic models where stochastic behavior is correctly described by quantum mechanical amplitudes, in precise accordance with the Copenhagen-Bohr-Bohm doctrine. What is difficult however is to obtain a Hamiltonian that is bounded from below, and whose ground state is a vacuum that exhibits complicated vacuum fluctuations, as in the real world. Beneath Quantum Mechanics, there may be a deterministic theory with (local) information loss. This may lead to a sufficiently complex vacuum state, and to an apparent non-locality in the relation between the deterministic (“ontological”) states and the quantum states, of the kind needed to explain away the Bell inequalities.”

He also states on page 1:

“The need for an improved understanding of what Quantum Mechanics really is, needs hardly be explained in this meeting. My primary concern is that Quantum Mechanics, in its present state, appears to be mysterious. It should always be the scientists’ aim to take away the mystery of things. It is my suspicion that there should exist a quite logical explanation for the fact that we need to describe probabilities in this world quantum mechanically. This explanation presumably can be found in the fabric of the Laws of Physics at the Planck scale. … Attempts to reconcile General Relativity with Quantum Mechanics lead to a jungle of complexity that is difficult or impossible to interpret physically. … What we need instead is a unique theory that not only accounts for Quantum Mechanics together with General Relativity, but also explains for us how matter behaves.”

The problem with what he writes is that he is ignoring Feynman’s solution in his 1985 book *QED* which is that the “uncertainty principle” is just the result of multipath interference in 2nd quantization; i.e. you have a separate wavefunction amplitude (psi) for each potential interaction between an orbital electron and Coulomb field quantum. There are numerous ways an orbital electron can interact with the Coulomb field quanta that bind it into its orbit. Each potential interaction has a wavefunction amplitude, and to find the probability of an electron going in a particular path you sum the wavefunction amplitudes for all the electron interactions with field quanta that will make it take that path, then you work out the sum of histories for all paths. You square the modulus of the results to get relative probabilities, then divide the result for the chosen electron path route into the result for all possible paths to get the absolute probability. There is no reality to first quantization or the usual “quantum mechanics” hype with its “indeterminancy principle”: it is is non-relativistic and only considers a single wavefunction amplitude for each onshell particle (e.g. only one wavefunction amplitude for each orbital electron). There is in reality no single wavefunction amplitude for an electron, so Schroedinger’s equation is misleading: there is a separate wavefunction amplitude for every potential interaction between an electron and a quantum of the Coulomb field (i.e., “field quanta”). The huge number of possible interactions have wavefunction amplitudes which mostly interfere and cancel out, unless they have very small action (in comparison to Planck’s constant over twice Pi, or h-bar).

Feynman argued (*QED,* Princeton U.P., 1985) that multipath interference (i.e. the Coulomb field quanta of 2nd quantization) provides a simple mechanism to replace the uncertainty principle of non-relativistic 1st quantization. Why not go further in this direction and simply replace the usual complex path amplitude exp(iS) (where action S is in h-bar units) with just its real component, cos S? [Taken from Euler’s equation: exp(iS) = i sin S + cos S.] When you think about it mathematically, exp(iS) is a vector on the complex plane (Argand diagram), and cos S is a scalar amplitude. All cross-sections and other observables calculated from a path integral [summing exp(iS) contributions] are real numbers, hence the resultant arrow must always be parallel to the real axis, so you get exactly the same result using exp(iS) or cos S. You aren’t losing complex plane directional information that has any use in the practical calculations of QFT. It seems that the only reason to stick to exp(iS) is historical, going back to Dirac’s derivation of exp(iHt) as the amplitude for a single wavefunction from Schroedinger’s equation, where the periodic real solutions produce the quantization. If you’re doing 2nd quantization, multipath interference for large path actions is the mechanism for quantization, so you don’t need Schroedinger’s equation (which is a non-relativistic approximation).

Finally, there is an interesting exchange of blows between ‘t Hooft and Peter Woit on Woit’s *Not Even Wrong* weblog post of 13 August 2012, *’t Hooft on Cellular Automata and String Theory* where Woit writes “Gerard ’t Hooft in recent years has been pursuing some idiosyncratic ideas about quantum mechanics. … those who are interested might like to know that ’t Hooft has taken to explaining himself and discussing things with his critics at a couple places on-line, including Physics StackExchange, and Lubos Motl’s blog. If you want to discuss ’t Hooft’s ideas, best if you use one of these other venues, where you can interact with the man himself. One of ’t Hooft’s motivations is a very common one, discomfort with the non-determinism of the conventional interpretation of quantum mechanics. The world is full of crackpots with similar feelings who produce reams of utter nonsense. … I don’t think what he is producing is nonsense. It is, however, extremely speculative, and, to my taste, starting with a very unpromising starting point. Looking at the results he has, there’s very little of modern physics there, including pretty much none of the standard model (which ’t Hooft himself had a crucial role in developing). If you’re going to claim to solve open problems in modern physics with some radical new ideas, you need to first show that these ideas reproduce the successes of the estabished older ones.”

t’ Hooft wrote in a comment there to respond to the criticism: “I did not choose to side with Einstein on the issue of QM, it just came out that way, I can’t help that. It is also not an aversion of any kind that I would have against Quantum Mechanics as it stands, it is only the interpretation where I think I have non-trivial observations.”

Woit then replied: “I hope you’ll keep in mind that I often point out that “Not Even Wrong” is where pretty much all speculative ideas start life. Some of the ideas I’m most enthusiastic about are certainly now “Not Even Wrong”, in the sense of being far, far away from something testable.”

That certainly is nothing to be proud of; checkable predictions are hyped as being more important that politics for science, but the socialist dictators in charge of the journals prefer politics (literature surveys of nonsense) to hard calculations.