Peter Woit vs. elitist snobbery by journals

Peter Woit, the wavefunction amplitude assumption by J.S. Bell and other non-relativistic 1st quantization quacks, the multiple wavefunction amplitudes for the path integral which debunk Bell’s inequality assumptions, the censorship of facts by inequality dictators, elitist snobbery by journals, and the freedom of the press barons and supposedly liberal communists to censor unfashionable facts and half-baked ideas

Here’s some less funny news for a change.  As Peter Woit finishes off his monumental and very interesting textbook, Quantum Theory, Groups and Representations: An Introduction, currently 601 pages (October 20, 2016 version), which I will review when completed, he’s taking politically correct potshots at journals which publish half baked heresies by outsiders:

Retraction at Annals of Physics

Retraction Watch reports that Annals of Physics has removed a recently published article by Joy Christian, replacing it by a publisher’s note that just says:

“This article was erroneously included in this issue. We apologize for any inconvenience this may cause.”

The paper is available on the arXiv here. Christian’s affiliation in the abstract is listed as “Oxford”. This refers to the Einstein Centre for Local-Realistic Physics which is not at Oxford University, but at a location in the town that I think I unknowingly walked past on my way to go punting last week. The only person involved with the centre who lists an academic affiliation is Dr. Jay R. Yablon (MIT), who appears to be a patent attorney in Schenectady.

This story brings back memories of the Bogdanov affair of 2002, one aspect of which was the publication by the Bogdanovs in Annals of Physics of a paper that, as far as I could tell, made little sense. That paper was never removed or retracted. The editor-in-chief when the Bogdanov paper was accepted was Roman Jackiw. Frank Wilczek took over from him and said at the time that he was hoping to improve the journal’s standards. The current editor-in-chief is my Columbia colleague Brian Greene.

Comments are off since I would rather not host a discussion involving the merits of this paper. I haven’t tried to seriously read it, and don’t want to spend time doing so. In the Bogdanov case I spent (wasted…) a lot of time reading their papers, so felt comfortable discussing them, not about to do the same in this case.

That is at first glance plain arrogant and unreasonable: if you “don’t want to spend time” trying to seriously read a paper, you’re not qualified to make any comment whatsoever about it.  However, we live in a world where any publicity is so precious that “all publicity is good publicity”, in other words, taking flack for supposed heresies and being crucified for it, whether your name is Jesus Christ or Donald Trump, may be the only way to motivate people to say anything, to have any real debate.

The withdrawn paper notice on states:

“This article has been withdrawn at the request of the Editors.  Soon after the publication of this paper was announced, several experts in the field contacted the Editors to report errors.  After extensive review, the Editors unanimously concluded that the results are in obvious conflict with a proven scientific fact, i.e., violation of local realism that has been demonstrated not only theoretically but experimentally in recent experiments. On this basis, the Editors decided to withdraw the paper.” (Emphasis added in bold type.)

Now, the problem of looking to conventionality to decide whether a new paper is right or wrong is sometimes called the Galileo problem: in other words, common sense and “established fact” has turned out wrong in the past.  The “flat Earthers” who could prove their ideas by experiments on samples of flat Earth and then extend the concept to the whole planet, were not censored out for being lunatics.  They were the ones censoring.  They censored the idea that the Earth is spherical, and refused to find the time to discuss evidence.

I know Bell’s work is based on the assumption of the correctness of first quantization, i.e. a single wavefunction that is indeterministic, as in the Schroedinger equation. In fact, second quantization shows that you have a separate wavefunction amplitude for each path or interaction with the real particle and an infinite number of offshell or virtual field quanta, which must be summed in the path integral. It is the addition of all these different path amplitudes, each proportional by exp(iS) where S is the action of the virtual particle interaction path, which produces the interference of these paths in the path-integral, and thereby causes all of the indeterminancy. Feynman’s 1985 book “QED” explains how this invalidates and replaces the old fashioned uncertainty principle of first quantization, such as Schroedinger’s equation, which Bell’s work is based on. Since Bell ignores this physical mechanism of multiple path wavefunction amplitudes interfering to produce indeterminancy, by Bell’s choosing to represent only a single wavefunction amplitude exp(iHt), all of his statistical analysis is obfuscating and misguided.

The path integral approach derives from the multiple virtual particles in the field (the field quanta) of Dirac’s relativistic equation which supersedes Schroedinger’s non-relativistic equation, in other words when you replace Schroedinger’s flawed non-relativistic Hamiltonian with Dirac’s spinor Hamiltonian, you are naturally led to the reality of field quanta and each interaction with a field quanta contributes a wavefunction amplitude.  The interferences of these amplitudes using the path integral replace the 1st quantization “uncertainty principle”, a fact that Feynman makes clear graphically for several examples in “QED” (the 1985 book, not the 1965 one he co-authored with Albert Hibbs!).

I downloaded Joy Christian’s paper On the Fatal Mistake Made by John S. Bell in the Proof of His Famous Theorem (also at because I studied Bell’s work in detail while a physics undergraduate, and found that these facts about the failure of Bell to look at path integrals, which we’ve made clear on this blog for years, are ignored.  Instead, disappointingly, Christian tries to use the old trick of pointing out that Bell relies on unobservables: “That is to say, no physical experiment can ever be performed … that can meaningfully measure or evaluate the above average, since none of these quantities could have experimentally observable values.”

Let’s now make a comment about the current religion of the uncertainty principle.  Basically, the conventional textbook hype in modern physics is inverted from reality: almost all of the particles in the universe, leptons, quarks, even dark energy and dark matter, are producing effects on us all the time via their gravity fields and other fields.  To answer Einstein’s famous question to the Bohr-rights: “Is the Moon there when you are not looking?”, you are always “looking” at the moon and at every subatomic particle in the radioactive nucleus controlling the fate of Schroedinger’s cat, because the Moon and those particles have fields that are continually affecting us: gravity, dark energy, electromagnetism etc.  If the Moon really disappeared, massive tidal effects would occur.  Regarding radioactivity, the very clear differences in half lives between different nuclei point not to indeterminism but to a definite shell structure.

Electron spins likewise determine the magnetic fields of magnets.  Sure, there is randomness as observes in mechanical situations as in Brownian motion, but the fact is that the alleged non-relativistic 1st quantization wavefunctions (even if such a non-relativistic model were valid) are always “collapsed by measurement” into a definite state condition, as we observe when we measure the average properties of a good sample size to eliminate small-scale randomness.  The half life of plutonium-239 nuclei is deterministic for large sample sizes, and differs from the half life of  Americium-241 nuclei.  And you don’t even need a large sample: alpha particles are emitted with discrete energies (like nuclear gamma rays, unlike beta particles) and Gamow’s tunnelling formula (better understood with 2nd quantization, i.e. a particulate field barrier which offers a statistical penetration probability by a particle “missing field quanta” by analogy to a football missing a crowd of defenders and scoring a goal, than by classical 1st quantization field concepts!) relates half life with alpha particle energy.

You can measure an individual alpha particle energy with a zinc sulphide phosphor, a photomultiplier and a pulse height discriminator, therefore, and get a pretty good estimate of the half life. Americum-241 always emits 5.486 MeV alpha particles, corresponding to a half life of 432.2 years; plutonium has longer half life (24,110 years) because it emits lower energy alpha particles, 5.245 MeV which therefore take longer to break through the barrier of virtual pions and other discrete particles that act to bind the nucleus together.

As Feynman explains in that 1985 book “QED” which we keep referring everyone to read, every electron in a block of glass is influencing and being influenced by every other one, which explains the otherwise counter-intuitive fact that a photon has a probability of reflecting off the front face of a block of glass that depends on the thickness of the glass, something that a photon merely interacting with the front will not be affected by, unless – as is in fact the case – the electrons vibration frequencies are a function of the thickness of the glass, so that even the electrons on the front of the glass are affected by the thickness of it, before the photon arrives.

On the topic of heresy, a question in a comment on the previous post stimulated a reaction from me on quantum gravity:

Yes, quantum gravity as I show at as well as and with diagrams that are easy to grasp and understand at or does predict all non-Newtonian gravitational contraction effects, which replicate and derive general relativity’s predictions precisely, just as you list! In fact, Einstein’s own original derivation of the field equations show that energy conservation accounts for the contraction numerically, which is precisely what we’re doing physically. Newton’s equation ignores the fact that a falling apple can’t acquire kinetic energy from nothing. What’s occurring is that the gravitational field’s potential energy is being reduced as the apple acquires a corresponding amount of kinetic energy.

All of the general relativity predictions that differ from Newton’s come from the contraction term, which Feynman showed (see his 1963 Lectures on Physics) is a contraction, a gravitational version of the Lorentz-Fitzgerald contraction of restricted or special relativity.

The contraction due to the distortion of space is small for most Newtonian situations; in fact it is something like 1.5mm for Earth’s mass equivalent. You get it by replacing v in the Lorentz transformation with escape velocity, and dividing the resulting contraction by 3 to account for the fact that only 1 dimension is contracted by linear motion (the dimension in the direction of that motion), whereas gravitational compression contracts three spatial dimensions.

In fact, the key differences between Newtonian gravitation and the mechanism of quantum gravity in are that the quantum gravity mechanism predicts additionally (1) all of local general relativity predictions, (2) predicts dark energy quantitatively, which general relativity fails to do, and (3) predicts the quantization of masses, which neither Newtonian gravity nor general relativity does.

I think it is time that was rewritten and improved from the popular standpoint.  The physics snobs are flat Earth “facters” when it comes to new ideas that contradict their money-spinning quackery, and they are also hypocrites, usually with regard to claiming vocally to demand equality when their actions enforce inequality.  (Though maybe some of them claim they’re communists and will give away all their money as soon as everyone else does.  Conveniently, they back equality only where it can never exist, and resist equality where they could make a difference! But power tends to corrupt scientists more than politicians.  At least Trump and Hillary are having debates.)

Photo of John Ellis’s office at CERN

Hat tip to Dr Woit, who spotted the message on the skeleton in a CERN news page about the crisis in theoretical physics.  John Ellis, CBE, FRS is the James Clerk Maxwell professor of theoretical physics at King’s College London.

Photo credit: Sophia Bennett CERN photo of John Ellis (freely usable for non-commercial purposes; since CERN is funded by taxpayers).Sophia Bennett CERN photo of John Ellis

John Ellis CERN

John Ellis CERN2

Enlargements of the political message of dogmatic consensus based “mainstream science”. Inflation and SUSY are complex epicycle type interpretations of evidence which survive by dominating the landscape, effectively squashing attempts to get the mainstream to investigate other options that actually work and make useful predictions that have been confirmed afterwards.

Nature peer-reviewed article debunking computer model doomsday climate change predictions is censored by mainstream media

climate change scaremongering scandal

Nature peer-reviewed article debunking computer model doomsday climate change predictions is censored by mainstream media

“It has been claimed that the early-2000s global warming slowdown or hiatus, characterized by a reduced rate of global surface warming, has been overstated, lacks sound scientific basis, or is unsupported by observations. The evidence presented here contradicts these claims.” – John C. Fyfe, et al., Making sense of the early-2000s warming slowdown, Nature Climate Change journal, v6, March 2016, page 224 (below).

Nature debunks climate change hype liars

Summary of the key evidence of the failure of doom predictions from 124 simulations of 24 CMIP-5 scare mongering computer models, as published in peer reviewed Nature.   This confirms that the official models which ignore feedback are wrong, as explained in detail in our “Failure Evidence for All 21 Ipcc Positive-Feedback Climate Models”  They can be fitted to 195-98 (by means of epicycles and fiddles), but are debunked by the latest observations.

1. In the most widely-hyped scare-mongering climate change report by IPCC in 2007, all 21 of its “different” models all identically ignored negative feedback entirely, while including all-positive feedback from water. Thus, they implicitly and wrongly assumed that warm moist air that absorbs heat doesn’t rise, expand and condense into sunlight absorbing clouds well above the ground/ocean/icecaps.

2. Dr Roy Spencer’s peer reviewed paper found that you can only identify negative feedback in specific tropical weather systems, where CO2 heating of the ocean generates cloud cover that soon wipes out surface temperature rises (below cloud bases). Idiot “critics” then pointed out that you don’t see water negative feedback in other data pertaining to land (no water to evaporate) and clear skies (no cloud cover). You can only see water’s negative feedback over the tropical monsoon systems that Spencer studied. This, “critics” claimed falsely, debunked Spencer’s findings. (By similar crackpot “reasoning”, the absence of ice in the sun would be held to prove that ice doesn’t exist.)

3. Correlation is not causation, so the biased data selection of a temperature correlation to CO2 doesn’t validate the simplistic greenhouse effect of CO2 controlling climate. In a greenhouse with an atmosphere about 100 miles high and with 71% ocean area, CO2 heating inevitably causes additional water evaporation: moist air that absorbs sunshine, heats, expands and rises buoyantly until it reaches cold air, where it makes additional clouds. The upper surfaces of the clouds heat up, reflecting and also absorbing energy and trapping the “energy imbalance” far away from the ground, ocean, ice caps.

4. This is “negative feedback” from water: the heating of the atmosphere seen from satellite albedo (reflected heat) and microwave temperature sensors that determine oxygen’s temperature. The satellite temperature data is biased against recording any negative feedback at all, because negative feedback by its nature only occurs under cloud cover (evaporation causes more cloud cover, negating as Spencer found, most of the warming effects of CO2).

5. The main driver of temperature as Nigel Calder (1950s New Scientist editor) recently proved is cloud cover seeding by natural cosmic rays, see which is the inverse effect of the “no go theorem” used by deniers of natural climate change to debunk the idea that energy delivery is proportional to temperature. In the case of cosmic rays, as proved by C.T.R. Wilson’s Nobel Prize winning “Wilson cloud chamber”, the more ionizing radiation, the more ion trails for water droplets to condense upon, and thus the more cooling by clouds. The bigoted human climate change crackpots ignore this vital mechanism, and instead claim that Calder’s inverse correlation between cosmic ray intensity and climate temperature debunks the role of cosmic rays! In fact, it proves it, since cosmic rays boost the ionization that causes water vapour to condense into clouds, but deliver insignificant heating energy!

debunked hockey stick

Above: as we’ve long explained using controlled experiments with ice outdoors in the sun and in the shade and trees in shade and sun, tree ring growth proxies and ice sublimation (oxygen-16 to oxygen-18 ratio data, since lighter water molecules in ice sublime more readily -with less energy) are debunked by negative feedback (cloud cover increases due to increased evaporation from warming oceans in a real world “greenhouse” with oceans). In a nutshell, tree growth and ice sublimation doesn’t respond to temperature in the way observed in controlled experiments where cloud cover isn’t varied.

In the real world, the mean percentage of the sky covered by cloud increases with ocean temperature due to evaporation increasing the humidity and thus the percentage of the earth covered by saturated air (clouds).  This increase in clouds with temperature cuts down solar solar radiation exposure to trees and ice, thus shading them, and offsetting the effects of air temperature variation!  Thus, the flat part of Michael Mann’s hockey stick is not a real constant temperature, but instead is provably just the misinterpretation of the proxies.  You cannot determine any temperatures from ice sublimation or tree ring growth, because as mean air temperature rises, mean cloud cover also increases (evaporation of water from warm oceans) causing negative feedback, and offsetting the effect of the air temperature increase.  It proves impossible to get the few climate hype skeptical journalists like James Delingpole to grasp this.  This universally suppressed mechanism proves that climate is naturally far more variable than Mann indicates using unreliable ice and tree proxies.  Mann’s rising part of the hockey stick (20th century direct temperature measurements) is more reliable, but disagrees with tree records from the same period.  Instead of using this fact to debunk the entire set of tree and ice “proxies”, he simply cuts and pastes in the direct measurements, ignoring the discrepancy.  Journalists are complicit in this cover up, by making speculative or strawman style arguments instead of sticking to hard facts.  Once you grasp the mechanism, you can see that the recent apparent correlation of temperature to CO2 level isn’t impressive, since the natural variability means that at any time it’s about 50% likely that the temperature is naturally rising and 50% likely that its falling.  It’s not a flat line that suddenly goes up when CO2 emissions rise.

This Nature paper debunking the official models is being ignored by the BBC just as my paper explaining the mechanism is ignored by Nature.  The mainstream media avoids direct science controversy reporting and it is taboo to do scientific investigative reporting.

This is also relevant to quantum field theory controversy.  The mainstream media’s position of reverence to science’s “expert authority” (where it happens to suit their political agenda) can be amusingly debunked by taking the same position with their political reporting as follows:

  1. Only the government’s own famous politicians in charge are contemporary “expert authorities” in politics because they have power and full access to secret data, so only their speeches and writings are worthy of reporting. Opposition politicians are not in power, don’t know all the facts, and certainly aren’t in a position of similar authority.
  2. Anyone criticising the government is unfashionable by definition and thus boring.
  3. Reporting criticisms of the government will confuse the public, who won’t know what to believe.
  4. Trying to get to the bottom of controversy by looking at evidence and facts takes too much time, effort, and expertise that the media don’t have.  More money can be had more easily from fashionable celebrity interviews and censoring out fact based criticisms/alternative ideas.

These tactics by the mainstream media in politics would turn democracy into dictatorship. So why on earth do they do the same in science, which is supposed to be liberal with regards to freedom of information, new ideas and criticism of dogma?

(This post on politically corrupted media pseudo-science is cross-posted on our other blog, here.)

In the same way, nuclear weapons effects are routinely exaggerated by using idealized test data blast and radiation transmission in open deserts and from people outdoors (not in concrete buildings) in low-skyline Japanese cities (Hiroshima and Nagasaki, 1945). The exaggerations are then used by antinuclear bigots like CND to try not to lower the yields, but to try to ban nuclear weapons. However, this is debunked by an inspection of declassified surveys proving excellent survival rates in concrete buildings in Hiroshima and Nagasaki where fires were extinguished with simple water buckets (the firestorm peaked 2-3 hours later, not instantly as in CND type propaganda). It is further debunked for nuclear terrorism in recent studies of blast and radiation transmission in modern concrete skylines, which greatly absorb energy from the blast wave thus attenuating it, as well as absorbing radiation. Few people end up with burns, blast or radiation sickness inside modern concrete buildings. But even if they did, and even if nuclear weapons effects were exaggerated, couldn’t we just reduce the yields of stockpiled weapons instead of disarming? This is never discussed, a fact which tells you what’s going on.

As physicist Richard P. Feynman put it in his lectures on “This unscientific age”, since nuclear fallout is insignificant compared to natural background radiation, the anti-nuclear folk must be more concerned about banning natural exposure than nuclear test fallout (e.g., banning naturally radioactive potassium-40 in coffee and bananas, natural cosmic radiation in visits to mountain tops, Denver, and air travel). It is simply untruthful to hype a smaller threat as a danger while ignoring one a hundred times greater which is natural, and it is untruthful to claim that natural hazards are unavoidable. In other words, idealistic politics, not genuine nuclear safety, drives CND folk. Some are deluded by personality liars and nasty pseudo-scientists, but most can grasp that we need compact nuclear weapons to deter the invasions and military attacks that set of both world wars, when bulky, expensive conventional arms and mobilization not only failed to deter war, but helped to start it in 1914. As with eugenics, today’s media accepts anti-nuclear bigotry due to its lazy reliance on “science authority”. When you take account of the actual scaling for realistic city effects of nuclear weapons, the effects are not of a different order to conventional weapons. The millions of conventional weapons in a large war are actually equivalent to the thousands of nuclear weapons in a stockpile: there’s no “overkill”. The use of weapons to produce particular effects such as fallout, akin to lingering mustard gas bomb fears in WWII, were deterred and also largely negated by simple countermeasures and widespread education in defence.

In all of these examples, the media refuses to get engaged with the scientific arguments, preferring to quote “authority” figures, personalities, instead. This is precisely why the public remains ill-informed and the controversies are never ended by hard factual debunking of propaganda. Fundamental physics controversies are similarly treated as taboo by the media, by claiming it to be a mathematically “boring” subject, far beyond the skills of journalists to engage with. Instead, obsolete and often wrong interpretations of the equations are given, such as the notion that there is a single amplitude or wavefunction associated with a particle (that is the false 1st quantization theory, debunked by Dirac’s 2nd quantization and Feynman’s path integral). A particle in a “quantum computer” doesn’t have a single wavefunction amplitude which remains unchanging and indeterminate until measured, storing entangled state information that can be used to compute. Instead, as Feynman showed clearly in 1985, in relativistic quantum mechanics, it’s being endlessly affected by random interactions with field quanta. There is one wavefunction amplitude for every one of these interactions, which must be summed: the electron’s state is continually being changed by discrete, quantum interactions with its particulate Coulomb field. This has never been clearly revealed in the popular media, to debunk Bohr’s and Schroedinger’s incorrect (non-relativistic) belief in a single wavefunction amplitude per particle. Enforced ignorance and apathy results.