Entanglement lies exposed by the late Caroline H. Thompson

Editorial policy of the American Physical Society journals (including PRL and PRA):

http://freespace.virgin.net/ch.thompson1/Papers/Crasemann-CHT%20correspondence%202004.htm

From: Physical Review A [mailto:pra@aps.org]
Sent: 19 February 2004 19:47
To: ch.thompson1@virgin.net
Subject: To_author AG9055 Thompson

Re: AG9055

Dear Dr. Thompson,

… With regard to local realism, our current policy is summarized succinctly, albeit a bit bluntly, by the following statement from one of our Board members:

“In 1964, John Bell proved that local realistic theories led to an upper bound on correlations between distant events (Bell’s inequality) and that quantum mechanics had predictions that violated that inequality. Ten years later, experimenters started to test in the laboratory the violation of Bell’s inequality (or similar predictions of local realism). No experiment is perfect, and various authors invented ‘loopholes’ such that the experiments were still compatible with local realism. Of course nobody proposed a local realistic theory that would reproduce quantitative predictions of quantum theory (energy levels, transition rates, etc.). This loophole hunting has no interest whatsoever in physics.” …’

The censored author of this ‘rebuke’, the late Caroline H. Thompson, of the University of Wales, Aberystwyth, had earlier written in her mainstream-damning arXiv preprint Subtraction of ‘accidentals’ and the validity of Bell tests, http://arxiv.org/PS_cache/quant-ph/pdf/9903/9903066v2.pdf:

‘In some key Bell experiments, including two of the well-known ones by Alain Aspect, 1981-2, it is only after the subtraction of ‘accidentals’ from the coincidence counts that we get violations of Bell tests. The data adjustment, producing increases of up to 60% in the test statistics, has never been adequately justified. Few published experiments give sufficient information for the reader to make a fair assessment. There is a straightforward and well known realist model that fits the unadjusted data very well. In this paper, the logic of this realist model and the reasoning used by experimenters in justification of the data adjustment are discussed. It is concluded that the evidence from all Bell experiments is in urgent need of re-assessment, in the light of all the known ‘loopholes’. Invalid Bell tests have frequently been used, neglecting improved ones derived by Clauser and Horne in 1974. ‘Local causal’ explanations for the observations have been wrongfully neglected.’

After her tragic death from cancer in 2006, her website was preserved, where she wrote in defiance of the Physical Review editor man:

http://freespace.virgin.net/ch.thompson1/EPR_Progress.htm:

‘The story, as you may have realised, is that there is no evidence for any quantum weirdness: quantum entanglement of separated particles just does not happen. This means that the theoretical basis for quantum computing and encryption is null and void. It does not necessarily follow that the research being done under this heading is entirely worthless, but it does mean that the funding for it is being received under false pretences. It is not surprising that the recipients of that funding are on the defensive. I’m afraid they need to find another way to justify their work, and they have not yet picked up the various hints I have tried to give them. There are interesting correlations that they can use. It just happens that they are ordinary ones, not quantum ones, better described using variations of classical theory than quantum optics.

‘Why do I seem to be almost alone telling this tale? There are in fact many others who know the same basic facts about those Bell test loopholes, though perhaps very few who have even tried to understand the real correlations that are at work in the PDC experiments. I am almost alone because, I strongly suspect, nobody employed in the establishment dares openly to challenge entanglement, for fear of damaging not only his own career but those of his friends.’

The stringy mainstream still ignores Feynman’s path integrals as being a reformulation of QM (a third option), seeing them instead as QFT: Feynman’s paper ‘Space-Time Approach to Non-Relativistic Quantum Mechanics’, Reviews of Modern Physics, volume 20, page 367 (1948), makes it clear that his path integrals are a reformulation of quantum mechanics which gets rid of the uncertainty principle and all the pseudoscience it brings with it.

Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84:

‘I would like to put the uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [arrows = path phase amplitudes in the path integral, i.e. eiS(n)/h-bar where S(n) is the action for path n] for all the ways an event can happen – there is no need for an uncertainty principle! … on a small scale, such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by field quanta] becomes very important …’

Copy of a comment of mine submitted (in moderation queue) to Dr Peter Woit’s blog post:

http://www.math.columbia.edu/~woit/wordpress/?p=2089&cpage=1#comment-48831

June 13, 2009 at 3:53 am

‘… my original problems with them seemed to have to do with powerful people who did not like having their multiverse pseudo-science disrespected.’

Peter, you’re not just opposing a handful of bigots at arXiv. Multiverse lies have been hyped in QM for a long time. Unless you address all that QM multiverse BS, you’re not going anywhere real.

Classical and quantum field theories differ due to the physical exchange of field quanta between charges. This exchange of discrete virtual quanta causes chaotic interferences to individual fundamental charges in strong force fields. Field quanta induce Brownian-type motion of individual electrons inside atoms, but this does not arise for very large charges (many electrons in a big, macroscopic object), because statistically the virtual field quanta avert randomness in such cases by averaging out. If the average rate of exchange of field quanta is N quanta per second, then the random standard deviation is 100/N1/2 percent. Hence the statistics prove that the bigger the rate of field quanta exchange, the smaller the amount of chaotic variation. For large numbers of field quanta resulting in forces over long distances and for large charges like charged metal spheres in a laboratory, the rate at which charges exchange field quanta with one another is so high that the Brownian motion resulting to individual electrons from chaotic exchange gets statistically cancelled out, so we see a smooth net force and classical physics is accurate to an extremely good approximation.

Thus, chaos on small scales has a provably beautiful simple physical mechanism and mathematical model behind it: path integrals with phase amplitudes for every path. This is analogous to the Brownian motion of individual 500 m/sec air molecules striking dust particles which creates chaotic motion due to the randomness of air pressure on small scales, while a ship with a large sail is blown steadily by averaging out the chaotic impacts of immense numbers of air molecule impacts per second. So nature is extremely simple: there is no evidence for the mainstream ‘uncertainty principle’-based metaphysical selection of parallel universes upon wavefunction collapse. (Stringers love metaphysics.) Dr Thomas Love, who writes comments at Dr Woit’s Not Even Wrong blog sometimes, kindly emailed me a preprint explaining:

‘The quantum collapse [in the mainstream interpretation of quantum mechanics, where a wavefunction collapse occurs whenever a measurement of a particle is made] occurs when we model the wave moving according to Schroedinger (time-dependent) and then, suddenly at the time of interaction we require it to be in an eigenstate and hence to also be a solution of Schroedinger (time-independent). The collapse of the wave function is due to a discontinuity in the equations used to model the physics, it is not inherent in the physics.’

‘… nature has a simplicity and therefore a great beauty.’

– Richard P. Feynman (The Character of Physical law, p. 173)

The double slit experiment, Feynman explains, proves that light uses a small core of space where the phase amplitudes for paths add together instead of cancelling out, so if that core overlaps two nearby slits the photon diffracts through both the slits:

‘Light … uses a small core of nearby space. (In the same way, a mirror has to have enough size to reflect normally: if the mirror is too small for the core of nearby paths, the light scatters in many directions, no matter where you put the mirror.)’

– R. P. Feynman, QED, Penguin, 1990, page 54.

Hence nature is very simple, with no need for the wavefunction collapse or the ‘multiverse’ lie of crackpot Hugh Everett III who wouldn’t even incorporate the physical dynamics of fallout particle sizes and deposition phenomena in his purely statistical paper allegedly predicting fallout casualties:

‘It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of spacetime is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.’

– R. P. Feynman, The Character of Physical Law, November 1964 Cornell Lectures, broadcast and published in 1965 by BBC, pp. 57-8.

8 thoughts on “Entanglement lies exposed by the late Caroline H. Thompson

  1. copy of a “fast comment” of mine to stringy Lubos Motl’s blog (from which it will probably be deleted):

    http://motls.blogspot.com/2009/09/schrodinger-virus-and-decoherence.html

    Entanglement and the Copenhagen Interpretation are based on QM which is first quantization; i.e. quantization of particle position/momentum using a wave equation (Schroedinger) or uncertainty principle (Heisenberg), in each case having a classical Coulomb potential.

    Actually, QM with 1st quantization is false: it is inconsistent with special relativity. 2nd quantization is correct, and quantizes the field not the position/momentum. I.e., the field quanta cause the indeterminancy in 2nd quantization. Indeterminancy is a physical effect of chaotically arriving field quanta on small scales of spacetime, such as inside an atom.

    Dr Thomas Love of California State University has shown that:

    “The quantum collapse [in the mainstream interpretation of of quantum mechanics, which has wavefunction collapse occur when a measurement is made] occurs when we model the wave moving according to Schroedinger (time-dependent) and then, suddenly at the time of interaction we require it to be in an eigenstate and hence to also be a solution of Schroedinger (time-independent). The collapse of the wave function is due to a discontinuity in the equations used to model the physics, it is not inherent in the physics.”

    http://arxiv.org/PS_cache/quant-ph/pdf/9903/9903066v2.pdf:

    ‘In some key Bell experiments, including two of the well-known ones by Alain Aspect, 1981-2, it is only after the subtraction of ‘accidentals’ from the coincidence counts that we get violations of Bell tests. The data adjustment, producing increases of up to 60% in the test statistics, has never been adequately justified. Few published experiments give sufficient information for the reader to make a fair assessment. There is a straightforward and well known realist model that fits the unadjusted data very well. In this paper, the logic of this realist model and the reasoning used by experimenters in justification of the data adjustment are discussed. It is concluded that the evidence from all Bell experiments is in urgent need of re-assessment, in the light of all the known ‘loopholes’. Invalid Bell tests have frequently been used, neglecting improved ones derived by Clauser and Horne in 1974. ‘Local causal’ explanations for the observations have been wrongfully neglected.’

  2. A new experiment, ‘Measurement Problem’ physical sequence derivation and paper published by Springer Nature in ‘Foundations of Physics’ has confirmed Caroline Thompsons hypothesis and analysis. It should be noted Gregor Weighs et al (who’s assistance the authors P Jackson and J Minkowski express gratitude for) similarly discarded significant raw data to obtain the QM predic-tion. A video is being developed to help explain the interaction mechanism, involving 3D eliipticised helicity and vector additions. Agreement with John Bells view is identified and ‘entanglement’ ex-plained causally. It is however notable that despite it’s best credentials the arXiv inexplicably re-fused to archive the paper, raising serious questions about competence at Stanford and honesty in physics which need to be addressed. See; Springer Nature; DOI 10.1007/s10701-021-00475-4 or a preprint is archived at; https://www.researchgate.net/publication/352056822_The_Measurement_Problem_an_Ontological_Solution

Leave a comment