Quantum Gravity Film and Scientific Paper in Amazon paperback Format

Spin-2 graviton deceivers: the stress-energy tensor of general relativity is a classical continuously differentiable entity that can’t represent discrete quantum fields realistically (they are put in as “perfect fluids”, not real discrete particles). So the argument that quantum gravity must be spin 2 because classical general relativity says so, is using the most flawed part of classical theory to dictate what quantum gravity must look like. A complete delusion. See also this link and this paper please, admit you repeatedly censored out the hard facts and abused the ethics of science, and apologise now, please.

Typical spin-2 delusion example: Steven Weinberg’s paper “Photons and Gravitons in S-Matrix Theory: Derivation of Charge Conservation and Equality of Gravitational and Inertial Mass,” Phys. Rev. 135 (1964) B1049-B1056, shows that spin-2 gravitons couple to the rank-2 stress energy tensor. Steven Weinberg refuted the stress-energy tensor in “Gravitation and Cosmology” Wiley, 1972, page 147:

“At one time it was even hoped that the rest of physics could be brought into a geometric formulation, but this hope has met with disappointment, and the geometric interpretation of the theory of gravitation has dwindled to a mere analogy, which lingers in our language in terms like ‘metric’, ‘affine connection’, and ‘curvature’, but is not otherwise very useful. The important thing is to be able to make predictions about the images on the astronomer’s photographic plates, frequencies of spectral lines, and so on, and it simply doesn’t matter whether we ascribe these predictions to the physical effect of a gravitational field on the motion of planets and photons or to a curvature of space and time.”

I’m sure Weinberg is back to the stress-energy tensor now to shore up spin-2 graviton delusion-based string theory hype. Nevertheless, the stress-energy tensor is 4×4 matrix of continuous differential equations which can’t represent discrete particles; you typically have to represent actual mass (particles of matter, quanta of energy) by a physically false “perfect fluid” continuum distribution, just so that general relativity pops out the smooth (pseudo) curvature (not a quantum field theory).

Using a classical entity like the rank-2 stress-energy tensor to “determine” the spin of the quanta of quantum gravity is like using epicycles to determine the structure of the universe. It’s absurd. If Riemann had never been born, and Einstein had formulated gravity in rank-1 (curving field lines) tensors instead of rank-2 (spacetime curvature), you would still have been able to include all the observed features of relativistic gravitation correctly with spin-1 field quanta. Hence there’s no logic here. Going to rank-1 tensors (ordinary vectors) gives the confirmed 1996 quantitatively accurate prediction that spin-1 graviton exchange between similar sign gravitational charges (e.g. masses) causes cosmological repulsion aka “dark energy”, as well as observed gravity effects (you can still keep your rank-2 general relativity as a classical duality for constructing the metric with its energy conservation spacetime contraction effects):

Watermelons by James Delingpole (a book review)
Quantum Gravity Successes

Quantum gravity paper overview: http://vixra.org/abs/1111.0111 Further information: http://www.quantumfieldtheory.org and paperback book version of paper: http://www.amazon.co.uk/dp/1470997452/ or http://www.amazon.com/Quantum-Gravity-Standard-Model-Nigel/dp/1470997452 If you sweep away 1st quantization, and allow all “wave effects” and eigenvalues (discrete energy levels of electrons in the atom etc) to arise from multipath interference, then the “uncertainty principle” becomes a result of multipath interference. No wavefunction collapses, because there is no single wavefunction in the path integral. The whole basis of the path integral is summing wavefunction amplitudes from all paths between source and receiver (instrument). The instrument only plays a part in determining the end point for the path, hence the understandable mechanism for relativistic quantum mechanics:

“… My way of looking at things was completely new [path integrals], and I could not deduce it from other known mathematical schemes … Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle …”

The Beat of a Different Drum: The Life and Science of Richard Feynman, by Jagdish Mehra (Oxford 1994, pp. 245-248).

“Scepticism is … directed against the view of the opposition and against minor ramifications of one’s own basic ideas, never against the basic ideas themselves. Attacking the basic ideas evokes taboo reactions … scientists only rarely solve their problems, they make lots of mistakes … one collects ‘facts’ and prejudices, one discusses the matter, and one finally votes. But while a democracy makes some effort to explain the process so that everyone can understand it, scientists either conceal it, or bend it … No scientist will admit that voting plays a role in his subject. Facts, logic, and methodology alone decide – this is what the fairy-tale tells us. … This is how scientists have deceived themselves and everyone else … Science itself uses the method of ballot, discussion, vote, though without a clear grasp of its mechanism, and in a heavily biased way.”

– Professor Paul Feyerabend, Against Method, 1975, final chapter

“‘Science says’ has replaced ‘scripture tells us’ but with no more critical reflection on the one than on the other. … the masses still move by faith. … I have fear of what science says, not the science that is hard-won knowledge but that other science, the faith imposed on people by a self-elected administering priesthood. … In the hands of an unscrupulous and power-grasping priesthood, this efficient tool, just as earlier … has become an instrument of bondage. … A metaphysics that ushered in the Dark Ages is again flourishing. … Natural sciences turned from description to a ruminative scholarship concerned with authority. … Our sales representatives, trained in your tribal taboos, will call on you shortly. You have no choice but to buy. For this is the new rationalism, the new messiah, the new Church, and the new Dark Ages come upon us.”

– Jerome Y. Lettvin, The Second Dark Ages, paper given at the UNESCO Symposium on “Culture and Science”, Paris, 6-10 September 1971 (in Robin Clarke, Notes for the Future, Thames and Hudson, London, 1975, pp. 141-50).

“Crimestop means the faculty of stopping short at the threshold of any dangerous thought. It includes the power of not grasping analogies, of failing to perceive logical errors, of misunder-standing the simplest arguments … and of being bored or repelled by any train of thought which is capable of leading in a heretical direction. Crimestop, in short, means protective stupidity.”

– George Orwell, 1984

“Denialism” can be directed both ways in science. It’s just a vacuous piece of playground name-calling. What matters is the substance of the science, not how fashionable something is. Fashionability matters for getting funding, of course, and this is where Lord Acton’s “All power corrupts…” comes in. Scientists are no more ethical than anyone else.

Educational psychologist Lawrence Kohlberg’s “Stage and Sequence: the Cognitive Development Approach to Socialization” (in D. A. Goslin, Ed., Handbook of Socialization Theory and Research, Rand-McNally, Co., Chicago, 1969, pp. 347-380) lists six stages of ethical development:

(1) Conformity to rules and obediance to authority, to avoid punishment.
(2) Conformity to gain rewards.
(3) Conformity to avoid rejection.
(4) Conformity to avoid censure. (Chimps and baboons.)
(5) Arbitrariness in enforcing rules, for the common good.
(6) Conscious revision and replacement of unhelpful rules.

The same steps could be expected to apply to scientific ethical development. However, the disguised form of politics which exists in science, where decisions are taken behind closed doors and with no public discussion of evidence, stops at stage (4), the level of ethics that chimpanzees and baboons have been observed to achieve socially in the wild.

(It’s a fact that “entanglement” is 1st quantization – non-relativistic – single-wavefunction nonsense. There are no single wavefunctions for particles, as Feynman discovered! There’s a separate wavefunction amplitude for every possible path, and indeterminancy is not due to wavefunction collapse, but instead is due to multipath interference. Do you grasp the analogy between multipath interference of HF skywave radio from partial reflection by different regions of the ionosphere – D, E, and F layers – and multipath interference in the path integral? The whole of Bell’s inequality/wavefunction collapse/entanglement is a propaganda exercise of 1st quantization disinformation. It’s aim, like Complementarity, is to promote mathematical misunderstanding and obfuscation to revert science to ancient metaphysical dogma. Bohr’s statements prove that he wanted no understanding of nature: he wanted to freeze 1st quantization at the 1926 level for all time with correspondence and complementarity principles. He and others wanted nobody to understand, or progress physics realistically with proved predictions empirically confirmed. The “nobody understands quantum mechanics” statement, presented as a factual proof of the non-existence of simple mechanisms, is extremely destructive. If people are prejudiced and not looking, even if they find facts they’ll ignore them. They will declare that the people promoting facts are giving out boring propaganda, or are just plain wrong because they have been denied any publicity compared to the over-hyped mainstream liars. They will object to calling Hitler a “liar” because of the Nazi dogma that “hard words make wounds”. They are socially evil dictators: deliberately marketing ignorant propaganda and drivel that makes no confirmed predictions unlike this paper which predicted the cosmological acceleration in 1996 correctly, and their aim is to increase the “noise level” in journals and popular media to help the mainstream use “guilt by false conflation of all alternative ideas” as a pseudo-argument in order to “justify” censoring calculative papers that predict facts later demonstrated in nature. What I mean is the increase in the noise level to drown out real physics, analogous to the sheep bleating continually and loudly “Four Legs Good, Two Legs Bad” in George Orwell’s book Animal Farm: the objective of people like Lee Smolin and Garrett Lisi, in addition to consistent histories propaganda, is to drown out all realistic physics. Then people like Ed Witten can announce that the sheep are making a lot of incoherent noise, and he gets applauded for it. Real physics remains unheard. Few today – as a result of this successful arXiv.org policy – have time to even read nevermind check confirmed predictions, when both the mainstream and the loudest bleating alternative ideas which are even more decrepit put them off the whole subject of understanding the world.)

“I would like to put the [1st quantization dogma/wavefunction collapse/entanglement/quantum computing/quackery] uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [wavefunction phase amplitudes] for all the ways an event can happen – there is no need for an [1st quantization lying] uncertainty principle! … on a small scale [path actions small compared to h-bar], such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by 2nd quantization field quanta] becomes very important [providing an understandable multipath interference mechanism for indeterminancy, taking the metaphysics from quackery and replacing it with understandable, predictive path integrals which work unlike metaphysics dogma; quack obfuscators who fill up the journals with pseudoscientific non-relativistic gibberish increase the “noise level” in a way that helps the mainstream dogma censors to find an excuse to “discredit” alternatives in general without bothering to even check them properly first; hence it is completely taboo to understand physics and a sign of stupidity to make checkable predictions]”

– Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84. (Beware of Feynman’s older books from the 1960s which are pro-quackery and contain statements like “nobody understands quantum mechanics”. Once you get lots of people making illucid claims that QM or SR are wrong, but ignoring 2nd quantization, criticisms backfire and enable mainstream thought eugenicists to censor all future critics of status quo by peer review politics.)

“The quantum collapse [in the mainstream interpretation of quantum mechanics, where a wavefunction collapse occurs whenever a measurement of a particle is made] occurs when we model the wave moving according to Schroedinger (time-dependent) and then, suddenly at the time of interaction we require it to be in an eigenstate and hence to also be a solution of Schroedinger (time-independent). The collapse of the wave function is due to a discontinuity in the equations used to model the physics, it is not inherent in the physics.”

– Dr Thomas Love, Departments of Physics and Mathematics, California State University, by email.

The first half of the video disproves quacks by proving that there is a physical difference between 1st and 2nd quantization beyond the description of antimatter and the path integral: as Feynman explained in his 1985 book “QED”, in the path integral formulation of quantum mechanics, all wave-particle duality effects arise from a physical mechanism, multipath interference of the cyclically varying wavefunction amplitudes for each path. This means, quoting Feynman’s book, “you don’t NEED an uncertainty principle”, in other words, multipath interference is the mechanism normally ascribed to the equation of the uncertainty principle. Put another way, you can derive the uncertainty principle from the multipath interference mechanism of the path integral. In relativistic 2nd quantization (contrary to Bohr/Schroedinger/Heisenberg/Bell/Bohm 1st quantization wavefunction entabglement/collapse) there is no single wavefunction for any particle and no collapse of that single wavefunction or entanglement of that single wavefunction (instead there is a sum over histories of many wavefunctions’, with multipath interference totally replacing the uncertainty principle of 1st quantization with a simple physical mechanism for indeterminancy); wavefunction amplitudes must be added up for each different possible path; there is no single wavefunction to collapse and thus no “entanglement” or Bell inequality test as in non-relativistic 1st quantization “quantum computing” quackery.

There’s a mechanism for indeterminacy, the interference between multiple paths, and it’s similar to the multipath interference mechanism that caused skywave interference in HF radio. This occurs when some radio energy is reflected back to earth by the different layers in the ionosphere, at different altitudes, so the different paths taken were received out of phase, causing the received signal to suffer from self-interference. The exchange of quanta is behind the Coulomb field binding electrons to nuclei, and since this is a discrete interaction, the electron’s motion on small scales is non-classical and indeterministic.

A “wave” is just a a periodic oscillation. Every particle’s path has an oscillating phase or “wave amplitude” which is exp(iS), S being “action” (the particle energy multiplied by the time taken, or more precisely, the integral of the Lagrangian) in Planck units. Since exp(iS) = cos S + i sin S, it follows the wave amplitude is periodic oscillating function of the distance the particle goes on a path between emission and absorption (i.e. the time taken). If it is possible for it to go several different ways (e.g. if two slits exist in a screen), there’s a separate wave amplitude for each possible path, and you add them up (path integral). The result maximises contributions from paths of least action (or least time, if the energy is constant). Feynman explains how this explains all wave-particle duality issues in his 1985 book QED, e.g. the double slit experiment, so the wave-type nature is light is due to multipath interference of periodically oscillating amplitudes from each path.

Note that the path integral always gives real (not complex) results for cross-sections and probabilities, so the “resultant arrow” on an Argand diagram (the sum over histories that Feynman draws in many illustrations in his QED 1985 book) is always parallel to the real axis, with no complex component (Feynman’s diagrams don’t make this detail crystal clear). This means that you don’t really need exp(iS) for the phase amplitude, at least mathematically. You can drop i sin S from Euler’s equation and just replace it by cos S in all path integral calculations. It makes absolutely no difference in all real-world checked calculations whether you you complex space (an Argand diagram with one axis in units of i) or whether the phase amplitude rotates in the real Euclidean space plane. It turns out that the only reason why exp(iS) is dogma instead of cos S, is that Weyl’s first attempt to quantize gravity in 1918 tried to scale the metric in proportion to exp(iX) where X is a function of the electromagnetic field.

Einstein debunked it, but Schroedinger loved the idea and in 1922 scaled the periodic real solutions to exp(iN) to represent Bohr’s discrete energy levels for an electron in a hydrogen atom (with all unobserved energy levels conveniently located in the complex plane!). (Ref: Schroedinger, “On a remarkable property of the quantum-orbits of a single electron”, Zeitschrift f. Physik v12, 1922, p13). After Heisenberg’s matrix mechanics, Schroedinger then reformulated the idea as his “wave equation”, since i dY/dt = HY has the solution: Y is proportional to exp(iHt), knowing from basic math that exponential solutions always exist for equations of the form dY/dt = Y.

Dirac then converted Schroedinger’s equation back into exp(iHt) in 1933, and in 1948 Feynman reinterpreted it correctly as applying to each possible path (not just to the a single path or a classical path) so that different paths interfere to produce wave effects. So exp(iHt) or its more general form exp(iS) is really a relic of 1st quantization and Weyl. It’s not needed anymore. We can replace it with cos S if we forget 1st quantization (Schoedinger’s single wave amplitude equation) and go for path integrals instead. Quantization occurs due to multipath interferences, not complex space, which was just a stopgap idea dating back to Weyl and Schoedinger 1922. There is no effect on a path integral’s mathematical results whatsoever: it’s still a real cross-section or probability in all checked experiments.

The electron’s discrete energy levels occur because “non-permitted” orbits don’t have paths of minimal action and so are eliminated by multipath interference. In QFT, the integral of exp(iS) over all paths can be replaced with the integral of cos S, which has no effect except from eliminating Hilbert space and with it all the problems of Haag’s theorem for renormalization. This makes quantum mechanics explicable, and understandable, in terms of simple physical concepts not requiring complex space.

My argument follows Maxwell’s SU(2) electromagnetic theory: Maxwell makes the point in his 1861-2 papers “On Physical Lines of Force” that magnetic fields are physically propagated by spinning field quanta or vortices. The handedness of the magnetic field vector, which loops or curls in the perpendicular plane around the direction of an electric current, is therefore a chiral handedness effect. In other words, magnetism involves a chiral handedness of gauge bosons, by analogy to SU(2) weak interactions which are chiral in involving left-handed neutrinos. Thus, by making SU(2) a full electroweak theory, the role of U(1) is no longer the SM’s fiddled hypercharge (fractional quark charges result from a vacuum polarization cloaking mechanism, where some electromagnetic field energy is converted into strong field quanta energy by vacuum interactions between pairs or triplets of nearby quarks which exchange gluons).

U(1) charge is now gravitational charge (mass), i.e. the charge of quantum gravity. U(1) mixing with chiral SU(2) gives rise to massive weak bosons. By including gravitation correctly as a gauge symmetry in the Standard Model, it is possible to predict masses since mass is the natural unit for the “charge” of quantum gravity. Because U(1) hypercharge mixing with SU(2) now yields predominantly (from the mixing angle standpoint) U(1) gravity and predominantly (from the mixing angle standpoint) SU(2) electro-weak forces rather than U(1) electrodynamics and SU(2) weak interactions as in the SM, we can see that the breaking of SU(2) into weak and electromagnetic interactions is chiral and depends on the spin of the single U(1) charge (mass). Giving mass to left-handed SU(2) gauge bosons breaks the SU(2) symmetry (creating Goldstone bosons which gain mass which have been experimentally mistaken for the SM’s “Higgs bosons” by SM propagandarists) and thus gives massive bosons which undergo weak interactions (their mass provides inertia to overcome magnetic self-inductance, which blocks one-way flows of charged massless gauge bosons); the remaining massless SU(2) gauge bosons convey electromagnetic fields as explained in detail in the video.

Massless electric charges can’t move in one direction only, because they have no inertial mass to overcome the magnetic self-inductance due to motion. However, they can be exchanged in a equilibrium (exchange radiation) between charges of similar sign, because the geometry then cancels out the magnetic field curls, totally preventing the self-inductance problem! This mechanism of equilibrium for current exchange (well known in electric circuits and logic signals) also has another vital effect: it constrains to zero the net charge transfer term in the SU(2) Yang-Mills equation! This physical constraint reduces the massless boson SU(2) Yang-Mills equation to Maxwell’s equations. So the theory is completely self-consistent, in addition to having made confirmed predictions!

It’s the fashionable preoccupation with string theory which has drawn mainstream attention away from efforts to find a simple and useful way to put gravity into the Standard Model as a gauge theory, since the arguments for this rely on a spin-2 graviton (the basis of string theory arguments) which is based on the rank-2 general relativity field tensors. It doesn’t seem to be a strong scientific fact, bearing in mind that general relativity is not a quantum theory, and you can describe gravity using rank-1 vector field equations like Poisson’s equation, with a relativistic metric to correct for spacetime contraction.

The theory successfully predicted the cosmological acceleration (dark energy) of the universe in May 1996, published in October 1996, two years before experimental confirmation by Perlmutter. In 1996, when the cosmological acceleration calculation was sent to Mike Renardson, editor of “First Thoughts” magazine, his initial reaction was that the roughly 10-10 ms-2 cosmological acceleration predicted was far too small to ever observe in the real world.

Yet just two years later, Perlmutter’s computer automated CCD (charge coupled device) telescopes detected the signature from a fixed energy-sized supernova at half the age of the universe, confirming quantum gravity’s prediction!

Understanding science corruptions and deception

Science liars don’t think they are liars for the most part, e.g. the typical example of “eugenics” pseudoscience (popularized by media censorship in democracies by fashionable bigots like Sir Francis Galton, the gas chamber final solution proposer and later Nazi collaborator Medical Nobel Laureate Alexis Carrel. Society needs a diversity of ideas. The mixing of the “educational” establishment with science research in the 1850s was a disaster, standardizing theories and thinking methodologies prematurely (it’s always premature to edit out diversity down to a single way of thinking) in order to set simplistic teaching syllabuses and exams, cloning scientists into a groupthink approach to fundamentals which are taboo.

Now you might construct a straw-man argument against this. You might say, OK, but in the real world lots of diversity leads to chaos, and we don’t have time to teach lots of diverse ideas, but need to simplify and censor for time constraints (Plato’s defense of bigotry). In that case you can still lean over backwards to repeatedly point out what you are doing in defining the theory you select as being preferable in particular specific ways, that for example Bertrand Russell said that as an alternative to evolution God made the universe 5 minutes ago including the fossil record: in other words, you use evolution not because it is the only possible theory (it isn’t) but because it is the most useful theory for scientific reasons (although for religious-promotion reasons, other theories may be more useful). In other words, you test theories but don’t “prove” them. This is a very difficult point. I can’t “prove” quantum gravity, I “just” have evidence which nobody else has: the prediction however doesn’t involve any extravagant hypotheses, just well defensible empirical data.

Nature’s editors Phil Campbell and Karl Zemelis wrote letters refusing to publish the “extremely unlikely” prediction in 1996 (along with several other prominent journals), and then chickened out of publishing the fact they had got their decision wrong when the experiments confirmed the prediction in 1998 (despite repeated letters sent by recorded delivery, and un-returned phone calls)! So did the New Scientist’s “ecowarrior” editors Richard Fifield and Jeremy Webb and even their letters editor, who all went silent, into abusive tantrums, or claimed that confirmed predictions were a “waste of time for the science news media unless they were FIRST published in journals “peer” reviewed by (bigoted) string theorists” (EW publications were ignored by New Scientist). Classical and Quantum Gravity (Institute of Physics, Bristol) sent the paper for “peer” review to an anonymous and brilliant string theorist, who astutely reported on that: “This paper is detached from current work in superstring theory.” Duh, we told you! Superstring theory doesn’t make a single falsifiable prediction despite thirty years of mainstream effort, so it’s turned into crank groupthink (worse than phlogiston crackpotism, which at least was a falsifiable conjecture that could be disproved by experiments!). The only role of defenders of superstring is to silence falsifiable predictions from genuine alternative theories, using the bogus “peer” (not!) review system!

The recent BBC “news” bias controversy (censoring a programme exposing its own 1980s star TV hero to abuse allegations in December 2011 and then on Friday last week admitting it transmitted a programme making damning allegations against a senior Thatcher politician without even bothering to first check if alleged attacker had been correctly identified, which he hadn’t been) should tell you that we don’t live in a free world: priesthoods of unelected greasy-pole-climbing liars act as both censors and witchfinder generals, and control the TV that channel that you are forced to pay for by law and threat of prison if you have a television, precisely the propaganda trick of the Nazis and the USSR (which also had public elections where voters could choose between two public relations expert clones every few years, a dictatorial propaganda process which in our country is given the misleading term “democracy”, despite having nothing to do with democracy, which was a daily referendum on issues – something that would be easy using the internet given internet banking security techniques today – not an election choice between two dictators for a period of five years).

Nothing wrong there: I am not objecting any of these decisions to tell lies but just to the censorship of those who point out the facts which disprove the lies; we are merely suggesting that defensible, free criticism of these mainstream lies is being censored out in order to allow the lies to continue to brainwash people in an undemocratic, non-free, Third Reich style corrupted media groupthink of “politically correct” thought dictatorship. If you want to tell lies, do so by all means, but my point is that in order to make progress we need be able to criticise lying statements objectively. THE FREEDOM TO DO OBJECTIVE CENSORSHIP OF LIES IN THE FASHIONABLY PREJUDICED MEDIA IS MISSING. Censorship is only wrong when used in a one-sided dictatorial manner, an emotional and fact-evading manner by thugs who ignore (will not respond to) objective criticisms because they don’t need to. It’s not wrong when done objectively to sort out and distinguish the facts, and to ask challenging questions. You can’t make progress if you can’t criticise status quo. The doublethink whereby we pretend we live with freedom of speech when in fact one-to-many USSR type quango media like the BBC saturate the world with groupthink lying propaganda from dictators like Paul Nurse (see linked page here) is dangerous. It happened before with eugenics and the use of the gas chamber, proposed first for use against critics of government policy by medical Nobel Laureate Alexis Carrel in his 1935 French bestseller – a bestseller in Germany with a Nazi foreword in 1936 – Man the Unknown. Deliberate misunderstanding of evolution for eugenics was convenient, so the big shots did not scientifically criticise it. Would the holocaust have occurred if Darwin had shot down Sir Francis Galton’s eugenics? Maybe not! Science isn’t an abstract game. It has human consequences.

Deliberate misunderstanding of quantum mechanics (non-relativistic 1st quantization complex space lies – not simple multipath interference in relativistic 2nd quantization real space path integrals – being the implicit assumption for “nobody understands” propaganda lies) is unnecessary and leads to a culture of misunderstanding and hatred towards progress in understanding physics. The liquid droplet model of the nucleus explains why the nuclear fission of a large nucleus into smaller nuclei releases energy: the surface tension (binding energy) is proportional to the surface area of the nucleus which scales as the square of the radius, whereas the number of nucleons present in a nucleus scales for very heavy nuclei as roughly the volume or the cube of radius. So the binding energy per nucleon gets smaller in very heavy nuclei, because they have less surface area per nucleon. If you look at the curve of binding energy, you see that very heavy nuclei like uranium have about 7 MeV of binding energy per nucleon, compared to about 8.7 Mev/nucleon for iron. It’s this fall in binding energy per nucleon for very heavy nuclei which allows them to fission.

However, the total amount of binding energy increases after fission: from about 7 MeV/nucleon for uranium to well over 8 MeV/nucleon for fission fragments. Nuclear binding energy is not being released, it’s getting bigger in fission! Fission doesn’t release energy from the nuclear (strong) force, on the contrary, fission increases that binding energy. What physicists call “nuclear energy” from fission is electromagnetic (Coulomb field) energy. The electromagnetic repulsion between protons in the nucleus is trying to push it apart, while the strong nuclear force mediated (at its maximum range) by pions (gluons are exchanged between quarks on shorter distance scales) and when you hit the nucleus of uranium with a neutron (preferably a uranium isotope with an odd number of nucleons, which is less stable than the closed nuclear shell structures with even numbers of nucleons), it causes a distortion of the nucleus which may allow the electromagnetic repulsion force to briefly overcome the strong binding force and break the nucleus up. The point is, “nuclear energy” is not nuclear energy.

It’s electromagnetic energy. It’s Coulomb repulsion of protons, accelerating the fission fragments apart and thus imparting energy to them. It should be called electromagnetic energy (or atomic energy) to reduce confusion. But it isn’t. It’s called “nuclear energy” for political purposes, not scientific ones. The energy doesn’t come from nuclear binding energy. The fission fragments have altogether more nuclear binding energy than the unfissioned uranium! Now if you look in popular books on science, they say that nuclear energy is explained by Einstein’s E = mc^2. Nope. An equation doesn’t explain the mechanism, and it’s not even “matter” that is being converted into energy anyway as we have already explained: some electromagnetic Coulomb field energy is released in fission, and for the most part this energy is being converted into matter (increasing the binding energy per nucleon of the fission products; essentially all of the mass of atoms comes not from quarks or electrons but from the short-ranged, short-lived field quanta around the quarks). Fission converts Coulomb electromagnetic field energy (which has mass but is not real, on-shell matter) into the kinetic energy of matter. Now most physicists have learned that “there is nothing to be understood” and see no value in understanding mechanisms, which just obfuscate complex numbers in their equations, so they censor this out. What’s new? Remember Feynman’s immediate acceptance by all the great despots of the age:

“… My way of looking at things was completely new, and I could not deduce it from other known mathematical schemes … Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle …”

The Beat of a Different Drum: The Life and Science of Richard Feynman, by Jagdish Mehra (Oxford 1994, pp. 245-248).

This attitude of Bohr persists today with regard to the difference between 1st and 2nd quantization; the attitude is that because non-relativistic 1st quantization was discovered first, and is taught first in courses, it must somehow take precedence over the mechanism for indeterterminancy in quantum field theory (2nd quantization). The doublethink of most textbooks omits this and glues on 2nd quantization as a supplement to 1st quantization, rather than as a replacement of it! Why not have doublethink, with two reasons for indeterminancy: intrinsic, unexplained, magical indeterminancy typified by the claim “nobody understands quantum mechanics (1st quantization)”, plus the mechanism that virtual particles in every field randomly deflect charges on small scales (like Brownian motion on dust)!

Einstein and Infeld in their book “Evolution of Physics” discuss the randomness of Brownian motion. When the random, indeterministic motion of fragments of pollen grains was first seen under a microscope, the water molecules bombarding the fragments were invisible, and Brown actually believed that the motion was intrinsic to small particles, an inherent indeterminancy on small scales in space and time! This error is precisely Bohr’s 1st quantization error. It is no wonder that Bohr was so ignorantly opposed to Feynman’s path integral, or that most people still profess that they can’t understand mechanisms.

Feynman’s answer of course is that 1st quantization is plain wrong, since it is non-relativistic and also Occam’s Razor tells us that we need 2nd quantization only because it explains everything mechanically without needing an 1st quantization (intrinsic or magical) uncertainty principle:

“I would like to put the [1st quantization] uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [wavefunction phase amplitudes] for all the ways an event can happen – there is no need for an [1st quantization] uncertainty principle! … on a small scale [path actions small compared to h-bar], such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by 2nd quantization field quanta] becomes very important …”

– Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84.

Statistical correlation tests are the most easily corrupted form of science, and this is rife: you test for “correlation” between one model and the experimental data, given a null (default) hypothesis that the “correlation” is just random coincidence. The flaw here is that the “evidence” you gain from a successful correlation test only tells you that the model accords with the data better than random noise. It doesn’t tell you anything about the problem that another theory may also agree, e.g. FitzGerald’s, Lorentz’s, Poincare’s and Larmor’s equations match Einstein’s special relativity’s transformation and E = mc2 law, so “experimental tests” of these equations doesn’t specifically support Einstein’s theory over the more mechanical derivations of the same equations by the earlier investigators. It’s also been shown that the confirmed predictions of general relativity come from energy conservation and are not specific confirmation of the geometric space-time continuum model. Therefore, it is Popperian sophistry to claim that a specific theory is “confirmed” by experiments merely when its predictions are confirmed, unless you have somehow disproved the possibility of any other theory predicting the same results by a different route. Politically, this sophistry gives rise to the “historical accident syndrome” whereby the first theory which gives the correct prediction in a politically-correct, fashionable manner, is hyped by the popular media as having been “confirmed” by experiment, when in fact only the predictions (which are also given by totally different theoretical frameworks sharing the same mathematical duality in the limits of the experimental regime) are confirmed. This is fascist hubris. We saw it with the earth-centred universe of Ptolemy. Once you have a fashionable model, it gets into the educational textbooks, it is “understood” by the popular media, and any alternative framework is wrongly dismissed as superfluous, unnecessary, boring, etc., without first being properly investigated to see if it fits more data more accurately.

It’s important to note that this is a general problem in politics and human endeavour generally. The advice is to keep to well-worn paths or you will get lost. However, you’re unlikely to find much on well-worn paths, because so many people keep to them, and the probability of finding anything on them is therefore low. Ironically, this point is “controversial” because you get the counter-argument that you’re unlikely to find anything if you go off the beaten track. More to the point, if you do find anything off the beaten track, you still have a difficulty in convincing anybody that it actually exists, as Niccolò Machiavelli explains in the political context (The Prince, Chapter VI): “the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new. This coolness arises partly from fear of the opponents, who have the laws on their side, and partly from the incredulity of men, who do not readily believe in new things until they have had a long experience of them. Thus it happens that whenever those who are hostile have the opportunity to attack they do it like partisans, whilst the others defend lukewarmly, in such wise that the prince is endangered along with them.”

It’s quite correct that that a lukewarm argument on a radical and unpopular proposal leads either nowhere or to failure (suppression). You cannot easily overthrow a tyrant with kindly, gentle words alone. By the time a tyrant is susceptible to arguments (in dementia), it is easier to overthrow the regime by other means anyway. Diplomacy is the policy of feeding wolves in the expectation of achieving peace through appeasement. Groupthink is never revolutionary: it is always counter-revolutionary, developing political structures to stabilize a success by preventing a further revolution. New ideas are only welcome within the narrow confines of an existing theory, like epicycles.

Irving L. Janis, Victims of Groupthink, Houghton Mifflin, Boston, 1972

Janis, civil defense research psychologist and author of Psychological Stress (Wiley, N.Y., 1958), Stress and Frustration (Harcourt Brace, N.Y., 1971), and Air War and Emotional Stress (RAND Corporation/McGraw-Hill, N.Y., 1951), begins Victims of Groupthink with a study of classic errors by “groupthink” advisers to four American presidents (page iv):

“Franklin D. Roosevelt (failure to be prepared for the attack on Pearl Harbor), Harry S. Truman (the invasion of North Korea), John F. Kennedy (the Bay of Pigs invasion), and Lyndon B. Johnson (escalation of the Vietnam War) … in each instance, the members of the policy-making group made incredibly gross miscalculations about both the practical and moral consequences of their decisions.”

Joseph de Rivera’s The Psychological Dimension of Foreign Policy showed how a critic of Korean War tactics was excluded from the advisory group, to maintain a complete consensus for President Truman. Schlesinger’s A Thousand Days shows how President Kennedy was misled by a group of advisers on the decision to land 1,400 Cuban exiles in the Bay of Pigs to try to overthrow Castro’s 200,000 troops, a 1:143 ratio. Janis writes in Victims of Groupthink:

“I use the term “groupthink” … when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.”(p. 9)

“… the group’s discussions are limited … without a survey of the full range of alternatives.”(p. 10)

“The objective assessment of relevant information and the rethinking necessary for developing more differentiated concepts can emerge only out of the crucible of heated debate [to overcome inert prejudice/status quo], which is anathema to the members of a concurrence-seeking group.”(p.61) [“Let’s all be friends” was the initial approach of both Hitler and Stalin to their enemies, Hitler, especially, hated rudeness and encouraged his enemies to stick to the rules of gentlemanly behavior. The German proverb was that “hard words make wounds”. It’s easier to for tyrants to censor those who are polite without “even making a scene”. Hence Hitler’s repeated meetings and peace accords – later broek of course – with Neville Chamberlain which gave Hitler time to start WWII. Also the cold-blooded use of gas and classical music played to keep concentration camps “in order” with minimal conflict and hot-blooded violence which would be “bad for morale”. The only useful, understandable, communication with despots is hot-blooded violence, as proved by WWII. The pacifist belief in the “reasonableness of man” to resolve problems by the method of calm negotiation is a delusion prevalent in those who have had a cushy time in life, away from desperate thugs. This was why Chamberlain was taken in, later lying that Britain had been “rearming” when the arms gap had been widening with every second prior to war increasing the relative strength of the Nazis and making the war when it did come more and more dangerous and costly in human lives. For the “let’s all be friends” approach to Hitler, see the book by Professor Cyril Joad, Why War 1st ed August 1939, 2nd ed September 1939, which exaggerates weapons effects and then tells the reader that the author believes in his heart without proof that all people are reasonable and we can just negotiate with Hitler. Sure we could. Just what Hitler wanted and tried to do: peaceful conquests and geoncide entirely in concentration camps and gas chambers, without expending ammunition. The problem isn’t war. The problem is socialist Professor Joad, who led the Oxford Union 1933 pacifist “we won’t fight” motion to victory straight after Hitler’s election as Chancellor. No popular historian mentions this, naturally. Are they all liars or ignorant?]

“One rationalization, accepted by the Navy right up to December 7 [1941], was that the Japanese would never dare attempt a full-scale assault against Hawaii because they would realize that it would precipitate an all-out war, which the United States would surely win. It was utterly inconceivable … But … the United States had imposed a strangling blockade … Japan was getting ready to take some drastic military counteraction to nullify the blockade.”(p.87)

“… in 1914 the French military high command ignored repeated warnings that Germany had adopted the Schlieffen Plan, which called for a rapid assault through Belgium … their illusions were shattered when the Germans broke through France’s weakly fortified Belgian frontier in the first few weeks of the war and approached the gates of Paris. … the origins of World War II … Neville Chamberlain’s … inner circle of close associates … urged him to give in to Hitler’s demands … in exchange for nothing more than promises that he would make no further demands”(pp.185-6)

“Eight main symptoms run through the case studies of historic fiascoes … an illusion of invulnerability … collective efforts to … discount warnings … an unquestioned belief in the group’s inherent morality … stereotyped views of enemy leaders … dissent is contrary to what is expected of all loyal members … self-censorship of … doubts and counterarguments … a shared illusion of unanimity … (partly resulting from self-censorship of deviations, augmented by the false assumption that silence means consent)… the emergence of … members who protect the group from adverse information that might shatter their shared complacency about the effectiveness and morality of their decisions.”(pp.197-8)

“… other members are not exposed to information that might challenge their self-confidence.”(p.206)

“Higgs boson” propaganda

Dr Woit reports “On Monday LHCb will report the latest results on B(s)->mu+mu-, and the latest Higgs news should come at the Higgs parallel session on Wednesday.” The problems for the SM Higgs boson arise from the non-prediction of the mass of the SM Higgs boson:

“Higgs did not resolve the dilemma between the Goldstone theorem and the Higgs mechanism. … I emphasize that the Nambu-Goldstone boson does exist in the electroweak theory. It is merely unobservable by the subsidary condition (Gupta condition). Indeed, without Nambu-Goldstone boson, the charged pion could not decay into muon and antineutrino (or antimuon and neutrino) because the decay through W-boson violates angular-momentum conservation. … I know that it is a common belief that pion is regarded as an “approximate” NG boson. But it is quite strange to regard pion as an almost massless particle. It is equivalent to regard nuclear force as an almost long-range force! The chiral invariance is broken in the electroweak theory. And as I stated above, the massless NG boson does exist.”

– Professor N. Nakanishi (Not Even Wrong blog comment, November 14, 2010 at 9:42 pm).

“Pion’s spin is zero, while W-boson’s spin is one. People usually understand that the pion decays into a muon and a neutrino through an intermediate state consisting of one W-boson. But this is forbidden by the angular-momentum conservation law in the rest frame of the pion.”

– Professor N. Nakanishi, Not Even Wrong blog comment, November 15, 2010 at 1:46 am.

Nakanishi states that despite the Higgs mechanism which produces massive weak bosons (Z and W massive particles), a massless Nambu-Goldstone boson is also required in electroweak theory, in order to permit the charged pion with spin-0 to decay without having to decay into a spin-1 massive weak boson. In other words, there must be a “hidden” massless alternative to weak bosons as intermediaries. This is explained clearly in our theory of SU(2).

Update (12 November 2012):

Leo McKinstry on BBC Common Purpose fanatics and liars

Biggest BBC Science Politics Ignorant Fact-Abuser and Obfuscator to be Appointed next BBC Director General?

The Guardian newspaper’s Stalinist clone (Daily Telegraph) blogs editor Damian Thompson has written a piece headed: “The next director-general of the BBC should be Jeremy Paxman. No, seriously” (linked here). This is typical of the ploys used to enforce evil: the pretence that Stalinist extremists are actually right-wingers and are putting forward objective ideas. A typical example of Jeremy Paxman’s science groupthink propaganda falsehoods were exposed by Dr Julian Lewis MP, who points out in his letter to the Sunday Telegraph on 29 August 1999 that Paxman’s own statements disingeniously contradict data given by his own book:

“In recounting the story of the discovery of deadly nerve gases by the Nazis, Jeremy Paxman surprisingly states: “Why Hitler chose not to use the weapons is one of the enduring mysteries of the Second World War” (Comment, August 22). … “… no matter how tempted he felt to use his secret gases, Hitler had always to balance in his mind the conviction of his scientists that the Allies had them too.” That quotation is to be found on page 64 of a book about chemical and biological warfare, entitled A Higher Form of Killing and published in 1982. Its authors were Robert Harris and Jeremy Paxman.”

Of course, Paxman wasn’t “simply lying”. Let’s invent some typical political style excuses for this “little anomaly”. (1) Maybe Paxman can’t or won’t proof-read book galleys before publication, and the quotation is stuff written by his co-author which he signed off without even reading. (2) Maybe he forgot his own views. (3) Maybe he changed his mind on the issue. Yes. Of course. Lots of excuses. (When someone in the media like Paxman gets everything wrong, it’s other people’s fault or has some “simple explanation”, but when these “big shots” inquisition others who make a similar slip up, it’s a different story; they’re “hero” witchfinder generals!)

Note that Dr Julian Lewis in 1982 book-reviewed Paxman’s highly biased pseudo-science book A Higher Form of Killing in the The Times (8 April 1982):

“… in June 1940, Sir John Dill, Chief of the Imperial General Staff, declared: ‘At a time when our National existence is at stake, when we are threatened by an implacable enemy who himself recognizes no rules save those of expediency, we should not hesitate to adopt whatever means appear to offer the best chance of success.’

“What the authors of this book clearly demonstrate – albeit reluctantly and with various critical asides – is the sheer irrelevance of unenforceable conventions aimed at limiting the application of science to warfare. … The 1925 Geneva Protocol on Gas and Bacteriological Warfare was to have negligible influence upon the conflicts that followed. Its prohibition of the first use of Chemical weapons did nothing to deter Mussolini in Abyssinia in 1936, and would probably not have prevailed with the British had an invasion been mounted after Dunkirk. Hitler’s failure to exploit his monopoly in nerve-gases was likewise determined by purely military factors [LEWIS IS SADLY IGNORANT OF THE FACTUAL BASIS FOR THE EFFICIENCY OF SIMPLE BRITISH WWII ANTI-NERVE GAS-PROOFING OF ROOMS AGAINST SKIN CONTAMINATION BY LIQUID NERVE AGENT DROPLETS OR NERVE GAS VAPOUR AND THE UTILITY OF WWII GAS MASKS AGAINST NERVE GASES, see detailed experimental proof in the papers I have personally published on the internet archive, linked here and also the 1999 nerve gas absorption experiments by buildings with closed windows, linked here, William K. Blewett and Victor J. Arca, Experiments in Sheltering in Place: How Filtering Affects Protection Against Sarin and Mustard Vapor (report ADA365348): “sorption of the agent by the shell and interior surfaces of the building … was found to produce substantially higher protection factors than are predicted simply by air exchange. In hour-long challenges with mustard vapor, passive filtering increased the protection provided by the cottage by a factor ranging from 15 to 50. Increases in protection factor were significant with sarin, the more volatile agent …”] …

“Faced with the problem of retained documents and incomplete archives, Messrs Harris and Paxman inevitably tend to stray into the realms of speculation. … At least Robert Harris’s notorious televised claim that Churchill pressed for a biological attack which would have left German cities indefinitely contaminated, is not resurrected. The Prime Minister’s advocacy of gas retaliation to the V-weapons is now carefully distinguished from questions of germ warfare.”

Fredrick Forsyth on EUSSR intimidation attempt

Above: brief extract from Frederick Forsyth’s exposure of the problems of Hitler’s “National Socialism” as it is now stands with Britain bailing out Greece’s communist credit card spending sprees as well as its own home-brewed thugs. Britain and the USA had to drop 1.3 megatons of conventional bombs on Germany in WWII to stop Hitler’s eugenics racism pseudoscience and “European Integration” lunacy back in the 40s. Why on earth do German Chancellors keep on trying to start wars they can’t win? For more words of wisdom from Mr Forsyth please see page 58 of my paper http://vixra.org/abs/1111.0111 linked here and also the blog post on the politics of science linked here (quotation from Frederick Forsyth on evil fascism dressed up as “political correctness” or groupthink do-gooder fascist-socialist “liberalism” lies).

James Delingpole, “28 Gates later”, The Telegraph, November 13th, 2012 :

“…the unsuccessful attempt by blogger Tony Newbery (Harmless Sky) to get to the truth of the now-infamous January 2006 seminar where the BBC decided to give up even pretending to be balanced on the climate change issue and start reporting it like a full-on Greenpeace activist. The BBC’s excuse: clever experts made us do it. But this won’t wash …

“Here are allegedly ‘the best scientific experts’ who attended:

BBC Television Centre, London
Specialists:

Blake Lee-Harwood, Head of Campaigns, Greenpeace
Li Moxuan, Climate campaigner, Greenpeace China
Kevin McCullough, Director, Npower Renewables
Sacha Baveystock, Executive Producer, Science
Helen Boaden, Director of News
Andrew Lane, Manager, Weather, TV News
Anne Gilchrist, Executive Editor Indies & Events, CBBC
Dominic Vallely, Executive Editor, Entertainment
Eleanor Moran, Development Executive, Drama Commissioning
Elizabeth McKay, Project Executive, Education
Emma Swain, Commissioning Editor, Specialist Factual
Fergal Keane, (Chair), Foreign Affairs Correspondent
Fran Unsworth, Head of Newsgathering
George Entwistle, Head of TV Current Affairs
Glenwyn Benson, Controller, Factual TV
John Lynch, Creative Director, Specialist Factual
Jon Plowman, Head of Comedy
Jon Williams, TV Editor Newsgathering
Karen O’Connor, Editor, This World, Current Affairs
Catriona McKenzie, Tightrope Pictures catriona@tightropepictures.com
BBC Television Centre, London (cont)
Liz Molyneux, Editorial Executive, Factual Commissioning
Matt Morris, Head of News, Radio Five Live
Neil Nightingale, Head of Natural History Unit
Paul Brannan, Deputy Head of News Interactive
Peter Horrocks, Head of Television News
Peter Rippon, Duty Editor, World at One/PM/The World this Weekend
Phil Harding, Director, English Networks & Nations
Steve Mitchell, Head Of Radio News
Sue Inglish, Head Of Political Programmes
Frances Weil, Editor of News Special Events …

Good work, Maurizio. Nice job! … ‘…now the BBC has yet another big problem on its hands. It turns out it has lied to the public who pay for it … This is no small matter considering the billions of pounds involved in the Green energy industry. Additional carbon taxation has directly led to fuel poverty for hundreds of thousands. The excess cold related deaths in the UK have shot up in the last few years.”

“BBC’s latest excuse: forget Jimmy Savile, blame Nigel Lawson

“by James Delingpole, November 12th, 2012

“The other day I argued that, following the Jimmy Savile and Lord McAlpine disasters [McAlpine has been hired for advertising by the BBC using a massive out-of-courte settlement paid for licence payers cash as an allegedly contractual bribe to declare the BBC immaculate; conservatives will always lie for brown envelopes of cash], the BBC will learn nothing and do nothing. Patten – I’ll bet you: and there’s no bet I’d more happily lose – will keep his well-upholstered rear stuck firmly in the Chairman’s seat. The BBC will remain, as it is now, a bastion of entrenched left-liberal orthodoxy. If you need proof, have a read of this astonishing speech just delivered to Oxford University by the BBC’s ex-Director General Mark Thompson.

“Though Thompson probably bore more responsibility than anyone for the Jimmy Savile fiasco – he was in charge when the BBC took its ludicrous decision to shelve a programme exposing Savile and run one praising him instead – he escaped in the nick of time to go to his new cushy £4 million a year job editing one of the few media institutions in the world even more nauseatingly bien-pensant than the BBC – the New York Times, aka Pravda. … Nigel Lawson’s Global Warming Policy Foundation has been a consistent thorn in the side of the BBC, by exposing the lamentable bias of its climate change coverage. Its publications include Christopher Booker’s devastating report The BBC and Climate Change: A Triple Betrayal … No one likes being told they’ve been naughty and done a bad thing. Especially not the gingery-beardie Mark Thompson … He quotes the Doran survey (‘97 per cent of scientists say…’), quite unaware that it has been exposed as rubbish; he is impressed by Bob Ward whom he seeks to brandish as an expert in the field; he constructs his whole speech around the argumentum ad verecundiam – blissfully unaware throughout that by citing supposed authorities such as the Royal Society he is guilty of precisely the rhetorical fallacy he is striving to criticise. … he resorts to yet another rhetorical fallacy (the argumentum ad populum) to demonstrate that ‘scientists’ are considered in opinion surveys to be much more trustworthy than ‘journalists.’ … my immediate thought is: wow! These people are so shameless.”

Update

Feedback on first quantum gravity YouTube video from Dr Mario Rabinowitz:

From: Mario Rabinowitz
To: Nige Cook
Sent: Sunday, November 11, 2012 4:26 PM
Subject: Re: My recent paper, Challenges to Bohr’s Wave-Particle Complementarity Principle

Hello Nige,

NG: “What do you think of my brief YouTube video which goes through my quantum gravity paper quickly? https://nige.wordpress.com/2012/11/11/quantum-gravity-film-11-november-2012-upload/N

Thanks, I enjoyed your well done YouTube video for a number of reasons:

*You cover some material that I and probably most physicists have not seen before.

*I think the historical perspective you provide is valuable.

*Most people do not have the time to unearth the material you provide.

*I’m glad you credited Maxwell for some of his seminal contributions. Maxwell is one of my heros. He was an exceptionally gifted and honest scientist. I read many of his papers decades ago. I recall one in which he said that he had been wrong on a thermodynamic question, and Causius was right.

*It is the first time I’ve heard your voice and seen your face.

*You are to be applauded for the hard work you put into it.

Here are a couple of suggestions in case you revise your video:

*Feynman’s voice is not identified until well after it is heard. Might be good to do it sooner. You may have done this intentionally to raise the viewer’s curiousity. However it may be unnecessarily distracting to many who don’t recognize his voice right away.

*There is a flash of something from Glasstone & Dolan. You may have done this intentionally to be subliminal. But it may be puzzling if not perplexing to those who are not familiar with their book on the effects of nuclear weapons.

Thanks also for the Feynman info. He is another of my heros who honored me by visiting with me in my office for about an hour.

In my recent paper, Challenges to Bohr’s Wave-Particle Complementarity Principle, in the ArXiv at

http://arxiv.org/abs/1211.1916

I conclude that violation of complementarity breaches the prevailing probabilistic (Copenhagen) interpretation of Quantum Mechanics. Do you agree with me?

Best, Mario

On Sun, Nov 11, 2012 at 1:19 AM, Nige Cook wrote:

Hello Mario,

Thank you very much. It is very interesting!

What do you think of my brief YouTube video which goes through my quantum gravity paper quickly? https://nige.wordpress.com/2012/11/11/quantum-gravity-film-11-november-2012-upload/

“… My way of looking at things was completely new, and I could not deduce it from other known mathematical schemes … Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle …”

– The Beat of a Different Drum: The Life and Science of Richard Feynman, by Jagdish Mehra (Oxford 1994, pp. 245-248).

This attitude of Bohr persists today with regard to the difference between 1st and 2nd quantization; the attitude is that because non-relativistic 1st quantization was discovered first, and is taught first in courses, it must somehow take precedence over the mechanism for indeterterminancy in quantum field theory (2nd quantization). The doublethink of most textbooks omits this and glues on 2nd quantization as a supplement to 1st quantization, rather than as a replacement of it! Why not have doublethink, with two reasons for indeterminancy: intrinsic, unexplained, magical indeterminancy typified by the claim “nobody understands quantum mechanics (1st quantization)”, plus the mechanism that virtual particles in every field randomly deflect charges on small scales (like Brownian motion on dust)!

Einstein and Infeld in their book “Evolution of Physics” discuss the randomness of Brownian motion. When the random, indeterministic motion of fragments of pollen grains was first seen under a microscope, the water molecules bombarding the fragments were invisible, and Brown actually believed that the motion was intrinsic to small particles, an inherent indeterminancy on small scales in space and time! This error is precisely Bohr’s 1st quantization error. It is no wonder that Bohr was so ignorantly opposed to Feynman’s path integral, or that most people still profess that they can’t understand mechanisms.

Feynman’s answer of course is that 1st quantization is plain wrong, since it is non-relativistic and also Occam’s Razor tells us that we need 2nd quantization only because it explains everything mechanically without needing an 1st quantization (intrinsic or magical) uncertainty principle:

“I would like to put the [1st quantization] uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [wavefunction phase amplitudes] for all the ways an event can happen – there is no need for an [1st quantization] uncertainty principle! … on a small scale [path actions small compared to h-bar], such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by 2nd quantization field quanta] becomes very important …”

– Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84.

Kind regards,

Nige