B-mode quantum gravity evidence from 300,000 years after the big bang and its abuse by inflationists

  • “Have you detected B-modes from inflation
    We have detected B-mode polarization at precisely the angular scales where the inflationary signal is expected to peak with very high significance (> 5 sigma). We have extensively studied possible contamination from instrumental effects and feel confident we can limit them to much smaller than the observed signal. Inflationary gravitational waves appear to be by far the most likely explanation for the signal we see.” – BICEP2 propaganda error, http://bicepkeck.org/faq.html
  • The problem with assuming that because inflation theory is the most fashion-hyped mechanism to suppress gravitational curvature in the early universe, it is the most likely mechanism.  (E.g., in 250 BC, Aristarchus of Samos suggested the earth rotates daily and orbits the sun annually, as an “alternative theory” to the theory that the sun orbits the earth daily; but Aristarchus was later dismissed falsely as contrived, complex and improbable by Ptolemy in 150 AD.  Science is about objectivity, which means not subjectively dismissing alternative theories because they look more complex or less fashionable or popular!  If you have more than one theory which models the data, you should be honest and admit it, not lying!)
  • The “New Scientist” and other mainstream media are ignoring the alternative quantum gravity theory which predicts the correct weak curvature at 300,000 years given by the cosmological background radiation temperature fluctuations, and falsely assuming that inflation is the only theory!  As a “strawman criticism” against BICEP2 data, New Scientist is now reporting MHD-EMP (compression of magnetic field lines by expanding supernovae explosion debris) as a contaminant to the data.  However, this just deflects attention from the key argument, which is over the mechanism for small scale of the temperature fluctuations:

cosmological background radiation polarization

Lee Smolin astutely pointed out on Woit’s Not Even Wrong blog that the B-mode polarization of the CBR is more important for quantum gravity than for anything else: “we may have confirmation of quantum gravity effects before we have direct detection of classical gravitational waves”.

Left handed circularly polarized light animation

B mode polarization diagram

There’s a kids-level description of the very simple polarization of the cosmic background radiation (which is just a normal Planck radiation spectrum at 2.7 K temperature) in section 6.5 (Polarization of the Cosmic Microwave Background Radiation) on pages 121-5 of Luis Álvarez-Gaumé and Miguel Á. Vázquez-Mozo, “An Invitation to Quantum Field Theory (Springer-Verlag Berlin Heidelberg, 2012):

“The differential cross section of Thomson scattering we have derived is relevant in many areas of physics, but its importance is paramount in the study of the cosmological microwave background radiation (CMB). Here we are going to review briefly how polarization emerges in the cosmic background radiation and discuss why its detection could serve as a window to the physics of the very early universe. … Just before recombination [3000 K or ~0.3 kev temperature at 300,000 years after the big bang] the universe is filled with a plasma of electrons interacting with photons via [non-relativistic] Compton scattering [hence the low energy Thomson cross-section applies to the Compton effect, not the Klein-Nishina formula that is needed for higher energy, relativistic effects from 1 keV to 1 MeV when pair-production starts]. … so the approximations leading to Thomson differential cross section apply. … Since Thomson scattering suppresses all polarizations in the direction of the incoming photons we find that the two polarizations in the scattered radiation come from the ‘horizontal’ polarizations of the incoming photons …

“The previous heuristic arguments show that the presence of a net polarization in the CMB is the smoking gun of quadrupole anisotropies in the photon distribution at the last scattering surface. … Gravitational waves propagating through the plasma induce changes in its density with precisely the quadrupole component necessary to produce the polarization in the CMB radiation. [Lengthy calculation follows …] In other words, what we have concluded is that the measurement of the polarization of the CMB gives direct information about the quadrupole component of the distribution function of photons at decoupling!”

B mode polarization calculation

Dr Igor Khavkine, who you will remember is one of those who censored out my quantitative, fully predictive and proof checked evidence for quantum gravity (1996 prediction of the correct quantity of dark energy, first seen two years later), now writes at Not Even Wrong on 29 March 2014 somewhat ironically about the new B-mode gravity wave evidence and the big bang inflation hypers who seize it as proof of their particular non-existent “theory”:

“I’m not an expert in the various physical effects surrounding inflation. However, based on information provided in some of the presentations of the BICEP2 results, it seems to me that the status of the B-mode signal detection as evidence of quantum gravitational effects + inflation can be put on the same footing as the status of CMB temperature fluctuations as evidence primordial quantum fluctuations of the inflaton field. Indeed, the latter seems to be fairly well accepted and the two observations are indirect in similar ways. I’ve based the following on information from various talks I’ve attended and discussions with cosmologists. It would be rather hard for me, unfortunately, to dig up specific references. …

“In my understanding, the temperature anisotropies that we see (provided all foreground effects can be assumed to have been eliminated) tell us directly only about the photon times of flight (accounting for different amounts of red shift) from the surface of last scattering to us. These varying times of flight are then considered evidence for (classical) density fluctuations present at the time of recombination. The distribution of mode amplitudes of these fluctuations appears to be gaussian …

“Similarly, the observed B-modes tell us directly only the presence of (classical) gravitational waves at the time of recombination. Actually, already this point could be disputed, because the degree of directness depends on the ability to exclude other sources of B-modes. Perhaps magnetic fields could be another source, but the BICEP2 analysis team didn’t seem to think that it was likely. I’m not sure about all the reasons, but lets take that for granted now. Lets also presume that the distribution of mode amplitudes of these gravitational waves was also gaussian, with covariance matrix estimated from the B-mode 2-point function. At the very least, I have not yet seen anyone bring up any evidence of non-gaussianity in the detected B-modes. … If inflation did happen, then it would leave behind this kind of signature, as amplified quantum vacuum fluctuations: (a) gaussian distribution of fluctuations connected to the gaussian shape of the quantum vacuum, (b) “large” amplitude (large enough for the fluctuations to have become classical) set by the amount of expansion during inflation, (c) a fixed relationship between gravitational and scalar amplitudes as a function of frequency …

“So, in the absence of other pre-recombination physics that would generate signals with specific signatures (a), (b) and (c), the observations of temperature anisotropies and B-modes do point toward inflation, an inflaton-driven period of rapid expansion in the early universe. And, if inflation did happen, then the detected B-modes do in fact descend from amplified graviton quantum vacuum fluctuations. A similar thing was said, and widely accepted, of temperature anisotropies long before the B-mode detection. Of course, alternatives where a signal with signatures (a), (b) and (c) is not of quantum origin might be possible, but they’d have to be subject to investigation and testing like any other hypothesis. At the moment, the inflation hypothesis seems to be doing rather well compared to its rivals.”

You have a theory, let’s say inflation. At 10^{-32} second or so, fundamental forces decouple from an unproved grand unification (by which most of these political guys mean a communist “equality” of running coupling parameters), and the universe “inflates” faster than the velocity of light, thereby distributing the matter over a large volume and drastically reducing the gravitational field curvature.  That’s Guth’s “theory”.  Now there are many problems with calling it a “theory”.  First, there’s no proved grand unified theory for inflation.  Second, there’s no hard quantitative predictions, merely equations with unknown and therefore adjustable parameters, which permit (but don’t prove) epicycle-like fits to data.  But the worst thing is that the “theory” isn’t unique.  It has to be hyped with giant neon lights in order to deflect attention from rival theory that do better, predicting quantum gravity, dark energy and weak gravitational curvature at 300,000 years after the big bang in a quantitative way (including predictions of constants and parameters, which don’t need to be “deduced” from the data they are claiming to “explain”), unlike the adjustable (ad hoc or qualitative) equations of inflation “theory”.  In the rival theories, you get politics.  The biggest hyped rival theory is equally non-predictive nonsense, as is the third.  Only a totally ignored theory way down the list, which is censored out by all lying hype and neon adverts for the “top” theories, has actual evidence that replaces inflation!  But if you point this out quietly, you’re ignored, and whatever you say you’re ignored.

(In fact, if you merely point out that you’re ignored, you get angry ad hominem attacks claiming that you’re a publicity seeker or whatever, which totally ignore what you’re saying, and the “editors” refuse to edit or even make constructive criticisms.  They’re paranoid and bitter with anyone truly innovative, but like the censors in George Orwell’s fairy tale, they redefine works to try to convey those qualities on to the people whose ideas they refuse to check.  Innovators are then labelled paranoid and bitter.  All this simply wastes time and effort.  Trying to get through biased peer-review is a waste of everybody’s time, just like “peaceful diplomacy” with Nazis.)

 

84.5% of mass (dark matter) is massive right-handed neutrinos

Right handed neutrinos: dark matter

See links herehere, here, here and here.  The hard fact is: massive right handed neutrinos don’t contribute much to weak interactions because of their immense mass, but do interact with gravity unlike massless left handed neutrinos.  I can’t understand why dark matter in the form of massive right handed neutrinos isn’t already considered a confirmed fact, based on experimental evidence of neutrino flavor mixing!  (See my discussion of massive right handed neutrino lifespan evidence below.)

Right handed neutrinos are implied by neutrino flavor mixing data and the see-saw mechanism for neutrino mass: left handed neutrinos are massless, right handed neutrinos are massive, so the small apparent (“observed”) masses of neutrinos are an average over time for oscillations between the briefly-existing massive right-handed neutrinos (which, due to their large mass, have a short mean-free-path before transforming back into massless left-handed neutrinos in the vacuum) and the longer-existing (massless) left handed neutrinos, which can only undergo weak interactions!

The (large) mass of right-handed neutrinos makes them couple to the gravity field, not only the weak interaction; but the lack of mass of left-handed neutrinos ensures that those merely couple to the weak force, not gravity.  This asymmetry in couplings for the two kinds of neutrinos is responsible for the small observable apparent mass of neutrinos, which is simply a time-average superposition between both of the states.  I don’t understand how anyone can accept the model for neutrino oscillation between between left and right handed states, if they don’t accept that both states have at least one interaction (i.e. Standard Model weak charge) in common, so I disagree strongly with Peter Woit’s statement that right-handed massive neutrinos don’t undergo weak (or any other S.M.) interactions:

Right-handed neutrino fields fit naturally into the SM pattern of fundamental fields, but with zero SU(3)xSU(2)xU(1) charges. That such fields have something to do with dark matter looks more promising than the SUSY or axion proposals of introducing a new and different sector of fields. – Woit

I disagree that right-handed neutrinos need to have a lack of weak charge: their short life (due to their mass) reduces the effective weak charge of right-handed neutrinos, simply because they aren’t there for long, as compared to left-handed neutrinos!  So I very much prefer Professor Matt Strassler’s far more cautiously-worded comment about right handed neutrinos:

… the dark matter particles are kind of like neutrinos — they’re fermions, like neutrinos, and they are connected to neutrinos in some way, though they aren’t as directly affected by the weak nuclear force.  [Emphasis added to key words.]

The fact that the right-handed neutrinos “aren’t as directly affected by the weak nuclear force” as left-handed neutrinos is simply down to their short-lifetime due to their immense mass.

The lifetime for spontaneously produced particles of mass in vacuum is only h-bar/(mc^2) seconds, whereas left handed neutrinos are massless and therefore have an effectively infinite lifetime, and so they remain unchanged until they undergo a weak interaction with either a flavour-changing, massive, short-lived right-handed neutrino in the vacuum, or else a Standard Model weak charge.

Since the massive Standard Model charges that form ordinary matter are long lived (not spontaneous pair-production short-life particles in the vacuum), their weak charge cross-sections are more apparent, simply because they last far than those of massive right-handed neutrinos, a simple fact that apparently appears so “facile” to some highly technical dudes, that no effect is made to grasp it at all!

Milner-Zuckerberg Prizes for Mathematics

Peter Woit states: “At the Hollywood-style awards ceremony last night for $3 million string theory and biomedical research prizes, it was announced that Yuri Milner and Mark Zuckerberg will now start funding something similar in mathematics, called the Breakthrough Prize in Mathematics. … I’ve written extensively about the “Fundamental Physics Prize” and what I see as the worst problem with it (heavily rewarding and propping up a failed research program).  … The physics prize has turned out to be extremely narrowly targeted at one particular subfield of physics … the main argument for the prizes is that the money (and Academy Awards-style ceremonies) will help make them celebrities … I still think the whole concept is problematic. The US today is increasingly dominated by a grotesque winner-take-all culture that values wealth and celebrity above all else.”

I strongly disagree with everything Peter Woit states here (see footnote at end of this post for my take on his classic anti-capitalism politics), but especially his hypocrisy in speaking out against celebrity while also claiming to take a stand against the dictatorship of physics by one failed unification idea which has become a religious dogma among leading physicists, with objections being deemed heretical, resulting in excommunication. The dangers here need spelling out clearly:

(1) that if he acquires celebrity status as the “debunker of string theory” and gets his argument wrong, he’ll only make the problem worse for others (in other words, if he leads the anti-string lobby and fails to overturn string, he’ll be used as a straw man opponent by the string theorists);

(2) the only way to overturn a failed dogma theory historically has been to replace it with something better.  This is not Woit’s approach, which is to make criticisms without suggesting a better theory.  So, on this basis, Woit is making the problem worse and providing a straw man target;

(3) Woit reproduces the electroweak sector charges of the standard model (including chiral features, since right-handed spinors in his model have zero weak charge) by picking out a U(2) symmetry as a subset of SO(4) spacetime (on page 51 of http://arxiv.org/abs/hep-th/0206135, based on his 1988 paper “Supersymmetric quantum mechanics, spinors and the standard model”, Nuclear Physics, vol. B303, pp. 329-42), yet he does not try to strongly market this theory as an alternative by making it the focus of a book or popular article, instead writing weakly/humbly on page 51 of a long technical paper: “The above comments are exceedingly speculative and very far from what one needs to construct a consistent theory. They are just meant to indicate how the most basic geometry of spinors and Clifford algebras in low dimensions is rich enough to encompass the standard model and seems to be naturally reflected in the electro-weak symmetry properties of Standard Model particles.”

Woit weak leadership

However it is clear that this fact – that progress in low dimensions is possible – leads Woit to his criticisms of string dogma.  In other words, Woit appears to me to be putting forward arguments against string which are weaker than they need to be, for a psychological reason (modesty).  Let’s make this fact crystal clear: Woit in 1988 discovered an alternative approach to developing a better understanding of electroweak symmetry, based on the mathematical representation of the U(2) symmetry in simple 4 dimensional Euclidean space.  This caused Woit to feel uneasy with Witten’s 1995 10/11 dimensional M-theory hype, despite the fact that Woit’s graduate work in computational (Wilson formulation) lattice QCD nuclear physics utilized Edward Witten’s earlier conjecture on the large N expansion (Witten’s 1979 paper: “Baryons in the 1/N expansion”, Nuclear Physics, vol. B160, pp. 57-115, a mathematical conjecture which seems to be based on thinking of the strong force using a hadronic string model).

Witten’s problem for physics today is his 1995 M-theory (conjecture) that 10 dimensional superstring is a brane surface on an 11 dimensional supergravity bulk. This speculation reinforces and hardens dogmas like SUSY, increasing the parameters of the Standard Model from 19 to at least 125 parameters in the minimally supersymmetric standard model.

The bottom line is, instead of presenting his strongest (objective) evidence against M-theory (his own research as a replacement direction for physics to go in), Woit instead raises a lot of relatively subjective arguments about the lack of “progress” in M-theory.  This is unsatisfactory, because “progress” is ill-defined in science: to someone digging in a hole, the deeper the hole gets, the more “progress” is being made.  To critics, it’s the opposite, and people in holes should stop digging.  Such arguments go nowhere, because if you are digging for gold and don’t know how deep the gold is (if it is there at all), it’s an arbitrary decision to quit.  Moreover, the more time and effort you “invest” (to critics: “waste”) in digging your hole, the less inclined you are to admit failure, lose face, etc.  Only when you get hungry and run out of supplies, are you likely to relent, and then you won’t admit failure. You’ll go to your grave dreaming of digging deeper in your hole.  The only way to defeat this, is for someone else to find the gold.  What drives some of us, is not the dream of seeing gold, but the desire to find the gold simply to discredit smug mathematical elitism.

Footnote:

Peter Woit attacks prizes for promoting capitalism with smug words: “The US today is increasingly dominated by a grotesque winner-take-all culture that values wealth and celebrity above all else.”

The problem is that this attitude ends up making prizes even more warped, because it introduces a political-type crusading aspect, rewarding high-profile scientists with failed grand unification theories but who are “worthy” in some other way. For instance, people either famous for making lots of money out of best-selling non-mathematical hype-style kids books “about mathematics”, or else famous for some kind of politically correct anti-capitalism or pro-environmentalism crusade (based on subjective or controversial interpretations of ambiguous data).

Apart from this purely “Matthew effect” corruption in prize ceremonies, there is also the egotism of those giving the prizes, which sometimes corrupts the selection of recipient: money is used to “buy” free publicity in the media, so you must give a prize to an already interesting or famous celebrity, to host an awards ceremony with media attendance. This is contrived “news” but it works.

Staged ad-style philanthropy is more praiseworthy than high-profile mega-rich celebrities begging those poorer to donate to good causes, while pretending to do this “free” (their payback is the relatively positive free “positioning” publicity they receive in the process of doing it, usually aided by public service awards). This is what Woit is missing in his analysis.

It’s not a choice of good versus bad options, but of bad versus very bad. It’s far better to take the lesser of two evils. Capitalism has its problems, but it works better than the USSR type socialist idealism, with its monolithic centralized control and its demotivating, restricting bureaucracy. Similarly, arbitrary prizes are vulnerable to corruption like capitalism, but probably work better than regimented consensus, which has its own set of groupthink problems.

Woit reproduces the electroweak sector charges of the standard model (including chiral features, since right-handed spinors in his model have zero weak charge) by picking out a U(2) symmetry as a subset of SO(4) 4-dimensional spacetime (on page 51 of http://arxiv.org/abs/hep-th/0206135 which is based on his 1988 paper “Supersymmetric quantum mechanics, spinors and the standard model”, Nuclear Physics, vol. B303, pp. 329-42), yet he does not try to strongly market this theory as an alternative by making it the focus of a book or popular article.

Although Woit “only” reproduced the electroweak charges and chiral features of the electroweak sector correctly in 1988, there has been some technical work since then dealing with the non-symmetry details of U(2) theory which Woit left untouched.  See, for example, the paper by Aranda, Carone and Lebed, U(2) Flavor Physics without U(2) Symmetry, http://arxiv.org/abs/hep-ph/9910392 which models the weak mixing angles (CKM matrix) and fermion mass relations.  So U(2) is not just a threadbare model of the electroweak sector charges and handedness.

Whether this specific example is totally correct or not, Woit’s conjecture that “The quantum field theory of the standard model may be understood purely in terms of the representation theory of the automorphism group of some geometric structure” (quoted from http://arxiv.org/pdf/hep-th/0206135.pdf, page 4) remains a promising avenue of investigation and should be rigorously pursued as an alternative to superstring.

Relevant technical trivia

Sophus Lie invented Lie symmetry group theory in 1874 and William Clifford invented Clifford algebras in 1876.  For the purposes of particle physics (but not necessarily math de la rigor mortis),  since Spin(n) is a double-cover of SO(n), they fit together and are therefore isomorphisms geometrically.  From the perspective of the number patterns involved, as utilized in particle physics, the following useful isomorphisms or equivalences hold:

Spin(2) = U(1) = SO(2)

Spin(3) = Sp(1) = SU(2) = SO(3)

Spin(4) = SU(2) × SU(2) = Sp(1) × Sp(1) = SO(4)

(It’s not always mathematically rigorous to treat an isomorphism as a strict equality, however useful it is in physics.  For example, E = mc if literally true would imply that 9 × 1016  Joules of energy has exactly the same price as 1 kilogram of manure.  If Einstein was literally asserting a simple equivalence, we could substitute or sell one for the other in that exact ratio.  Since nobody will buy 1 kilogram of manure for the same price as 9 × 1016  Joules  of energy, it’s obvious that the conversion equivalence is not always as simple as that.  Similarly, the equation 1 + 1 = 2 taken naively would suggest that two halves of a wedding cake are the same value as a whole  wedding cake.  It’s obviously not true.  If you chop 10 feet of rope into 10 separate 1 foot sections, you still have literally “10 feet of rope”, but may be of far less value to a sailor.  The point is, any equivalence in general may only have a limited range of exact validity, like an analogy between the similarities of different systems. Two halves of a car are less useful than one whole car.  This is so obvious that it is omitted from arithmetic, but this logical “reductionist problem” can cause problems in more abstract areas of science where things are not so obvious, and so you need to be far more careful.)

With the above isomorphism

SU(2) × SU(2) = SO(4)

where SO(4) is used to produce Woit’s U(2) electroweak particle charges, we can represent weak interactions by one SU(2) with massive bosons, and the other SU(2) as a hidden electrodynamics symmetry with massless bosons that reduces the Yang-Mills via a technical mutual magnetic self-inductance mechanism (which prevents the one-way motion of charged massless bosons, but not massive charged bosons; thus eliminating the charge transfer quadratic term in the Yang Mills equations and reducing them to Maxwell’s equations), to appear like the familiar Abelian U(1) Maxwell electrodynamics theory.

Backing this up SU(2) electrodynamics up further, the three Pauli matrices of SU(2) isospin are extremely similar in basic structure to the two gamma matrices of Dirac, with the third Pauli matrix being equivalent to Weyl’s chiral spinor:

SU (2) electrodynamics spinor. The sigma components in the second Dirac gamma matrix are themselves given by the SU(2) Pauli matrix, a fact which has helped to confuse the simplicity of this SU(2) electrodynamics symmetry.  Dirac's omission of chiral handedness from QED was later corrected by the addition of a chiral spinor by Weyl, yet the hype of Dirac's work and the initial obscurity of Weyl's (Pauli dismissed Weyl's prediction of chiral effects until 1957 when the left handed nature of weak interactions was discovered experimentally) turned half-baked initial ideas into a dogma, which resists correction to this day.  Electrodynamics is an SU(2) theory; the fact you need 4 polarizations  (not two as for onshell photons) for electromagnetic gauge bosons (offshell photons) to mediate attractive and repulsive forces in QED should make this clear, but is currently camouflaged by "proud statements" of the sort: "nobody understands quantum mechanics", which are today used as an excuse to censor out progress.

SU (2) electrodynamics spinor. The sigma components in the second Dirac gamma matrix are themselves given by the SU(2) Pauli matrix, a fact which has helped to confuse the simplicity of this SU(2) electrodynamics symmetry. Dirac’s omission of chiral handedness from QED was later corrected by the addition of a chiral spinor by Weyl, yet the hype of Dirac’s work and the initial obscurity of Weyl’s (Pauli dismissed Weyl’s prediction of chiral effects until 1957 when the left handed nature of weak interactions was discovered experimentally) turned half-baked initial ideas into a dogma, which resists correction to this day.  The standard textbook approach to the standard model is that Dirac’s equation gives the lagrangian for massive fermions, and Weyl’s spinor only comes into play for “massless fermions” (formerly believed to be neutrinos).  But this is falsified by the empirical observation that neutrinos change flavor as they propagate and therefore have mass, despite engaging in weak left-handed interactions.  So it does appear that massive fermions can contradict Pauli’s ad hoc parity conservation law, thus Weyl’s handedness spinor applies to massive particles.   Electrodynamics is an SU(2) theory because of the fact you need 4 polarizations (not two as for onshell photons) for electromagnetic gauge bosons (offshell photons) to mediate attractive and repulsive forces in QED should make this clear, but is currently camouflaged by “proud statements” of the sort: “nobody understands quantum mechanics”, which are today used as an excuse to censor out progress.

The rationale for including Weyl’s chiral spinor in QED (not just in weak theory) goes back to Maxwell himself, who argued that the fixed direction of curl of the magnetic field circling around moving electrons (or a wire carrying a current) is evidence for a chiral handedness of spin: Maxwell had a spin angular momentum transfer  (spinning vortex or “gear box”) model for the mediation of magnetic forces through space.  Abstract gauge theory today needs to properly replace Faraday’s old field line theory, or Einstein’s curved space time theory, with a mechanism for force production by gauge boson exchange.  The Casimir force is an example: two conducting metal plates exclude pressure effects from the space between them by virtual photons of wavelengths that are longer than the distance between the plates.  Therefore, there is a deficit in the cut-off spectrum of wavelengths exerting pressure between the Casimir plates, compared with the full spectrum that is pushing them together from the surrounding space.  So the net effect is that they get pushed together.  Extending this, it’s easy to see that if electrodynamics is SU(2), the magnetic curl (self- inductance problem) for massless charged bosons only allows the exchange of charged-bosons to similar charged particles (thus causing them to repel); opposite charges can’t exchange charged-bosons because the geometry of the magnetic vectors of the exchanged bosons is such that they don’t cancel out but add together instead (so this exchange is impossible due to the uncancelled, infinite magnetic self-inductance of the charged bosons).  In summary, opposite charges repel, but similar charges don’t repel and are hence pushed together by a Casimir-type “attraction” mechanism.

Professor Edsger Wybe Dijkstra (1930-2002), The strengths of the academic enterprise, EWD 1175, University of Texas, 9 February 1994:

“In the wake of the Cultural Revolution and now of the recession I observe a mounting pressure to co-operate and to promote ‘teamwork’.  For its anti-individualistic streak, such a drive is of course highly suspect; some people may not be so sensitive to it, but having seen the Hitlerjugend in action suffices for the rest of your life to be very wary of ‘team spirit’.  Very.  I have even read one text that argued that university scientists should co-operate more in order to become more competitive….. Bureaucracies are in favour of teamwork because a few groups are easier to control than a large number of rugged individuals.  Granting agencies are in favour of supporting large established organizations rather than individual researchers, because the support of the latter, though much cheaper, is felt to be more risky; it also requires more thinking per dollar funding.  Teamwork is also promoted because it is supposed to be more efficient, though in general this hope is not justified. … the co-operation seems more to force the researchers to broaden their outlook than to increase the efficiency of the research. … everybody complains about the amount of red tape … Why should a vigorous, flourishing department seek co-operation when it is doing just fine all by itself? It is the weak departments that are more tempted to seek each other’s support and to believe that there is might in numbers.  But such co-operation is of course based on the theory that, when you tie two stones together, the combination will float.”

Update (17 December 2013):

As predictable, Woit is now deemed, by Business Insider, a “famous math professor” for his pretty much worthless criticism of Zuckerberg.  All Woit is doing in the”criticism” is a disfavor to physics, by effectively preventing himself from being considered a possible recipient of such prizes, and thereby preventing his own theory from being funded with the sort of money required for its media hype to the extent needed for it replace string theory as a major research direction!

Update (3 Jan 2014): To his credit, Peter Woit has made a stance against the cringeworthy self-imposed money-making-aimed-self-censorship by fashion-dominated journals: “The policy of Physics Today to charge $30 to look at an article seems to have no point other than to ensure that no one does it.”

Real “freedom” of the press (internet) for everyone, or  “intellectual communism” – as contrasted to the anti-”intellectual communism” of pro-Marxist “financial communism” of anti-capitalist people in the BBC/Guardian/left wing who are paid subsidies or USSR style taxation funding for issuing biased “information” or propaganda (analogous to opinions of the Witten M-theory variety, dressed up as facts) – is the number one “problem” for “journalism” in the internet age.  How do journals and journalists retain their elitism when anybody is now free to circumvent their groupthink censorship?  The whole idea of “freedom of the press” is a complete lie: see the 3 January 2014 released secret UK National Archives file PREM 19/1394 which 22 May 1984 report by Sir James Goldsmith for the Defense Strategy Forum of the National Strategy Information Center, Soviet Active Measures versus the Free Press: A European Perspective, stating:

“Then comes the outer layer consisting of those who follow fashion and seek easy praise. Responsible journalists can also be disinformed by these campaigns. When a journalist works on an article, he refers to the press cuttings file which covers the subject about which he is writing. Information … will be used over and over again. So, once the press cuttings files have been polluted by propaganda, the false information will be repeated quite innocently and as it is repeated will gather further credibility and momentum. … Here are some thoughts … We need … better journalism. The better informed the public, the better equipped it is … The trouble with today’s intellectual environment is that few dare discuss the problem. … It is taboo. … It is a genuine problem which needs free and open discussion. … in a free country the best remedy is wide publication of the true facts. … journalists should investigate and publish. But they face a problem. There is a tradition of forbidden areas. Dog must not eat dog. Not only is it unpopular to expose a colleague or a journal, bit it is also difficult to find papers who would publish your material. Investigation should not be concentrated on the unpopular. It takes no courage to be fashionable, to express conventional wisdom and comfortably to join the pack in attacking the same wounded stag. Courage resides in saying the truth that does not please and which can make you a pariah in the eyes of your peers. This precisely is the duty of the press and one of the great justifications for the freedom of the press.”

The usual defense of a “free press” is totally wrong: it was mass press in Britain in the 1930s which rubbished, ridiculed and censored Churchill’s warnings about Hitler and the Nazis, instead playing the song of appeasement and “collaboration” for pacifist utopia and Nobel Peace Prizes all round for pacifists like Sir Norman Angell.  The mass media is professional, which means it’s profession (money-making) relies on being fashionable: money corrupts the professional journal or journalist (who won’t sell papers or TV time to be paid if unpopular) just as it corrupts the professional politician (who won’t get elected and paid unless he is popular enough to get votes), or the professional scientist (who won’t get paid unless he gets sponsorship).  Real freedom of equality for speech on facts or a “communism of thought”, is opposed by precisely the bigots who are professional liars, the “Marxist communists” who want not a communism or freedom and equality of ideas but a communism of money.  They want to censor people on any reason other than fact (because they have no fact to defend themselves with).  “Marxist communists” want to dictate opinions but never to listen to facts.  They are professional (money making) quacks.  God knows how long they will continue to be lauded.  What’s wrong is allowing a freedom of speech on unsubstantiated opinion, but permitting fashionable bigots to censor facts that contradict their popular opinions.  There are many ways to sort this problem out.  Bullets.  Vitriol.  A censorship of opinion to clear a breathing space for an airing of facts.  A discrimination between opinion and facts based on objective evidence.  These methods traditionally “don’t work” because they don’t maintain hegemony, in other words they’re like the Ancient Greek style method of democracy (daily referendums on issues, not a choice between two near-clone parties once every four years) which was considered too “volatile” or “insecure” by the founders of dictatorship or “modern democracy”.  (Daily referendums are perfectly possible logistically and technologically, with the same systems as the secure databases that allow millions of people to safely access online bank accounts daily.)  If you look at how modern “democracy” works, with people forced to start campaigns and effectively fight propaganda wars for years against status quo for every tiny revision to nonsensical groupthink-error laws, it’s very similar to dictatorship.  Its’ whole aim is to hinder change as much as possible, not to aid or objectively facilitate it!  No wonder why people get tired of political propaganda.  Politics, like string theory, attracts the Stalinist mind-set.

Sir Basil Henry Liddell Hart, Why Don’t We Learn from History?, PEN Books, 1944; revised edition, Allen and Unwin, 1972:

“If a man reads or hears a criticism of anything in which he has an interest, watch whether his first question is as to its fairness and truth. If he reacts to any such criticism with strong emotion; if he bases his complaint on the ground that it is not in ‘good taste,’ or that it will have a bad effect – in short, if he shows concern with any question except ‘is it true?’ he thereby reveals that his own attitude is unscientific. Likewise if in his turn he judges an idea not on its merits but with reference to the author of it; if he criticizes it as ‘heresy’; if he argues that authority must be right because it is authority; if he takes a particular criticism as a general depreciation; if he confuses opinion with facts; if he claims that any expression of opinion is ‘unquestionable’; if he declares that something will ‘never’ come about, or it is ‘certain’ that any view is right. The path of truth is paved with critical doubt, and lighted by the spirit of objective enquiry… We learn from history that in every age and every clime the majority of people have resented what seems in retrospect to have been purely matter of fact … We learn too that nothing has aided the persistence of falsehood, and the evils resulting from it, more than the unwillingness of good people to admit the truth … Always the tendency continues to be shocked by natural comment, and to hold certain things too ‘sacred’ to think about. I can conceive no finer ideal of a man’s life than to face life with clear eyes instead of stumbling through it like a blind man, an imbecile, or a drunkard – which, in a thinking sense, is the common preference. How rarely does one meet anyone whose first reaction to anything is to ask: ‘is it true?’ Yet, unless that is a man’s natural reaction, it shows that truth is not uppermost in his mind, and unless it is, true progress is unlikely.”  (Emphasis added.)

BBC and Guardian newspaper and others who read their copy unfailingly manage to swallow the liars propaganda (hook, line and sinker), thus taking the wrong side because  journalists and their readers always find fiction more appealing and saleable (££££$$$$££££$$$$, money) than facts!  Thus, they prefer utopian hopeful fantasies to tough reality. They are ideologues who want to believe in contrived propaganda that reinforces their ideals:

“… fashionable trends of thought and ideas are carefully separated from those which are not fashionable … what is not fashionable will hardly ever find its way into periodicals or books or be heard in colleges.  Legally your researchers are free, but they are conditioned by the fashion of the day.  There is no open violence such as in the East; however, a selection dictated by fashion and the need to match mass standards frequently prevent independent-minded people from giving their contribution to public life. There is a dangerous tendency to form a herd, shutting off successful development. I have received letters in America from highly intelligent persons, maybe a teacher in a faraway small college who could do much for the renewal and salvation of his country, but his country cannot hear him because the media are not interested in him. This gives birth to strong mass prejudices, blindness, which is most dangerous in our dynamic era.

- Aleksandr Solzhenitsyn’s 1978 Harvard address (section discussing the dictatorship by fashion in the Western media).

Higgs censorship/dictatorship/fashion/consensus non-science dogma of 125 GeV weak and electromagnetic decaying spin-0 bosons

“It would be unrealistic to believe that dogmatism in science ended … flagrant examples [are] the Nazi doctrine of Aryan racial supremacy and the Communist credo of dialectic materialism … less publicized instances … are known in every discipline in small or large degree. Every area of knowledge at the present time has its ‘big names’ whose opinions in science … prevail over the views of lesser lights just because they are recognised … Dogmatism is a frequent concomitant of a systematized creed and a well-institutionalized priestly hierarchy … unified control with a discipline that is dedicated to its unquestioning support. This condition directly parallels the requirement for authoritative secular administration. … there be only one source of truth … the source be afforded enough power to enforce its dictates. … Heretical views may not be tolerated … because they threaten the economic and the ideological commitment …”

Professor H. G. Barnett, Innovation: the Basis of Cultural Change, McGraw-Hill, New York, 1953, pages 69-70.

Irving L. Janis, Victims of Groupthink, Houghton Mifflin, Boston, 1972:

“I use the term “groupthink” … when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.”(p. 9)

“… the group’s discussions are limited … without a survey of the full range of alternatives.”(p. 10)

“The objective assessment of relevant information and the rethinking necessary for developing more differentiated concepts can emerge only out of the crucible of heated debate [to overcome inert prejudice/status quo], which is anathema to the members of a concurrence-seeking group.”(p.61)

“Eight main symptoms run through the case studies of historic fiascoes … an illusion of invulnerability … collective efforts to … discount warnings … an unquestioned belief in the group’s inherent morality … stereotyped views of enemy leaders … dissent is contrary to what is expected of all loyal members … self-censorship of … doubts and counterarguments … a shared illusion of unanimity … (partly resulting from self-censorship of deviations, augmented by the false assumption that silence means consent)… the emergence of … members who protect the group from adverse information that might shatter their shared complacency about the effectiveness and morality of their decisions.”(pp.197-8)

“… other members are not exposed to information that might challenge their self-confidence.”(p.206)

Don’t be fooled: we’re not arguing that censorship is wrong or that individualism is right, but that subjective censorship is wrong (we need more objective censorship, i.e. less authority-based dismissals of hard evidence, and more technical fact-driven debate rather then debates driven by the mere opinions of famous bigots or personalities who act as “expert” authorities who assert lies), and that socialism in science can only work if heated debates are allowed to break down Hitler-type eugenics pseudoscience fantasies. The present version of socialism used in science protects bigots by (1) ignoring polite statements of facts and (2) censoring more assertive statements of facts as being “rude”, precisely the Nazis eugenics “hard words make wounds” censorship-technique. (The idea of “penetrating” the existing regime in disguise to force revolutionary change is like saying Churchill in 1935 should have volunteered to serve as a Nazi concentration camp guard in order to try to destroy eugenics pseudoscience from within. Bad, rather than good, comes from collaboration with bigots.)

The only way to make real progress is not to assert individualism or to ban censorship, but to ban bigotry within socialism and to enforce fact-based rather than dogma-based censorship. We need more censorship in science of the fact-based type, to get rid of existing dogmatic eugenics-type pseudosciences (the incremental progress side of science, which fills the journals up with politically-correct trivia, such as adding more and more epicycles to mainstream pseudosciences). We need more socialism in science of the unbigoted type, with heated debates rather than dictatorship by Stalin-like bigots (who claim they are morally and ethically “maintaining nice politeness” in debates by sending “rude” critics into exile or worse).

“… prizes only give one view of how science is done. They encourage the idea that the typical manner of progress in science is the breakthrough of a lone genius. In reality, while lone geniuses and breakthroughs do occur, incremental progress and collaboration are more important in increasing our understanding of nature. Even the theory breakthrough behind this prize required a body of incrementally acquired knowledge to which many contributed.”

- Jon Butterworth, Guardian, 8 October 2013

The fascist Catch-22 “Godwin law” states that people must never learn the lessons of eugenics groupthink ideologue pseudoscience, because (1) until 6 million defenseless people have been massacred in the name of moralistic eugenics lies, all comparisons with the Nazi regime are inappropriate, and (2) after another 6 million defenseless human beings have been massacred by pseudoscience, it is too late to prevent that tragedy.  If you warn that eugenics is murdering people and is equivalent to the 1920s-1930s Nazi eugenics fashion hate campaign backed by eugenicists like famous Medical Nobel Laureate Alexis Carrell, whose 1935 bestseller Man the Unknown argued for gas chamber eugenics on pseudo-moralistic grounds, you are attacked using the false, contrived and dogmatically asserted “Godwin law” argument that until a pseudoscience actually succeeds in murdering as many people as the Nazi regime by 1945, all comparisons with the Nazi regime in its earlier stages (when it could have been stopped without a shot, as Churchill argued) are “inappropriate” or “misleading”.  Godwin, of course, is just inventing a false argument which prevents the lessons of eugenics pseudoscience being widely comprehended and applied.  These ideologues seek to prevent rational discussion and all progress based on hard facts, by censoring it out as “rude” or “boring”, because it contradicts their dogmatic religion of hatred.

This is of course complete nonsense, because in science, all forms of groupthink i.e. censorship for the benefit of Marxist or fascist unity or racist/eugenics/flat earth conspiracy (sometimes called “socialism”) by definition lead either to the censorship of “unlikely” alternative ideas (some of which are proved right by history), or to warfare or “damaging controversy” if the alternatives have any popular credibility (which counts not a cent towards determining whether they are experimentally and theoretically helpful or useful data-summarizing empirical models for future progress, or just another caloric/phlogiston or flat earth dogma of the mainstream that takes centuries to debunk but is then recorded by deluded historians as being some kind of proof of the danger not of consensus science, but of alternative ideas).

All you ever get from socialist peer-review is a conspiracy to suppress progress unless it comes from a member of the same tribe who is effectively a member of the same trade union, as even then it has to be “exciting” news, not “depressing” news that the reigning theory is BS, to be welcomed.  In other words, dogmatic bias is held to be sacred, not facts.  When it comes to groupthink, it is impossible to make any revolutionary advances by definition without overthrowing status quo.  Socialist groupthink sees only danger in revolutionary advances.  What happens to all the scientists working on the present paradigm?  But this question doesn’t arise directly.  Anything revolutionary is sneered at automatically, simply because it is revolutionary.  Name a single revolutionary advance in science that ever occurred due to incremental progress, and you have are just contriving a contradiction of terms.  By definition, revolutionary advances aren’t incremental.

What Butterworth means by “the typical manner of progress in science” is trivia.  His argument is that mundane trivia, the invention of yet more epicycles within a dogmatic religion of cosy socialist conference proceedings, is more typical of progress than revolutionary advances.  That’s a piece of revisionism like saying that the gas chamber operators of the holocaust were more to blame that the revolutionary eugenicists.  Sorry, Butterworth, but you’re not telling the truth.

Butterworth and the loss-making Guardian newspaper propagandarist editor and publisher (or whoever supports his Guardian-headed articles, maybe a web editor), totally neglect the fact that “lone geniuses” are not for the most part mavericks or fairy tale loners, but are men of principle or women of principle (Curie and Noether being examples) who do what they know is worthy of research, DESPITE the lack of cosy peer support in any way, shape or form.  Butterworth paints a racial-type steryotype of the lone scientist as someone who chooses to be a loner, rather than as someone who chooses to follow up the evidence, regardless of heresy or taboo, of social exclusion, of ridicule, of abuse, of censorship, of hatred.

Comment on the Higgs Nobel Prize

This is a puzzling Nobel Prize decision in the sense that it is merely assuming without proof that the 125 GeV spin-0 massive boson decays confirm the prediction, just as the discovery of the muon (a lepton) was initially believed to be the Yukawa strong force mediator and led to a Nobel prize for Yukawa. (Later the pion was discovered, after his Nobel prize, confirming his theory, but it might not have been if the theory was wrong.)

While Higgs and others certainly did work of value, the danger is that the foundations of the Standard Model will now be assumed to be proved real and there will be ever more censorship of alternative variants of basic ideas. Certainly the SU(2) weak interaction is beyond question in my opinion, but the Abelian U(1) electrodynamics model is a contrived piece of nonsense because you can’t represent electromagnetism by one charge, its anticharge and a single photon; you need 2 extra “polarizations” on the photons to account for electromagnetic photons, which can be considered charges. It turns out you can have an SU(2) electrodynamics and a U(1) quantum gravity, considerably changing the meaning of “electroweak symmetry” in the SM (viXra:1111.0111).

The whole basis of prize-giving seems to be to encourage and reward groupthink, conformity, and censorship. I’ve always believed in one myth: that science is different from fashion and politics. Apart from making a fortune supplying dynamite to both sides in the Crimean War, Nobel’s legacy of rewarding ideas after they have become dogmatic consensus is a toxic poison for nascent science.

Quantum mechanics from quantum field theory

Image

Fig. 1: the “not equal to” signs above are controversial, because according to status quo (dogma), exp(-iHt/hbar) from quantum mechanics is precisely equal to the path integral over exp(iS/hbar).  In fact, it is commonplace in quantum field theory to treat them as mathematically equal, so you Fourier-transform an action (calculated by integrating a quantum field lagrangian over time) into a hamiltonian to allow easy calculations (avoiding the path integral).  Zee, for example, claims to do path integrals while actually only using a single path evaluation, exp(-iHt/hbar), and then claims that his non-multipath, non-path integral mathematics is proof that the the graviton travels along the path of least action, i.e. directly between the two masses which “attract”, proving that gravitons are spin-2.  By the physical error of using as an input assumption in your calculation what you are trying to prove, you don’t prove anything other than that you are using a circular argument.  1st quantization quantum mechanics effectively states that there is only a single wavefunction for every “particle”, and this wavefunction’s amplitude is proportional the complex number exp(iS/hbar) = i sin(S/hbar) + cos(S/hbar), which reduces to simply cos(S/hbar) if you just want to know probabilities (from squaring the wavefunction and normalizing) and you aren’t interested in polarization vectors (which can be represented using the complex component, i.e. the angle with respect to the real axis on the argand diagram is zero for non-polarization situations).  You would however want to use the polarization vector when scattering two particles which each have a spin.  The spin vectors interfere either constructively or destructively, affecting the resultant cross-section in the S-matrix.  The mechanism for this spin effect is magnetic fields, but mechanisms are not usually discussed (preference is given to actually making calculations, rather than contriving a system of mechanisms that gives an understanding of what happens; likewise a magician who starts off by explaining his tricks is less revered than one who deceives).  Polarization effects also exist with light photons, and the complex amplitude is needed in the optical theorem, which is so beloved by Jacques Distler.  In the atom, electrons are paired with opposite spins, so complex vectors are used for the spin polarization.  However, for the case of calculating probabilities where spin vectors do not affect the S matrix, wavefunction amplitude exp(iS/hbar) effectively reduces to its purely real amplitude component, cos(S/hbar).  

 

The electron doesn’t “orbit” the nucleus classically, because the Coulomb field that binds it to the nucleus isn’t classical, but instead is composed of discrete interactions due to the exchange of off-shell photons (observable only through the forces they impart) called “field quanta”.  Feynman explains this breakdown of classical mechanics in the case of an atom using the path integral in his 1985 book QED (which Jacques Distler in a brief discussion with me, kindly confused with Feynman’s 1965 book on path integrals – not mechanisms for quantum mechanics and optics – coauthored with Albert Hibbs).  Now, how does quantum field theory (2nd quantization, i.e. indeterminancy caused by random chaotic field quanta impacts on particles) differ from “quantum mechanics” (1st quantization, intrinsic indeterminancy/magic/non-mechanism dogma/”complementarity principle”/Bohring physics)?  Answer: multipath interference. In the double slit experiment, a transversely spatial photon (behaving like a skywave HF “radio wave” undergoing multipath reflection from different charged regions in the ionosphere) goes through two slits.  The part of the spatially extended (i.e. transverse) photon which goes on the slit with a shorter path length to the screen, arrives at the screen first (because the distance is shorter) and therefore is slightly out of phase with the remainder of the photon which goes through the other slit.  This causes the interference, just as it would with water waves which arrive out of phase and undergo amplitude (not energy) cancellation.  Energy is conserved.

Altogether it’s a complicated mechanism for indeterminancy.  But it’s reality.  So why is 1st quantization quantum mechanics – which ignores the mechanism of multipath interference for indeterminancy – still widely ignored almost 30 years after Feynman debunked it?  Magic.  Interesting controversy.  Feynman makes this point in another book, where he writes of his battles with the physics educational textbook buying committee in California.  They recommended a crass book which contained pictures of machines with the question underneath: “What makes it go?”  Feynman explained how enthusiastic he was to find this approach – thinking that the book would then go on to explain the mechanism for each of the machines.  Instead, when he turned over the page, the answer was: “Energy makes it go!”  Feynman explains this is crass, because it conveys no information whatsoever.  If something stops you can say that “energy makes it stop” (you use energy in applying the break).

The basic problem of “energy makes it go” is analogous to “the Heisenberg indeterminancy principle makes the wavefunction collapse” in quantum mechanics: it omits the mechanism.  

Gerard ‘t Hooft’s peer reviewer on quantum gravity paper

gravity

Entire content of Editor Gerard ‘t Hooft’s response (Foundations of Physics, FOOP-D-13-00186 “Planck data and correction of Frampton’s repulsive gravity”) :

“[Dyslexic] Reviewer #1: In my opinion this manuscript has a contents [sic?] that is not correct. The author has a [sic] idea about repulsive gravity, but seems to neglect a general theorem concerning gravitational fields of a spherically symmetric source, namely that only the part of the source closer to the center than the field point contibutes [sic] to the gravitational field at the point.”

 

To spell out what’s wrong with this dyslexic “peer reviewer” report Nobel Laureate ‘t Hooft relied on, here’s the equivalent “argument” against Einstein’s 1905 paper: “The author has an idea about relativity, but seems to neglect a general theorem by Issac Newton about absolute space and time.”  Got it? The point is, principles and assumptions don’t count for nothing (to use bad grammar), all that counts is agreement with measurements.

You cannot assess a new principle by seeing whether it agrees with old prejudiced assumptions and “theorems” which are based on quicksand.  You need to see whether it agrees with experiment and observation, not subjectivity.

 

The whole point of the paper is to overturn the prejudice that gravity is due to innate attraction, there is no proved “general theorem” which proves the dogma that gravity is only dependent on the mass in the earth, it’s merely a (wrong) assumption.  It’s a good approximation for most purposes, but fails to predict dark energy from gravitation, unlike the correct model!  The fact that ‘t Hooft passed this nonsense on to the friend who submitted my paper to ‘t Hooft’s journal, shows that he is purely prejudiced and easily deluded, and/or doesn’t have enough time to do his job properly and actually read papers himself.

Let’s repeat the basic principles:

(1) Distant mass (supernovae etc) is observably accelerating: this is modelled in an ad hoc way using a small positive cosmological constant (dark energy gives an outward acceleration, Newtonian/Einsteinian gravity an inward acceleration in the big bang universe).  This is currently accepted evidence and isn’t in the least controversial.

(2) Here’s the controversial bit.  If we apply Newton’s 2nd and 3rd laws to the small (~ 10^-10 ms^-2) observed acceleration to the mass in the universe around us, we find an inward reaction force of F = ma, and using the uncontroversial gravity cross-section we prove that the “dark energy” causes predicts Newton’s law of gravitationa attraction by analogy to the Casimir force.  The measured weak force cross-section area is simply scaled to gravity using Feynman’s rules for calculating Feynman diagrams, where the cross-section is proportional to the square of the coupling.  All of this is empirical input, however.  There is nothing scientifically wrong with doing new things with existing, well-established empirical laws (Newton’s laws) and the empirical weak force cross-section an coupling!  Quite the opposite!  We should be doing new things, trying new calculations, and publishing successful results!

(3) The result is that we can calculate terrestrial gravitation accurately using dark energy.  This is new, because it predicts the gravitational coupling from lambda, the dark energy cosmological constant, or vice-versa.  The old, established approach is different: it says that dark energy and gravitation are different things, not the same.  It cannot calculate one from the other.  It treats the two numbers are unrelated.

(4) What we’ve done here is analogous to what Newton did when he combined or unified Kepler’s solar system laws with Galileo’s terrestrial gravitation.  Prior to Newton, the solar system was presumed (by Gilbert and also Kepler) to be held together by magnetic forces.  Newton did away with the magnetic force theory; he showed that the force that caused apples to fall was the same as that which caused the moon to stay in orbit around the earth, or the earth to orbit the sun.

What we’re doing is similarly eliminating one fictional force by showing it to really be another force in disguise: dark energy and gravitation are different aspects of the same thing. The apple is pushed down by the same fundamental gauge interaction that accelerates distant supernova.

Why we need objective censorship

Censorship, quality censorship, consists of constructive criticism, like requesting predictions and comparisons to reality.  Junk censorship or quackery consists of “no-go theorems”, contrived excuses to ignore innovations, simply because they’re not in any textbooks and don’t follow from professional frontier research being done by Sean carroll or a million dollar Nobel Laureate like Alexis Carroll, who invented gas chambers for eugenics after being awarded a medical Nobel prize.  Prizes create a political system in science that in politics is called groupthink dictatorship. Of course, if you criticise a politician for behaving as a dictator in a so-called democracy, he will falsely claim that you’re “against democracy” (daily referendums, in ancient Greece) and in favour of anarchy, instead of being in favour of real democracy.  He’ll very quickly “draw a line” under the discussion to prevent you from saying that in response, thus confirming that he really is a dictator.  The situation is then one where to avoid being labelled “rude”, nobody is actually able to “say” someone is a dictator who censors facts, or at least they are simply censored out of videotape and transcripts for publication (precisely the situation with critics of dictatorship in “honest” dictatorships, i.e. those which don’t try to misleadingly use propaganda to paint themselves in the color of democracy).  So all that we actually see in the fashion-bigoted media is a filtered version of what is actually going on.  That’s a dictatorship of fashion, censoring innovation in the name of preserving rationality.  The ultra-conservatives (in the sense of suppressing dissent) who censor most severely are of course those who declare themselves falsely to be most progressive, like the Marxists, the socialists, the communists, and the fascists.  Those who the left falsely label as “ultra-conservatives” are ironically the real radicals, because they are unattached to the dogmatic dictatorship of bigotry, and so are more prepared to listen and evaluate alternative courses of action than the thugs.

There are two types of controversy: fashionable and unfashionable.  The former generates money and excitement; the latter does the opposite.  In order to market research, it is a fact that you need to be fashionable.  To generate funding for research, you need to market ideas.  Therefore, you need to be fashionable.  Nobody cares what you do, as long as you keep out of the newspapers. Otherwise, you court controversy and damnation.

Fashionable controversy -> media interest and funding from media-deluded billionaires
Unfashionable controversy -> no media interest and no funding from media-deluded billionaires.

Take Darwin’s Origin of Species, or Newton’s Principia, as contrasted to Maxwell’s Treatise on Electricity and Magnetism, for example.  Darwin and Newton both courted fashion.  Darwin in the first edition deliberately excluded the history of evolution: of course he was well aware of earlier theories of evolution which contained error, but he did not frame his book as a defense of, or correction of, Lamarke’s flawed evolution theory.

Darwin was controversial in a more fashionable (media-interest generating) way than ordinary research.  The point is, the controversy that Darwin’s book catered to wasn’t the boring technical errors in the Lamarke theory of evolution (Lamarke’s evolution was more complex and wrong than Darwin’s, it claimed evolution of species occurred magically due to the inheritance of acquired habits, like neck stretching), but was instead the evidence for a very simple mechanism of evolution which was rejected out of hand by creationists.  Darwin didn’t even need to frame an argument with creationists in his book: merely presenting the evidence was enough to generate the controversy which gave publicity to his evidence!  That’s the point.

By seeming to merely present facts, without spelling out their implications for creationism or for Lamarke, Darwin came across as more objective, providing more effective fuel for existing media controversy than he would have done if his writings had spelled out moral implications.  Likewise, Newton did not review and correct his predecessors’s work and errors in Principia in 1687.  Instead, Newton just presented evidence.  he did not review the literature.  Einstein tended to follow this same approach of Newton, leading to some criticisms that he did not provide enough literature references in his papers (Einstein’s key 1905 relativity paper contains no references at all).  The point is, the most fashionable way to be controversial is to appear to be totally uncontroversial, by making no mention of the existing situation.  However, as many have pointed out, Einstein’s paper could not be published today with the bigoted “research” dogmas that insist new papers build on existing prejudices in the subject, or at least cite them and discuss their errors.

Finally, we come to Maxwell, who is the exception.  Whereas Newton, Darwin and Einstein wrote classics by going against the “research” dogma of science, by ignoring in their key papers the histories of their subjects and the existing controversies, and presenting fresh evidence (search results, not “research”), Maxwell used the technique of research.  Maxwell’s Treatise on Electricity and Magnetism reviews the history of the subject in detail, adding new ideas within the historical discussion.  It is not the approach taken by Newton, Darwin, or Einstein.  It is full of footnotes giving obsolete literature references to research long since edited out of modern textbooks.

The point is, Maxwell’s book didn’t create a revolution any more than Kepler’s long-winded astrology book on planetary motions (which claimed planets were bound to the sun by magnetism, and that their motions were musical, etc.), it was Oliver Heaviside who extracted and reformulated in vector calculus notation the “Maxwell equations”, from 20 Maxwell long handed differential equations.  These, contrary to misinformed modern mathematical physicists, prevented Maxwell from using an asymmetry in the “four” vector calculus equations as the basis to add displacement current.  Maxwell added displacement current to allow for open circuits, e.g. capacitor charging with a “vacuum dielectric”, not because he could see any asymmetry in his equations.  In any case, even Heaviside didn’t just have four vector calculus equivalent equations.  They had five, the extra one being the conservation of charge.  For this reason, plus Maxwell’s support of “aether”, his book is unfashionable even today.  Modern physicists prefer a sanitized description of Maxwell, written by Einstein.  Maxwell’s theory, it is said, is Maxwell’s equations.   Only they’re actually Einstein’s equations.

To summarize: (1) everyday research techniques (research reviews of existing work) are boring and unfashionable compared to a non-research style book presenting revolutionary new evidence minus history, (2) the history of science is always unfashionable and boring or diverting.  If Darwin (or Newton or Einstein) had presented the new evidence hidden within the context of a long-winded evaluation and correction of Lamarke’s theory of evolution, his book would have been a technical, boring PhD or Maxwell-style research thesis which would not have been popular reading for the media or public, and it would not have therefore been a tool of popular and fashionable controversy.  (Lamarke was “old hat” to the media.)

Precisely because Darwin omitted unpopular technical controversy, he was able to court the more fashionable type of controversy needed to sell his ideas to the world.  By discriminating between unfashionable controversy (technical research trivia) and fashionable controversy (interesting and popular evidence), we can understand the kind of writing necessary to market new ideas successfully, obtaining sufficient funding to develop them usefully.  However, I don’t think it’s a disaster to produce densely written, compact, and sketchy technical reports as an interim stage in this process.

First, it helps to draft material and to establish a paper trail so that the evolutionary improvement of the ideas and calculations is on record.  Secondly, it helps to obtain criticisms and to highlight areas that require reformulation or better presentation.  To start a scientific book by writing down a contents list and then fill in the chapter content is to start and finish with bias.  The contents of a scientific book should be determined by the content research, not the other way around.  Science should be driven by evidence, rather than evidence being selected and contrived to fit theoretical dogma.  The contrast between a textbook, assembled to cater to the prejudice by a dogmatic exam syllabus, and a scientific book cannot be greater.  It’s exceedingly easy to produce the outline for a textbook, you just look at the syllabus.  It’s harder to approach the problem scientifically, because of the many interconnections between different aspects of a subject.  If you are unifying two different forces, for example, it might not be possible to merely treat each force in a separate chapter, and then you have the difficulty of breaking material down into chapter sized chunks, without losing the connections.  So the key organization problem for the revolutionary is the non-problem for the textbook writer.  The textbook writer has the basic contents chapter list on a platter from the exam syllabus, while the revolutionary science writer has the problem of deciding how to organize radically new evidence.

Zombies: Sean Carroll versus Jacques Distler

Occasionally, as in June 1941, a couple of dictators find themselves giving up their pretense of groupthink socialist unity, and try to overcome their differences using more constructive techniques than simply “agreeing to disagree”.  E.g., war.  This is then labelled in the media as some kind of disproof of dictatorship.  Their “logic” is that, if two thugs fight, then surely that proves that one side must be right and the other must be wrong?  So in June 1941, the fact that Hitler attacked Stalin (not vice-versa!!!) was seen in the Marxist-duped Western media to somehow elevate Stalin to sainthood (ignoring Stalin’s joint invasion of Poland with Hitler in 1939, and the Katyn forest massacre of 1940).  So anyone with useful ideas continues to be ignored, and the media goes from reporting on the dictators to worshipping one of the dictators while criticising the other.  No progress in media ethics ever occurs.

Unrelated to this political problem is Professor Jacques Distler’s wonderful defense of physics against Zombies perpetuated by media-dominating Sean carroll:

August 24, 2013

ZOMBIES

Normally, I wouldn’t touch a paper, with the phrase “Boltzmann brains” in the title, with a 10-foot pole. And anyone accosting me, intent on discussing the subject, would normally be treated as one of the walking undead.

But Sean Carroll wrote a paper and a blog post and I really feel the need to do something about it.

Sean’s “idea,” in a nutshell, is that the large-field instability of the Standard-Model Higgs potential — if the top mass is a little heavier than current observations tell us that it is — is a “feature”: our (“false”) vacuum will eventually decay (with a mean lifetime somewhere on the order of the age of the universe), saving us from being Boltzmann brains.

This is plainly nuts. How can a phase transition that may or may not take place, billions of years in the future, affect anything that we measure in the here-and-now? And, if it doesn’t affect anything in the present, why do I #%@^} care?

The whole Boltzmann brain “paradox” is a category error, anyway.

The same argument leads us to conclude that human civilization (and perhaps all life on earth) will collapse sometime in the not-too-distant future. If not, then “most” human beings — out of all the humans who have ever lived, or ever will live — live in the future. So, if I am a typical human (and I have no reason to think that I am atypical), then I am overwhelmingly likely to be living in the future. So why don’t I have a rocket car? To avoid this “paradox,” we conclude that human civilization must end before the number of future humans becomes too large.

The trouble is that there is no theory of probability (Bayesian, frequentist, unicorn, …) under which the reasoning of the previous paragraph is valid. In any theory of probability, that I know of, it’s either nonsensical or wrong.

Now where’s my shovel … ?

Posted by distler at August 24, 2013 2:10 PM

Carroll replies there in the comments section:

RE: ZOMBIES

In both the paper and the blog post I explain that our reasoning is quite different from the silly arguments rejected above. Naturally, taking the time to read them, understand the point, and engage constructively is a bit of effort with which not everyone will choose to bother.

Posted by: Sean Carroll on August 25, 2013 12:16 PM | Permalink | Reply to this

CONSTRUCTIVE ENGAGEMENT

How about this:

Your notion of “cognitive instability” is better-understood as the statement that a proper Bayesian, even if his priors strongly favour the hypothesis that he is a Boltzmann brain, will very quickly come to reject that hypothesis.

Call it survivorship-bias, if you wish, but Bayesians have no Boltzmann brain problem (and frequentists would reject the “problem” as nonsensical in the first place).

Posted by: Jacques Distler on August 25, 2013 12:37 PM | Permalink | PGP Sig | Reply to this

New paper: http://vixra.org/abs/1302.0004

http://vixra.org/abs/1302.0004

RECENT COMMENTS BY GERARD ‘T HOOFT ON PEER REVIEWS OF THIS PAPER:

http://vixra.org/abs/1111.0111 was submitted (Foundations of Physics submission FOOP2945) to Gerard ‘t Hooft, Chief Editor of “Foundations of Physics”, who emailed on January 11, 2012: “Both the structure and the unduly high degree of speculativeness of the arguments presented in this manuscript place it outside the scope of Foundations of Physics.” This is precisely the opposite of the confirmed predictions based on facts which are given in the paper, and are precisely what the paper itself says about mainstream “string theory” trash hype, which contains no checkable predictions and is poorly structured with a landscape of 10500 metastable vacua. However, to remove all excuses, a briefer version cut from 63 pages to 7 pages and now hosted at http://vixra.org/abs/1302.0004 with the detailed literature survey including 43 references completely removed was prepared in order to focus concisely on the key prediction and its confirmed, factual basis (Foundations of Physics submission FOOP-D-13-00076). Gerard ‘t Hooft has emailed on 28 February 2013 by reversing his original 2012 criteria: “The author of this manuscript fails to make clear how his work relates to current discussions in the foundations of physics. Regrettably, this fact places the current submission outside the scope of Foundations of Physics. This is displayed by a lack of references to recent literature.”

This contradicts the original submission, which did have a recent literature survey of 43 references (http://vixra.org/abs/1111.0111) and a very detailed discussion of how the new result overthrows “current discussions in the foundations of physics.” These 43 references were removed in the resubmission to force the peer reviewers to focus on the accuracy of the scientific calculations and their factual, defensible basis. First the man claimed that the discussion of the problems in existing research and the literature survey of 43 references had distracted him from seeing the factual basis of the confirmed predictions, and then when the references and literature discussions were removed, he reversed his argument and simply ignored the facts presented in the paper by complaining instead that the 43 references and literature discussion were now missing from the paper! This contradiction is due to contriving inconsistent and trivial reasons for ignoring the hard science in both papers.

However, we’ll improve the paper in an effort to reach a compromise and see what happens. Notice that the role of “Foundations of Physics” (and all other journals) is no longer to physically communicate science or data (which anybody can put on the internet), but is purely advertising/marketing/publicity/hype. With the internet available, nobody needs to publish in this or that journal/newspaper/TV show in order to directly make information physically available for people who actually want that information.

Instead, the role of these media is all about advertising or hyping a result, in other words, it is the purely unscientific, political act of making a song and dance out of science just to attract serious funding for further research. (Peer review politics is described in http://vixra.org/abs/1211.0156.)

It should be added that “Foundations of Physics” editor Gerard ‘t Hooft (who proved that the U(1) X SU(2) electroweak theory is renormalizable since the infinite momenta problem disappears in the UV or high energy unbroken symmetry limit where the SU(2) field quanta lose their mass, thus helping to solidify the current dogma that doesn’t include quantum gravity), is author of misleading and unpredictive papers on QM including “Determinism beneath quantum mechanics” whose Abstract states:

“Contrary to common belief, it is not difficult to construct deterministic models where stochastic behavior is correctly described by quantum mechanical amplitudes, in precise accordance with the Copenhagen-Bohr-Bohm doctrine. What is difficult however is to obtain a Hamiltonian that is bounded from below, and whose ground state is a vacuum that exhibits complicated vacuum fluctuations, as in the real world. Beneath Quantum Mechanics, there may be a deterministic theory with (local) information loss. This may lead to a sufficiently complex vacuum state, and to an apparent non-locality in the relation between the deterministic (“ontological”) states and the quantum states, of the kind needed to explain away the Bell inequalities.”

He also states on page 1:

“The need for an improved understanding of what Quantum Mechanics really is, needs hardly be explained in this meeting. My primary concern is that Quantum Mechanics, in its present state, appears to be mysterious. It should always be the scientists’ aim to take away the mystery of things. It is my suspicion that there should exist a quite logical explanation for the fact that we need to describe probabilities in this world quantum mechanically. This explanation presumably can be found in the fabric of the Laws of Physics at the Planck scale. … Attempts to reconcile General Relativity with Quantum Mechanics lead to a jungle of complexity that is difficult or impossible to interpret physically. … What we need instead is a unique theory that not only accounts for Quantum Mechanics together with General Relativity, but also explains for us how matter behaves.”

The problem with what he writes is that he is ignoring Feynman’s solution in his 1985 book QED which is that the “uncertainty principle” is just the result of multipath interference in 2nd quantization; i.e. you have a separate wavefunction amplitude (psi) for each potential interaction between an orbital electron and Coulomb field quantum. There are numerous ways an orbital electron can interact with the Coulomb field quanta that bind it into its orbit. Each potential interaction has a wavefunction amplitude, and to find the probability of an electron going in a particular path you sum the wavefunction amplitudes for all the electron interactions with field quanta that will make it take that path, then you work out the sum of histories for all paths. You square the modulus of the results to get relative probabilities, then divide the result for the chosen electron path route into the result for all possible paths to get the absolute probability. There is no reality to first quantization or the usual “quantum mechanics” hype with its “indeterminancy principle”: it is is non-relativistic and only considers a single wavefunction amplitude for each onshell particle (e.g. only one wavefunction amplitude for each orbital electron). There is in reality no single wavefunction amplitude for an electron, so Schroedinger’s equation is misleading: there is a separate wavefunction amplitude for every potential interaction between an electron and a quantum of the Coulomb field (i.e., “field quanta”). The huge number of possible interactions have wavefunction amplitudes which mostly interfere and cancel out, unless they have very small action (in comparison to Planck’s constant over twice Pi, or h-bar).

Feynman argued (QED, Princeton U.P., 1985) that multipath interference (i.e. the Coulomb field quanta of 2nd quantization) provides a simple mechanism to replace the uncertainty principle of non-relativistic 1st quantization. Why not go further in this direction and simply replace the usual complex path amplitude exp(iS) (where action S is in h-bar units) with just its real component, cos S? [Taken from Euler's equation: exp(iS) = i sin S + cos S.] When you think about it mathematically, exp(iS) is a vector on the complex plane (Argand diagram), and cos S is a scalar amplitude. All cross-sections and other observables calculated from a path integral [summing exp(iS) contributions] are real numbers, hence the resultant arrow must always be parallel to the real axis, so you get exactly the same result using exp(iS) or cos S. You aren’t losing complex plane directional information that has any use in the practical calculations of QFT. It seems that the only reason to stick to exp(iS) is historical, going back to Dirac’s derivation of exp(iHt) as the amplitude for a single wavefunction from Schroedinger’s equation, where the periodic real solutions produce the quantization. If you’re doing 2nd quantization, multipath interference for large path actions is the mechanism for quantization, so you don’t need Schroedinger’s equation (which is a non-relativistic approximation).

Finally, there is an interesting exchange of blows between ‘t Hooft and Peter Woit on Woit’s Not Even Wrong weblog post of 13 August 2012, ’t Hooft on Cellular Automata and String Theory where Woit writes “Gerard ’t Hooft in recent years has been pursuing some idiosyncratic ideas about quantum mechanics. … those who are interested might like to know that ’t Hooft has taken to explaining himself and discussing things with his critics at a couple places on-line, including Physics StackExchange, and Lubos Motl’s blog. If you want to discuss ’t Hooft’s ideas, best if you use one of these other venues, where you can interact with the man himself. One of ’t Hooft’s motivations is a very common one, discomfort with the non-determinism of the conventional interpretation of quantum mechanics. The world is full of crackpots with similar feelings who produce reams of utter nonsense. … I don’t think what he is producing is nonsense. It is, however, extremely speculative, and, to my taste, starting with a very unpromising starting point. Looking at the results he has, there’s very little of modern physics there, including pretty much none of the standard model (which ’t Hooft himself had a crucial role in developing). If you’re going to claim to solve open problems in modern physics with some radical new ideas, you need to first show that these ideas reproduce the successes of the estabished older ones.”

t’ Hooft wrote in a comment there to respond to the criticism: “I did not choose to side with Einstein on the issue of QM, it just came out that way, I can’t help that. It is also not an aversion of any kind that I would have against Quantum Mechanics as it stands, it is only the interpretation where I think I have non-trivial observations.”

Woit then replied: “I hope you’ll keep in mind that I often point out that “Not Even Wrong” is where pretty much all speculative ideas start life. Some of the ideas I’m most enthusiastic about are certainly now “Not Even Wrong”, in the sense of being far, far away from something testable.”

That certainly is nothing to be proud of; checkable predictions are hyped as being more important that politics for science, but the socialist dictators in charge of the journals prefer politics (literature surveys of nonsense) to hard calculations.