Yuri Milner’s Fundamental Physics Prize of $3 million each to Edward Witten, Alan Guth, Andrei Linde, Arkani-Hamed, Juan Maldacena, Nathan Seiberg, Alexei Kitaev, Maxim Kontsevich, and Ashoke Sen (4 September 2012 update)


Above: here’s something really annoying to me. Facebook suggested adding $3 million Yuri Milner prize winner Edward Witten (“Luboš Motl and Jack Sarfatti are mutual friends”) – who is widely regarded as both the greatest and quietly modest mathematical physics genius of his generation. To see why this automated nonsense is funny, look at the jokes and Witten’s Nature article telling string theory colleagues to abstain from controversy, linked here. Witten currently uses a picture of a brain on his profile, which is hardly the emblem of “quiet modesty” which the big “science” hyping media prefers. (A quick scan through the 176 friends listed reveals anthropic landscape multiverse proponent Professor Susskind, drug smuggling-charged Professor Paul Frampton, and popular physics author Professor Lawrence Krauss, so this appears to be Witten’s genuine Facebook profile.) One thing you realize early on is that any association with fashionable status quo groupthink is toxic poison.

Dr Peter Woit of Columbia University maths department comments on Not Even Wrong, the fundamental physics blog critical of non-falsifiable speculations:

“… I noticed what is odd about this prize, after realizing that the winners are kind of a list of the most prominent people in the field who haven’t won a Nobel Prize. What this does is turn the Nobel Prize on its head; you get it for doing work that is untestable or wrong, but that has a high profile … The Fundamental Physics Prize winners get about six times more [than the Nobel Prize] for ideas that have gotten a lot of hype, but no experimental test (or at least not enough to satisfy the Nobel Committee of physicists). Even better, you get the prize for your over-hyped ideas even if experiment does show them to be wrong … One wonders about the implications of this for the future of theoretical physics: why should young theorists work on unpopular ideas and/or try hard to find testable ones? …”

“… it’s now all too clear where we end up: the textbooks of string theory and supersymmetry have already been written, and that will be codified as humanity’s best understanding of fundamental physical reality for the indefinite future. …” – Dr Woit.

Dr Woit: “… if string theorists all of a sudden calculate accurately all the parameters of the SM, using a string theory landscape calculation that implies a multiverse, that would be good evidence for a multiverse. It’s also true that if I discover tonight a wonderful TOE which explains everything perfectly and has no multiverse, that will be evidence against the multiverse. At this point, both of those eventualities seem equally irrelevant: I’m a lot more likely to get run over by a truck on my way home tonight. As for … the smart high school student, what if he or she instead of getting excited about the hype sees it for what it is, decides “frontier science” is BS, and decides to become a lawyer instead of a scientist? I think a lot of that has been happening in recent years ….”
Nine string theorists, inflationary cosmology theorists, and salesmen of non-existing quantum computers have been awarded $3 million each and the opportunity to control the awarding of future similar prizes, funded by Russian Facebook investor Yuri Milner. All the ideas have failed, and quantum computing hype rests on logic based upon non-relativistic 1st quantization, i.e. single wavefunction quantum mechanics from 1926, not the relativistic 1948 Feynman path integral (relativistic 2nd quantization) in which electrons have multiple wavefunctions for different paths, which interfere on small scales (path actions near Planck’s constant) to cause indeterminancy.

The foundations of quantum field theory and quantum mechanics

As documented (with literature references) in detail here and in several blog posts, the basis of quantum mechanics and quantum field theory as they now stand is Weyl’s 1918 gauge theory of quantum gravity. Weyl there suggested that the metric of general relativity is directly proportional to exp(iF), where F is a function of the electromagnetic field. Euler’s equation gives: exp(iF) = (i sin F) + cos F. In other words, there are an infinite series of real value, discrete (“quantized”) multipliers for the metric, like a series of railway tracks of different widths (which is where the name “gauge” theory comes from).

Weyl’s theory was wrong (as Einstein pointed out, it would mess up spectral lines in light from heavy stars with intense gravitational curvature), but the idea of using the discrete real solutions of exp(iF) to quantize or “gauge” a theory persisted since Schroedinger read Weyl’s paper and tried to apply exp(iF) to explain Bohr’s discrete quantum states in a paper he wrote in 1922. Schroedinger was well aware that any equation of the type, dF/dt = F, must have an exponential solution, because you get a natural logarithm when you rearrange to get similar factors on the same sides, and integrate, since the integral of (1/X)dX = dt gives ln (X1 /X2) = t, which when you get rid of the natural logarithm is equivalent to X1 = X2 exp(t). If you have dX/dt = -iHX, then the solution is X1 = X2 exp(-iHt).

Hence, the phase factor of exp(-iHt) or exp(iS) which is integrated to give the path integral in quantum field theory, is very simply a solution to an equation of the form dX/dt = -iHX, which we call Schroedinger’s equation (X is the wavefunction since this blog does not support the Greek symbol for the letter Psi). Schroedinger in 1926 came up with his equation dX/dt = -iHX after being familiar with Weyl’s gauge theory and being given de Broglie’s wave-particle duality paper; Feynman writes in his Lectures on Physics that it isn’t possible to derive the wave equation, i.e. it is taken as an axiom rather than a result of quantum mechanics theory. Clearly Schroedinger was just extending his 1922 idea of applying Weyl’s gauge theory to quantum mechanics. Weyl eventually (following London’s suggestion) reapplied his gauge theory of gravity to the electromagnetic field, scaling the local phase wavefunction (rather than than the metric of general relativity) in direct proportion to exp(iF) where F is a function of the electromagnetic field. This gave electromagnetic gauge theory. Put in causal terms (rather than Noether’s mathematical theorem): if a force field does work upon a particle to change its wavefunction, then the conservation of energy implies that there is an effect on the field which did work in causing the phase change.

Basically, the wavefunction amplitude for each actual offshell or “virtual” quanta in the double-slit experiment (or each possible interaction involving a virtual quanta and fermion in bound states like the quantized Coulomb field binding between orbital electrons and nuclei) is proportional to exp(iF). The periodic circular variation of the resulting complex wavefunction amplitude around the origin of an Argand diagram allows paths of different lengths (taken by different offshell quanta) to arrive with varying phases, which will cancel out if their actions are much larger than Planck’s constant divided by twice Pi.

What Weyl did in producing the first quantum mechanics gauge theory in 1929 was to consider the derivative of this relationship. As proved on page 7 of our paper, what Weyl noticed was that the equation

{wavefunction_(t)} = {wavefunction_(0)}{exp(iF)}

does not yield the same ratio when the wavefunctions are differentiated with respect to spacetime variables to find their rates of change. The product rule of differentiation and the rule for differentiating any exponent give an extra term, which turns out to be a compensation for the energy taken from the electromagnetic field in order to produce a change in the wavefunction amplitude of a particle. In other words, it’s the mechanism of quantum field theory: to change the state (wavefunction amplitude) of a particle you must supply energy from the surrounding quantum field by means of the absorption of an offshell field quanta. The energy lost from the field is the energy gained by the onshell particle which absorbs the field quanta, as depicted in Feynman diagrams. Noether’s theorem linking conservation laws to symmetries in physics influenced Weyl’s original development of gauge theory: if two electrons are paired in an orbital with opposite directions of spin (by the Pauli exclusion principle) then inverting the relative direction of both spins will preserve symmetry because although each electron will now have a changed spin, its both will still have opposite spins. This conserves energy because symmetry is preserved. (Electrons 1 and 2 have spins up and down before inversion; then down and up respectively afterwards. Although the spins have inverted, each electron still has opposite spins relative to its neighbor.)

What Weyl did physically here was to quantify the modification (“Gauge transformation”) to the electromagnetic field (Maxwell field potential), which corresponds to a given rate of change or derivative of the wavefunction. Weyl found that the derivative of the wavefunction is a function of the electromagnetic field; symmetry is preserved where the energy of the electromagnetic field is conserved. Only when a quantum of field energy is imparted to a particle, is work done. Classical physics ignores this entirely. Newton’s law of gravity contains no allowance for the gravitational field energy to be reduced by the amount equal to the kinetic energy gained by a falling apple. This is a flaw in classical physics, which generally ignores the application of the conservation of energy to fields.

In terms of gravity, if you drop an apple, the acquisition of kinetic energy by the apple’s mass comes from the gravitational force field which surrounds it (relativity shows that the apple gains mass slightly by accelerating, so the acceleration of the apple is not due to energy originally stored in its mass). The gravitational force field surrounding the apple is depleted, therefore, by precisely the amount of energy which the apple gains from the field as it falls. When it hits the ground, the sound waves are energy which was originally gravitational field energy (the potential energy of the apple in gravitational field). So we know (from energy conservation) that some graviton energy is used i.e. converted into kinetic energy.

Similarly, in Weyl’s electromagnetic gauge theory, every phase charge of a particle requires the absorption of a virtual particle from the electromagnetic field in which it is immersed. This is the basis of the simple “tree” level Feynman diagrams which tend to contribute the vast majority of the amplitude of fundamental forces like electromagnetism, at least at low energy. Weyl’s theory quantifies the relationship between the size of the phase change and the amount of electromagnetic field energy required for that phase change. Quantum gravity works the same way.

Euler’s equation shows that the phase space amplitude, exp(iS) = (cos S) + i sin S. You can’t have a non-real phase amplitude or a non-real path integral result (the non-observable complex states are precisely what quantize quantum mechanics). So the resultant of any path integral comes only from the non-complex term, i.e. exp(iS) can be replaced by cos S. (Ever heard of a cross-section or probability which is complex? Me neither, so goodbye i sin S.) Then the phase amplitude is a simple rotation of some hidden variable in real space. If you don’t like hidden variables, then you don’t like quantum mechanics, because the complex wavefunction itself is a “hidden variable”; you can only see probabilities and cross-sections which are proportional to the square of its modulus, so it’s not directly observable and thus is a hidden variable. This is entirely consistent with all experimental results in QFT and QM!

This has nothing to do with 1st quantization “hidden variables” such as Bohm’s derivation of the Schroedinger equation or the experimental tests of the Bell inequality; both of which are 1st quantization quantum mechanics, i.e. they implicitly assume a single wavefunction exists. In fact, a single wavefunction doesn’t exist in relativistic quantum mechanics, i.e. 2nd quantization. Each path has a wavefunction amplitude, and there is no (single) “wavefunction” to collapse when a measurement is made. As Feynman states in his book QED (1985, not to be confused with his earlier works), you “don’t need” the Heisenberg uncertainty principle as an axiom in 2nd quantization, because multipath wavefunction interference causes indeterminancy and produces quantitatively the same result, without any single wavefunction collapsing. It is the superposition of varying real wavefunction amplitudes from multiple paths that causes the interference phenomena in the double slit experiment and the indeterminancy of the electron’s orbit in the atom. Feynman explains in QED that for an orbital electron, the path integral is indeterministic because the Coulomb field (binding it) consists of discrete interactions with field quanta, which cause deflections to the motion of the electron. In other words, the electron doesn’t take all possible paths, it’s just a case that to model the electron statistically (to find the probability of finding it any any given location), you need to include in the calculation all of the various possible interactions which can occur between the electron and field quanta.

Similarly, if you want to predict the path of a pollen grain fragment, you need to include the various possible random interactions with air molecules in the calculation; the result is Brownian motion “random walk” statistics. The pollen fragment doesn’t take all possible paths. But because a large number of possible paths are possible, the model that mathematically describes the motion is indeterministic (giving only a statistical description), and this originally caused confusion for Brown who discovered Brownian motion but attributed it initially to some kind of intrinsic law of nature that particles are chaotic on small scales (the reason was that he could not see the water molecules which were impacting the pollen grain fragments with his microscope). Einstein and Infeld discuss this history in their book, The Evolution of Physics. It should be noted, however, that although by Occam’s razor the simplest path integral model for the electron is that it doesn’t take all possible paths (you just have to take all possible paths into account in the calculation of probabilities by the path integral), the photon does take multiple paths (since the double slit experiment gives interference with “single” photons, which must therefore be spatially extended and influenced by both paths which are given by the two slits).

Those who are proud of mathematics will try to defend epicycles as elegant, by obfuscation techniques. Are we “losing solutions” if we replace exp(iS) with cos S, in an analogous manner to the fact that you do lose negative solutions if you square a real function which can be positive or negative? No, because the resultant arrow for a sum over histories (path integral) on an Argand diagram must always be parallel to the real axis! It can’t be pointed any other way, or else the result of a path integral or quantum mechanics probability will not be a real number! “Maybe”, they will allege (with lots of arm-waving), “the complex exp(iS) epicycle carries vital interference information which magically goes to work in the multipath interference of the path integral, and ensures the Standard Model and QM give valid results?” Wrong!

It is a fact that cos S is just as periodic as i sin S; the only difference is that one is on the real plane (which we deal with) and the other isn’t on the real plane (so it’s a hidden variable that’s not ony hidden, but superfluous as well). Schroedinger needed the complex number, i, because he used first quantization, i.e. a single wavefunction rather than wavefunctions interfering for all paths. He did not have the path integral (second quantization) where discrete results arise from multipath interference, each path having a separate wavefunction amplitude which contributes to the path integral’s resultant arrow or amplitude.

Schroedinger had to explain discrete lines and discrete orbit diameters (most probable electron locations) with a defective, non-relativistic model in which there is only one wavefunction and there is no multipath interference mechanism for the quantization phenomena. The path integral provides this mechanism. We no longer need Schroedinger’s complex conjugate, if we use second quantization rather than first quantization.

The multipath interference mechanism of the path integral makes Hilbert space obsolete. We simply don’t need the exp(iS) wavefunction amplitude in the path integral; we can use cos S instead. It does the same job. No information is lost; there is no information input which gets lost in the output if we do the path integral as an integral of phase factor cos S in place of exp(iS). The only differences are advantages: we stop thinking about imaginary (Hilbert) space, which is at best a spurious epicycle in the theory, and we lose the problems with Haag’s theorem (which makes renormalization impossible to prove self-consistent if we continue to use imaginary Hilbert space). Haag’s theorem doesn’t apply to a real phase factor of cos S, only to exp(iS). So cos S solves a lot of problems, with no drawbacks. It reduces the number of hidden variables in QFT and QM. Very nice!

Errors from fashionably dogmatic ignorance and Orwellian doublethink

Let’s examine the string theorists like Ed Witten and the inflationary universe people like Alan Guth. Our paper explains on pages 1-14 what’s wrong with the stress-energy tensor’s spin-2 gravity coupling, and on pages 15-16 what’s wrong with inflation and the correct prediction for the flatness of the early universe.

Feynman explains clearly in his 1985 book QED that particles can’t exist in a 1st quantization Schroedinger-Heisenberg superposition of classical states, because the indeterminancy is produced by the mechanism multiple path interference.  So there is simply no single real wavefunction with an eternal “superposition” of states.  There is just one state for each quantum number of a fermion, whose indeterminism is due to the multiple wavefunctions from field quanta exchange which interfere with one another, without a constant superposition that can store any data whatsoever. Although a two-state, spin-1/2 “qubit” such as the up/down spin direction of an electron may appear to store information, it’s a fantasy because field quanta from the electrons’ own fields are continually interacting with them and reversing their spins; the only “information” is that each electron in a pair has opposite spin directions. The actual spin directions are continually changing. David Deutsch simply ignores Feynman’s 1985 QED book and in his article on “Quantum Computation” in Physics World, 1/6/92, misleadingly asserts that Feynman backed quantum computing in 1982! To make a quantum computer requires turning the universe non-relativistic so there is just one wavefunction per fermion, or else using massless bosons – photons – which store information during their light-velocity journey because of relativity. The problem is the same with quantum computing as with string theory and inflation cosmology: these are failed superficial speculations built on shifting foundations (the stress-energy tensor in general relativity doesn’t represent discontinuous matter, so there is no smooth curvature unless it is fiddled with a false continuous fluid approximation), shored up with false public relations claims. The media fails to report these deliberate or inadvertent religious-dogma-style deceptions (science fantasy is popular and sells).

Dr Peter Woit of Columbia University maths department comments on Not Even Wrong, the fundamental physics blog critical of non-falsifiable speculations:

“… I noticed what is odd about this prize, after realizing that the winners are kind of a list of the most prominent people in the field who haven’t won a Nobel Prize. What this does is turn the Nobel Prize on its head; you get it for doing work that is untestable or wrong, but that has a high profile … The Fundamental Physics Prize winners get about six times more [than the Nobel Prize] for ideas that have gotten a lot of hype, but no experimental test (or at least not enough to satisfy the Nobel Committee of physicists). Even better, you get the prize for your over-hyped ideas even if experiment does show them to be wrong … One wonders about the implications of this for the future of theoretical physics: why should young theorists work on unpopular ideas and/or try hard to find testable ones? …”

In other words, the worst version of the Matthew effect. Awards to celebrities are self-publicity for the award-giver. To put in rudely, if Milner had rewarded unhyped ideas which have been confirmed experimentally such as the 1996 prediction of the dark energy of the universe, then the media would have either ignored him or at best crucified him with their usual self-righteous drivel about everyone unfamous being wrong/unworthy/failures/bitter/pathetic or simply non-newsworthy. It’s exceptionally hard for the media to hype anything unless it already has popular appeal, because the media is useless at marketing completely unpopular ideas (I obtained grade A on a marketing course dealing with the use of the media for publicity, so I do know how this works).  No editor can sell plain old news (ordinary births, deaths, marriages, etc.) without front-page controversy/fame/infamy/celebrity gossip, which is popular.  Nobody has the time to know what every Tom, Dick and Harry do.

The media interest is in promoting the press-releases of the popular status quo “Max Clifford” spin-doctors, the serial murderers, megarich, or megapowerful politicians, actors, or rock stars.  Even in the supposedly fair and level playing fields of the Olympic Games coverage, the mainstream media in fact hypes, interviews, and endlessly promotes the profile and public recognition of the “expected” winners (or the previous winners) ahead of the new contests and new results, and in preference to the new competitors who are unknowns.  This is the Matthew effect.  Take Michael Phelps who has 17 Olympic medals from previous Games.  The media has already devoted more coverage to the “news” that he didn’t win a medal in his first Olympic competition of 2012, than to the new winners who did!  This “paradox” has a simple reason: Phelps established a profile and a massive fan base, and well deserved his publicity for the 17 previous medals.  The self-serving media priority is thus pandering to this established popularity and established interests of the viewers and readers, not simply reporting what I’d consider to be the real “news”.  Past history prejudices the media coverage, by defining in advance what “news” is considered worthy of reporting!  In other words, the power of the media is corrupted (like all other power), and for precisely the same reason, there is no media market for real “news” (non-string) of checked predictions in fundamental physics!  Dr Woit stated yesterday, 30 July 2012 (even before Milner’s prize was announced):

“The lesson … from the failure of … one trendy subject is just to change to a different trendy subject.” – Dr Woit

The danger is that fashionable groupthink will be encouraged, wasting money by concentrating too many eggs in one basket or in baskets located on one bandwaggon:

“You really think that this year’s winners will continue to do research as if nothing has happened? And given their financial power over the rest of their colleagues, you think their relationships will stay as natural? If someone had an incredible amount of money and wanted to sabotage a subject, you think there is a more effective way? Mr Milner could have started a new, well-funded institute dedicated to fundamental research in physics, along the lines of the Perimeter Institute, but this time in a different continent. He could have subsidized the research of a very large number of young, talented scientists (including many in Russia who live hand to mouth). But he decided to take the easy way and splash incredible amounts of cash on those who need it least.” – Not Even Wrong commentator MathPhysics

“As several have pointed out, it makes the problem of follow-my-leader physics worse. As it is there are too many young people whose work is based on what is fashionable at Princeton, and the prospect of a 100k/3M dollar carrot will just make this worse.” – Not Even Wrong comentator P.

“… these theories have (or this one [Witten’s “M-theory”] has) the remarkable property of predicting gravity [this emphasis is Witten’s own] – that is, of requiring the existence of a massless spin-2 particle whose couplings at long distances are those of general relativity. (There are also calculable, generally covariant corrections that are unfortunately unmeasurably small under ordinary conditions.) This result is in striking contrast to the situation in conventional quantum field theory, where gravity is impossible because of the singularities of the Feynman graphs.”

– Edward Witten, “Reflections on the fate of spacetime”, Physics Today, April 1996, p24.

Notice that Pauli and Fierz introduced spin-2 gravitons in 1939 as “gravitational waves” (which will be composed of gravitons) by stating:

“In the particular case of spin 2, rest-mass zero, the equations agree in the force-free case with Einstein’s equations for gravitational waves in general relativity in first approximation …”

– Conclusion of the paper by M. Fierz and W. Pauli, “On relativistic wave equations for particles of arbitrary spin in an electromagnetic field”, Proc. Roy. Soc. London., v. A173, pp. 211-232 (1939).

It is well “accepted” by the theoretical physics community that gravitational waves “must” be composed of gravitons of spin-2, because they couple to the rank-2 stress-energy tensor of general relativity.  This is despite the fact that the stress-energy tensor requires an unrealistic ideal/perfect classical fluid approximation to yield a differentiably smooth distribution of mass, energy etc., so that the resulting curvature is a smooth, differentiable function of spacetime.  Similarly, it was well accepted by the theoretical physics community in 1867 that atoms were composed of the stable aether vortices of Lord Kelvin, von Helmholtz, and Tait, because they agreed mathematically with the “established laws of nature” such as the conservation of mass and momentum. Any messenger who ridiculed Kelvin for this error of not making falsifiable predictions was simply censored out.  Planck, inventor of the quantum theory of radiation and editor of the journal which published Einstein’s first papers, remarked depressingly in his autobiography that new ideas triumph “one death at a time” as famous bigots/leaders are rat poisoned/die. Should critics of any status quo group-think fashion really be “ridiculed” and then banned from the right to reply properly? Or should the media stop chickening out and start investigating the liars instead of believing the current epicycle theory like religious dogma?

“The student … is accustomed to being told what he should believe, and to the arbitration of authority. … Ultimately, self-confidence requires a rational foundation. … we should face our tasks with confidence based upon a dispassionate appreciation of attested merits. It is something gained if we at least escape the domination of inhibiting ideas.”

– Cecil Alec Mace, The Psychology of Study, 1963, p90

String theorists have arXiv to host their preprints (and censor out critical trackbacks), so they don’t really need Physical Review Letters for peer review. Once upon a time, peer reviewed journals were the mechanism for peer-to-peer communication. Now, however, the top journals are merely a prestige publicity/marketing mechanism used to communicate to the media and thus the public (advertising media). Top string theorists can now simply put out arXiv papers with “press releases”, instead.

That’s precisely the problem: all hard-core stringers do think that dissent from the dominant theoretical approach should be discouraged (i.e. ignored). This is because they feel that string theory is the only decent idea out there and – unless or until a better option emerges – it makes sense to focus on string and temper old-fashioned Popperian prejudices about theories being judged on their falsifiable predictions. Dissent amounts to defeatism, which lowers morale. The power of status quo is measured by its ability to ignore all opposition (critics and dissenters).

Woit comments on Witten’s defense of string theory dismally, too:

“I don’t think any of his examples addressed the real issue, which is not that practical tests of string theory are far away, but that it makes no predictions, even if you had the technology to test it. To defend the falsifiability of string theory he gave the dubious argument that if table-top experiments showed quantum mechanics to be wrong, that would show string theory was wrong.”

Witten’s 1996 “defense” of string theory using spin-2 gravitons implicitly assumes that the rank-2 stress-energy tensor which requires spin-2 graviton coupling, is correct. In fact, the rank-2 stress-energy tensor is long known to be false because it can’t model discontinuities: the discrete particles of mass and energy are not represented accurately by the stress-energy tensor! Instead, a falsely smooth distribution is required to force the stress energy tensor to give a smooth Ricci curvature. In addition, you can write a rank-1 tensor, i.e. a vector, equation for gravity – e.g. the Poisson equation – which is analogous to a spin-1 force law in QED, so Witten’s argument is subjective to the easily disproved assumption that the smooth distributions (required for the differential rank-2 stress-energy tensor in general relativity) are perfectly correct model of mass distributions in quantum gravity! Duh! Even Phoebe grasped this!

Quantizing general relativity is the deeper argument for string theory, not falsifiable experiments. The problem is that this gives an opportunity to move the goalposts from tests to the need to overcome singularities in general relativity, by replacing them with Planck length strings of compactified dimensions.

“Wow, the theoretical physics field is crazy, now a bunch of ‘top’ physicists in string theory and other areas with untestable theories get 3 million dollars each for ostensibly over hyping their discoveries? It seems you should be a better PR guy than physicist now a days and you’ll be more successful. Plus as several people have stated earlier in posts, this just reinforces the old guard. They get to choose who gets next years prizes? Wow. As an aside, how can physicists who champion untestable, unproven ideas past any reasonable time frame remain so revered?” – Hack.

They remain so revered because it is taboo to send them a message, just as the Emperor’s New Clothes were a taboo subject for discussion. It’s no coincidence that society’s liars just happen to be protected by the rules of taboo. On the contrary, these people seek to hide their attire from criticism precisely because they work in subject areas which the media considers taboo for discussion. Why didn’t someone point out to Hitler that eugenics theory – promoted by revered Medical Nobel Laureate Alexis Carrell and Darwinian psuedoscientist Sir Galton – was a lie simply because evolution utilises diversity? Answer: taboo. Nobody ever wants to discuss mechanisms, causes, and understanding the evidence objectively. People want lies and spin, and that’s therefore precisely what they get, masquerading as fact.

Another popular example: IPCC taboo’s on negative feedback data from cloud cover

The IPCC ignores entirely cloud feedback data from Spencer et al. It’s precisely because of the lower air pressure and lower temperature above the surface that the rising water vapour expands and condenses: the Wilson cloud chamber effect. If you reduce the air pressure, the parcel of warm moist air expands, so its temperature falls, and cool air holds less water vapour so the super-saturated (excess) water content then condenses into cloud droplets.

Go further up in altitude and the air gets cleaner, with less dust and condensation nuclei. It’s exactly like a Wilson cloud chamber, in which air ions from cosmic rays act as condensation nuclei which attract water molecules and set off cloud formation. This produces vapour trails around the tracks of alpha and beta particles, and charged cosmic ray collision particles. Nigel Calder, former New Scientist editor, has correlated the inverse cosmic ray cycle with radiosonde temperature: http://calderup.files.wordpress.com/2012/03/101.jpg and http://calderup.wordpress.com/2012/03/03/climate-physics-101/ The lower the cosmic ray intensity, the greater the temperature! This is precisely what the Wilson cloud chamber mechanism predicts for cloud cover such as cirrus (around 15,000 feet). The more Wilson cloud cover, the greater Earth’s albedo, and thus the cooler the temperature because more sunshine is reflected away by the cloud cover. The fewer the cosmic rays, the less high altitude cloud cover, and the warmer the surface is.

The Wilson cloud chamber is not an opinion or a speculative theory, it’s hard fact. Whether Calder’s correlation is based on the world’s best data for temperatures is another question, but I think this is the kind of mechanism that at least contributes to the Earth’s temperature fluctuations. By ignoring this physical mechanism entirely, the IPCC descends into pseudoscience. Their approach is to ignore Spencer and Calder, instead of objectively investigating mechanism other than AGW.

The hockey stick curve is wrong due to negative feedback from cloud cover. The variation in cloud cover as a function of temperature opposes the effect of air temperature on tree growth and ice molecule sublimation. When earth is hot, there is more high altitude cirrus cloud due to evaporation of water, and this reduces the sunshine for photosynthesis and ice sublimation. Fact. This effect opposes the effect of air temperature (which promotes tree growth. Fact.

The effect is quantitative: greenhouse experiments on the rate of growth of trees under varying air temperature do not allow for the fact that there is more cloud cover when the planet heats up. Therefore, the correlation used between air temperature and tree growth is inaccurate. I state that this is a quantitative effect on the error margins in the IPCC tree ring proxies: they underestimate the temperature fluctuations error bars. The actual air temperature varies dramatically, but because cloud cover increases with global warming, tree growth is less affected than their greenhouse data suggest.

Trees of identical species in similar soil grow at very different rates depending on exposure to sunshine for photosynthesis. (What stops this kind of objective quantitative research is the fact that it’s not going to profit anyone, apart from the taxpayer. The politicians and professional (quack) “scientists” are in it for research grants, political “saving the universe” hero worship/votes, etc. However, I’m more interested in the science.)

I’ve explained in detail what’s wrong with the “error bounds” in the hockey stick curve. Earth’s temperature fluctuates widely, but this has less effect on tree ring growth and ice sublimation than the IPCC believes, because as the air temperature goes up the cirrus cloud cover increases which partially cancels the increased growth of trees and the increased sublimation of ice (both of which depend on sunlight exposure to trees and ice, not just air temperature as the IPCC assume).

The models are incorrect because they omit the Wilson cloud chamber effect entirely, and Spencer’s negative feedback water data. All of the IPCC models are wrong!

Try saying this, and you are into classic taboo territory, in which it is socially nice to tell lies and pretend that CO2 is causing the temperature rise in the contrived hockey stick, which mashes together a horizontal line from tree ring proxies where naturally variable temperature swings are cancelled out by corresponding cloud cover variations, to more recent satellite data which shows a real temperature swing upward which isn’t seen in the tree ring proxy falsehood. There is a 50% chance of increasing or decreasing natural temperature swings, since a variable can increase or decrease with time (two possibilities). CO2 has an effect, but due to negative feedback (increased cloud cover to reflect sunlight away as the earth warms up), there is a thermostat in place which the IPCC exclude from the entire range of their climate models. The IPCC assumes (without evidence) that 100% of the temperature rise since satellite data arrived has been due to CO2 and related greenhouse gases.

To make this assumption look credible, the IPCC uses the lie of the tree ring proxy data, which don’t correlate to temperature since cloud cover affects photosynthesis, just as cloud cover affects the sublimation of oxygen isotopes from surface ice which goes on to form the ice-core “temperature record”. This allows them the hockey stick fiddle, and to claim that recent temperature changes are unprecedented, correlate with CO2 output, and are not natural random fluctuations. The geological evidence shows that negative feedback from cloud cover prevents CO2 rises from affecting temperature: most major CO2 levels changes lag behind temperature swings. Temperature is regulated by the Wilson cirrus cloud chamber effect, which controls the natural global variations in temperature. When cloud cover decreases, temperature rises and this results in a rise in CO2 due to a proliferation of CO2 emitting animals in the warmer climate, faster than CO2 absorbing rainforests can expand. Hence, geological record temperature rises preceded CO2 rises. The IPCC approach to science is epicycles and lying propaganda.

There is no data correction method known for cloud cover; all these studies assume implicitly (never explicitly) that by taking more and more data, local variations in cloud cover will cancel out. This assumes that the mean global cloud cover is not varying as a function of the global mean temperature. In fact, as temperature rises, mean cloud cover increases due to evaporation, and this reduces the mean amount of sunlight available to trees. Tree ring growth consequently doesn’t correlate with mean global temperature as strongly as greenhouse-calibrated tree ring proxies, which suggest falsely that temperature variations prior to say 1900 were smaller than the real temperature variations. This is why the “official” error bars on individual data sets of tree-ring proxies are far too small. The real fluctuations individual temperature sets would be far greater still than the fluctuations on the data in the “official” hockey stick curve.

This also applies to the temperature proxy of using the ration of oxygen-16/18 isotopes, since sunshine on the surface of Vostok ice increases sublimation of ice to H2O vapour, regardless of the air temperature: sunlight supplies infrared energy directly to the water molecules in the ice crystals. This cloud cover effect on Vostok ice core data is ignored by the IPCC. There is no foolproof correction method. You simply can’t resolve two variables from one piece of measured data. You cannot deduce both temperature and cloud cover variations from tree rings or oxygen ratios.

There is direct evidence in the data since 1960, where tree ring proxies indicate a smaller temperature variation than direct temperature data measurements. Ice cores aren’t available over the entire earth’s surface for a fairly obvious reason (summer temperatures), so it’s polar data only. Tree rings are the major proxies, introducing random noise into the “temperature” data sets whose average gives the flat part of the hockey stick curve.

There’s a huge scatter and disagreement in the temperature proxies (oxygen isotope ratios, tree ring data) used for the hockey stick prior to circa 1900, when you take account of the cloud cover effect I’ve explained. So Mann averaged a huge number of differently fluctuating temperature proxies, to obtain the constant temperature part of the hockey stick. If that’s wrong, and the temperature really was fluctuating wildly before the 20th century (as critics claim citing the Medieval warm period and the iced Dickensian Thames in the 1850s), then the correlation between recent CO2 and temperature rise may not be so impressive. If the temperature is always fluctuating with a period of a century or so, then for any given century there’s a 50% chance of rising temperature and 50% of falling. So the correlation is not proof of causation. Even if you have a billion or a trillion falsely analyzed oxygen isotope ice core and tree ring data sets, if you ignore cloud cover variations (increasing cloud cover as global temperature rises), you’re not doing science.

The only way the IPCC get a big disaster prediction is to assume positive feedback from water evaporation, boosting global warming. However, water vapour can’t have a self-feedback that’s positive, or else Earth would be boiling in a runaway greenhouse effect. Because Earth isn’t in a runaway greenhouse effect naturally, you know that the greenhouse properties of ocean evaporated H2O are somehow limited in nature. You don’t see anyone announcing that dihydrogen oxide must be banned because it could all evaporate from the oceans and roast the world.

Although water vapour absorbs IR, when too much water evaporates, it heats up, rises buoyantly, then expands and cools until the air gets saturated and the water turns to cloud droplets which shadow (and cool) the surface below. Dr Roy Spencer published some data on this negative feedback from clouds in monsoons; it seems H2O has positive feedback (as IPCC assume) for small temperature rises due to CO2, but has negative feedback (opposing CO2) for higher temperature rises. This subtle effect is what’s been missed out. Clearly, it must exist or we wouldn’t exist; it would be in a runaway greenhouse world.

The danger is that science is being perverted by the usual conflation of data and politically correct interpretations (involving “reasonable looking” assumptions that must not be questioned for political correctness reasons, not scientific reasons). It is a fact, not a mere hypothesis, that here is a conflict between individualism and the union of people into tribes, religious sects, and so on. If you want evidence that mainstream science can become corrupted, take a look at medical Nobel Laureate Alexis Carrell’s 1935 book “Man the Unknown”, which suggested gas chambers as a totally civilized and humane method to deal with people deemed undesirable to the state government. The book was published with an enthusiastic foreword in Germany the next year, and later implemented with disastrous consequences. The origins of “civilized” gas chamber eugenics go back to “greats” like Sir Francis Galton, who asserted Darwinian evolution could be put to good use to purify humanity. This is the danger: it has happened before when fashionable authority has been worshipped by the public and its attending media in Nazi and USSR scams. If it is an “insult” to claim this occurs in a “democracies” then democracies as functioning today need insulting badly. Very badly.

The eugenics society was still powerful in Britain in the late 1940s (Penguin reprinted Carrell in 1946, omitting to mention the use of gas chambers and Carrell’s collaboration charges), until the full evidence of eugenics results – in photos and films – were published after being presented in evidence at Nuremberg. But there are lots of examples. Marxism as a perversion of science went right through a major segment of Western academic idealism from 1917 to the end of the Cold War, with endless physicists claiming that nuclear power is proof we must embrace socialism or be vapourised. They were wrong. In those parts of science where personal attacks take the place of scientific criticism, there is a problem of groupthink (corruption due to funding pressures) which is like fascism. Fascism is the short cut whereby you assert and suppress dissent. Pseudosciences like eugenics were promoted with paranoid scare-mongering tactics; why take the risk with the Jews? If some famous and fashionable people become dictators who suppress the facts, then they need to be insulted, which is the only option left after they have banned all discussion and rational debate about understanding the facts scientifically. Conflicts don’t occur because the weaker side prefers war to rational discussion, but because the weaker side is banned from discussion. The problem with AGW dogma is the money sucked into the industry. Get trade unions fighting for the jobs of workers who build solar panels and windfarms, and the media supporting them, and there will be a political intervention to control science. Once you have invested too much in something, you have to see it through even if it turns out wrong. There is too much momentum to stop or reverse it. If Michael Mann or Phil Jones reversed their position, that wouldn’t reverse the financial bandwaggon build on the back of AGW dogma. There is no conspiracy, just momentum. If a large ball of snow begins rolling down a ski slope, getting bigger as it goes, you don’t need a conspiracy to explain why people who get in its way disappear suddenly.

“Never ascribe to malice that which is adequately explained by incompetence” (Napoleon). AGW started out in 1896 in the “genuine” idea by Arrhenius (famous for his reaction rate equation) that CO2 will increase temperature. He falsely believed without any evidence about cloud cover and its negative feedback or the reason why there was not a runaway greenhouse effect from water vapour on earth, that trace gases were responsible for ice ages. If you look at the actual correlations in Vostok ice core data for trapped CO2 bubbles and oxygen isotope ratios, you see some correlation, but vitally the temperature changes in many spikes slightly before the CO2. So it suggests that a temperature rise killed off some rainforests and thus reduced global photosynthesis of CO2 to oxygen, allowing CO2 levels to rise as a result of temperature rises. This is the exact opposite to the CO2 temperature-driving mechanism Arrhenius speculated. Arrhenius was wrong. The problem I have is with authority being mistaken for fact in mainstream science. I think the whole basis of science as a political or officialdom based totem-pole of power is wrong. This is not about “conspiracy”.

We see this in the January 1986 Challenger space shuttle enquiry. The engineers testing the rubber O-rings knew that an immense risk was being taken by launching the shuttle in cold conditions, because the rubber was brittle and leaked fuel when icy cold. Their boss however had to maintain the contract with NASA, or they might all lose their jobs. He asked them sarcastically if he should tell NASA to delay launch until April. No concerns were raised with NASA until the shuttle exploded and the Presidential inquiry with Feynman was done. It’s no good claiming a “conspiracy”. Once you have an error but money is flowing, you don’t need to order people to shut up. The pressures to conform in the usual management structure prevent any clear message being passed upwards to a level where it will do any good. In fact, like the “Emperor’s New Clothes”, even if the Emperor at the top hears that he’s been conned, he is in a jam and can’t do anything without becoming unpopular or looking even more of a complete fool. So he keeps his Godly act up until he is out of power, or dies.

You can never be “wrong” when you want to save the planet. It’s not about science, so much as a emotional claptrap. The same is true of superstring theory: it’s emotionally defended as “beautiful” and “the work of great minds”. This kind of emotion is a goalpost that is switched for falsifiable predictions whenever needed. It’s pure hubris, the kind of propaganda poured out by Dr Goebbels and later the Moscow based World Peace Council during the Cold War. There is a point at which conformity becomes dogma, and professionalism becomes conformity. Then professionalism is concerned with dogma. At some point, however, let’s assume that the critics come up with a viable alternative theory. By that time, the mainstream science has hardened into a orthodoxy supported by billions of people and trillions of dollars. The popularity of the facts will then be received about as well as Jesus by the High Priest. The only option is to ignore or shoot the messenger. This is where the fascism come in. Take the “Physics Forums” issue. I posted a discussion thread on gravity. The only people to comment hadn’t bothered to read my paper, but asserted it was wrong because of errors in other people’s papers or asserting that spin-2 graviton dogma is a proved fact because two masses exchanging gravitons must exchange spin-2 gravitons in order to attract (which is true, but irrelevant if the two masses exchanging gravitons would actually repel if a two mass universe existed which it doesn’t; “attraction” from repulsion because there are lots of masses all around around). Anyway, one string theory student commented there that if he thought I was really correct, he would give up physics because the universe would be ugly and mechanical and the elegance of the maths of string theory was the whole reason for his passion in physics. I think this is really bitter fascism, this complete and unashamed loss of scientific honesty in favour of fashionable lies. Someone else, Professor Sean Carroll who has Feynman’s old desk at Caltech, blogged that there is no censorship of alternative ideas, and if anyone really has the final theory of quantum gravity he would see it into print. He hasn’t done so, but by making such false claims he appeases those who would otherwise be worried by groupthink in science.

(Furthermore, if only a “final theory” is going to be supported, that would rule out Newton’s and Einstein’s papers, and everything previously done in science. If you claim that everyone is free to pursue any new idea in science because, should they find the final theory, some Professor claims he will publish it, you’re ruling out anything short of a final theory, which would rule out everything in science to the present time. In other words, it’s too stringent a criterion. Newton and Einstein in any case didn’t work out new ideas in complete isolation, they relied on the data and tools of people like Galileo, Brahe and Kepler, and Riemann, Levi-Civita and Ricci. If you block off alternative ideas unless or until a “final theory” emerges from them, it’s just fascism, because it takes away the motivation to try to publish the intermediate stages on the way to a final theory in the alternative framework. AGW does the same thing, by asserting authority to suppress controversy using peer-review politics.)

This was the liebestraum problem tackled by a German chancellor in the 1930s. The traditional solutions to overcrowding is starvation. AGW will help ensure this because the economic resources being invested in AGW will take away those resources from the usual poverty-fighting efforts, as the global recession deepens. You can’t have your cake and eat it. Sustainable wind power and carbon balancing schemes are expensive and what is spent preventing an imaginary AGW disaster will be unavailable to help prevent mass starvation when harvests fail. Debt will limit responses. However, I don’t think any doomsday scenario is real. There are automatic feedback mechanisms in place. When overpopulation really gets bad, most people (with the exception of some regional irresponsibles) will start having smaller families because the expense of having many kids is excessive. Similarly, when pollution really gets bad, if something can be done about it, people will do it. E.g., the New York sewage system and London sewage system histories. People live with problems until a real nuisance, then solve them. Predictions of doom creeping up by accident while everyone looks the other way except for scientific journals that censor alternatives and criticisms of the lying propaganda, are absurd (see Herman Kahn’s “The resourceful earth”). Doom creeps up because of censorship of criticisms by mainstream dictatorial fascist movements which disguise themselves as planet saving, zero-risk groupthink idealism. The pacifist movement led by Cyril Joad’s Oxford Union 1933 pacifist motion (which encouraged the new dictator Hitler to do what he wanted) is a perfect example of the “why take the risk?” approach of these idealists. They always claim – without proof – that the only risk is from the alleged danger they hype (i.e., the “risk” that Britain would become “war minded” if it tried to stop the Nazis by force rather than by peaceful collaboration, civilized talking, peace deals, and mutual cooperation pacts). They ignore the risks from the courses of action they propose, while exaggerating the risks from the course of action they oppose. In order to prevent criticism, they shoot the messenger in fascist sytle whenever anyone disagrees with them, e.g. see Cyril Joad’s attack on Winston Churchill in his August 1939 best-seller “Why War?” The danger since the time of Jeremiah has been excessive doom-mongering (usually for fame, political power, or financial profit), not doomsday. Doomsday claims are used to “justify” costly political moves like unjustified wars, dictatorships, and genocide. Ignoring critics is key to this ongoing process.

Update: Dr Woit had published an article in the left-wing Italian newspaper Il Manifesto and comments depressingly on his blog:

Article for Il Manifesto

“… it’s now all too clear where we end up: the textbooks of string theory and supersymmetry have already been written, and that will be codified as humanity’s best understanding of fundamental physical reality for the indefinite future. …”

This historically has always happened in what the media call “science”: the social education side of knowledge (exam syllabus and media hero-worship-of-alleged-“genius”-bigots) forces fundamental physics to turn its reigning “best guess” theory into educational dogma. Then the best guess theory (flat earth/creationism/epicycles/vortex atoms/aether) hardens into orthodoxy, and fascist doorkeepers shoot alternative ideas by the simple lie that anythig that disagrees with the mainstream belief must be “wrong” by definition. The arguments for this:

(1) There are no viable alternatives, so you must support it or you are a terrible proponent of anarchy. (This amounts to saying if you live in dictatorial regimes, you must support dictatorship because you have “no alternative”.)

(2) For harmony, civilized behaviour and politeness in science, everybody must always sing from the same hymn sheet for the common good or for socialist/fascist/environmentalist/universe preserving/antinuclear ideals. Otherwise, the result is confusion or ugly chaos. (This amounts to the support of power through corrupt unity; sheer group-think power politics. By analogy, the argument would be that if you oppose USSR/Nazi dictatorship, you should join it, because then you will have more chance of reforming them in the direction you want, than you ever had while on the outside of that group. If you refuse to cooperate with the dictators, Dr Goebbels becomes angry with you, calling you a “rebel”.)

(3) Science is defined by human socialist consensus, and not by experiments or confirmed predictions. (Despite the lessons of Ptolemy’s epicycles, Maxwell’s mechanical aether, Kelvin’s vortex aether atoms, Witten’s M-theory, and so on, this statement is still taboo. Those brainwashed in lies will claim that science is a consensus of experimental evidence, despite string theory, and then they move the goalposts specifically to excuse the difficulties with today’s dogma.)

My first contact with the problems of science was when my hair changed colour from red to brown when a teenager, despite the reigning educational genetics dogma that genes produce permanent, unchangable characteristics. This was not dye, and was not a speculative theory. We inherit two versions of each gene, and the old genetic theory of dominant and recessive genes is wrong: no gene is 100% dominant or 100% recessive. (I’m not saying that hair colour is controlled by just a single dominant gene, but gene switching does control colour change.) Further, the “actual” percentages of deviation from Mendel’s simplistic dominant/recessive genetics theory (based on peas) are simple not fixed constants, as was originally believed when epicycles were inserted into the original theory to allow for discrepancies. They are variable, depending on circumstances: hence “gene switching” between the supposedly dominant and supposedly recessive gene is possible. During life, the concentrations of different chemicals in the blood stream vary (due to hormones, diet, stress, exercise, etc.) and these chemical changes can sometimes be sufficient to cause “gene switching”; the “dominant” gene in the pair of genes in each cell is not fixed, but depends on the chemical environment it is immersed in. Hence the “genetic” diseases inherited identically by both individuals in a pair of identical twins do not occur with equal likelihood in each twin. Although they each contain the same pairs of genes, the dominance of each gene in a pair within an individual is a function of the environmental circumstances (work stress, diet, exercise, sunshine exposure, etc.). Therefore, it is possible for differences to occur between identical twins, due to gene switching.

Suppose you have a faulty gene for protein P53, a DNA repair enzyme which repairs breaks in DNA strands that result from free radicals and natural water molecule bombardment at body temperature. If the DNA breaks are not repaired rapidly enough (before further breaks occur), the DNA fragments when eventually repaired can be transposed (out of order), causing a cancer risk. Therefore, if gene switching at some point during your life turns on a faulty version of protein P53, you are at risk from cancer. If this gene switching does not occur, your good version of P53 remains in operation, and you have very much better protection. It follows, then, that the old fatalistic idea that “genes are immutable” is false dogma. The way to prevent cancers and other genetically related diseases is to understand the epigenesis mechanism by which “dominant” genes are expressed, as a function of their chemical environment. Thus, the role of some empirically-discovered cancer drugs whose theoretical mechanisms are not understood chemically “very well”, is probably related to gene switching. Some of these chemicals probably work by gene switching: turning off the genes of defective cancer-suppressing enzymes, and turning back on the working versions of those genes. This would explain some statistical anomalies in the effectiveness of these treatments. E.g., a person who has inherited two versions of a defective cancer suppressing gene will be at risk from cancer from an early age and will not respond to these chemical treatments because switching from one defective gene to the other equally defective gene will make no difference in the cancer situation. Most people will statistically be likely to only have one bad gene, and therefore will respond to treatment. In summary, the switching role of drugs used for disease treatment at present may be obfuscated by ignorant accepted dogma. This affects the funding and the research priorities.

Another emerging taboo is the effect of the insulin-like growth hormone activator IGF-1 in the ageing process and disease. By promoting rapid cell division and inhibiting cell death, high levels of IGF-1 in the blood promote cancer proliferation and ageing. Goodwin and others showed in 2002 (Journal of Clinical Oncology, v20, pp42-51) that excess insulin promotes cancer growth and correlates with mortality. (Unfortunately Goodwin’s research studied the end results on people who had cancer, not the risk of getting cancer in the first place, as a function of insulin level.) Malignant cells are continuously dividing, with high energy requirements and cannot survive fasting. Non-cancer cells can regulate their metabolism to survive fasting. Fasting affects cancer risks. Pity this isn’t better researched (drug companies have a very different approach!).

Dr Woit: how to be greasy on the subject of Gerard ’t Hooft

’t Hooft won a Nobel Prize share for proving mathematically that the Higgs mechanism used for electroweak symmetry breaking in the Standard Model of particle physics is mathematically renormalizable. I.e., at very high energy the Higgs mechanism (which makes weak bosons massive at low energy) allows symmetry to exist between electromagnetic and weak interactions, by making the weak gauge bosons (W and Z bosons) massless. Since it is the mass of the weak bosons at low energy which slows them down and makes the weak force less strong than the electromagnetic force at low energy, taking away their mass at high energy makes the weak force coupling the same as that of the electromagnetic force, thus “unifying” electromagnetism and weak interactions. However, as people like Dr Woit have pointed out, the problem with electroweak unification is that the weak force is chiral (only acting on left handed helicity spinors), but the electromagnetic force isn’t supposed to be. Maxwell in 1861 argued that magnetism is due to field quanta (he called it vacuum vortices or aether, but that was the fashion in 1861) spin as a result of charges spinning while in motion and imparting some angular momentum to force-mediating vacuum field quanta. According to Maxwell, therefore, the fixed direction of the curl of the magnetic field which loops around a wire carrying an electric current is evidence that electromagnetism is a chiral effect, so electromagnetism has a preferred handedness. This is completely ignored in textbook QFT. The chiral handedness of electrons for the weak force only emerges as a function of their velocity. At low velocity they don’t have a helicity, just a spin whose axis is not necessarily aligned with its direction of motion. However, as the velocity approaches that of light, the spin becomes aligned along the direction of motion due to relativity (i.e. the Lorentz contraction, which flattens the electron): this is helicity. For an electric current of 1 amp in a wire, the electrons typically flow at only 1 mm/second, so you don’t expect much helicity since their velocity is so small compared to the velocity of light. However, the magnetic force is relatively weak, and the way it emerges as a function of the velocity of the electrons is what you would expect for helicity of spin on the basis of Maxwell’s model of magnetic fields. All of this is ignored in the Standard Model, which does not explain the emergence of the left-handed weak force when electroweak symmetry breaks at low energy.

In his post http://www.math.columbia.edu/~woit/wordpress/?p=5022 , Dr Woit states:

“Gerard ’t Hooft in recent years has been pursuing some idiosyncratic ideas about quantum mechanics … Personally I find it difficult to get at all interested in this (for reasons I’ll try and explain in a moment) … One of ’t Hooft’s motivations is a very common one, discomfort with the non-determinism of the conventional interpretation of quantum mechanics. The world is full of crackpots with similar feelings who produce reams of utter nonsense. ’t Hooft is a scientist though of the highest caliber, and as with some other people who have tried to do this sort of thing, I don’t think what he is producing is nonsense. It is, however, extremely speculative, and, to my taste, starting with a very unpromising starting point.

“Looking at the results he has, there’s very little of modern physics there, including pretty much none of the standard model (which ’t Hooft himself had a crucial role in developing). If you’re going to claim to solve open problems in modern physics with some radical new ideas, you need to first show that these ideas reproduce the successes of the estabished older ones. From what I can tell, ‘t Hooft may be optimistic he can get there, but he’s a very long way from such a goal.

“Another reason for taking very speculative ideas seriously, even if they haven’t gotten far yet, is if they seem to involve a set of powerful and promising ideas. This is very much a matter of judgement: what to me are central and deep ideas about mathematics and physics are quite different than someone else’s list. In this case, the central mathematical structures of quantum mechanics fit so well with central, deep and powerful insights into modern mathematics (through symmetries and representation theory) that any claim these should be abandoned in favor of something very different has a big hurdle to overcome. Basing everything on cellular automata seems to me extremely unpromising: you’re throwing out deep and powerful structures for something very simple and easy to understand, but with little inherent explanatory power.”

’t Hooft commented on these remarks on the blog post (August 13, 2012 at 6:24 pm
):

’t Hooft on Cellular Automata and String Theory

“Even though my work is here sketched as “not even wrong”, I will avoid any glimpse of hostility, as requested; I do think I have the right to say something here in my defense … I want to stress as much as I can that I am striving at a sound and interesting mathematical basis to what I am doing; least of all I would be tempted to throw away any of the sound and elegant mathematics of quantum mechanics and string theory. Symmetries, representation theory, and more, will continue to be central themes. I am disappointed about the reception of my paper on string theory, as I was hoping that it would open some people’s eyes. Perhaps it will, if some of my friends would be prepared to put their deeply rooted scepsis against the notion of determinism on hold. I think the mathematics I am using is interesting and helpful. I encounter elliptic theta functions, and hit upon an elegant relation between sets of non-commuting operators p and q on the one hand, with integer, commuting variables P and Q on the other. All important features of Quantum Mechanics are kept intact as they should. I did not choose to side with Einstein on the issue of QM, it just came out that way, I can’t help that. It is also not an aversion of any kind that I would have against Quantum Mechanics as it stands, it is only the interpretation where I think I have non-trivial observations.
If you like the many world interpretation, or Bohm’s pilot waves, fine, but I never thought those have anything to do with the real world; my interpretation I find far superior, but I just found out from other blogs as well as this one, that most people are not ready for my ideas. Since the mud thrown at me is slippery, it is hard to defend my ideas but I think I am making progress. They could well lead to new predictions, such as a calculable string coupling constant g_s, and (an older prediction) the limitations for quantum computers. They should help investigators to understand what they are doing when they discuss “quantum cosmology”, and eventually, they should be crucial for model building. G. ’t H.”

Dr Woit then responded (August 13, 2012 at 6:39 pm ):

’t Hooft on Cellular Automata and String Theory

“Prof. ‘t Hooft,

“Thanks for writing here with your reaction to and comments on the blog posting. I hope you’ll keep in mind that I often point out that “Not Even Wrong” is where pretty much all speculative ideas start life. Some of the ideas I’m most enthusiastic about are certainly now “Not Even Wrong”, in the sense of being far, far away from something testable.

“While my own enthusiasms are quite different than yours, and lead me to some skepticism about your starting point, the reason for this blog posting was not to launch a hostile attack, but to point others to what I thought was an interesting discussion, one which many of my readers might find valuable to know about.

“Good luck pursuing these ideas, may you show my skepticism and that of others to be mistake…”

Subsequent anonymous comments, which Dr Woit allowed to be published, falsely claimed that Hooft was wrong because of Bell’s inequality had dismissed deterministic hidden variable theories:

Anonymous says:
August 13, 2012 at 8:10 pm

’t Hooft on Cellular Automata and String Theory

“Prof. ‘t Hooft,

“While I am not familiar with your particular work, I am familiar with previous explorations on the theme of interpretations on quantum mechanics and determinism, particularly with old things such as de Broglie-Bohm’s theory, Bell’s contextual ontological model, Kochen-Specker’s model, and newer things such as Harrigan & Spekkens classification of ontological models, Lewis et al. psi-epistemic model, Hardy’s excess baggage theorem, etc. But after studying them with interest for a while, I gradually developed the opinion that they have no good motivation, use uninteresting mathematics, and have been generally fruitless. Since then I have stopped paying attention to this area of research …”

What Dr Woit should learn is that Darwin was deterred from publishing his evolution theory for twenty years, not just because of the religious taboo, but because of the Lamarkian evolution theory which came earlier, but contained errors and was rejected. There is an industry of “peer” review censorship liars, which responds to every new advance with something like:

“There is nothing new under the sun. While I am not familiar with your particular work, and can’t be bothered to read it and check it carefully, I am familiar with previous explorations on the theme. Because these are known to be, their authors are of higher profile than you are, which proves them more intelligent than you. If they got it all wrong, what hope is there that your paper contains anything worthy of being published? Previous research had no good motivation, used uninteresting mathematics, and was generally fruitless. Since then I have stopped paying attention to this area of research. If you can convince someone like me who won’t read your work or check it that it is correct, then I will read your work. But note: I won’t read your paper until after you have convinced me. If I need to read your paper to be convinced, then too bad…”

Deterministic hidden variables theories and Bell’s inequality have nothing to do with real world physics, which isn’t 1st quantization. It’s like epicycles. You can still use Ptolemaic epicycles today to give rough predictions of apparent (two dimensional celestial sphere) planetary positions, despite the theory having nothing to do with real world physics (planets have elliptical orbits around the sun, not epicycle orbits around the earth; Ptolemy’s model was complex and failed to account for the distances of the planets from the earth correctly). Just because a 1st quantization looks good at first glance (just as the sun appears to orbit earth, at first glance), does not prove it to be relativistic and correct. This is of course completely taboo, despite being factually correct. The industry of “wavefunction collapse” popularization has succeeded in selling false, epicycles to the public. There is no indeterministic wavefunction that collapses upon measurement; multiple wavefunctions exist, one wavefunction amplitude per path, and it is the interference of these multiple wavefunctions which gives rise to indeterminism. We still use words like “sunrise” even though we know that the earth’s rotation is bringing the sun into our field of view; the sun is not orbiting the earth daily. This is the situation with 1st quantization; the taboo over mechanisms in quantum field theory allows both 1st and 2nd quantization to co-exist side by side. Fine for rough calculations. Not so good for understanding what is really going on. When is multipath interference of many wavefunctions going to replace the non-relativistic single wavefunction “collapse” dogma?

History is the problem. Dirac in 1928 only half introduced 2nd quantization: he made the 1st quantization Schroedinger equation relativistic by his relativistic spinor equation for the Hamiltonian energy (which replaced the non-relativistic Hamiltonian of 1st quantization). While the spinor Dirac introduced implied that the field was quantized, Dirac failed to correctly realize that the single wavefunction of the Dirac equation (Schroedinger equation with Dirac’s relativistic Hamiltonian energy operator) was rendered obsolete by the quantized field. Interviewed in America when the Weyl gauge theory of quantum electrodynamics was published, Dirac stated that he didn’t understand Weyl’s work. The fact is, thre is an amplitude (wavefunction) for every possible quantum field interaction with a charge, so you require a path integral, summing all the amplitudes, to make a probabilistic prediction of what will occur due to the interference of those amplitudes. This was finally grasped by Feynman, but continued to be opposed (and rejecte) until Dyson battled Oppenheimer in 1948. There were numerous spurious reasons given by “greats” like Einstein, Bohr (who said that modelling electron orbit paths with a path integral was against the dogma of the uncertainty principle) and others, to dismiss path integrals as obviously wrong. They aren’t, but the taboo over their reality persists today, to the detriment of progress in physics. Instead of having progress in the mechanism of quantum field theory well funded and published, it is censored out and the messengers with useful confirmed predictions are dismissed by people who are too grand to even read the messages and check them.

A long term solution to this problem would involve replacing today’s subjective and abusive form of so-called “peer” review (“your theory is about evolution so it must be wrong because Lamarke came up with a theory of evolution which turned out to be wrong, and he is more famous and thus more intelligent than you are!”) with objective and scientific genuine peer-review, where the “peer” reviewers are actual peers, interested in communicating progress in science more than publishing fashionable papers by fashionable scientists. You know how this works in the real world. You put forward a confirmed prediction, and the response is a rhetorical question (to which answers are not permitted) or inaccuracy-filled “responses” which ignore the point you make and point out the errors in somebody else’s theory instead. As you calmly correct the errors and give scientific answers to rhetorical questions, the “critic” becomes more and more infuriated, instead of being won over. Such people are not behaving rationally. The problem with science is not peer review, therefore, but the absence of peer review. If there was constructive criticism, there would be no problem. Instead, there is dogmatic bigotry masquerading as peer review. The corruption of power peer-review is getting worse.

One way to look at this is the nature of evolution or special relativity in the context of Popper’s definition of science. In special relativity, Lorentz contraction, time-dilation and mass increase are all functions of the velocity of a particle relative to the observer. This relativism is also present in Maxwell’s equations, where a magnetic field is observed if an electric charge is in motion relative to the observer. This is all well justified by experiments. What’s not so clear is whether the great utility of relativism is a proof that there is no absolute motion or absolute time. This is where pedalogical sophistry come in. What is science? Is it a proof of the nature of the universe, or just a way of making some falsifiable predictions? The teacher wanted both, despite the failures of past theories. The teacher had to attract students, and you do that better by offering truths about curved space or extra dimensions, than just making predictions with a handy mathematical model (whose ultimate physical validity is controversial).

Popper insisted that science is not absolute truth and is just a best guess theory, justified by the failure of experiments to disprove (falsify) it. Occam’s razor says science is the simplest theory to fit the facts. Feyerabend says in his book “Against Method” that science is pragmatic: it is whatever method works best for those who have to use it. Thus, if the people using a theory don’t need very great accuracy they can choose non-relativistic physics such as 1st quantization, but if they want better accuracy they have to go over to using relativistic 2nd quantization quantum field theory. Similarly, the Bohr atom is still taught in high school physics courses simply because it uses less sophisticated mathematics than 1st quantization quantum mechanics or 2nd quantization path integrals. This mathematics effect is very important: it introduces Orwellian doublethink into physics. People get used to false models being used for pragmatic purposes, to facilitate quick calculations.

Anyone trying to point out the “correct” theory in this situation is then dismissed as being ignorant of the fact that simplistic theories can be used for convenient calculations. In other words, wrong theories end up surviving and cluttering up the scene, preventing the right questions from being asked (since they allow the goalposts to be changed whenever a question is asked) and advanced mathematics theories are less widely understood than pedalogical sophistry like the claim that general relativity has experimentally proved space to be curve. Eugenics is such a wrong theory. Popper’s idea that you can falsify a theory by experimental test is naive. Anyone can usually add epicycles to a wrong theory to bring it into agreement with the data. The world is complicated, and sometimes it is impossible to avoid modifying a theory to include variables which were originally omitted and ignored. In other cases, it might be best to re-examine the foundations of the theory when experiments come out against it.

Newtonian gravity failed to predict the precession of the perhelion of Mercury. This did not “falsify” Newtonian gravity. People use the most useful available theory for the problem they have. There are issues with all theories, but this doesn’t falsify all theories in any sense. If science research runs up against a wall, there are two popular pieces of advice: (1) “when in a hole, stop digging”, and (2) “when going through hell, keep going.” These contradict. Diversity is needed in science, because it’s a subjective judgement call when the groupthink herd decides to move away from one particular idea, or to approach another idea. If everyone sticks to existing fashion, you end up with a technician-led science which just concerned with applying and using existing theories like superstrings, not developing new ideas. So the superstring technicians keep hyping their fiddling as being “new” in press-releases. Likewise, Ptolemaic epicycles were added to and modified for generations, giving the appearance of a dynamic, progressive scientific discipline, with spin-offs like trigonometry. It is the cult-like dogma of reigning “scientific” orthodoxies which leads to uninformed claims about them being justified by predicting gravity (non-quantitatively). Bertrand Russell said that, as a rival theory to evolution, God could have created the universe 5 minutes ago, including the fossil record, the works of Darwin, and everybody’s memories, just for entertainment. You cannot disprove this “simple theory”, because there are no falsifiable predictions. Just like superstring theory.

Nobel Laureate Prof. Josephson has a discussion of arXiv initially barring a paper of his http://www.tcm.phy.cam.ac.uk/~bdj10/archivefreedom/main.html

Professor Josephson’s discussion of arXiv censorship finishes:

“It is true, of course, that standards should be maintained. But the problem with the uninspired persons who operate the archive is that they seem unable to make the distinction between ‘nutty’ ideas (which either have no scientific meaning or contain serious errors), which should be barred from the archive, and unusual ideas which may or may not be right, and also may turn out to be important, which should be allowed on the archive.”

The arXiv itself states at http://arxiv.org/help/endorsement :

What are my responsibilities as an endorser?

“The endorsement process is not peer review. You should know the person that you endorse or you should see the paper that the person intends to submit. We don’t expect you to read the paper in detail, or verify that the work is correct, but you should check that the paper is appropriate for the subject area. You should not endorse the author if the author is unfamiliar with the basic facts of the field, or if the work is entirely disconnected with current work in the area.”

This bans endorsers from permitting radical new ideas (“entirely disconnected with current work”), while permitting the more usual incremental development of stringy ideas. I.e., Copernicus would be banned since his solar system was entirely disconnected with current work on epicycles in the Earth centred universe. Likewise, other radical new breakthroughs from outsiders like Patent Examiner Einstein would not fit into the current work. This disconnection from current work is the whole definition of a radical breakthrough. If arXiv had been around before quantum theory, it could have kept physics classical by deleting quantum submissions and blocking the hosting of those papers. “Better safe than sorry” has two sides to it when it comes to censorship. If you want to ban ideas without reading them to check them (you don’t have time, like Hitler), you’re into Nazi book burning territory. It’s amazing how so many Guardian or Washington Post newspaper readers have no concern about the early symptoms of dictatorial fascism, and are prepared to declare that the press is free because their bigoted and incorrect views are represented without informed debate. In true Orwellian “1984” style, emotional claptrap is used to “justify” the banning of any meaningful dissent against the fashionable and popular ideology which aims to “save the world” by causing an insignificant decrease in carbon emissions at economically disastrous cost. “Four legs good, two legs bad” as Orwell put the endless “protestor” bleatings in another book. This endless chanting of hype and half-truths actually works. That’s why they use it in adverts!

Update (5 September 2012): Dr Woit on the alleged abc conjecture proof by Shin Mochizuki

Proof of the abc Conjecture?

Click to access Inter-universal%20Teichmuller%20Theory%20IV.pdf

“In the case of the Szpiro proof, the techniques he was using were relatively straightforward and well-understood, so experts very quickly could read through his proof and identify places there might be a problem. This is a very different situation. What Mochizuki is claiming is that he has a new set of techniques, which he calls “inter-universal geometry”, generalizing the foundations of algebraic geometry in terms of schemes first envisioned by Grothendieck. In essence, he has created a new world of mathematical objects, and now claims that he understands them well enough to work with them consistently and show that their properties imply the abc conjecture.

“What experts tell me is that, very much unlike the case of Szpiro’s proof, here it may take a very long time to see if this is really a proof. They can’t just rely on their familiarity with the usual scheme-theoretic world, but need to invest some serious time and effort into becoming familiar with Mochizuki’s new world. Only then can they hope to see how his proof is supposed to work, and be able to check carefully that a proof is really there, not just a mirage. It’s important to realize that this is being taken seriously because such experts have a high opinion of Mochizuki and his past work. If someone unknown were to write a similar paper, claiming to have solved one of the major open questions in mathematics, with an invention of a strange-sounding new world of mathematical objects, few if any experts would think it worth their time to figure out exactly what was going on, figuring instead this had to be a fantasy. Even with Mochizuki’s high reputation, few were willing in the past to try and understand what he was doing, but the abc conjecture proof will now provide a major motivation.” [Emphasis added to key sentences in bold print.]

This is precisely analogous to the rebuilding of the foundations of quantum field theory and the Standard Model built on it, which yields quantum gravity with checked predictions. The whole way of thinking about what the “problems” in unifying the Standard Model with general relativity is traditionally biased in favour of the existing framework built on foundations which are inadequate and misleading in some key respects. This means that, as with Mochizuki’s proof, you have a situation where “few if any experts would think it worth their time to figure out exactly what was going on, figuring instead this had to be a fantasy.” In other words, there is a pedalogical and marketing problem in presenting a predictive theory that renovates the foundations of a subject in order to work.

This statement by Dr Woit is enlightening in view of his statements in the past about “elitism” in science, which are only partly helpful. The world has always had different kinds of “elitism”:

1. Dictatorial obfuscation. Become “respected” by force or by cunningly sneakiness. Appear mysterious by using secrecy or obscuring the unpleasant facts that most people don’t want to hear.

2. Innovate, predict, check results, correct errors.

Both Woit and Witten have commented on “elitism” unhelpfully, by failing to distinguish what “elitism” they refer to. The word has two diametrically-opposed meanings. It can mean the elite leadership of a dictatorship, media or popular fashion, or it can mean an attempt to achieve genuine scientific integrity (thus people like Galileo being put under house arrest for innovation). It is convenient for most people to conflate both these opposing meanings together into Orwellian “doublethink”, so as to pretend that “science” is an all inclusive term for both teaching “established” educational group-think dogma about today’s fashionable theories, and for innovating and being critical. Then they can switch between opposite meanings of the same word when people object to “elitism”. If critics object to “elitism”, they’re objecting to ignorant dictatorship, but Witten’s letter to Nature seems to conveniently change the goalposts at the critical moment, interpreting “elitism” not as ignorant dictatorship but scientific integrity. We need more good “elitism”, and less bad “elitism”.

Identical semantic sophistry occurs with the word “censorship”, something that again we need more of in the positive sense. We need more censorship to objectively criticise fashionable speculation and to publish factual, confirmed predictions and corrections to errors in existing “well established” theories. We need less censorship of ideas on the basis that they contradict unconfirmed fashionable speculation. This fact, that we need more objective censorship, is routinely ignored. If you are constructively critical of censorshi censors try to “defend” themselves by lying that you are simply against “censorship”, and then “explaining” why “censorship” is necessary to reduce the noise level. Yes, censorship is necessary to reduce the noise level and so to allow communication of facts, but it must be objective, not based on fashion. We need objective censorship, not lazy censorship.

Data on cross-sections (relative reaction rates) for Higgs boson decay processes

Enrico Fermi suggested that when a neutron decays into a proton, electron, and antineutrino, the process is identical to a neutron and a neutrino scattering (a reaction with an effective cross-sectional target area or “cross-section”) with a change of charge and mass, so that a proton and an electron emerge. This enabled weak decay to be treated as a “simple” particle scattering interaction, with an effective cross-section. In 1967 the “electroweak theory” was developed which unified the weak strength of this weak reaction with the electromagnetic gauge theory force (which is far stronger at low energy) by inserting a massive (80 GeV) charged W vector boson into the weak interaction process, this mass being necessary to explain the observed weakness of the weak force relative to the electromagnetic force. The W boson with 80 GeV mass and other properties as predicted was discovered in 1983 at CERN, and now the “Higgs” particle which is postulated to give the 80 GeV mass to the W boson has supposedly been discovered, again at CERN:

“The only fly in the ointment is its decay rate to two photons. This is nearly twice as large as expected. The significance of the discrepancy with the standard model is about 2.5 sigma. It could be a fluke. We have learnt to show some healthy skepticism when it comes to observations of physics beyond the standard model. However it is also consistent with an enhancement due to the presence of another charged boson. If that boson exists it must have a mass at least a bit larger than the W otherwise the Higgs would decay to this particle in pairs and we would see the effect on the other decay rates. It can’t be too massive otherwise it would not enhance the diphoton rate enough.” – Dr Philip Gibbs

The latest data on the quantities of 125 GeV massive spin-0 bosons seen by the CMS and ATLAS detectors at CERN’s LHC can be compared to the Higgs boson cross-sections for different reactions (e.g. decay processes) predicted by the Standard Model of particle physics (the electroweak theory).  The results show that the ratios of observed/expected signals for different decays are:

1.0 for two neutral weak bosons (ZZ),

1.75 for two gamma rays, and

0.75 for two two charged weak bosons (WW).

Dr Woit comments: “The bottom line is that, within errors, everything is consistent with the SM predictions. The gamma-gamma channel is the one to watch, it is about 2 sigma high.”

A preprint issued yesterday by Pier Paolo Giardin and others, called “Is the resonance at 125 GeV the Higgs boson?”, states: “The recently discovered resonance at 125 GeV has properties remarkably close to those of the Standard Model Higgs boson.”

A comment today by Mohit Sinha on Woit’s blog discusses the discrepancies in decays, suggesting that the 2-sigma excess in the double gamma ray production (i.e. 2 statistical standard deviations in a Gaussian/normal dfistribution error curve; not to be confused with the observed/expected ratios) “could be pointing to another not-yet-discovered boson along with the Higgs-like boson just discovered”, while the underestimated double W production decay data may weaken the case for spin-0 and instead suggest that the new 125 GeV boson is a massive spin-2 vector boson (of relevance to quantum gravity gauge theories).  The detection of double gamma ray decay rules out spin-1, which would violate the conservation of momentum, since gamma rays are spin-1, but doesn’t rule out spin-0 or spin-2.  If the low WW production debunks spin-0, then that would leave spin-2 by elimination.  However, gravity itself is long-ranged and so its quanta can’t have rest mass, so if there is a spin-2 massive boson it’s not the graviton, although if quantum gravity is a gauge theory which connects into the Standard Model, you can expect some symmetry breaking boson (although conventional stringy ideas would suggest that the quantum gravity symmetry breaking scale would be near the immense Planck mass, far greater than the LHC can see).  But the most probable explanation is simply that the relatively small amount of data available on WW production in spin-0 decays has given an inaccurate result, which will improve when more data is accumulated.

One good example of a symmetry breaking massive pseudo-Goldstone boson which acts as a vector boson is the pion, which mediates the strong nuclear attractive force between nucleons (neutrons, protons) in the nucleus, keeping it bound together against the mutual electromagnetic repulsion from the protons.  The pion is a QCD symmetry breakdown pseudo-Goldstone boson, but acts as a vector boson.  Note that the pion is a composite particle, containing one quark and one anti-quark, each having spin-1/2.  The combination acts as an effective spin-1 boson, just in the same way that superconductivity arises from the Cooper pairs of electrons (fermions, each spin-1/2) coupling their spins together to form effective “bosons” of spin-1, which lose all electrical resistance and propagate like massive (slower than light) photons.  It’s possible that the spin-0 massive boson is a composite, by analogy to these examples.  The pion is not a fundamental particle, since it contains two fundamental particles, but nevertheless (1) it arises through symmetry breaking, and (2) it acts as a vector boson for the nuclei-scale strong force (gluons of course mediate the QCD force between individual quarks).  What concerns me, as my paper shows, is that the electroweak Z boson’s 91 GeV mass seems to be the building block of the masses of fundamental particles.

Another commentator today on Woit’s blog, “truth” (who seems to think like a string theorist) claims: “The Goldstone models couple to the W, Z bosons to give them mass and the vev gives mass to the fermions. None of that requires the extra degree of freedom which is the Higgs boson. The only reason we have to add this extra degree of freedom is to ensure the theory is unitary at high energies. So what the LHC has discovered is that unitarity is respected by nature. This is the real content of the discovery. It is quite interesting to me that unitarity is the guiding principle of string theory, i.e., string theory is the only known consistent theory of gravity that exactly respects unitarity. This is extremely interesting.”

This is just a circular argument or assertion of dogma.  The data available are no proof that the massive spin-0 boson detected is precise confirmation of the electroweak theory with Higgs mechanism, so interpreting the data this way and then asserting that this speculative assertion amounts to a proof of unitarity and string theory is absurd.

Another commentator on Woit’s blog, David Nataf, points out that there is a “4-sigma signal of a gamma-ray emission line (that could be a dark matter annihilation line) toward the Galactic center at an energy 130 GeV”, i.e. close in energy to the 125 GeV massive spin-0 LHC particle, in the paper, “A Tentative Gamma-Ray Line from Dark Matter Annihilation at the Fermi Large Area Telescope” by Christoph Weniger, http://arxiv.org/abs/1204.2797 and “Strong Evidence for Gamma-ray Line Emission from the Inner Galaxy” by Meng Su and Douglas P. Finkbeiner, http://arxiv.org/abs/1206.1616 Nataf states: “The first version of the abstract of the second paper comments on how the energy is very close to that of the Higgs, I think they suggest the dark matter particle might decay into the Higgs.” 

Peter Shor in the same comments section on Woit’s blog states: “you can easily add sterile heavy right-handed neutrinos to the Standard Model, and that these could both explain dark matter and the low mass of the left-handed neutrinos [using the see-saw mechanism], so maybe Occam’s razor actually predicts the Standard Model with added heavy sterile neutrinos.”  Massive (125 GeV) right-handed neutrinos could decay, but since they are fermions (with spin-1/2) it’s hard to see how they can decay into bosons (with integer spin), unless there is some mechanism for spin angular momentum to be conserved.  For example, to conserve spin angular momentum, a massive spin-0 boson could be emitted when a 125 GeV right-handed neutrino decayed into a left-handed, trivial-mass neutrino.