Nature peer-reviewed article debunking computer model doomsday climate change predictions is censored by mainstream media

climate change scaremongering scandal

Nature peer-reviewed article debunking computer model doomsday climate change predictions is censored by mainstream media

“It has been claimed that the early-2000s global warming slowdown or hiatus, characterized by a reduced rate of global surface warming, has been overstated, lacks sound scientific basis, or is unsupported by observations. The evidence presented here contradicts these claims.” – John C. Fyfe, et al., Making sense of the early-2000s warming slowdown, Nature Climate Change journal, v6, March 2016, page 224 (below).

Nature debunks climate change hype liars

Summary of the key evidence of the failure of doom predictions from 124 simulations of 24 CMIP-5 scare mongering computer models, as published in peer reviewed Nature.   This confirms that the official models which ignore feedback are wrong, as explained in detail in our “Failure Evidence for All 21 Ipcc Positive-Feedback Climate Models”  They can be fitted to 195-98 (by means of epicycles and fiddles), but are debunked by the latest observations.

1. In the most widely-hyped scare-mongering climate change report by IPCC in 2007, all 21 of its “different” models all identically ignored negative feedback entirely, while including all-positive feedback from water. Thus, they implicitly and wrongly assumed that warm moist air that absorbs heat doesn’t rise, expand and condense into sunlight absorbing clouds well above the ground/ocean/icecaps.

2. Dr Roy Spencer’s peer reviewed paper found that you can only identify negative feedback in specific tropical weather systems, where CO2 heating of the ocean generates cloud cover that soon wipes out surface temperature rises (below cloud bases). Idiot “critics” then pointed out that you don’t see water negative feedback in other data pertaining to land (no water to evaporate) and clear skies (no cloud cover). You can only see water’s negative feedback over the tropical monsoon systems that Spencer studied. This, “critics” claimed falsely, debunked Spencer’s findings. (By similar crackpot “reasoning”, the absence of ice in the sun would be held to prove that ice doesn’t exist.)

3. Correlation is not causation, so the biased data selection of a temperature correlation to CO2 doesn’t validate the simplistic greenhouse effect of CO2 controlling climate. In a greenhouse with an atmosphere about 100 miles high and with 71% ocean area, CO2 heating inevitably causes additional water evaporation: moist air that absorbs sunshine, heats, expands and rises buoyantly until it reaches cold air, where it makes additional clouds. The upper surfaces of the clouds heat up, reflecting and also absorbing energy and trapping the “energy imbalance” far away from the ground, ocean, ice caps.

4. This is “negative feedback” from water: the heating of the atmosphere seen from satellite albedo (reflected heat) and microwave temperature sensors that determine oxygen’s temperature. The satellite temperature data is biased against recording any negative feedback at all, because negative feedback by its nature only occurs under cloud cover (evaporation causes more cloud cover, negating as Spencer found, most of the warming effects of CO2).

5. The main driver of temperature as Nigel Calder (1950s New Scientist editor) recently proved is cloud cover seeding by natural cosmic rays, see which is the inverse effect of the “no go theorem” used by deniers of natural climate change to debunk the idea that energy delivery is proportional to temperature. In the case of cosmic rays, as proved by C.T.R. Wilson’s Nobel Prize winning “Wilson cloud chamber”, the more ionizing radiation, the more ion trails for water droplets to condense upon, and thus the more cooling by clouds. The bigoted human climate change crackpots ignore this vital mechanism, and instead claim that Calder’s inverse correlation between cosmic ray intensity and climate temperature debunks the role of cosmic rays! In fact, it proves it, since cosmic rays boost the ionization that causes water vapour to condense into clouds, but deliver insignificant heating energy!

debunked hockey stick

Above: as we’ve long explained using controlled experiments with ice outdoors in the sun and in the shade and trees in shade and sun, tree ring growth proxies and ice sublimation (oxygen-16 to oxygen-18 ratio data, since lighter water molecules in ice sublime more readily -with less energy) are debunked by negative feedback (cloud cover increases due to increased evaporation from warming oceans in a real world “greenhouse” with oceans). In a nutshell, tree growth and ice sublimation doesn’t respond to temperature in the way observed in controlled experiments where cloud cover isn’t varied.

In the real world, the mean percentage of the sky covered by cloud increases with ocean temperature due to evaporation increasing the humidity and thus the percentage of the earth covered by saturated air (clouds).  This increase in clouds with temperature cuts down solar solar radiation exposure to trees and ice, thus shading them, and offsetting the effects of air temperature variation!  Thus, the flat part of Michael Mann’s hockey stick is not a real constant temperature, but instead is provably just the misinterpretation of the proxies.  You cannot determine any temperatures from ice sublimation or tree ring growth, because as mean air temperature rises, mean cloud cover also increases (evaporation of water from warm oceans) causing negative feedback, and offsetting the effect of the air temperature increase.  It proves impossible to get the few climate hype skeptical journalists like James Delingpole to grasp this.  This universally suppressed mechanism proves that climate is naturally far more variable than Mann indicates using unreliable ice and tree proxies.  Mann’s rising part of the hockey stick (20th century direct temperature measurements) is more reliable, but disagrees with tree records from the same period.  Instead of using this fact to debunk the entire set of tree and ice “proxies”, he simply cuts and pastes in the direct measurements, ignoring the discrepancy.  Journalists are complicit in this cover up, by making speculative or strawman style arguments instead of sticking to hard facts.  Once you grasp the mechanism, you can see that the recent apparent correlation of temperature to CO2 level isn’t impressive, since the natural variability means that at any time it’s about 50% likely that the temperature is naturally rising and 50% likely that its falling.  It’s not a flat line that suddenly goes up when CO2 emissions rise.

This Nature paper debunking the official models is being ignored by the BBC just as my paper explaining the mechanism is ignored by Nature.  The mainstream media avoids direct science controversy reporting and it is taboo to do scientific investigative reporting.

This is also relevant to quantum field theory controversy.  The mainstream media’s position of reverence to science’s “expert authority” (where it happens to suit their political agenda) can be amusingly debunked by taking the same position with their political reporting as follows:

  1. Only the government’s own famous politicians in charge are contemporary “expert authorities” in politics because they have power and full access to secret data, so only their speeches and writings are worthy of reporting. Opposition politicians are not in power, don’t know all the facts, and certainly aren’t in a position of similar authority.
  2. Anyone criticising the government is unfashionable by definition and thus boring.
  3. Reporting criticisms of the government will confuse the public, who won’t know what to believe.
  4. Trying to get to the bottom of controversy by looking at evidence and facts takes too much time, effort, and expertise that the media don’t have.  More money can be had more easily from fashionable celebrity interviews and censoring out fact based criticisms/alternative ideas.

These tactics by the mainstream media in politics would turn democracy into dictatorship. So why on earth do they do the same in science, which is supposed to be liberal with regards to freedom of information, new ideas and criticism of dogma?

(This post on politically corrupted media pseudo-science is cross-posted on our other blog, here.)

In the same way, nuclear weapons effects are routinely exaggerated by using idealized test data blast and radiation transmission in open deserts and from people outdoors (not in concrete buildings) in low-skyline Japanese cities (Hiroshima and Nagasaki, 1945). The exaggerations are then used by antinuclear bigots like CND to try not to lower the yields, but to try to ban nuclear weapons. However, this is debunked by an inspection of declassified surveys proving excellent survival rates in concrete buildings in Hiroshima and Nagasaki where fires were extinguished with simple water buckets (the firestorm peaked 2-3 hours later, not instantly as in CND type propaganda). It is further debunked for nuclear terrorism in recent studies of blast and radiation transmission in modern concrete skylines, which greatly absorb energy from the blast wave thus attenuating it, as well as absorbing radiation. Few people end up with burns, blast or radiation sickness inside modern concrete buildings. But even if they did, and even if nuclear weapons effects were exaggerated, couldn’t we just reduce the yields of stockpiled weapons instead of disarming? This is never discussed, a fact which tells you what’s going on.

As physicist Richard P. Feynman put it in his lectures on “This unscientific age”, since nuclear fallout is insignificant compared to natural background radiation, the anti-nuclear folk must be more concerned about banning natural exposure than nuclear test fallout (e.g., banning naturally radioactive potassium-40 in coffee and bananas, natural cosmic radiation in visits to mountain tops, Denver, and air travel). It is simply untruthful to hype a smaller threat as a danger while ignoring one a hundred times greater which is natural, and it is untruthful to claim that natural hazards are unavoidable. In other words, idealistic politics, not genuine nuclear safety, drives CND folk. Some are deluded by personality liars and nasty pseudo-scientists, but most can grasp that we need compact nuclear weapons to deter the invasions and military attacks that set of both world wars, when bulky, expensive conventional arms and mobilization not only failed to deter war, but helped to start it in 1914. As with eugenics, today’s media accepts anti-nuclear bigotry due to its lazy reliance on “science authority”. When you take account of the actual scaling for realistic city effects of nuclear weapons, the effects are not of a different order to conventional weapons. The millions of conventional weapons in a large war are actually equivalent to the thousands of nuclear weapons in a stockpile: there’s no “overkill”. The use of weapons to produce particular effects such as fallout, akin to lingering mustard gas bomb fears in WWII, were deterred and also largely negated by simple countermeasures and widespread education in defence.

In all of these examples, the media refuses to get engaged with the scientific arguments, preferring to quote “authority” figures, personalities, instead. This is precisely why the public remains ill-informed and the controversies are never ended by hard factual debunking of propaganda. Fundamental physics controversies are similarly treated as taboo by the media, by claiming it to be a mathematically “boring” subject, far beyond the skills of journalists to engage with. Instead, obsolete and often wrong interpretations of the equations are given, such as the notion that there is a single amplitude or wavefunction associated with a particle (that is the false 1st quantization theory, debunked by Dirac’s 2nd quantization and Feynman’s path integral). A particle in a “quantum computer” doesn’t have a single wavefunction amplitude which remains unchanging and indeterminate until measured, storing entangled state information that can be used to compute. Instead, as Feynman showed clearly in 1985, in relativistic quantum mechanics, it’s being endlessly affected by random interactions with field quanta. There is one wavefunction amplitude for every one of these interactions, which must be summed: the electron’s state is continually being changed by discrete, quantum interactions with its particulate Coulomb field. This has never been clearly revealed in the popular media, to debunk Bohr’s and Schroedinger’s incorrect (non-relativistic) belief in a single wavefunction amplitude per particle. Enforced ignorance and apathy results.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s