Edward A. Schuert’s fallout path integral mapping system for understanding the QFT path integral mechanism of multipath interference

Above: is there any relationship between Edward A. Schuert’s method plotting of particle trajectories, falling from a height through air with differing wind speeds and directions at different altitudes, and the quantum field path integral? At first sight, none, because the path integral considers multiple paths with the same point of origin and termination, whereas the fallout particle trajectories shown above don’t all end up at the same termination point (shown for the 4.5 megatons “cleaner” 95% fusion Los Alamos Redwing-Navajo test at Bikini lagoon in 1956, a device of only 5% fission yield, or one tenth the fallout for typical Teller-Ulam devices; an innovation opposed automatically by everyone in the world, arms controllers and the military, who all loved dirty bombs). Additionally, the whole point of the path integral is that each particle has a phase vector, usually denoted by exp(iS) in complex space (although if you lack appreciation for mathematics you can, for bog standard calculations, replace exp(iS) with cos S, S being action (expressed in units of Planck’s parameter, h, divided by twice Pi). This action is the Lagrangian energy of a given path integrated over time. So the path integral is a double integral, in which you’re calculating the sum of the phases of all possible paths, each with its own Lagrangian integral over time. If we are going to consider only very simple phenomena, perhaps we can go one step further, replacing S in cos S, with the Hamiltonian-time product, S ~ -Ht, so that exp(iS) gets replaced with cos (-Ht), where H is the Hamiltonian energy (multiplied by twice Pi, and divided by Planck’s parameter, h). Then, as Feynman shows in the Schuert-like spatial “path integral” diagrams of his 1985 book QED for light reflection, you just have a phase vector in each photon which rotates with a frequency equal to the frequency of the photon (duh!) as the photon moves!

So, looking again at Schuert’s graph, and comparing it to the QFT path integral as Feynman depicts it in several spatial (not space versus time, as classically done) graphs in his 1985 QED book, you can develop a clearer understanding of what’s really going on in the latter. For example, suppose Schuert had wanted not to see the “big picture” of where particles end up, but merely wanted to see what fallout particles arrive at a fixed spatial point in the fallout area. Then he would ignore all particle trajectories that didn’t end up at that termination point. All he wants to know, then, is what arrives at designated location. This helps to understand what the hell is going on, if you want a mechanical understanding of the universe, which we do, if we aren’t all 100% total quacks.

In the path integral, you’re working out the multipath interference amplitude by summing all possible spatial paths, where the individual paths have a phase amplitude that’s a function of the action (K.E. – P.E. integrated over a fixed time for a path; the amplitude is always for multipath inteference where at a given time, the paths arrive at a fixed spatial point to interfere). This treats space and time differently, and Dr Woit argues for using the Euclidean not Minkowski signature for such integrals. The usual path integral for SM particle physics cross-sections and reaction rate type calculations, where the amplitudes for different paths vary, due to varying SPATIAL configurations over a FIXED TIME for all the paths involved (every path integrated must arrive at spatial end point at the SAME time) and are are summed to give total amplitude at a FIXED SPATIAL ENDPOINT LOCATION and for a FIXED TIME. Schuert’s plots, and Feynman’s revolutionary all-spatial path integral diagrams in his 1985 QED book, are a step forward in physical understanding that will forever be ignored by the epicycles/supersymmetry nuts. Shame!

There are loads of other “clues”. One massive issue which again is totally ignored by the mainstream (including PW) and by popular science writers is that the quantum electrodynamic propagator has the form: 1/(m + k)^2, where m is the virtual massive (short ranged) electromagnetic field quanta (e.g. the virtual electrons and positrons that contribute vacuum polarization shielding and other effects between IR and UV cutoffs), and k is the term for the massless (infinite range) classical Coulomb field quanta (virtual photons which cause all electromagnetic interactions at energies lower than the IR cutoff, i.e. below collision energies of about 1 MeV, which is the minimum energy needed for pair production of electrons and positrons).

The point is, you have two separate contributions to the mass of a particle from such a propagator: k gives you the rest mass, while m gives you additional mass due to the running coupling for collision energy >1MeV. (See for instance fig 1 in https://vixra.org/pdf/1408.0151v1.pdf .)

The fact that you can separate the Coulomb propagator’s classical mass of a fermion at low energy (<1 MeV) from the increased mass due to the running coupling at higher energy, proves that there’s a physical mechanism for particle masses in general: the virtual fermions must contribute the increase in mass at high energy by vacuum polarization, which pulls them apart, taking Coulomb field energy and thus shielding the electric charge (the experimentally measured and proved running of the QED coupling with energy). In being polarized the electric field, the virtual positron and electron pair (or muon or tauon or whatever) soaks up real electric field energy E in addition to Heisenberg’s borrowed vacuum energy (h-bar/t). So the virtual particles must have a total energy E + (h-bar/t), which allows them to turn the energy they have have absorbed (in being polarized) into mass. This understanding of the two terms in the propagator, m and k, therefore gives you a very simple mechanism basis for predicting all particle masses, m, which shows how the mass gap originates from treating the propagator as a simple physical model of real phenomena, and not a sacred scripture dictated by a God. But efforts like this to explain all SM parameters and masses are unfashionable to superstringers.

The media’s continued obsession with Nukemap lies (run by a colleague of John Horgan’s at Steven’s Institute in the home of evil, America) has been driving me stir-fry crazy with anger, as it continues the lying “WMD’s disarmament will lead to universal love and world peace” mythology (debunked for poison gas in the 1930s when Angell, Noel-Baker, and other Nobel Peace Prize winning liars used it to start WWII, with no retribution at all, just as Nobel Laureate Dr Carrell’s call for gas chamber eugenics in 1938 in a book that became a German best-seller called Man the Unknown, went unpunished), that nuclear weapons shouldn’t be used to deter the invasions that kill kids. The Nukemap guy has a blog and years ago I explained he was wrong in a post on my nuclear weapons facts blog about his deceit, receiving the first and only comment from him on my blog, so I deleted the post (though it is saved on a HDD with his comment), hoping he’s correct his stuff. But he didn’t. It delivers what people want: lies that nuclear weapons can only be used to kill huge numbers of civilians, not to stop invasions and DETER war. There’s nothing about the most important effect of nuclear weapons in Nukemap: DETERRENCE IS AN EFFECT OF NUCLEAR WEAPONS YOU IGNORE AT YOUR PERIL, AND AT THE PERIL OF UKRAINIAN KIDS, AND IN FUTURE AMERICAN KIDS.

I remember the same frustration at groupthink fascism (yep fascism is what killing kids for eugenics pseudoscience or whatever Marxists use as their so-called “excuse”, and get lost Mr “Godwin’s law”) back in 1997 when I met the physicist David A. Chambers who had done integrals of the energy delivered on the screen in the Feynman style double-slit experiment, using a laser and photomultiplier. The key thing here, as I saw when he showed me, was that you can make a pinholes (yep, with a small pin!) in the screen at key places and analyse what light gets through: the “interference fringe” spots where photons are arriving and cancel out or reinforce (depending on whether their amplitudes are in-phase or out-of-phase as discussed two paragraphs above – see the figures in Feynman’s 1985 book, QED: The Strange Theory of Light and Matter) can be the pinhole locations! You can learn a lot about the mechanics this way, even without firing photons one at a time, if you measure the amount of light getting through. Also, when you actually do the experiment (rather than looking at the exaggerated diagrams in certain obfuscating books), the gap between the two slits in front of the screen has to be so small that there is no mystery as to what is going on: the transverse extent of a photon will always overlap both slits, and so be affected by both! The whole mystery arises from the artificially exaggerated gap between slits shown in textbooks. With such a large gap between two slits, you simply don’t get interference fringes! Anyway, I checked and published Chalmer’s paper in the first issue of Science World magazine, ISSN 1367-6172 (if I remember the barcode correctly), and then got simply ill-informed abuse and death threats in response from I think some nutters at Hull University. The police weren’t interested in this, unsurprisingly. You can’t tell the facts without upsetting nutters!

Going back to the Steven’s Institute where John Horgan is, I followed a link from Woit’s blog to Horgan’s site containing back-numbers of some of his Scientific American articles and a some general posts, before realising he was at Stevens where Nukemap originates, allegedly (actually it goes back to the terrible Carter admin politically correct – i.e. trash – 1977 version of Glasstone’s book, The Effects of Nuclear Weapons, which deletes all the useful data on protective measures nuclear tests in previous versions, creating the delusion that a nuclear bomb on an unobstructed desert creates the same effect as in a highly shielded concrete city, where buildings PROVABLY absorb all the effects – radiation and also blast as proved by Lord Penney to the continuing horror of the Pentagon’s nuke disarmament freaks – VERY effectively, reducing casualties by a factor on the order of 100 from what you get for Glasstone’s assumption of nukes over nudist beaches). Anyway, Horgan has a blog post claiming we have free-will. I don’t think he wants my comment on his site, because like all the Scientific American fascists and general American fascism in the media over there, he dismisses anyone with proven facts as a quack without bothering to check his facts first, also he’s at the same place as the Nukemap charlatan who seemingly wants Putin to go on murdering people without credible nuclear deterrence (correct me if you have proof he’s corrected Nukemap now, please), so I’ll comment here instead: Jews in Nazi territory had a very limited free-will choice of slavery or death, if lucky (if unlucky, they were first used in science experiments to determine survival time in a vacuum chamber or in ice cold water, etc.). Most people in the world have constraints on their free-will, hopefully not all as bad as those in the most extreme fascists regimes! In 2021, 1.07/10,000 people in the UK committed suicide, the most extreme form of free-will. Guess which country tops the charts of suicide rates? Correct! Russia, with just over the fascism media dictatorship and pseudo-democracy UK figure (Russia had 1.16/10,000 people in 2019!).  So certainly, free-will appears to exist, in some “form”, for everyone: you can always go jump off a cliff if reading my stuff depresses you even more than it depresses me to write it. (Don’t do that!) What’s in question is not the qualitative existence of free-will, but it’s quantitative distribution as a function of wealth, country, race, education, and so on. Clearly, we don’t all have the same amount of free-will, and those with the greatest choices tend to be most loathed to make good use of this luxury by going off the beaten track in an honest way, particularly if they have got to where they are by being conformist. Those who got millions easily to fund their adventures, like Trump and Meg/Harry, display the most freewill (alas, usually the easy “controversial” forms if it, rather than 100% originality), and become polarizing media figures (called “marmite” if you are British; i.e. something 50%-loathed, 50%-loved).

UPDATE (31 DECEMBER 2022): if you want a really good WICKED laugh and you are like me a practical mathematician and NOT a elitist snob who believes God, the universe, and everything is a “beautiful equation” (Feynman has a pretty good debunking of this based on the difficulty of finding a non perturbative – i.e. mathematically exact – solution to anything such as the infinite series of terms in the perturbative expansion to even the simplest two particle interaction in the REAL WORLD rather than some BS world certain elitist mathematicians live in), then Dr Peter Woit has a new blog post for you to enjoy: “Earlier this year I bought a copy of the recently published version of Grothendieck’s Récoltes et Semailles, and spent quite a lot of time reading it. I wrote a bit about it here, intended to write something much longer when I finished reading, but I’ve given up on that idea. At some point this past fall I stopped reading, having made it through all but 100 pages or so of the roughly 1900 total. I planned to pick it up again and finish, but haven’t managed to bring myself to do that, largely because getting to the end would mean I should write something, and the task of doing justice to this text looks far too difficult.” Ha. Ha Ha. Serves you right. Bertrand Russell took over 100 pages to “prove” 1 + 1 = 2 in his acclaimed pure mathematics book. In the real world 1 + 1 = 2 is always a lie because no two real world electrons have the precisely the same polarized vacuum state around them (which partly shields their core electric charge and has effects on mass, spin, magnetic moment, etc, etc), which is inherently non-deterministic and NON MATHEMATICAL, due to the random nature of pair production in that vacuum shield. Mathematics is a human invention of ego, not a real world phenomenon. The fact anyone thinks differently, with all the physics evidence against them, tells you STAY CLEAR OF THEM. They’re the nutters.

Above: very brief PDF “flavour” extracts from two long books that contain data vital to the nuclear weapons deterrence use and deterrence effects debate today, but which are basically as unavailable to most people as Top Secret classified bomb design documents: John A. Northrop, Handbook of Nuclear Weapon Effects Abstracted from EM-1 and Frank H. Shelton Reflections of a Nuclear Weaponeer 2nd revised, updated and expanded edition 1990. In my opinion, for what it is worth, the entire subject is corrupted as presented in the populist media which has always lied maliciously to encourage wars while pretending to do the opposite.

Update 5 Jan 2023: John Horgan’s blog now contains the nonsense “no-go theorem” style BS quotation:

“It is true that you can convince yourself that you understand quantum mechanics without calculus (or using simpler mathematics than that which professional physicists use) but that’s totally delusional. Good luck calculating the Lamb shift!”

The whole point of Feynman diagrams is that any perturbative correction whatsoever in QM or QFT, including the Lamb shift in hydrogen spectra, can be calculated very simply by a two-year old, drawing two-year old style simple interaction diagrams, and then applying Feynman’s very simple calculating rules (counting up the number of vertices in the diagrams to determine the total power of the interaction coupling, etc.). Sorry, this guy doesn’t understand that each term in the perturbative expansion (equivalent to the “path integral” which CANNOT be directly evaluated using calculus in general) has a simple calculating procedure [using feynman’s rules]. Things get more complicated for the strong nuclear interaction, but even then you can use simple lattice approximations to evaluate it reasonably well in a computer, without getting an [exact] analytical solution using calculus to [the] lagrangian (impossible to date). Cheers!

(I’ve submitted this comment there but it has as much chance of appearing as a comment on the Nukemap guy’s site. They only allow comments that appease them, to make it appear as if they don’t make mistakes. Typical media charlatan con-trick! Duh!)

Update: 8 January 2023. Professor John Horgan at stevens has another blog post up, referring to a new 2023 dated Nature paper: Michael ParkErin Leahey and Russell J. Funk, “Papers and patents are becoming less disruptive over time”, Nature volume 613, pages138–144 (2023), which states: “Theories of scientific and technological change view discovery and invention as endogenous processes1,2, wherein previous accumulated knowledge enables future progress by allowing researchers to, in Newton’s words, ‘stand on the shoulders of giants’3,4,5,6,7. Recent decades have witnessed exponential growth in the volume of new scientific and technological knowledge, thereby creating conditions that should be ripe for major advances8,9. Yet contrary to this view, studies suggest that progress is slowing in several major fields10,11. Here, we analyse these claims at scale across six decades, using data on 45 million papers and 3.9 million patents from six large-scale datasets … We find that papers and patents are increasingly less likely to break with the past in ways that push science and technology in new directions. This pattern holds universally across fields and is robust across multiple different citation- and text-based metrics1,13,14,15,16,17. Subsequently, we link this decline in disruptiveness to a narrowing in the use of previous knowledge, allowing us to reconcile the patterns we observe with the ‘shoulders of giants’ view. We find that the observed declines are unlikely to be driven by changes in the quality of published science, citation practices or field-specific factors. Overall, our results suggest that slowing rates of disruption may reflect a fundamental shift in the nature of science and technology.” Yup. It’s called “practicality” and is what happened when Marx’s and Engel’s Communist Manifesto was implemented by Stalin, requiring the massacre of millions of Ukrainians.

My dad was an idealistic Communist, as am I, in principle. However, no humane person who is sane and rational can along exactly with the “practicalities of Communism” including genocide and dictatorship involved in what the Stalinists like Putin refer to as the “practicality of implementation”. The early 1920s USSR had 20,000,000 small farms. Lenin, Stalin and Trotsky wanted more efficient, social, Communist big farms, called collectives, the State-owned “kolkhozy” and tried to use propaganda to make this work from 1921-1928. Problem: the propaganda failed, and by 1927 USSR statistics proved that only 0.8% of the 20,000,000 small farmers had joined the efficient State owned collectives or kolkhozy. Stalin was thus “forced” to use starvation under Five-Year Plans to encourage obedience. At this point, “Communist” became a fascist dictatorship of theft and genocide. Millions were starved to death in Ukraine for the crime of not joining the collectives. They were driven off their land, their properties were burned, their livestock stolen by the state, and they were told that people who refuse to work on the State collectives (not their own land) “shall not eat”. So they starved to death under state compulsion. What difference to Anne Frank in Belsen under Hitler? Not a bit. Using Comrade Corbyn’s preferred adjectives (but not of course preferred analogy!), you can’t put a cigarette paper between Stalin and Hitler. They jointly invaded Poland in September 1939. They both massacred scapegoat “capitalist jews”. They were both evil. The good new? In 1931, USSR statistics showed that 52.7% of farmers had joined the kolkhozy, rather than see their kids starve to death before their eyes. Wonderful! By 1940, it was 97% membership of the kolkhozy, and 99.9% of cultivated USSR land under state control. Communism made practical. This is analogous to what has happened to “big science”.

By bringing everything under dictatorial control for “efficiency” it has been turned into the similar conformity of rigor mortis:

Above: graph from the 2023 Nature paper, “A new study by Russell Funk et al shows a sharp decline in “disruptive” science over the past 60 years” – Professor Horgan, from https://www.johnhorgan.org/blog/posts/42122.

THE FOLLOWING IS A BRIEF RELEVANT EXTRACT FROM a 2015 post on our other blog: Lapp’s 1965 book The New Priesthood begins (page 1) with the following quotation from President Woodrow Wilson, on the dangers of [BIG SCIENCE] dictatorship by secretive expert advisers, like a Manhattan project:

“What I fear is a government of experts.  God forbid that in a democratic society we should resign the task and give the government over to experts.  What are we for if we are to be scientifically taken care of by a small number of gentlemen who are the only men who understand the job?  because if we don’t understand the job, then we are not a free people.”

Lapp then points out how he saw science change during WWII from a poorly funded, low-prestige business of struggling individuals pursuing unpopular technical questions to find the truth, into today’s “big science” of groupthink-dominated government (taxpayer)-funded teams of aim-biased technicians, seeking wealth and prestige, paying only lip-service to freedom and objectivity:

“Today … the lone researcher is a rara avis (rare bird); most scientists team up to work together toward agreed upon objectives [not an unbiased agenda]. … A single experiment may involve a hundred scientists … the research is no longer unspecified as to objective … democracy faces its most severe test in preserving its traditions in an age of scientific revolution. … scientists in key advisory positions wield enormous power.  The ordinary checks and balances in a democracy fail when the Congress, for example, is incapable of intelligent discourse on vital issues.  The danger to our democracy is that national policy will be decided by the few acting without even attempting to enter a public discourse … our democracy will become a timocracy. … Even if no formal secrecy is invoked by the government, an issue might as well be classified ‘secret’ if the people in a democracy are incapable of carrying on an intelligent discussion of it. … The danger is that a new priesthood of scientists may usurp the traditional roles of democratic decision-making”
– Dr Ralph E. Lapp, The New Priesthood: The Scientific Elite and the Uses of Power, Harper, New York, 1965, pages 2-3.

Lapp on page 8 quotes President Thomas Jefferson:
“To furnish the citizens with full and correct information is a matter of the highest importance.  If we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education.”

Education in fact, not groupthink indoctrination nor the propaganda substitutes for fact used by dictatorships.

Lapp on page 14 quotes President Dwight Eisenhower’s 17 January 1961 farewell address:
“Today the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists … In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution … Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. … The prospect of domination of the nation’s scholars by federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.”

Lapp on page 16 quotes Dr Alvin Weinberg (director of Oak Ridge National Laboratory, 1955-1973):
“I do believe that big science can ruin our universities, by diverting the universities from their primary purpose and by converting our university professors into administrators, housekeepers and publicists.”

Alvin Weinberg expanded on his critique of “big science” in his 1967 book, Reflections on Big Science.

We quoted Alvin Weinberg’s analogy of populist anti-nuclear pseudoscientific rants to witch hunts, in a previous post (linked here). Weinberg wrote Appendix B: Civil Defense and Nuclear Energy, pages 275-7 of The Control of Exposure of the Public to Ionizing Radiation in the Event of Accident or Attack, Proceedings of a Symposium Sponsored by the National Council on Radiation Protection and Measurements (NCRP), April 27-29, 1981, Held at the International Conference Center, Reston, Virginia. (The proceedings were published on May 15, 1982, by the U.S. National Council on Radiation Protection and Measurements, Bethesda, Md.):

“That people will eventually acquire more sensible attitudes towards low level radiation is suggested by an analogy, pointed out by William Clark, between our fear of very low levels of radiation insult and of witches. In the fifteenth and sixteenth centuries, people knew that their children were dying and their cattle were getting sick because witches were casting spells on them. During these centuries no fewer than 500,000 witches were burned at the stake. Since the witches were causing the trouble, if you burn the witches, then the trouble will disappear. Of course, one could never be really sure that the witches were causing the trouble. Indeed, though many witches were killed, the troubles remained. The answer was not to stop killing the witches – the answer was: kill more witches. … I want to end on a happy note. The Inquisitor of the south of Spain, Alonzo Frias, in 1610 decided that he ought to appoint a committee to examine the connection between witches and all these bad things that were happening. The committee could find no real correlation … So the Inquisitor decided to make illegal the use of torture to extract a confession from a witch. … it took 200 years for the Inquisition to run its course on witches.”

Above: Herman Kahn’s graph of the massive rise in U.S. government taxpayer funded research and development from 1940-1960, about 20% of which is military and 80% is civilian.  (Lapp states on page 45 of The New Priesthood that in 1939 the entire U.S. Federal research and development budget was just $50 million, mostly for agricultural science, with a small portion for ship studies at the Naval Research Laboratory, and just $2 million for physics research, by the National Bureau of Standards.)  The Manhattan project which resulted in the first nuclear weapons used in 1945, was reported to have cost $2 billion from 1942-1945.  Thus began “big science”.  By 1960s, six times as much as that was being spent per year.  Essentially all of this expenditure is decided in advance by timetable and grant-proposal dominated groupthink bureaucracy and officialdom, not by a completely unbiased search for the truth by individuals who are free to follow the evidence.  You cannot find the unknown by a search governed by planned timetables.

Lapp quotes an editorial by Science editor Dr Philip Abelson on page 30 of The New Priesthood:

“The witness in questioning the wisdom of the establishment pays a price and incurs hazards.  He is diverted from his professional activities.  He stirs the enmity of powerful foes.  He fears that reprisals may extend beyond him to his institution.  Perhaps he fears shadows, but … prudence seems to dictate silence.”

critics have their own critics: some rational, others crazy

On 14 December 2022, shortly after dear old CIA Director William Burns poetically announced that there is “no evidence” of Putin being ready for war (maybe a special military operation, you understand, but not a war), Putin ordered the loading of an ICBM into a Russian silo to be filmed and published; just love the fact they tilt the entire lorry with the wheels up into the air, don’t you? So practical, not needing a huge crane and a million bucks. This publication is purely a matter of setting the record straight, true, and honest. Just to clarify, you understand. To make sure Biden doesn’t get fake news. From his CIA Director. Fake news that is classified Top Secret – restricted data sigma 1, so nobody who actually has the facts can see the fake news to debunk it. So we’re all good. (Putin, the kindly old son of a gun, has lovingly also been declassifying and publishing all the other Russian nuclear weapons secrets, too. Just for clarification. Maybe he thinks, unlike America, that for deterrence to work the other side has to believe it? Who knows. Why cares?)

Above: moving on from the deplorable way that Putin has disgracefully debunked the lies of the CIA Director, William Burns, we have another type of “Criticism of Criticism” (c-squared, henceforth). This c-squared example is an “anonymous” hate attack on Woit’s Not Even Wrong blog. It mentions Prof. Gil Kalai, who has put the case that quantum computers are fake news, like the fake claim that the USA exploded a “hydrogen bomb” in 1952 (it was over 75% fission, under 25% fusion). The argument here is that at the quantum level, noise prohibits practical computing. I have no interest in this. Freeman Dyson has a lecture about heresy being rejected by “experts” before being proved right. The point I’d like to make here is that, if you are going to censor freedom of speech, either using “secrecy laws” or attacks on critics, then we might as well surrender to Putin and live in the utopian Russian Federation. (Tip for the kind of people who love mad censorship: I’m being sarcastic, not endorsing Putin.)

Above: beyond c-squared (critics of critics) there is c-cubed (criticism of critics of critics), as evidenced Ivor Catt (he quotes bits from me and other critics on that page, failing naturally to bother lower himself to responding to the criticisms), whose successfully peer-reviewed paper “Cross-talk in Digital Systems” in  I.E.E.E. Trans, electronic Comput., 16 (1967), went to his head, leading him to claim to “disprove” first displacement current (Maxwell’s term to explain charging and discharging of capacitors, and which also gives us a theory for electromagnetic radiation, a very uncontroversial “theory” well substantiated by evidence transmitted by radio waves!), and then “disproves” electric current! (yeah, he thinks he can use Occam’s Razor to replace it with electromagnetic radiation in the form of Heaviside’s “slab of energy current”, and in his own little world there are no vacuum tubes or old fashioned TV screens with electron beams, or beta particles from the Sr-90 source I gave him, all these are little packets of trapped Heaviside “energy current” that acquire rest mass by magic). I first met him shortly after my first article was published in the November 1994 issue of Electronics World. Like a Sir Jimmy Saville or a superstring theorist, at first he appears humble and genuinely interested in defending freedom of progress, but you eventually discover he’s just a conman, like a typical “superstring theorist”. Actually, Catt’s 1967 paper, though peer-reviewed, is based on what is just an approximation (by Heaviside) to more interesting quantum electrodynamic mechanics, and it is the fake part of the approximation that Catt assumes to be Gospel truth, not the empirical evidence which it fits. In other words, it’s the old problem of “factoids” in science; usually the hardest to kill factoids are mixtures of measured evidence and approximate theory, which are being fashionably presented as if they were 100% empirical evidence, and are then used to falsely denounce alternative analyses of the data that are more accurate. (Classic examples being caloric and phlogiston; in other words, once you have a name for a symbol in an equation, and it has received say a trillion dollars funding grant from harry and meghan’s charitable foundation for universal love and anti-racism, the factoid becomes as “real” to simpletons as wormholes in computer simulations or dare I say the non-water vapour exponential increase in the amount of Marx-media hot air. You might think that “dark energy” and “dark matter” are similar epicycles, but there is evidence that they are in fact more fundamental than currently believed by populist fashion.)