Comment to blog of Andrew Thomas

I’ve just come across a physics website by Andrew Thomas: http://www.ipod.org.uk/reality/

Here’s a comment about it submitted to the blog http://andrewthomas10.blogspot.com/

If I can go off-topic, your physics internet site, Quantum Reality and Hidden Variables Theories, is very interesting:

“It is this phase value which introduces the interference effects. The wavefunction appears to have a structure – it’s certainly more than a simple probability. So now we’re considering the wavefunction as describing “reality before observation”. This makes a lot of sense: if it is a mathematical model then it must be a mathematical model of an underlying mechanism. You do not have a functional model without an underlying mechanism. Surely a mathematical model does not exist on its own – it models something real. Even if we do not understand that mechanism, the wavefunction can provide us with a good understanding of its structure.”

Can I draw your attention to something about wavefunction collapse that Dr Thomas Love of the Departments of Maths and Physics, California State Unversity, stated in a preprint he emailed me, “Towards an Einsteinian Quantum Theory”:

“The quantum collapse [in the mainstream interpretation of of quantum mechanics, which has wavefunction collapse occur when a measurement is made] occurs when we model the wave moving according to Schroedinger (time-dependent) and then, suddenly at the time of interaction we require it to be in an eigenstate and hence to also be a solution of Schroedinger (time-independent). The collapse of the wave function is due to a discontinuity in the equations used to model the physics, it is not inherent in the physics.”

Laws are rarely a true mechanistic model. They’re usually just approximations. E.g., when radioactive material decays it does so in units of atoms, not as a continuous variable as in the exponential decay law, which is just a statistical approximation for large numbers.

Similarly air pressure isn’t constant so the mathematical models in hydrodynamics are not really mechanistic, and break down on small scales (e.g., for Brownian motion, where individual air molecule impacts become important).

The key thing is that Coulomb’s law is false on small scales since electromagnetic fields are compsed of light-velocity gauge bosons, field quanta, being exchanged between charges.

On large scales and in large time frames, the rate of the random, chaotic exchange of field quanta is so great that the randomness gets cancelled out, just as the randomness of air molecules striking a ship’s sail gets averaged out and appears to be a continuous pressure.

This is modelled in quantum field theory by the path integral of an infinite number of interactions over the spacetime involved.

But on small scales and times, such as for an electron inside an atom, the quantum nature of the electric field binding the electron to the proton is extremely random and chaotic, with impacts from gauge boson field quanta throwing the electron about and not keeping it on the classical, smooth orbit implied by Coulomb’s law.

Also, in an electric field of strength above 1.3*10^18 volts/metre, which occurs out to a range of 33 fm or so from the middle of the electron (see equation 359 in http://arxiv.org/abs/quant-ph/0608140 or equation 8.20 in http://arxiv.org/abs/hep-th/0510040 ), pair-production occurs spontaneously in the Dirac sea, and the pairs get radially polarized by the electron’s core electric field before annihilating back into field quanta (this radial polarization consists of virtual positrons being on average slightly closer to the real electron core than virtual electrons). This polarization shields part of the core charge of the electron, necessitating the renormalization of charge in calculations of things like the magnetic moment of the electron, known accurately to many decimals. The polarization shielding efect was experimentally demonstrated by Koltick and others in 1997, an is published in Physical Review Letters (I. Levine, D. Koltick, et al., Physical Review Letters, v.78, 1997, no.3, p.424). They found that when electrons are scattered with 90 GeV energy, the electric charge increases by 7% (the coupling constant representing electric charge increases from alpha = 1/137 to 1/128.5).

But what’s more important, the spontaneous production of pairs of virtual (i.e. short-lived) fermions around electrons at random due to the intense gauge boson (electric field quanta) radiation in strong fields breaking down the vacuum “Dirac sea”, will have chaotic effects on the motion of the electron on small scales (although on large scales the chaos will cancel out, just as a large number of random air molecule impacts averages out on large scales to be approximated well by the concept of constant air pressure, but doesn’t cancel out on small scales where individual impacts become important, causing Brownian motion of small particles).

It will randomly cause small-scale deflections, each deflection occurring when pair production produces pairs of fermions at random near an electron.

Feynman states on a footnote printed on pages 55-6 of his book QED (Penguin, 1990):

“… I would like to put the uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old-fashioned ideas would begin to fail, so a warning was developed … If you get rid of all the old-fashioned ideas and instead use the [path integral] ideas that I’m explaining in these lectures – adding arrows [each arrow representing the contribution to one kind of reaction, embodied by a single Feynman diagram] for all the ways an event can happen – there is no need for an uncertainty principle!”

Feynman on p85 points out that the effects usually attributed to the ‘uncertainty principle’ are actually due to interferences from virtual particles or field quanta in the vacuum (which don’t exist in classical theories but must exist in an accurate quantum field theory):

“But when the space through which a photon moves becomes too small (such as the tiny holes in the screen), these [classical] rules fail – we discover that light doesn’t have to go in straight lines, there are interferences created by two holes … The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that there is no main path, no ‘orbit’; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of intereference becomes very important, and we have to sum the arrows to predict where an electron is likely to be.”

Hence, in the path integral picture of quantum mechanics – according to Feynman – all the indeterminancy is due to interferences. It’s analogous to the indeterminancy of the motion of a small grain of pollen (less than 5 microns in diameter) due to jostling by individual interactions with air molecules, which represent the field quanta being exchanged with a fundamental particle.

The path integral then makes a lot of sense, as it is the statistical resultant for a lot of interactions, just as the path integral was actually used for brownian motion (diffusion) studies in physics before its role in QFT. The path integral still has the problem that it’s unrealistic in using calculus and averaging an infinite number of possible paths determined by the continuously variable lagrangian equation of motion in a field, when in reality there are not going to be an infinite number of interactions taking place. But at least, it is possible to see the problems, and entanglement may be a red-herring:

“It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of spacetime is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.”

– R. P. Feynman, The Character of Physical Law, BBC Books, 1965, pp. 57-8.

Notice that Feynman had a serious argument with Niels Bohr at the 1948 Pocono conference, where Bohr tried to dismiss the whole path integral approach to quantum field theory using Mach’s principle:

” … Bohr … said: ‘… one could not talk about the trajectory of an electron in the atom, because it was something not observable.’ … Bohr thought that I didn’t know the uncertainty principle …”

– Feynman, quotation is from “The Beat of a Different Drum: The Life and Sciece of Richard Feynman”, by Jagdish Mehra (Oxford, 1994, pp. 245-248), see: http://www.tony5m17h.net/goodnewsbadnews.html#badnews

I’m less sympathetic with some ideas you mention on your site like inflation from virtual particles. It’s ad hoc speculation, and there are better ways to do physics.

My area is cosmology. Hubble wrote his discovery of cosmological recession as

v = HR

velocity increasing linearly with distance. But in spacetime, you’re looking back into the past with distance, so he should have considered writing the recession like that, e.g. R = ct where t is time past,

v = HR = Hct

here the product Hc is the constant, and it has physically interesting units: it’s the cosmic acceleration of the universe.

Another approach is to differentiate v = HR with respect to time,

a = dv/dt
= d(HR)/dt
= H*dR/dt + R*dH/dt
= H*v + R*0
= RH^2

This is something I used to predict the cosmological acceleration of the universe via a paper published through the letters pages of Electronics World in October 1996, after it was rejected from journals more relevant! When the discovery of the acceleration was made a couple of years later an published by Perlmutter et al. in Nature, Nature’s editor Philip Campbell censored out my letter pointing out that the discovery had been predicted. I edited the letter down in size a few times and resubmitted, but they wouldn’t publish. The mainstream cosmology dictum is that Hubble’s law is analysed with the Friedmann-Robertson-Walker metric of GR, end of story.

Anyway, what’s interesting about the acceleration is that it produces predictions from quantum gravity. The receding universe has an outward acceleration, so it’s mass has an outward force by Newton’s 2nd law. The cosmic acceleration is only small, say 6*10^{-10} ms^{-2}, but it’s what has been measured in the absense of gravitational deceleration of distant supernovae.

Putting that acceleration into F = ma with m the mass of the accelerating universe, you get the outward force say 7*10^43 N. Newton’s 3rd law of motion then comes into play, and suggests an equal inward-directed force. Looking at the quantum field theories and their vacuum gauge bosons, gravitons would appear to be the only candidate for mediating this inward-directed force.

Using a = RH^2 from the derivative dv/dt = d(HR)/dt above, it’s clear that nearby masses (R = 0 approximately) will have outward acceleration a = 0, and won’t produce either an outward force or an inward-directed force mediated by graviton exchange radiation.

So nearby objects which aren’t receding relativistically will not transmit gravitons to you forcefully, unlike the receding matter at great distances. Nearby masses therefore constitute a shielding area for the cross-section they have for graviton interactions.

Therefore you get pushed towards the Earth by the imbalance. This predicts gravity accurately, see https://nige.wordpress.com/2008/01/30/book/

What’s fascinating about physics is that many of the people in it are so prejudiced they’d be more honest if they preached religion openly, instead of doing so unethically under the cover of claiming to be scientists.

I’m a database designer and network engineer by the way. I’ve still got a lot of questions to resolve in physics, and while I can’t get papers published in the right places, I’m taking the opportunity as time permits to follow up loose ends and tidy up my work as far as I can.

Instead of inflation making the universe’s density extremely smooth before the cosmic background radiation was emitted, my work shows that gravitational coupling G increases in strength linearly with time after the big bang. This doesn’t make fusion rates vary because electromagnetic force strengths are als predicted by the mechanism and vary the same way, so the increased G sggesting increased gravitational compression in a star doesn’t increase fusion rates because the Coulomb force (repulsion between protons) increases and reduces the ability of protons to approach closely enough to fuse. The two force variations with time cancel one another for practical purposes. The main effect observed is that at early times, the universe was less lumpy than predicted by mainstream calculations, because gravitation was weaker at earlier times and caused less clumping, so the density was more uniform. No inflation.

9 thoughts on “Comment to blog of Andrew Thomas

  1. Hi Nige, thanks for the long comment on my blog. Very interesting.

    You’re very ambitious, aren’t you?! I think revolutionary ideas in physics turns some people off. I say all ideas are good ideas. Keep up the good work.

  2. Hi Andrew,

    Thanks for responding!

    Maybe it’s revolutionary that nature is not as non-mechanistic as is claimed by the proponents of physical ignorance. Quantum field theory is supported by the successes of the standard model of particle physics, which makes accurate predictions that are confirmed in numerous particle physics experiments.

    So things like the quantum (virtual photon) nature of the electric (Coulomb type) field being responsible for the random motion of the electron on small scales (inside an atom), is pretty well established.

    Force-causing virtual particles (gauge bosons such as gravitons and virtual photons) need to be treated by simple physical interaction models in physics, not just as abstract gauge interactions.

    The Feynman diagrams represent physically real interactions in the vacuum, and a path integral can be established without calculus just by summing all the individual interaction histories (Feynman diagram contributions) which contribute to a force or event.

    This resolves many problems very simply, making checkable predictions.

    Unfortunately, it’s not popular because of the popularity of speculations like 10/11 dimensional string theory, which can’t make any checkable predictions, but is so mainstream that string theorists get to censor out my papers from Classical and Quantum Gravity etc. as if they are my “peer”-reviewers.

    There is nobody really who is interested in this. Those who are indoctrinated in the established ideology aren’t interested because they have a bias due to the things they have been immersed in for years, while students are busy preparing for exams and don’t have time for non-orthodox ideas. Basically, nobody has time for something that evidently faces a lot of hostility from the mainstream. It’s not a bandwaggon which looks likely to get anywhere fast, so people stay clear of it.

    Regarding the ambitious-seeming nature of it: what happens is that in debates, people raise questions which need answering, and that causes the theory to be applied to questions and problems it hadn’t been applied to before. So it snowball into something bigger, just because of the critics.

    Amongst the rejection reasons are being too concise and brief (not providing enough evidence/detail to convince people) and being too lengthy for people to be able to invest the reading time.

    When I have the time maybe I’ll try making a video about it, with hi-tech illustrations to demonstrate all the physics. I think that people may be more willing to passively watch and listen to well organised demonstration than to just read a book about it.

  3. Andrew

    I just discovered your reality website and found extremely interesting and informative. As a layperson with only a BS in chem I was able to follow it and see where you were going. It also shook me up a little concerning the “matrix” portion of the site. My question concerns another experiment that was performed, firing an electron gun at a sheet of graphite. The resulting pattern was a very ordered dot pattern. The person discussing the experiment stated that “while looking at the screen we see this ordered pattern but while not viewing the screen all we see is a schmeer of light” I may be quoting incorrectly but my question is a basic one. How did the experimenter know it was a “schmeer” if not observing it?
    thank you
    marshall rowe

  4. Dr. Thomas… I do not know any other way of getting in touch with you. I’ve read your 3 “Hidden In Plain Sight” books and this comment is about #3. I have always believed that the “block model” of time was merely an affect of the vast distances in the universe coupled with the speed of light. You seem to favor it as “the real story” on time, but your own delightful little book belies this. If, as you so well reveal, we are all traveling together at “the speed of light” in spacetime, then there must be genuine simultaneity in the universe even if we cannot map it physically. To elaborate, every particle in the universe has been traveling together at the same speed in spacetime since the instant of the big bang. That means that there is a constantly widening interval between every point in spacetime and the big bang and every one of those points is widening at the same rate. To take an example you use in the book…

    You and I walk past one another on the street in opposite directions relative to Andromeda. We pass one another at exactly 3.8 billion years + 12 hours after the instant of the big bang. Now on some planet in Andromeda, a legislative body decides to launch its space fleet toward the Earth, and that decision also occurs exactly 3.8 billion years + 12 hours after the big bang. Although we (that is the two of us passing on the street) cannot map our time physically to that decision in an identical way, since both we and the Andromedans have been moving at the same speed in spacetime since the big bang there is nevertheless meaningful for us to say that both events take place simultaneously, that is 3.8 billion years + 12 hours after the instant of the big bang.

    Thank you for your time.
    matthew rapaport
    quineatal@gmail.com

  5. This is an invitation to a virtual dinner. Up until now I’ve told folks that of all the persons living or dead, It was Richard Feynman with whom I would most like to spend an evening (excluding Lisa gleaned or Susan Sennett in their heydays). After reading all your “hidden…” books, please join us. Barry Swartz. MD JD lay person, ex physics major from a liberal arts school many eons ago.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s