I’m windsurfing at Jandia in Fuerteventura until 19 December, but logged on to see what is happening in high energy physics. Dr Woit has a new post about the LHC beam energy being gradually worked up into new territory.
He was born before either the Standard Model or supersymmetry (the 10/11 dimensional string theory unification scheme for fundamental forces, since the Standard Model does not unify forces properly) was formulated, so presumably he is hoping to live to see some new discovery to confirm the mechanism of electroweak symmetry breaking and the origin of mass, if electroweak symmetry really is “broken” at low energy rather than just simply being a force which is not unified at high energy (see towards the end of the previous post for more explanation).
In summary, as Richard P. Feynman explained in the final chapter of his 1985 book which deals with quantum field theories beyond electrodynamics, QED, the “electroweak unification” U(1) x SU(2) is glued together by the ad hoc Weinberg mixing angle (which literally mixes the neutral unobserved hypercharge gauge boson of U(1) with the neutral W of SU(2) to produce two mixed up observed gauge bosons: the observed electromagnetic photon and the observed weak Z boson).
Additionally, the electroweak unification is stuck together with a piece of artificial duct tape called a Higgs mechanism (there are endless varieties of Higgs theories, none of which have been confirmed and the simplest versions of them with just one Higgs boson conflict with experiment; proponents believe in some kind of supersymmetric Higgs field theory). This breaks the symmetry at low energy as observed (weak forces are not symmetric with electromagnetism at low energy because they are short ranged, due to being mediated by massive gauge bosons). Weak forces are also only applicable to particles with left-handed spin unlike electromagnetism.
Why believe in electroweak unification of the sort postulated in the Standard Model, when it has not been confirmed? Similarities between many elements of the weak and electromagnetic gauge interactions do certainly suggest a connection between the W, Z and photon, and thus some kind of “unification” of electromagnetism, but this unification need not be that of the Standard Model (there is a better option).
What happens in physics always is that the first model to be made up to predict phenomena becomes a Mecca for religious mainstream worship, and is “believed”. Then they go on building on the original assumption without adequate experimental suppport, adding epicycles or adding in “quantum gravity” using the same approach to “unification” that has not even been confirmed experimentally for electroweak interactions (outside the low energy, broken symmetry regime). George Bernard Shaw cynically explained:
“We cannot help it because we are so constituted that we always believe finally what we wish to believe. The moment we want to believe something, we suddenly see all the arguments for it and become blind to the arguments against it. The moment we want to disbelieve anything we have previously believed, we suddenly discover not only that there is a mass of evidence against, but that this evidence was staring us in the face all the time.”
Update (27 December 2009):
Dr Lubos Motl has a new post about an article in the cringeworthy, sycophantically mainstream magazine New Scientist which he refers to (appropriately with his usual right-wing depreciation) as Nude Socialist:
“Nude Socialist printed a short article with a simple purpose: the Higgs particle may be hard to be seen in 2010 but the lightest supersymmetric particle – possibly a neutralino which would be the best dark matter candidate – could be much easier to find.
“For the first time, the readers of this magazine are correctly told that supersymmetry could actually be discovered by the LHC before the Higgs boson – even though supersymmetry is much more revolutionary and much less certain than the Higgs mechanism. That’s how the masses, cross sections, and signals work.
“As Patrick Goss says, supersymmetry is so 2010. The recent mixed non-observation of some possibly dark-matter-induced events by CDMS has somewhat increased chances that the dark matter is right behind the corner although this question remains to be settled. …
“Afshar, Telegraph, and war
“Meanwhile, The Telegraph gives a lot of space to a crazy physicist, Shahriar Afshar (whom I know in person), who has previously made ludicrous statements about the invalidity of the postulates of quantum mechanics.
“The eager Muslim predicts that there will be a civil war in the physics community if the God particle is not found.
“Well, first of all, it is extremely unlikely that the Higgs won’t be found at all. Second of all, if it is not found, everyone will be equally confused. There will be no room for “infighting”. Instead, there will be room for completely new theories and for people’s creativity.”
The Standard Model’s success in the “broken” electroweak symmetry regime (i.e. the non-symmetry regime, where weak vector bosons are massive making that force weak and electromagnetic vector bosons are massive making that force relatively strong, and thus not symmetrical to the weak force) suggests that there is a need of some mechanism by which weak gauge bosons acquire mass and also the bias to only interact with particles of left-handed spin, not right-handed.
There is no evidence that the Higgs field is the correct way to give the mass to weak gauge bosons and all other particles with mass. The Higgs field is designed to give mass at low energy to weak vactor bosons, but not to give them mass above a certain energy (the electroweak unification energy), above which both weak and electromagnetic gauge bosons are supposed to be massless, making their couplings similar and thus “unifying” numerically the strengths of the interactions.
This mainstream “unification” idea (equality of couping parameters for fundamental interactions at high energy, and inequality – explained by broken symmetry due to Higgs type mechanisms – at low energy) contradicts conservation of energy for the fields which are mediated by vector bosons. At small distances (femtometres) from a charged fundamental particle such as an electron, the electric field strength exceeds Schwinger’s threshold for spontaneous pair-production of virtual fermions in the vacuum. In other words, the field is strong enough that normally invisible fermions suddenly appear for a brief period governed by the uncertainty principle. The strong electric field of the real (long lived) electron polarizes the virtual fermion pair during this brief time, so that positively charged virtual fermion gets attracted slightly closer to the real electron at the core and the negatively charged virtual fermion gets repelled slightly further away from the real electron’s core. Thus, the “cloud” of virtual fermions acts to shield the electric field from the core, reducing its observable value at greater distances.
This is the physical basis for the renormalization of electric charge in quantum field theory: it is well understood and has plenty of evidence behind it. Now what happens to the energy of the electric field when it is shielded by the cloud of polarized virtual fermions? Any electric field has an energy density (Joules per cubic metre) equal to the product of half the permittivity of the dielectric (such as the vacuum) and the square of the electric field strength measured in volts per metre. The electric field strength at any distance from an electron is simply given by Coulomb’s electric force law and F = qE where F is force, q is charge and E is electric field strength. Hence E = q/(4*Pi*Permittivity*distance2).
The electromagnetic running coupling due to vacuum polarization makes increase in high energy collisions or for very small distances (as you penetrate the shielding cloud of polarized virtual fermions and thus see less shielding of the field from the charge). Energy conservation is violated by this (Coulomb’s inverse square law of distance compensates for the geometry of the fall in energy density, but does not include the running coupling variation): something must happen to the energy shielded at small distances.
This “missing” energy produces a whole range of virtual particles so it automatically goes into powering the short-range nuclear interactions, the weak and the strong forces. This simple, experimentally substantiated argument constitutes a fact-based theory that rivals mainstream electroweak symmetry at high energy, Higgs theory, supersymmetry, and thus mainstream string theory claims. Instead of symmetry via couplings all becoming identical at high energy (“symmetry to the eyes of a physically ignorant mathematician”), there is a physical mechanism at work, and the energy lost from the electromagnetic field due to shielding via the polarized cloud of virtual fermions gets converted into the potential energy of the short-ranged strong and weak nuclear force fields. Because the energy density lost from the electromagnetic field by vacuum polarization is easily calculated using the known running coupling parameter (confirmed experimentally by the high-energy collisions of leptons, for instance), this theory is predictive and gives you estimates for the energy densities of the short-ranged nuclear force fields as a function of distance.
For leptons (which don’t have strong force i.e. color charges), 100% of the energy lost due to vacuum polarization shielding of the electric field will be converted into the potential energy of the weak field, mediated by weak gauge bosons. For hadrons (which have color charges), the lost energy will be shared according to the relative weak isospin charge and strong color charge of the particle in question. We have outlined this definite prediction in previous posts, and a detailed scientific paper of predictions is on the way.
Electroweak symmetry, the supposed equality of coupling parameters for electromagnetism and weak interactions, due to massless weak and electromagnetic vector bosons above the postulated electroweak unification energy, is really a complete scam. Electroweak “symmetry” as described conventionally by the equality of couplings (effective charges) is broken at all energies: it simply does not exist in nature. There is no symmetry in terms of electromagnetic and weak interactions having the same effective charges (couplings) at very high energy. There is no “unification energy”. Instead, the way nature works is that energy lost from the electromagnetic field due to vacuum polarization shielding is converted into vector bosons for short-ranged interactions.
This is a complete departure from existing thinking on the subject of symmetry in particle physics. It will be experimentally verified at some point in the future, because it is a fact-based theory, based on energy conservation and other observed facts, unlike the high energy symmetry guesswork in the Standard Model; at low energy the Standard Model fits the facts very well and the equations are right, but it will fail at high energy. There is a field that gives weak gauge bosons and other particles mass, but it isn’t the symmetry breaking Higgs field. Again, this has been outlined in previous posts, and a full scientific paper is on the way.
Update: Woit has a new post about a couple of books, including one by Professor Sean Carroll claiming that the low-entropy beginning of the universe (entropy is wasted thermal energy – the energy in a closed system which is unable to do useful work – divided by the temperature of the coldest heat sink available to the closed system, e.g. you cannot power a steam engine to do any work if you have no lower-temperature heat sink with which to condense the steam back into water, so all the energy will be wasted in a closed system at a uniform temperature, giving “high entropy”; similarly, it’s easier for an egg to become an omelette than for an omelette to convert itself back into an egg because an omelette is a more uniform or “high entropy” state for the molecules than the egg with yoke and white separated, and it would require a lot of energy to separate all the molecules again) appears to be a random freak of nature that can only be “explained” scientifically by the anthropic selection of an initial low-entropy state from the string theory multiverse landscape of 10500 vacuum states. Carroll summarizes this false argument as follows (the American spelling for omelette is omelet):
“You can turn an egg into an omelet, but not an omelet into an egg. This is good evidence that we live in a multiverse. Any questions?”
Actually, as Woit states:
“I don’t disagree that to understand cosmology you want to explain the low entropy at the Big Bang. I just continue to not see why this explanation, whatever it is, is necessary … the problem seems to me to be that questions like this inherently can’t be adjudicated by any conceivable experiment. Arguing about them thus tends to be a rather pointless activity. We’re really in the realm of philosophy of science here, not science.”
However, Woit’s comment applies to Professor Carroll’s uncheckable pseudoscience, not to checkable physics. The law of entropy always increasing applies to a closed system, not to the universe which is open and expanding and subjected therefore to the exchange of radiation which is always being received in a low-energy, red-shifted state. Entropy measures disordered, useless energy. The second law of thermodynamics states that entropy (disorder) always increases with time in an isolated non-equilibrium system. In the universe, the opposite occurs: the cosmic background radiation, originating from around 400,000 years after the big bang, is extremely uniform, indicating that the universe had very high entropy (disorder) at that time, not the low entropy as Carroll claims. You couldn’t extract any useful energy at 400,000 years after the big bang, because the exceedingly good uniformity of the temperature throughout the universe precluded any possibility of a heat sink! Hence, the energy was useless for doing work, even in principle, so the entropy was exceedingly high, equal to approximately the total heat energy divided by the uniform ~3,000 K temperature at that time. Since that time, condensation of hot radiation into colder atoms and also the effect of gravity has effectively clumped matter together, ending the uniformity of temperature by creating an irregular distribution of hot stars in cold space which acts as a heat sink and allows work to be done. This effect of gravity has allowed the entropy of the universe to fall from its initial high value which was due to the unitial temperature uniformity.
The universe started with very high entropy (uniform temperature) and has evolved, due to the temperature decline with expansion and due to clumping of matter by gravity, into a low entropy state where the centre of stars are at millions of times the temperature of the vacuum of space.
Carroll believes that the universe started with low entropy because he doesn’t calculate the entropy of the whole universe as its heat energy divided by its temperature. He believes that entropy was low initially because he disagrees about the amount of disordered energy initially, which he believes was smaller. He would argue that entropy should not be calculated for the whole universe using the heat energy and temperature at early times, because the second law of thermodynamics (rising entropy) applies to closed (isolated) systems not in equilibrium.
Carroll simply takes the 2nd law of thermodynamics and extrapolates into the past. Because the 2nd law of thermodynamics states that entropy rises in isolated, non-equilibrium systems, Carroll finds that the entropy of the early universe must have been very small, and then he tries to explain this “problem” by invoking an uncheckable “explanation”: the anthropic selection of a small entropy value from a string theory multiverse landscape with 10500 differing universes to pick from.
This isn’t experimentally validated physics; it is politically correct (groupthink stringy thought) philosophy. A physicist would approach the problem by questioning whether the 2nd law of thermodynamics really applies to the universe. The 2nd law of thermodynamics applies to isolated or closed systems which are not in equilibrium. Even if the universe should be subjected to the 2nd law of thermodynamics, the law may nevertheless be wrong for very large systems. It is a “law” not because it was found on a tablet of stone, but because it was found that steam engines wouldn’t work without a heat sink at a lower temperature than the boiler! That’s the basis for that law. It is well-established experimentally for steam engines, but that doesn’t mean you can assume it applies to the big bang. It is also a pretty abstract human-invented concept, a bit like the concept of “probability”. In a real world, things happen or don’t happen: the whole concept of probability is due to human ignorance such as the human failure to produce a mathematical model and measure imput parameters to allow precise (non-probabilistic) predictions. E.g., radioactive decays are triggered by the random exchange of particles (field quanta) between quarks: this randomness is just a physical phenomenon like Brownian motion, it is not governed by mathematics but merely conveniently averaged by mathematics. The fact that you can’t develop a practical deterministic mathematical model due to the vast complexity of the system is not proof that nature is inherently beyond mechanistic physical understanding: on the contrary, it’s very easy to see how random exchanges of field quanta cause chaotic effects on orbital electrons and on nuclear particles.
“I look forward to reading Sean’s book, but I am confused by some of these accounts of the issue. Even if the entropy of the universe as a whole can be defined (which I am confused about) and increases over all, the important thing for eggs, chickens and cooks is that nevertheless there are large regions far from equilibrium today.”
Instead of philosophical speculations that can’t be checked (at best) and which moreover look like a lot of rubbish, Carroll would be doing more for science if he objectively reviewed quantum gravity research which did make checkable predictions. However, it seems that isn’t his cup of tea.
Update (12 January 2010):
An anonymous comment puncturing Sean’s omelette entropy propaganda states: “As someone else once pointed out in this blog, turning omelettes into eggs is almost as trivial as the inverse process. Just finely chop the omelettes and feed a hen with them, she will turn them back into eggs, increasing the overall entropy of the barn with heat and waste in the process…”
In an earlier post on Woit’s blog, John Rennie (a former editor of Scientific American?) argues:
“Re the low entropy at the Big Bang: if gravity were repulsive then the state of maximum entropy would be an (approximately) uniformly distributed gas. Doesn’t LQG predict that gravity becomes repulsive as you wind back time towards the Big Bang? In that case isn’t it possible that the low initial entropy is due to gravity flipping from a repulsive to an attractive force?”
Rennie is making two assumptions based on groupthink not observational fact: that LQG (loop quantum gravity) correctly predicts that gravity was repulsive at early times (forcing particles away from one another, and preventing them from gravitational clumping together), and secondly that uniformity at early times implies low entropy (little disorder), rather than high entropy (much disorder). Is a uniform gas ordered or disordered? It depends on what you mean by order! The whole concept of entropy loses more and more practical meaning when you take it away from the practicalities of steam engine efficiency…
Chris then pointed out that the cosmological acceleration implies that gravitational attraction only holds over a limited range of small distances, and over larger distances masses repel one another (cosmological acceleration), responding to Rennie simply as follows:
“on a cosmic scale gravity IS repulsive.”
Rennie then responded:
“@chris: I assume you’re referring to dark energy, but this has only recently started dominating and certainly wasn’t a factor near the Big Bang, or indeed during the formation of galaxies.
“Of course there may eventually be a big rip. If so presumably the state of maximum entropy would again be a roughly uniform distribution. That’s assuming the concept of entropy has any meaning in such a radically non-equilibrium system!”
Rennie is not correct in assuming that “… dark energy … has only recently started dominating …”. This is what mainstream cosmologists say when they are talking out of their as*ses, because cosmological acceleration is only seen in data from the earliest times that have ever been observed, not from recent times, so the situation is the opposite to that stated by Rennie. We see cosmological acceleration dominating when we look back across immense distances to very early times after the big bang. If cosmological acceleration was observed to be a recent event that has just started dominating, we would have to have local evidence to justify that, because the local time is 13,700 million years after the big bang. When you look to immense distances, you’re looking back to very early times, not “recent” events! Cosmological acceleration due to dark energy (spin-1 graviton exchange) is only observable over immense distances corresponding to looking back to very early times after the big bang.
Rennie could and would probably argue that the lambda-CDM model suggests that dark energy has a bigger effect on the overall expansion of the universe at later rather than at earlier times after the big bang. But we can’t see large distances at late times. If we look over large distances, we’re looking back in time (because of the travel time of light), so we can only actually observe very large distances over times tending back towards zero time (early times).
We can observe early time phenomena over vast distances, and we can observe late time phenomena in the big bang over small distances. If we look to distance S, we automatically see things at that distance when they were at time S/c into the past. There is no way even in principle to escape this linkage of space and time when making observations, because light doesn’t travel instantly. Hence, all claims that are outside of this rigid observational reference frame are “not even wrong” speculations that cannot be experimentally validated, even in principle, without waiting around for billions of years! So the actual observations of dark energy which we have only apply to both large distances and early times after the big bang.
Rennie’s implication that at large distances and late or “recent” times the dark energy predominates, is not a scientific observation: it is merely a lambda-CDM model prediction. It’s nevertheless wrong to claim it as being a scientific fact, when it is not an observation and cannot be checked even in principle. Like the 10500 vacuum states in the “multiverse” landscape of string theory, it is an inherently fruitless speculation that Pauli would have to dismiss as being “not even wrong”.
When you instead look at the evidence of dark energy correctly, you get a simple scientific model which does make checkable predictions. Inevitably, it isn’t so glamorous as the sci fi nonsense. General relativity assumes without evidence that the cosmological acceleration is independent of gravity: in fact the correct quantum gravity model proves that the two are interdependent (linked) because gravity is a result of cosmological acceleration.