This morning ATLAS and CMS reported new Higgs results at the Hadron Collider Physics Conference in Kyoto. Only a subset of the available decay channels have been updated. The crucial diphoton channels in particular have not been updated by either experiment. This may be due to increased difficulties in doing the analysis with possible issues over systematic errors and mass/energy calibration. Obviously the systematics get more significant as the statistical errors diminish. The earlier diphoton update at 8 TeV from ATLAS already showed some signs of inconsistency with the excess peaking at around 127.5 GeV compared to lower estimates of around 125.5 GeV from CMS. We will have to be more patient which they sort it out.

However, there have also been some sensational new updates. Both CMS and ATLAS have provided new results from the ditau decay mode showing an excess of 0.72 +- 0.52 times the standard model at 125 GeV in CMS and 0.7 +- 0.7 in ATLAS. A crude combination gives 0.71 +- 0.42 times that standard model. This agreement with the standard model using 17/fb in each detector overturns earlier results from CMS in July where the signal seemed a little on the low side.

In the ZZ channels CMS have shown a useful update that extends the mass range up to 1 TeV with no signs of any excess anywhere other than the known 125 GeV. Although this only works directly for the standard model it is also a bit of a blow for models with a second ordinary Higgs in this mass range.

Like this:

LikeLoading...

Related

This entry was posted on Wednesday, November 14th, 2012 at 8:38 am and is filed under Higgs Hunting, Large Hadron Collider. You can follow any responses to this entry through the RSS 2.0 feed.
Both comments and pings are currently closed.

[…] Más información en otros blogs. Philip Gibbs, “Higgs at HCP2011,” viXra log, Nov. 14, 2012; Tommaso Dorigo, “Higgs: New ATLAS And CMS Results,” […]

[…] about all this, see Tommaso Dorigo here and here, Matt Strassler here and here, Philip Gibbs here. For primary sources, CMS here, ATLAS here, presentations […]

With mass/energy calibration an acknowledged issue we have operational definition of mass uncertainty! It follows logically that the Higgs mechanism fails in favor of something like a bulk modulus. It remains my comment that the mass/kinematic viscosity split of the action pertains, for a new chapter in the Copenhagen saga.

Fascinating post! As a fellow science-lover, I thought you might be interested in theoretical physicist Sean Carroll’s new book about the Higgs boson, The Particle at the End of the Universe. I’m happy to share more info with you and send a review copy if you’re interested.

As to the ZZ4l CMS data being
“… a bit of a blow for models with a second ordinary Higgs …”
I do not agree with respect to my model
which has two ordinary Higgs mass states beyond the 125 GeV state
with masses around 200 GeV and 270 GeV
and cross sections around 25% or so of the SM cross section.

As you can see from your CMS ZZ4l Brazil band plot above there are:

200 GeV peak in 2 sigma yellow zone at cross section between 25% and 30%

270 GeV peak in 2 sigma yellow zone at cross section around 30%

while in that mass region the expected cross section is now well below 20%.

As more data is accumulated, those two peaks will either
sink below the 20% line and go away
or
stay above the 20% line and increase in significance to 3 sigma and more.

Unfortunately, it is likely that it will take more than 30/fb of data to determine whether my peaks live or die,
so we probably will not know before 2015 at the earliest.

However, as of now I think that my model is still alive
and will likely remain so for the next few years.

Phil, what new measurements tell about validity of Higgs boson. For instance, what tell article http://arxiv.org/pdf/1207.1093v2.pdf.
Does these new measurements confirm it or reject it anything?

The new measurements increase precision a little and make it look more like the standard Higgs boson, but other “imposter” scenarios such as doublets and triplets are not ruled out since they have a parameter space that includes points that look very much like the standard model one.

There is a lot of very naive ontological activity involved with Higgs. This is especially so in blog literature.

Already in string models Higgs and gauge fields emerge at QFT limit as effective notions. In TGD framework this is even more the case since at microscopic level bosons consist of fermion antifermion pairs meaning that all bosons emerge.

Higgs fields and gauge boson fields appear only in the effective action defining QFT limit for the microscopic dynamics. Higgs like particle is real but Higgs mechanism only reproduces the masses predicted by p-adic thermodynamics. Higgs vacuum expectation has no counterpart at microscopic level – coherent state for Higgs like particle would have been a candidate for this counterpart. Higgs mechanism predicts nothing as we should already now from four decades of failed attempts to predict the mass spectrum.

Piron’s Theorem in quantum logic requires four (4) particles to constitute an algebra of probabilities, but the Higgs decay channels do not feature that many particles at any one time! So this data appears strictly beyond the QFT limit, but then the Higgs can be quantizing anything to constitute the limit, and you can as well think of proton spin dissipating for lack of regular QFT structure!

All the data gives is an interaction pattern, with mass varying between channels, which suggest that within channels there is a complementary variation in viscosity, the well-known quark-gluon viscosity, which pertains to proton spin. Unfortunately the interaction Hamiltonian is not renormalizable, and Witten’s crew are driven to N=4 SYM to integrate the model. But the same query applies: can you strictly assume N=4? Yes, IF you include the gluons, which means in a parton model, the way Feynman did it.

Its impressive to me that the N=4 SYM residue of string theories echoes the logic of Piron’s Theorem. But its looking more plausible now to think of intrinsic uncertainty in proton spin. This will sure be a long saga, though 2016 at least.

[…] Από τις ανακοινώσεις των ερευνητών σε συνέδριο που πραγματοποιήθηκε πριν από λίγες ημέρες στο Κιότο, αναζωπυρώθηκε κάπως ενδιαφέρον για το σωματίδιο Higgs. Για το τι νέο υπάρχει σχετικά με το Higgs μπορεί κανείς να διαβάσει ΕΔΩ, ΕΔΩ, ΕΔΩ και ΕΔΩ. […]

According my finding (http://vixra.org/abs/0907.0012) phenomenon 18 degrees for pseudoscalar mesons it can be
18×15=270Gev
18×7=126Gev last value
My guess: 18 degrees connected with Golden spiral
phi = 1+2\sin(pi/10) = 1 + 2sin18deg
phi = 1\2csc(pi/10) = 1/2csc18deg http://en.wikipedia.org/wiki/Golden_ratio

[…] Από τις ανακοινώσεις των ερευνητών σε συνέδριο που πραγματοποιήθηκε πριν από λίγες ημέρες στο Κιότο, αναζωπυρώθηκε κάπως ενδιαφέρον για το σωματίδιο Higgs. Για το τι νέο υπάρχει σχετικά με το Higgs μπορεί κανείς να διαβάσει ΕΔΩ, ΕΔΩ, ΕΔΩ και ΕΔΩ. […]

Sir R.A. Fisher found a group structure in factorial interactions, and then a prime factor structure in the group. So it looks like this interaction is generated by the group of quark symmetries.

I wonder if the CMS vs. ATLAS spin-0 masses come from a statistical effect of relativistic mass increase. CMS measures the production of two gamma rays from the decay of massive spin-0 bosons; ATLAS measures the production of four leptons produced by the decay of massive spin-0 bosons into two Z bosons (which each decay into two leptons). The larger mass found by CMS may be the first evidence that a statistical effect is at work: faster spin-0 bosons (with relativistic mass increase over the rest-mass) may be statistically more prone to undergo electromagnetic (double gamma) decay emission, rather than weak (double Z) emission.

Although it is traditional to assume that decay rates are intrinsic and uniformly affected by relativity’s time-dilation formula (which predicts that both electromagnetic and weak decays are affected in precisely the same manner by relativity), this is only a first-order approximation and could neglect subtle vacuum corrections, corresponding to the effects of the motion of a particle on its interaction with its own quantum field. The Lorentz contraction may have some effect on the UV cutoff energy (and radius in the direction of motion) for high-order quantum field perturbative corrections (loops carrying large momenta), and since weak decay rates depend on quanta in the field, it could be that the CMS vs. ATLAS spin-0 massive boson decays from electroweak symmetry breaking are the first clean evidence for a second-order relativistic correction. (Other VERY high-energy experiments always create jets with too much noise to see this effect.)

I hasten to add that I am not referring to the SM “electroweak symmetry breaking” or non-mass-predicting “Higgs boson” in my comment, but to the actual experiments at CERN, which should not be dogmatically tied down to mainstream dogma any more than Kepler should have interpreted Brahe’s data on Mars using Ptolemy’s epicycles:

“Higgs did not resolve the dilemma between the Goldstone theorem and the Higgs mechanism. … I emphasize that the Nambu-Goldstone boson does exist in the electroweak theory. It is merely unobservable by the subsidary condition (Gupta condition). Indeed, without Nambu-Goldstone boson, the charged pion could not decay into muon and antineutrino (or antimuon and neutrino) because the decay through W-boson violates angular-momentum conservation. … I know that it is a common belief that pion is regarded as an “approximate” NG boson. But it is quite strange to regard pion as an almost massless particle. It is equivalent to regard nuclear force as an almost long-range force! The chiral invariance is broken in the electroweak theory. And as I stated above, the massless NG boson does exist … Pion’s spin is zero, while W-boson’s spin is one. People usually understand that the pion decays into a muon and a neutrino through an intermediate state consisting of one W-boson. But this is forbidden by the angular-momentum conservation law in the rest frame of the pion.”

– Professor N. Nakanishi (at N.E.W.) Nakanishi states that despite the Higgs mechanism which produces massive weak bosons (Z and W massive particles), a massless Nambu-Goldstone boson is also required in electroweak theory, in order to permit the charged pion with spin-0 to decay without having to decay into a spin-1 massive weak boson. In other words, there must be a “hidden” massless alternative to weak bosons as intermediaries. (I’m finishing off a paper about this ASAP.)

ATLAS gave 123.5 GeV for weak decay chain h→ZZ→4l, while CMS gave 126.5 GeV for h→γγ. We argue that if this mass difference is real (rather than a systematic detector miscalibration of some kind), it indicates a statistical relativistic effect: the Lorentz contraction in the direction of motion affects self-interactions of a moving spin-0 massless boson with its own field quanta, affecting weak and electromagnetic decays to a differing extent. So in a spectrum of massive spin-0 boson velocities produced by an LHC collision, the fastest moving massive spin-0 bosons could be more likely than expected to decay by double gamma emission; the slower ones might be expected to be more likely than expected to undergo weak decays and four lepton emissions. The higher the speed, the greater the slowing due to time-dilation on massive Z boson decay processes, whereas there is no time-dilation velocity effect for massless gammas (which go at light velocity in regardless).

To consider relativistic effects is much like calling for a parton model, which shows the vacuum reaction in a relativistic frame. And opens on new thinking: is this “massivization” not an effect in space-time, not string but a tension in space-time?

One would want to see the traces in the parameter-space as in Jester’s latest post on Resonaances. The traces are very distinct, and they show systematic variation, not in the parameters-space, but in the parameters of the space. Its “the parameters are the model”!

Also we have cf = +/- 1 at 99% confidence: about 3 sigma, the same as the SM model as a whole. If its Higgs, we’re seeing double!! But that’s not an SM prediction…

“If its Higgs, we’re seeing double!! But that’s not an SM prediction…”

Precisely (assuming the data is accurate).

(I remember well the “two yardsticks” Hubble parameter in the early 90s; one set of observations consistently gave roughly 50 km/s/megaparsec +/- 10%, the other researchers consistently gave nearer to 100, yet again with a standard deviation of roughly +/- 10%. So the universe was either 10 gigayears old +/- 10% or else 20 gigayears +/- 10% standard deviation. Quite a conflict. There was a miscalibration of a conceptual error in some cepheid variables. Ironically, Hubble had originally overestimated the Hubble parameter by a large factor, estimating roughly 500 km/s/Mps, due to a cruder cepheid variable misidentification. This bigger error was corrected back in the 1950s. Majority opinion was thought that the 50s correction of Hubble’s original cepheid yardstick error had settled the science forever on this topic. Then they eventually discovered that there were still subtler distinctions among the populations of cepheids! Classic groupthink.)

Suppose that the data are correct, but that (unlike my idea above) it’s not a relativistic effect that’s causing the two masses for spin-0 massive bosons. Then, as you say, there must be two bosons. “We’re seeing double.”

The spin-0 boson decaying by gamma emission is more massive than that undergoing weak interactions for a simple reason: the mass is linked to the interactions each boson undergoes. ATLAS gave 123.5 GeV for weak decay chain h→ZZ→4l, while CMS gave 126.5 GeV for electromagnetic decay route h→γγ.

So somehow the electroweak symmetry breaking is producing two massive spin-0 bosons, not one. One is 2% more massive than the other and is prone to decay by emitting two gamma rays. The lighter spin-0 boson is more prone to decaying by emitting two massive neutral weak bosons.

The key difference between the two spin-0 bosons is rest mass (the charge for quantum gravity). The SM is too simplistic. Glashow and his adviser Schwinger in 1956 prematurely attempted a SU(2) purely Yang-Mills electroweak theory (unlike the SM’s U(1) X SU(2) electroweak mixing scheme) but fouled it up by naively trying to making the neutral boson the electromagnetic photon!

This meant giving mass to the two charged SU(2) bosons for weak interactions (neutral currents of Z bosons were totally unknown in 1956), but not to the uncharged SU(2) boson. This was experimentally wrong, but instead of continuing the effort for an SU(2) electrweak theory, it led to the complete abandonment of the scheme in favor of U(1) X SU(2) mixing, where U(1) is a hidden hypercharge symmetry that is only manifested as electromagnetism after “mixing” with SU(2). What should have been done is the development of an SU(2) electroweak theory where left-handed spinor field quanta acquire mass from the mixing with a quantum gravity gauge group (thus weak gauge bosons); the other handedness fails to acquire mass this way, and consequently it remains massless and acts as electromagnetism (the electromagnetic handedness shows up in Lenz’s law, the handedness of the curl of the magnetic field carrying a current, as Maxwell himself proved using a mechanical model of spinning field quanta back in 1861, see his “On Physical Lines of Force” series of papers). The Yang-Mills SU(2) equation for electromagnetism reduces to Maxwell’s equations due to a physical mechanism which constrains the term for net transfer of charge to be zero for massless charged bosons (infinite magnetic self-inductance of massless field quanta prevents one-way propagation; they can only be exchanged under a perfect two-way propagation or exchange equilibrium).

To be fair, I should add that Sheldon Glashow does want the SM to be disproved and replaced:

‘Yet there are new things to discover, if we have the courage and dedication (and money!) to press onwards. Our dream is nothing else than the disproof of the standard model and its replacement by a new and better theory. We continue, as we have always done, to search for a deeper understanding of nature’s mystery: to learn what matter is, how it behaves at the most fundamental level, and how the laws we discover can explain the birth of the universe in the primordial big bang.’

– Sheldon L. Glashow, The Charm of Physics (American Institute of Physics, New York, 1991).

nige, I find your perspective really interesting: my old conundrum is just the displacement current, from Maxwell’s correction of Ampere’s Law, with physical basis still unknown. Here have here a Bohr correspondence horizon for a revision of the SM.

Attending for once to Einstein’s old complaint, not just Maxwell but also the Classical symmetry analysis, is incomplete: one must turn from Spin groups to the Pin group to capture neutrino handedness. The Pin group adds an extra dimension of polarity (+/- 1) which must still be interpreted across the Spin formalism, opening on twistor representations.

A Pin view of the Dirac and Paui matrices: arxiv:hep-th/9810018v1

If the Higgs parameter space has an unexpected symmetry in the “fermiphobic” factor, the symmetry also breaks in one channel at least.

I see there’s some confusion among physicists about Pin groups: they are well known in algebraic geometry, but that again opens on twistor representations, where work is rather slow. Such things keep us “strung out” for now.

112th LHCC Meeting AGENDA OPEN Session, Dec 5 and Status of the LHC and Experiments, Dec 13 9 am. Will we see update of those gammagamma and ZZ channels?

viXra has passed the 10,000 paper milestone! 5 months ago

LHC Schedule looks good. Cryogenics nearly ready now and Op vistas says 6800 GeV per beam = 13.6 TeV com fb.me/1ZpgkgsN15 months ago

First ten essays in the FQXi essay contest are now online including mine and some other familiar faces... fb.me/6Zh4SGiFJ6 months ago

RT @SAPLancer: @viXra Respect for your decision & efforts to create an alternative archive to serve the scientific community, independent f… 7 months ago

@closefrank even though they were conference proceedings Springer were claiming that they were peer-reviewed. 7 months ago

[…] Más información en otros blogs. Philip Gibbs, “Higgs at HCP2011,” viXra log, Nov. 14, 2012; Tommaso Dorigo, “Higgs: New ATLAS And CMS Results,” […]

For the interpretation of the results in TGD framework see

http://matpitka.blogspot.fi/2012/11/higgs-like-state-according-to-tgd-after.html

and also the preceiding postings

http://matpitka.blogspot.fi/2012/11/higgs-without-higgs.html

and

http://matpitka.blogspot.fi/2012/11/to-deeper-waters.html .

[…] about all this, see Tommaso Dorigo here and here, Matt Strassler here and here, Philip Gibbs here. For primary sources, CMS here, ATLAS here, presentations […]

Phil,

Thanks for the update. There are few typos that need correction (“parrticlular”, “diphton” and “signifcant”).

Cheers,

Ervin

With mass/energy calibration an acknowledged issue we have operational definition of mass uncertainty! It follows logically that the Higgs mechanism fails in favor of something like a bulk modulus. It remains my comment that the mass/kinematic viscosity split of the action pertains, for a new chapter in the Copenhagen saga.

Fascinating post! As a fellow science-lover, I thought you might be interested in theoretical physicist Sean Carroll’s new book about the Higgs boson, The Particle at the End of the Universe. I’m happy to share more info with you and send a review copy if you’re interested.

Should the title be changed to: higgs at HCP 2012?

As to the ZZ4l CMS data being

“… a bit of a blow for models with a second ordinary Higgs …”

I do not agree with respect to my model

which has two ordinary Higgs mass states beyond the 125 GeV state

with masses around 200 GeV and 270 GeV

and cross sections around 25% or so of the SM cross section.

As you can see from your CMS ZZ4l Brazil band plot above there are:

200 GeV peak in 2 sigma yellow zone at cross section between 25% and 30%

270 GeV peak in 2 sigma yellow zone at cross section around 30%

while in that mass region the expected cross section is now well below 20%.

As more data is accumulated, those two peaks will either

sink below the 20% line and go away

or

stay above the 20% line and increase in significance to 3 sigma and more.

Unfortunately, it is likely that it will take more than 30/fb of data to determine whether my peaks live or die,

so we probably will not know before 2015 at the earliest.

However, as of now I think that my model is still alive

and will likely remain so for the next few years.

Tony

Phil, what new measurements tell about validity of Higgs boson. For instance, what tell article http://arxiv.org/pdf/1207.1093v2.pdf.

Does these new measurements confirm it or reject it anything?

The new measurements increase precision a little and make it look more like the standard Higgs boson, but other “imposter” scenarios such as doublets and triplets are not ruled out since they have a parameter space that includes points that look very much like the standard model one.

[…] A Rare Sight, poi Jester su Resonaances, Higgs: what’s new, e Philip Gibbs su viXra log, Higgs at HCP2012. Riassunto? Niente, niente, […]

There is a lot of very naive ontological activity involved with Higgs. This is especially so in blog literature.

Already in string models Higgs and gauge fields emerge at QFT limit as effective notions. In TGD framework this is even more the case since at microscopic level bosons consist of fermion antifermion pairs meaning that all bosons emerge.

Higgs fields and gauge boson fields appear only in the effective action defining QFT limit for the microscopic dynamics. Higgs like particle is real but Higgs mechanism only reproduces the masses predicted by p-adic thermodynamics. Higgs vacuum expectation has no counterpart at microscopic level – coherent state for Higgs like particle would have been a candidate for this counterpart. Higgs mechanism predicts nothing as we should already now from four decades of failed attempts to predict the mass spectrum.

Piron’s Theorem in quantum logic requires four (4) particles to constitute an algebra of probabilities, but the Higgs decay channels do not feature that many particles at any one time! So this data appears strictly beyond the QFT limit, but then the Higgs can be quantizing anything to constitute the limit, and you can as well think of proton spin dissipating for lack of regular QFT structure!

All the data gives is an interaction pattern, with mass varying between channels, which suggest that within channels there is a complementary variation in viscosity, the well-known quark-gluon viscosity, which pertains to proton spin. Unfortunately the interaction Hamiltonian is not renormalizable, and Witten’s crew are driven to N=4 SYM to integrate the model. But the same query applies: can you strictly assume N=4? Yes, IF you include the gluons, which means in a parton model, the way Feynman did it.

Its impressive to me that the N=4 SYM residue of string theories echoes the logic of Piron’s Theorem. But its looking more plausible now to think of intrinsic uncertainty in proton spin. This will sure be a long saga, though 2016 at least.

[…] Από τις ανακοινώσεις των ερευνητών σε συνέδριο που πραγματοποιήθηκε πριν από λίγες ημέρες στο Κιότο, αναζωπυρώθηκε κάπως ενδιαφέρον για το σωματίδιο Higgs. Για το τι νέο υπάρχει σχετικά με το Higgs μπορεί κανείς να διαβάσει ΕΔΩ, ΕΔΩ, ΕΔΩ και ΕΔΩ. […]

According my finding (http://vixra.org/abs/0907.0012) phenomenon 18 degrees for pseudoscalar mesons it can be

18×15=270Gev

18×7=126Gev last value

My guess: 18 degrees connected with Golden spiral

phi = 1+2\sin(pi/10) = 1 + 2sin18deg

phi = 1\2csc(pi/10) = 1/2csc18deg

http://en.wikipedia.org/wiki/Golden_ratio

For details:http://vixra.org/abs/0907.0012

Mpi=Mpr x tan(45-2×18)

Mk=Mpr x tan(45-18)

Md=Mpr x tan(45+18)

Mb=Mpr x tan(45+2×18)

Mpi (mass of pi-meson). Not confusing with pi=3,14

Mpr (mass of proton)

Clarification:

If somebody have question:”What common between tangent and real numbers?”

tan18 deg =0,29 apr. 1/3

pi/10=3,14/10 =0,31 apr.1/3

[…] Από τις ανακοινώσεις των ερευνητών σε συνέδριο που πραγματοποιήθηκε πριν από λίγες ημέρες στο Κιότο, αναζωπυρώθηκε κάπως ενδιαφέρον για το σωματίδιο Higgs. Για το τι νέο υπάρχει σχετικά με το Higgs μπορεί κανείς να διαβάσει ΕΔΩ, ΕΔΩ, ΕΔΩ και ΕΔΩ. […]

Sir R.A. Fisher found a group structure in factorial interactions, and then a prime factor structure in the group. So it looks like this interaction is generated by the group of quark symmetries.

I wonder if the CMS vs. ATLAS spin-0 masses come from a statistical effect of relativistic mass increase. CMS measures the production of two gamma rays from the decay of massive spin-0 bosons; ATLAS measures the production of four leptons produced by the decay of massive spin-0 bosons into two Z bosons (which each decay into two leptons). The larger mass found by CMS may be the first evidence that a statistical effect is at work: faster spin-0 bosons (with relativistic mass increase over the rest-mass) may be statistically more prone to undergo electromagnetic (double gamma) decay emission, rather than weak (double Z) emission.

Although it is traditional to assume that decay rates are intrinsic and uniformly affected by relativity’s time-dilation formula (which predicts that both electromagnetic and weak decays are affected in precisely the same manner by relativity), this is only a first-order approximation and could neglect subtle vacuum corrections, corresponding to the effects of the motion of a particle on its interaction with its own quantum field. The Lorentz contraction may have some effect on the UV cutoff energy (and radius in the direction of motion) for high-order quantum field perturbative corrections (loops carrying large momenta), and since weak decay rates depend on quanta in the field, it could be that the CMS vs. ATLAS spin-0 massive boson decays from electroweak symmetry breaking are the first clean evidence for a second-order relativistic correction. (Other VERY high-energy experiments always create jets with too much noise to see this effect.)

I hasten to add that I am not referring to the SM “electroweak symmetry breaking” or non-mass-predicting “Higgs boson” in my comment, but to the actual experiments at CERN, which should not be dogmatically tied down to mainstream dogma any more than Kepler should have interpreted Brahe’s data on Mars using Ptolemy’s epicycles:

“Higgs did not resolve the dilemma between the Goldstone theorem and the Higgs mechanism. … I emphasize that the Nambu-Goldstone boson does exist in the electroweak theory. It is merely unobservable by the subsidary condition (Gupta condition). Indeed, without Nambu-Goldstone boson, the charged pion could not decay into muon and antineutrino (or antimuon and neutrino) because the decay through W-boson violates angular-momentum conservation. … I know that it is a common belief that pion is regarded as an “approximate” NG boson. But it is quite strange to regard pion as an almost massless particle. It is equivalent to regard nuclear force as an almost long-range force! The chiral invariance is broken in the electroweak theory. And as I stated above, the massless NG boson does exist … Pion’s spin is zero, while W-boson’s spin is one. People usually understand that the pion decays into a muon and a neutrino through an intermediate state consisting of one W-boson. But this is forbidden by the angular-momentum conservation law in the rest frame of the pion.”

– Professor N. Nakanishi (at N.E.W.) Nakanishi states that despite the Higgs mechanism which produces massive weak bosons (Z and W massive particles), a massless Nambu-Goldstone boson is also required in electroweak theory, in order to permit the charged pion with spin-0 to decay without having to decay into a spin-1 massive weak boson. In other words, there must be a “hidden” massless alternative to weak bosons as intermediaries. (I’m finishing off a paper about this ASAP.)

I’ve put a brief summary paper on this problem here for the time being: http://quantumfieldtheory.org/Spin%200%20boson%20mass%20variations%20and%20relativity.pdf

ATLAS gave 123.5 GeV for weak decay chain h→ZZ→4l, while CMS gave 126.5 GeV for h→γγ. We argue that if this mass difference is real (rather than a systematic detector miscalibration of some kind), it indicates a statistical relativistic effect: the Lorentz contraction in the direction of motion affects self-interactions of a moving spin-0 massless boson with its own field quanta, affecting weak and electromagnetic decays to a differing extent. So in a spectrum of massive spin-0 boson velocities produced by an LHC collision, the fastest moving massive spin-0 bosons could be more likely than expected to decay by double gamma emission; the slower ones might be expected to be more likely than expected to undergo weak decays and four lepton emissions. The higher the speed, the greater the slowing due to time-dilation on massive Z boson decay processes, whereas there is no time-dilation velocity effect for massless gammas (which go at light velocity in regardless).

To consider relativistic effects is much like calling for a parton model, which shows the vacuum reaction in a relativistic frame. And opens on new thinking: is this “massivization” not an effect in space-time, not string but a tension in space-time?

One would want to see the traces in the parameter-space as in Jester’s latest post on Resonaances. The traces are very distinct, and they show systematic variation, not in the parameters-space, but in the parameters of the space. Its “the parameters are the model”!

Also we have cf = +/- 1 at 99% confidence: about 3 sigma, the same as the SM model as a whole. If its Higgs, we’re seeing double!! But that’s not an SM prediction…

“If its Higgs, we’re seeing double!! But that’s not an SM prediction…”

Precisely (assuming the data is accurate).

(I remember well the “two yardsticks” Hubble parameter in the early 90s; one set of observations consistently gave roughly 50 km/s/megaparsec +/- 10%, the other researchers consistently gave nearer to 100, yet again with a standard deviation of roughly +/- 10%. So the universe was either 10 gigayears old +/- 10% or else 20 gigayears +/- 10% standard deviation. Quite a conflict. There was a miscalibration of a conceptual error in some cepheid variables. Ironically, Hubble had originally overestimated the Hubble parameter by a large factor, estimating roughly 500 km/s/Mps, due to a cruder cepheid variable misidentification. This bigger error was corrected back in the 1950s. Majority opinion was thought that the 50s correction of Hubble’s original cepheid yardstick error had settled the science forever on this topic. Then they eventually discovered that there were still subtler distinctions among the populations of cepheids! Classic groupthink.)

Suppose that the data are correct, but that (unlike my idea above) it’s

nota relativistic effect that’s causing the two masses for spin-0 massive bosons. Then, as you say, there must be two bosons. “We’re seeing double.”The spin-0 boson decaying by gamma emission is more massive than that undergoing weak interactions for a simple reason: the

mass is linked to the interactions each boson undergoes.ATLAS gave 123.5 GeV for weak decay chain h→ZZ→4l, while CMS gave 126.5 GeV for electromagnetic decay route h→γγ.So somehow the electroweak symmetry breaking is producing

twomassive spin-0 bosons, not one. One is 2% more massive than the other and is prone to decay by emitting two gamma rays. The lighter spin-0 boson is more prone to decaying by emitting two massive neutral weak bosons.The key difference between the two spin-0 bosons is rest mass (the charge for quantum gravity). The SM is too simplistic. Glashow and his adviser Schwinger in 1956 prematurely attempted a SU(2) purely Yang-Mills electroweak theory (unlike the SM’s U(1) X SU(2) electroweak mixing scheme) but fouled it up by naively trying to making the neutral boson the electromagnetic photon!

This meant giving mass to the two charged SU(2) bosons for weak interactions (neutral currents of Z bosons were totally unknown in 1956), but not to the uncharged SU(2) boson. This was experimentally wrong, but instead of continuing the effort for an SU(2) electrweak theory, it led to the complete abandonment of the scheme in favor of U(1) X SU(2) mixing, where U(1) is a hidden hypercharge symmetry that is only manifested as electromagnetism after “mixing” with SU(2). What should have been done is the development of an SU(2) electroweak theory where left-handed spinor field quanta acquire mass from the mixing with a quantum gravity gauge group (thus weak gauge bosons); the other handedness fails to acquire mass this way, and consequently it remains massless and acts as electromagnetism (the electromagnetic handedness shows up in Lenz’s law, the handedness of the curl of the magnetic field carrying a current, as Maxwell himself proved using a mechanical model of spinning field quanta back in 1861, see his “On Physical Lines of Force” series of papers). The Yang-Mills SU(2) equation for electromagnetism reduces to Maxwell’s equations due to a physical mechanism which constrains the term for net transfer of charge to be zero for massless charged bosons (infinite magnetic self-inductance of massless field quanta prevents one-way propagation; they can only be exchanged under a perfect two-way propagation or exchange equilibrium).

To be fair, I should add that Sheldon Glashow does want the SM to be disproved and replaced:

‘Yet there are new things to discover, if we have the courage and dedication (and money!) to press onwards. Our dream is nothing else than the disproof of the standard model and its replacement by a new and better theory. We continue, as we have always done, to search for a deeper understanding of nature’s mystery: to learn what matter is, how it behaves at the most fundamental level, and how the laws we discover can explain the birth of the universe in the primordial big bang.’

– Sheldon L. Glashow, The Charm of Physics (American Institute of Physics, New York, 1991).

OT: Do you know what the schedule for the LHC is for the remaining few weeks ? They’re doing VDM scans and these weren’t planned on the last update.

nige, I find your perspective really interesting: my old conundrum is just the displacement current, from Maxwell’s correction of Ampere’s Law, with physical basis still unknown. Here have here a Bohr correspondence horizon for a revision of the SM.

Attending for once to Einstein’s old complaint, not just Maxwell but also the Classical symmetry analysis, is incomplete: one must turn from Spin groups to the Pin group to capture neutrino handedness. The Pin group adds an extra dimension of polarity (+/- 1) which must still be interpreted across the Spin formalism, opening on twistor representations.

A Pin view of the Dirac and Paui matrices: arxiv:hep-th/9810018v1

If the Higgs parameter space has an unexpected symmetry in the “fermiphobic” factor, the symmetry also breaks in one channel at least.

I see there’s some confusion among physicists about Pin groups: they are well known in algebraic geometry, but that again opens on twistor representations, where work is rather slow. Such things keep us “strung out” for now.

112th LHCC Meeting AGENDA OPEN Session, Dec 5 and Status of the LHC and Experiments, Dec 13 9 am. Will we see update of those gammagamma and ZZ channels?

I think these tend to be reports on the running of the machine and detectors rather than the scientific results, but I could be wrong.