Before the independence day and ICHEP Higgs discovery I raised a question about the Higgs decay to WW channel. In the early days it had shown a broad excess, but this had then faded to the point where it was consistent with no Higgs anywhere rather than the signal seen in some other channels. I asked how well we could trust these results.

The deficit was especially noticeable from ATLAS with CMS showing a less significantly low event count. Today at the Higgs Hunting workshop ATLAS have released an update for their WW channel at low mass with a combination of 7 TeV and 8 TeV data. Now they once again have a broad excess signal more consistent with a boson in the low mass range. There is also a conference note giving all the details.

Using unofficial combinations I can now update the plot that shows the size of the signal in each channel. Here it is with the earlier results from 2011 shown in blue and the updated versions in green. This is a global combination with the Tevatron data helping in the bb channel.

The diphoton channel stills shows an excess while the ditau now has a deficit. Others are really in line with the standard model Higgs. In any case there is not yet enough data to draw any conclusions but that is no reason to not speculate about what might explain the results if they hold up.

Like this:

LikeLoading...

Related

This entry was posted on Wednesday, July 18th, 2012 at 12:53 pm and is filed under Higgs Hunting. You can follow any responses to this entry through the RSS 2.0 feed.
Both comments and pings are currently closed.

Could the lack of tau compensate the excess of gamma, via branching ratios? Surely. And then, a leptophobic higgs? Giving mass to quarks but asking leptons to go nicely to get their mass elsewhere?

Wouldn’t that only work if Gamma(H->tau tau) did constitute 50% of the Higgs width in the SM, such that setting the partial width Gamma(H->tau tau)=0 would cut in half the Gamma(H-> anything) in the denominator and thus double the remaining BFs.
Since the tau tau final state only accounts for 8% of Higgs decays in the SM or so, erasing it would only raise the BFs by ~8%.

(This would also raise the WW and ZZ a bit.
You can lower the ZZ coupling to compensate, but the WW coupling is positively correlated to the gamma gamma rate and can’t be lowered too much without weakening that excess again.)

[…] For more about this, and a nice summary of the latest combined data for various Higgs channels, see viXra log. The gamma-gamma channel Higgs signal is high, the tau-tau channel is low, others close to […]

A question about terminology: how should we define Higgs if we allow leptophobic Higgs;-). Do we give up the idea about fermion massivation via Higgs expection or do we require separate Higgs for quarks and leptons. Should we just talk about spin zero boson?

I agree here with Gibbs. Higgs mechanism is about boson mass. Another point is what to do about leptons. My opinion is that they are not independent mechanisms, the ones for quarks and leptons. Remember that the postulate of “orthogonality” in Koide triplets allowed me to calculate within one sigma the mass of the top quark starting from the leptons, just using the link between e-mu-tau and bottom-strange-charm and then going one step above, to charm-bottom-top (this was my vixra/arxiv dual preprint, last year). So whatever the mechanism is, it is something reaching both kind of fermions.

The orginal Higgs mechanism did not include elementary fermions at all. It is absurd that an elementary spin 0 boson could give mass to all elementary fermions. A much richer mechanism is required. And… of course there is the question of what give mass to the Higgs? :-)

As for the higgs, since it is a scalar one can simply write down a mass as it is not forbidden by the EW symmetries. You give it a negative mass squared of -(125 GeV)^2 which destabilizes the vacuum, and the resulting higgs vacuum field bootstraps the higgs boson mass to a real and positive value of +(125 GeV)^2 Via the self-interaction.

The way fermions get their mass from the Higgs is via a Yukawa coupling. That means there is a different coupling for each and every elementary fermion. It is not really a mechanism “explanation” at all for how fermions get their mass. It is completely ad hoc. Kane shows in “Modern Elementary Particle Physics”,

m_f = g_f v/sqrt(2).

Where g_f is the coupling for the particular fermion, v is the vacuum expectation value ~= 246 GeV. Ya still have to plugin by hand, all the different fermion masses into the Lagrangian density. I think fermions get their mass from interaction with a quantized Dirac-Fermi field and perhaps the Higgs gets its mass from that also. All that has to be figured out is the geometrical aspects of the quantized Dirac-Fermi field to generate masses for everything. Is there a geometrical math wizard in the house? :-) There may be 8 space dimensions involved.

All that the theorists aimed to do when building the standard model was to find a consistent way to describe the observed physics. The chiral gauge invariance of the weak interaction does not allow mass terms, but yukaka terms are renormalisable and so are quadratic and quartic terms for the scalar. That is all that matters. It is not about beauty, simplicity, principles of symmetry or particles with god-like properties, just the need for a consistent renormalisable theory that matches nature.

A charge with no mass is an immediate crisis. Mass actually comes from the Kaluza-Klein construct, as a corollary to the existence of the spacetime metric. Can anyone else here figure out how to draw a Kaluza-Klein charge (by strictly adhering to the demands of general covariance)?

You are apparently operating under the assumption that no-one in the scientific community dares consider alternatives to the Higgs mechanism. I know it’s hard to led go of conspiracy theories, but that’s silly. It’s just that any known alternative so far has needed many many more epicycles to just barely accomodate all experimental observations which are even automatically predicted by the standard mechanism.

So the only choices are stick with a few epicycles or go to a model with many more epicycles?

How about theories of principle that do not require epicycles, and that can make definitive predictions that are prior, feasible, quantitative, NON-ADJUSTABLE and unique to the theory being tested?

How about a willingness to say maybe the dark matter is not “WIMPs” after 40 years of being completely AWOL?

How about theoretical physicists who can seriously consider that they have wound up in a cul-de-sac?

The trendline towards an SM Higgs result in four out of five channels measured as more data is collected is pretty encouraging. So is the fact that the SM prediction is within the experimental margin of error and getting closer to the SM prediction as more data is collected for the two most common expected branching ratio decays (bb = 60%; WW=21%). By comparison the below expectation tau/tau has a 5% branching ratio, the converging and within MOE of the SM prediction ZZ is 2.5% and the excessive relative to the SM expectation diphoton is 0.2%. Channels representing 83.5% of events are right where they should be, a channel representing 5% of events is quite low, a channel representing 0.2% of events is quite high. Channels representing 11.3% of events are not measured so far.

We would expect that there would be something like 60,000 bb events and 21000 WW events in the roughly 100,000 Higgs decays in the data so far (in round and approximate numbers for easy of illustration), as opposed to 5000 tau/tau, 2500 ZZ, and 200 diphoton decays (some channels are omitted because the experiments aren’t able to detect them relative to messy backgrounds).

Also there is an inherent experimental bias towards a diphoton excess just at the moment of a confirmed Higgs boson discovery, because the diphoton channel because it is the cleanest of the lot, is critical to claiming that one has discovered a Higgs boson, as opposed to something else. The odds of having data to support a Higgs boson discovery during a point in data collection when random variation has led to an excess of diphoton events from the SM expectation, rather than at a point when random variation has led to a deficit of diphoton events relative to the SM expectation, is considerable.

The tau/tau is more of a concern, because the number of events attributable to Higgs boson decays in absolute terms is much greater than in the diphoton channel and the because the tau/tau since it is less diagnostic of Higgs boson discovery than the diphoton bias, should be less prone to discovery bias in favor of an excess over the SM. And, if anything, there would still be a slight discovery bias towards an excess rather thana deficit of such events. The trend away from the SM expectation with more data is also notable, although not unduly so – it is still one of the smaller channels out of five channels in all. If the tau/tau rate doesn’t move back towards the SM expectation in the next round or two of LHC data one has to seriously consider the possibility of new physics influencing this channel.

The tau/tau channel is also more complicated to explain with new physics independent of a SM Higgs. Adding a new unstable even integer spin particle to the zoo can easily give you an excess of diphotons in the prescence of a SM Higgs, and can have properties tuned to the data. A process that gobbles up tau/tau events produced by SM Higgs boson decays so that they are hidden in the final data is far harder to imagine (perhaps tau/taus annilhilate each other due to correlated paths more often than expected and convert themselves into diphton events, but it is hard to see the expectation calculations screwing that up). It is much easier to imagine systemic experimental errors that could cause an underreporting of tau/tau events (either in the software used to make cuts or the hardware used to detect tau/tau events), than it is to imagine a theoretical tweak to the model that could do that without screwing everything else up more distinctly.

Could the lack of tau compensate the excess of gamma, via branching ratios? Surely. And then, a leptophobic higgs? Giving mass to quarks but asking leptons to go nicely to get their mass elsewhere?

Wouldn’t that only work if Gamma(H->tau tau) did constitute 50% of the Higgs width in the SM, such that setting the partial width Gamma(H->tau tau)=0 would cut in half the Gamma(H-> anything) in the denominator and thus double the remaining BFs.

Since the tau tau final state only accounts for 8% of Higgs decays in the SM or so, erasing it would only raise the BFs by ~8%.

(This would also raise the WW and ZZ a bit.

You can lower the ZZ coupling to compensate, but the WW coupling is positively correlated to the gamma gamma rate and can’t be lowered too much without weakening that excess again.)

[…] For more about this, and a nice summary of the latest combined data for various Higgs channels, see viXra log. The gamma-gamma channel Higgs signal is high, the tau-tau channel is low, others close to […]

A question about terminology: how should we define Higgs if we allow leptophobic Higgs;-). Do we give up the idea about fermion massivation via Higgs expection or do we require separate Higgs for quarks and leptons. Should we just talk about spin zero boson?

The original Higgs mechanism did not include leptons, just gauge bosons, so I think a leptophobic Higgs is still a Higgs

I agree here with Gibbs. Higgs mechanism is about boson mass. Another point is what to do about leptons. My opinion is that they are not independent mechanisms, the ones for quarks and leptons. Remember that the postulate of “orthogonality” in Koide triplets allowed me to calculate within one sigma the mass of the top quark starting from the leptons, just using the link between e-mu-tau and bottom-strange-charm and then going one step above, to charm-bottom-top (this was my vixra/arxiv dual preprint, last year). So whatever the mechanism is, it is something reaching both kind of fermions.

The orginal Higgs mechanism did not include elementary fermions at all. It is absurd that an elementary spin 0 boson could give mass to all elementary fermions. A much richer mechanism is required. And… of course there is the question of what give mass to the Higgs? :-)

Howdy,

Why is it absurd? It obviously works in the SM.

As for the higgs, since it is a scalar one can simply write down a mass as it is not forbidden by the EW symmetries. You give it a negative mass squared of -(125 GeV)^2 which destabilizes the vacuum, and the resulting higgs vacuum field bootstraps the higgs boson mass to a real and positive value of +(125 GeV)^2 Via the self-interaction.

@Alex,

The way fermions get their mass from the Higgs is via a Yukawa coupling. That means there is a different coupling for each and every elementary fermion. It is not really a mechanism “explanation” at all for how fermions get their mass. It is completely ad hoc. Kane shows in “Modern Elementary Particle Physics”,

m_f = g_f v/sqrt(2).

Where g_f is the coupling for the particular fermion, v is the vacuum expectation value ~= 246 GeV. Ya still have to plugin by hand, all the different fermion masses into the Lagrangian density. I think fermions get their mass from interaction with a quantized Dirac-Fermi field and perhaps the Higgs gets its mass from that also. All that has to be figured out is the geometrical aspects of the quantized Dirac-Fermi field to generate masses for everything. Is there a geometrical math wizard in the house? :-) There may be 8 space dimensions involved.

All that the theorists aimed to do when building the standard model was to find a consistent way to describe the observed physics. The chiral gauge invariance of the weak interaction does not allow mass terms, but yukaka terms are renormalisable and so are quadratic and quartic terms for the scalar. That is all that matters. It is not about beauty, simplicity, principles of symmetry or particles with god-like properties, just the need for a consistent renormalisable theory that matches nature.

A charge with no mass is an immediate crisis. Mass actually comes from the Kaluza-Klein construct, as a corollary to the existence of the spacetime metric. Can anyone else here figure out how to draw a Kaluza-Klein charge (by strictly adhering to the demands of general covariance)?

And a couple more epicyles are not going to cause anyone to doubt the wisdom of the Higgs Mehanism.

You are apparently operating under the assumption that no-one in the scientific community dares consider alternatives to the Higgs mechanism. I know it’s hard to led go of conspiracy theories, but that’s silly. It’s just that any known alternative so far has needed many many more epicycles to just barely accomodate all experimental observations which are even automatically predicted by the standard mechanism.

So the only choices are stick with a few epicycles or go to a model with many more epicycles?

How about theories of principle that do not require epicycles, and that can make definitive predictions that are prior, feasible, quantitative, NON-ADJUSTABLE and unique to the theory being tested?

How about a willingness to say maybe the dark matter is not “WIMPs” after 40 years of being completely AWOL?

How about theoretical physicists who can seriously consider that they have wound up in a cul-de-sac?

The trendline towards an SM Higgs result in four out of five channels measured as more data is collected is pretty encouraging. So is the fact that the SM prediction is within the experimental margin of error and getting closer to the SM prediction as more data is collected for the two most common expected branching ratio decays (bb = 60%; WW=21%). By comparison the below expectation tau/tau has a 5% branching ratio, the converging and within MOE of the SM prediction ZZ is 2.5% and the excessive relative to the SM expectation diphoton is 0.2%. Channels representing 83.5% of events are right where they should be, a channel representing 5% of events is quite low, a channel representing 0.2% of events is quite high. Channels representing 11.3% of events are not measured so far.

We would expect that there would be something like 60,000 bb events and 21000 WW events in the roughly 100,000 Higgs decays in the data so far (in round and approximate numbers for easy of illustration), as opposed to 5000 tau/tau, 2500 ZZ, and 200 diphoton decays (some channels are omitted because the experiments aren’t able to detect them relative to messy backgrounds).

Also there is an inherent experimental bias towards a diphoton excess just at the moment of a confirmed Higgs boson discovery, because the diphoton channel because it is the cleanest of the lot, is critical to claiming that one has discovered a Higgs boson, as opposed to something else. The odds of having data to support a Higgs boson discovery during a point in data collection when random variation has led to an excess of diphoton events from the SM expectation, rather than at a point when random variation has led to a deficit of diphoton events relative to the SM expectation, is considerable.

The tau/tau is more of a concern, because the number of events attributable to Higgs boson decays in absolute terms is much greater than in the diphoton channel and the because the tau/tau since it is less diagnostic of Higgs boson discovery than the diphoton bias, should be less prone to discovery bias in favor of an excess over the SM. And, if anything, there would still be a slight discovery bias towards an excess rather thana deficit of such events. The trend away from the SM expectation with more data is also notable, although not unduly so – it is still one of the smaller channels out of five channels in all. If the tau/tau rate doesn’t move back towards the SM expectation in the next round or two of LHC data one has to seriously consider the possibility of new physics influencing this channel.

The tau/tau channel is also more complicated to explain with new physics independent of a SM Higgs. Adding a new unstable even integer spin particle to the zoo can easily give you an excess of diphotons in the prescence of a SM Higgs, and can have properties tuned to the data. A process that gobbles up tau/tau events produced by SM Higgs boson decays so that they are hidden in the final data is far harder to imagine (perhaps tau/taus annilhilate each other due to correlated paths more often than expected and convert themselves into diphton events, but it is hard to see the expectation calculations screwing that up). It is much easier to imagine systemic experimental errors that could cause an underreporting of tau/tau events (either in the software used to make cuts or the hardware used to detect tau/tau events), than it is to imagine a theoretical tweak to the model that could do that without screwing everything else up more distinctly.

http://phys.org/news/2012-07-higgs-absolute.html#jCp

http://arxiv.org/pdf/1206.7114.pdf

Not that good news?

CERN Papers (7/31/2012):

ATLAS: Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC:

http://arxiv.org/abs/1207.7214

CMS: Observation of a new boson at a mass of 125 GeV with the CMS experiment at the LHC:

http://arxiv.org/abs/1207.7235