The Power of Theory

February 28, 2012

I couldn’t resist. No offense intended :)


Higgs 2012

February 23, 2012

2011 lived up to all expectations and hopes for news about the Higgs Boson, but 2012 promises to be it’s crunch year and the excitement is about to begin

LHC startup

The Large Hadron Collider is getting ready to restart operations after the winter shutdown. The first part of the schedule looks like this

As you can see they should be starting full powering tests today which means the complete circuit of magnets should be cooled down to its working temperature of 1.9 Kelvin. However, as you can see in the picture below one of the RF cavities is still at room temperature. I hope it is something that can be sorted soon and they will be on their way.

A summary of how the LHC will run during 2012 can be found in a presentation by Rogelio Garcia . This slide in particlular says it all.

They expect 15/fb to 19/fb integrated luminosity for the whole year. Last year they expected 1/fb and delivered more than 5/fb, but this year we should not expect such a large overshoot. The machine has been brought close to its present operating capabilities and the peak luminosity cannot be pushed much beyond the numbers they are aiming for. The main uncertainty is in how efficiently it will run. last year there were times when it ran smoothly for two weeks and other times when technical issues held up progress for almost as long. The estimates for 2012 are based on the assumption that an average of the two extremes will be seen, but reality may differ. The decision to stick with 50ns at least means that the running will not be two different, although the higher energy and tighter squeeze than last year will be challenging enough.

ICHEP 2012

This year the International Conference for High Energy Physics will be taking place in July in Melbourne. This is the largest meeting on the HEP calendar and it is only held every two years. The experiments will be keen to have something new to say about the Higgs for the occasion so they have asked for 5/fb by June. With the 5/fb already analysed at 7 TeV and another 5/fb at 8 TeV there is a good chance that very convincing evidence for the Higgs will be found. However, I understand that they will not be combining the results at different energies immediately. I will of course perform my usual party trick of combining the plots unofficially to provide combinations over the different energies, channels, and experiments. I expect to be very busy. However, the approximate combinations do not give a definitive answer to how many sigmas of statistical significant have been observed. That will have to wait for official combinations to provide the pvalue plots.

Moriond 2012

Much sooner than ICHEP we will have the Moriond Meetings. This is split into several parts including the Electro-Weak conference and the QCD conference (there is also Theory and Cosmology).  The Higgs reports should be in the electroweak section but from the preliminary programs you can see that there is more about the Higgs at the QCD meeting with Sunday 11th of March being the crunch day. One talk that is so far noticeable by its absence is the ATLAS+CMS Higgs combination. I am led to believe that this will not now be ready in time due to the recent update by CMS. Producing the combinations is a long process and as the amount of data to analyse increases it can only get longer. It is also possible that the difference in position of the peak excess from the two experiments is giving some cause for delay while they improve their calibration methods to see if they can be brought closer together. I would not be surprised if they abandon the full combination and aim to just get decisive results from both individual experiments instead.

Since the LHC has nothing new to show about the Higgs, the interest will be in what the Tevatron can produce from its complete 10/fb of data. In the last month they have published their final results for the diphoton channel (already reported at AP and TRF) here is the plot

This plot tells us almost nothing because the limits are at ten times the expected Higgs cross-section. Any bumps at this level of sensitivity would be almost certainly statistical fluctuations. The Tevatron is not very sensitive in the diphoton channel because Higgs production is lower at the lower energy and because the mass resolution is not very good compared to the LHC. However, the Tevatron does much better in the bb decay channel and their complete combination should be quite good.

The overall expected sensitivity of the Tevatron to a 125 GeV Higgs is 3-sigma we are told. Previous published results reached 2-sigma sensitivity but only a 1-sigma excess was seen, they were probably unlucky. Due to the inferior energy resolution of the Tevatron any excess at the low mass region should also be expected to be quite wide compared to what we have seen recently at the LHC. Here is my simulation of what we might see as the final result.

Hopefully we will see the real deal on 11th March if they are ready in time. If the excess is any bigger than this fake version we should be happy, any less will be a bit disappointing, but it is all down to chance.


What to print in 3D

February 21, 2012

Over the last few years we have watched a whole load of new technologies go from expensive items for the professional to cheap gadgets for the home. Laptops, Mobile phones, plasma TVs, digital cameras, HD camcorders, GPS, photo printers, scanners, the list goes on. This year everything is going 3D. TVs and laptops with 3D screens are already available at reasonable prices and within a few of years most gadgets that can be made to work in 3D will be sold mostly in 3D versions. If you are an early adopter you may already have your 3D phone with 3D screen and camera, but what about 3D printers?


3D printers don’t print 3D pictures that you view with 3D glasses like 3D TVs, they print real 3D objects made out of plastic. Already they are being used by manufacturing and design companies for rapid prototyping and in medicine they are being used to print bone replacements for knees, hips and jaws. Each part is a one-off with exactly the right shape produced directly from a computer model. The cost of a 3D printer such as this HP Designjet is about €13,000 so a few rich gadget freaks may already have them in their home. For most of us the cost will need to come down by one or two orders of magnitude before it gets onto the Xmas wishlist. Will that happen and if so how fast? Assuming there is no technical obstacle the answer depends on the demand. What would we use it for?

If you think the only thing a 3d printer could be used for in the home is printing spare buttons for your shirt then you are sadly lacking in imagination. Somebody with a bit more vision would see things differently and he or she may be the next entrepreneur to reach the top ten on the worlds rich list. I regret that it isn’t going to be me but it might be someone like Oskar van Deventer who has been using Shapeways 3D printing services to make ingenious (and often amusing) puzzles based on Rubik’s Cube. When you need some inspiration you could do worse than browse some of the many videos on his Youtube channel. Here are some favourites.

FadiCube

Hollow Cube

Kilominx

Mixup Cube

Anisotropic cube

Unlucky Twist

YouCube

17x17x17 Rubik’s Cube

Twisty Tree

Gear Snake

Thank you for watching.

Bonus – How to solve Rubik’s cube in 5.66 seconds

When you are 3 years old it may take you a little longer


LHC Running parameters for 2012

February 10, 2012

The LHC performance workshop in Chamonix is over and the summary talk gives the proposed running parameters for this year (to be rubber stamped by the directorship) As expected the beam energy for this year will be 4 TeV.

As it turned out they never did do those thermal tests to see if the splices were up to it, but they formed a new risk assessment based on the fact that there have been no beam induced quenches in 2011. This was attributed to the successful use of snubber resistors (in case you needed to know such details)

To decide on whether they should run with 25ns or stick with 50ns they took completion of the Higgs search as the priority criterion. They figure this will require precisely 13.3 /fb at 4TeV so that ATLAS and CMS have enough data independently. Using the predictions for luminosity that were reported on in December they find that at 50ns they would collect 16.5/fb while at 25ns they could only reach 11.5/fb, so obviously they must run at 50ns. I suspect that if they added error bars to some of those numbers the conclusion  would not be so clear cut :-) .  beta* will be 0.6m whereas the earlier luminosity estimates assumed beta* of 0.7m, never mind that this brings the estimate at 25ns up to 13.5 /fb, just enough to find the Higgs :-) . 50ns is the right decision anyway.

They also have an option to run for an extra 2 months before the long shutdown if that is needed to complete the search. Don’t be surprised if they do need it.


Some technical points about combining sigmas.

February 10, 2012

In the latest reports ATLAS is claiming 3.5σ local significance for their combined plot and 2.2σ global significance after ‘Look Elsewhere Effect’. For the diphoton channel alone they have 2.8σ local significance and 1.5σ global significance. Meanwhile over at CMS the figures are 3.1σ local significance and 2.1σ global significance for the combined plot, and for the diphoton channel they have also 3.1σ global significance and 2.1σ after LEE.

Now everyone wants to combine these numbers. How can that be done and what is the answer? Concentrating on the combined plots for the moment, a common method is just to add them in quadrature

s = \sqrt{s_1^2 + s_2^2}

giving \sqrt{3.5^2 + 2.8^2} = 4.5 for the combined local significance and \sqrt{2.2^2 + 1.5^2} = 2.7 for the global significance. Is this correct?

No that is wrong. Look elsewhere effect must be applied after combining.

The global significance is wrong because we have combined two results with LEE already applied. We should combine the local sigmas first and then apply LEE again. Well LEE is a subjective observer dependent quantity that nobody agrees about how to apply so lets just look at the local significance and let everyone estimate their own LEE afterwards. So are we correct for the global significance?

That is wrong too. The observed excesses were not in the same place.

It’s a good point. We can only combine the excesses at the same mass and the peaks of the excess differ by 1 or 2 GeV. If we do this we will get a smaller answer, but is that fair? The difference could be due to a systematic calibration error in one or other of the experiments. In fact this is looking increasingly likely as more data is added and the peaks do not get closer. We will have a much better idea about that when the data is doubled by the summer. So let’s be optimistic and just assume that the peaks will nearly coincide after recalibration. In that case we still have 4.5σ. Have we got it right now?

It is still wrong. Combining the numbers in quadrature is not the right formula.

If you think combining sigmas in quadrature is right, or even just approximately right, consider this scenario. Suppose in the first run of data I get a 2 sigma excess at some mass, but it is really just a statistical fluctuation. When the data is double we expect no excess at that mass so combine in quadrature to get \sqrt{2^2+0^2} = 2 so the excess is still two sigma. Double the data again and we might get a 1 sigma excess so the total significance is \sqrt{2^2+1^2} = 2.2  , even if we double the data again and get a deficit of one sigma below expected we add in quadrature to get \sqrt{2.2^2+(-1)^2} = 2.4 So if you believe that sigmas are added in quadrature you must believe that no excess can ever get smaller as more data is added. In fact they will probably grow like a random walk everywhere. This is obviously rubbish.

The reason for this is that these numbers are not error estimates that are normally added in quadrature. Have a look at this signal plot for the CMS and ATLAS signals in the diphoton channel

The excess for CMS is about 2.1 times the standard model signal plus or minus 0.6. For ATLAS it is 2.4 ± 0.7. These are not sigmas.  Those would be given by the size divided by the error so 2.1/0.6 = 3.5 sigma for CMS and 2.4/0.7 = 3.4 sigma for ATLAS (not quite right but I’ll come back to that). If we assume flat normal distributions then figures like these have to be averaged weighting by the errors. It is those errors which are combined in quadrature. For equal size data sets the errors should normally be the same which means that the correct formula for combining the sigmas is actually

s = \frac{s_1+s_2}{\sqrt{2}}

So redoing the calculation we get (3.5 + 2.8) \times 0.707 = 4.5 , the same answer.  In fact if the two sigma levels are similar this formula gives an answer very close to what you would get by adding in quadrature, so why should we care?

The present excesses in the diphoton channel are larger than predicted by the standard model with the Higgs boson at that mass. This excess is not as big as the excess over the standard model with no Higgs boson. It could be a sign that something non-standard is at work, but let’s assume that it is just a statistical fluctuation. In that case when we double the data we expect to get just the standard model signal for the second half of the data. In that case the signal next time will be given by (2.1 + 1.0) \times 0.707 = 2.2 for CMS and (2.4 + 1.0) \times 0.707 = 2.4  for ATLAS. In other words if the excess is due to a standard model Higgs boson then we should not expect much increase when the data is doubled. Don’t get your hopes up for a discovery by the summer. In fact the size of the signal in diphoton could easily go down. Even with quadruple the data it may not grow much bigger. Hopefully the combination with ZZ and WW will fare better because we have not seen the same over-excess in those channels and they will have a discovery by the end of the year, but don’t bank on it unless you think the over-excess is a real non-standard effect.

So do we have the right number of sigmas yet?

It is still wrong. You forgot the systematic correlations and have produced NONSENSE.

OK, but get serious. The previous unofficial combinations have shown that the correlations have a negligible effect on the combination when compared to official results. So the  combined result of 4.5 sigma still stands.

It is still wrong. The distribution is not flat normal. It is log normal.

Again this has been found to be a good approximation for doing the combination but there is a good point to be made here. Should be we read the number of sigmas off the plot when the CLs scale is linear or logarithmic? Have a look at these two plots which are the same thing on log and linear scale.

Remember that the green and yellow bands show one and two sigma deviations so the excess looks like three sigma on the log scale and four sigma on the linear scale. Which is right? If we assume the flat normal distribution is correct we should be using the linear scale but the bands are more equally spaced on the log scale, so presumably that is more correct.  The flat normal approximation is good for generating the plot but we should be careful to read the size of the excess from the log scale. if we do that will the answer be correct.

It’s still wrong. For best results use the combined p-value plot.

Have a look at what ATLAS and CMS are quoting for their local significance for the diphoton channel. CMS say 3.1 and ATLAS say it is 2.8. This does not match what you would get from reading either the linear or logarithmic plots. The numbers come from the p-value plots which are converted to sigma-equivalents. It looks like trying to get these numbers from the exclusion plots or the signal plots will never be that accurate. The bottom line is that we have to wait for the full official combination if we want to know the real answer. Until then adding in quadrature is probably as good as anything else. :-)


Prof or Hobo Quiz

February 10, 2012

I think this has been going round for some time but in case you have not seen it, please have a go and let us know how you score. I managed an average 7/10.


February Higgs Update

February 9, 2012

ATLAS and CMS have updated their Higgs publications based on last years data. These results were released rapidly at the CERN council meeting in December but since then they have had time to polish the reports and as a bonus CMS have added some new data into the diphoton channels. This has already been covered nicely on the other blogs including QDS, NEW, TRF, OPS and the best report from Resonaances which gives a nice account of where the extra events come from.

CMS added a new category of events which, apart from 2 photons, contain 2 energetic jets in the forward (closer to the beam pipe) region of the detector. Such events could arise in the so-called vector boson fusion (VBF) process, where each of the 2 colliding quarks emits a W or Z boson which coalesce to create a Higgs boson.”  Jester

The outcome of this is a stronger signal at 124 GeV making it look very similar in strength to the one in ATLAS ar 126 GeV. Here is a plot showing the new CLs for diphoton from CMS with the December version in red for comparison

As you can see, the peak at 124 GeV is stronger while the other bumps have gone down. I have plotted this on a log CLs scale rather than the linear scale that CMS use because you need a log scale to read the significance properly. You can see that it has now gone up to 2.5 sigma for this channel alone. For comparison here is the same plot but with the results from ATLAS for comparison (which has not changed since December).

If ATLAS looks better don’t forget that its expected value (the dotted blue line) is also higher so in fact the statistical significance is now about the same for both.

One mystery is why the peaks are about 2 GeV apart. This could simply be a statistical deviation or it may be a sign that they still have some work to do on calibrating the calorimeters which measure the energies of the photons.  The two experiments have different systems for detecting these photons. ATLAS use a tank of liquid argon, a clear liquid that looks like this.

CMS on the other had uses some Lead Tungstate crystals which look like glass but are as heavy as lead. See this old article from Symmetry Breaking for the remarkable story behind them.

When the CMS detector is finally laid to rest in about 40 years time they will be able to take the crystals out and make a great chandelier out of them as a memorial.

Another mystery about the diphoton results is that the strength of the signal is about twice as strong as expected. This can be seen clearly in this signal plot.

The excess over the standard model Higgs signal is about 1.5 Sigma for each experiment. It could be a sign that the particle is not as standard as expected, but more probably this is just a fluctuation that will go away with more data. It does mean that we should be cautious about how much we should expect the signal to improve if we double the data. It wont be as much as you might think and in fact the signal could get worse. This is why you should not expect conclusive results until all this years data is in.

The unofficial Higgs combinations now need updating and here is the new ATLAS+CMS combo for the diphoton channel with the December version in red for reference.

The significance is not as good as it would be if the two excesses coincided better in mass but it is now almost 3 sigma.

The CMS all channel combination has also been updated of course. Here is a replot to show how it has changed vs the December levels in red. there is no change at higher masses.

The significance on this plot has now just passed 3-sigma at the peak of the excess.

Which means that the unofficial Higgs combination needs to be updated so here it is

The peak significance here has now reached about 3.5 sigma.

I should remind you that this is an unofficial approximate combination that ignores correlations and un-normal probability distributions. The official version from the Higgs combination group should be due out soon. It will be similar but the differences are important.


Stop rumours!

February 7, 2012

Meaning that there are rumours going round about stops or scalar tops, not that we should stop spreading rumours. In SUSY theories stops are the lightest sfermions (scalar fermions are bosons not fermions) related to top quarks which are the heaviest leptons and indeed the heaviest particle in the standard model. If stops exist they would help stabilise the Higgs vacuum which could be too unstable if the Higgs mass is around 125 GeV as now expected, but noone has seen one yet and the situation for theorists has been getting a bit desperate because they had expected to see them at the LHC and the so the anti-SUSY bloggers have been poking fun and saying I told you so.

Now rumours have been squarked to the blogotwittersphere via Motl at TRF and Jester of Resonaances that a signal for the stop has been seen in the data. the story so far has been summed up by Cliff Harvey on Google+ so look there for the details. There is a seminar next week that could be relevant to the rumour but Jester’s last tweet says knowingly  “Caution: theorists rumoring about stops is fact, but what is now out on blogs is 100% false. Dont jump unless more reliable rumors appear” so what is going on?

Sooner or later someone is going to start a rumour just to catch us out. So is the greatest news story in the history of science about to break or have we been duped by a revengeful experimentalist who saw the next seminar as an opportunity to get back at us for all those earlier leaks on the theory blogs? Is it indeed a slepton or something we should have slept on?

By the way there is an LHCb seminar about to be webcast and they are the only ones with plausible BSM signals so far so let’s slide back to reality and enjoy that, until next week.


LHC Update: Chamonix

February 6, 2012

This week the operations groups of the Large Hadron Collider are holding their annual workshop in Chamonix to determine how to run the collider during 2012 and beyond.   Many technical slides are up giving us a good indication of the status of the machine with most of the winter maintenance completed.

Last year they were expected to produce 1/fb of integrated luminosity and they gave the experiments 5/fb, so this year we reward them with an expectation of at least 15/fb, no pressure. According to Lamont, the tentative running parameters include a beta* of 60cm in ATLAS and CMS compared to 1m last year. This means a tighter squeeze at the collision points and potentially a 60% increase in luminosity, so they are already half way to the target. The rest depends on stability and the time they can keep it in stable beams. Last year 50% of fills lasted less than 3 hours and turnaround times were dominated by machine availability. Improving the efficiency will be key to getting more luminosity.

Will they stick with 50ns switch to 25ns to double the number of bunches? This wont double the luminosity as you might expect because the injector has to split the bunches resulting in lower intensity. They will also be limited by beam induced heating and emittance. Overall the smaller bunch spacing may decrease the luminosity but it would also provide some much wanted relief from the effects of pile-up in the experiments. There are other downsides to consider. The increased effects of the e-cloud at 25ns means lower luminosity lifetimes. To mitigate this problem they will need a lot of scrubbing runs taking about two weeks of machine running time compared to 2 or 3 days if they stick with 50ns. Will the extra uncertainty that this implies rule out 25ns operations for this year? See the talk by Rumolo for more details.

Another decision concerns the beam energy. Last year they ran at 3.5TeV per beam, but this year they tentatively hope to increase this to 4 TeV. That would be great news for physics because it increases the discovery potential at the higher masses where new particles may be waiting to revolutionize our knowledge. Whether they can run safely at the higher energy this depends largely on the results of tests on defective busbar joints. This will be reported tomorrow so look out here for an update.

In July the biannual ICHEP conference is to be held in Melbourne. It is the biggest particle physics conference on the planet and it would make a big splash if they can produce some good results by then. The best hope is for a successful update on the Higgs search which would require doubling the total luminosity to add another 5/fb by mid-June.  It’s not impossible, but they have already lost a week of runs due to a problem with RF in CMS as reported in the CERN Bulletin. The start of beam operations is now scheduled for 21st March.

Update 7-Feb-2012: Today’s talks indicate that 4TeV per beam will not present risks any larger than those accepted for 2011. this is due to the reduced number of quenches and results of checks on the splices. See this talk in particular. Final decisions are not yet in and as we saw last year nothing is settled until then.


Follow

Get every new post delivered to your Inbox.

Join 270 other followers