Physics In Collision 2010

August 31, 2010

One particle physics conference ends and another begins. After Toronto, the next one is Physics In Collision 2010 starting tomorrow at Karlsruhe . Access to slides of the talks can be found here.

Further results from ATLAS and CMS will be presented tomorrow, and with the rapid increase in data collected there is always a chance that new plots will appear not seen in Toronto last week. These LHC experiments now have 3.6/pb of data collected. New negative results are possible. However, it is unlikely that they will show any results with positive hints of new physics because it would take longer to check and approve anything that exciting.

Speakers face a little dilemma at these conferences because they need to upload their slides in advance, but they don’t want to spoil the talk by showing their results in advance, especially if there is a risk that someone might blog about it. One poster presenter has tackled this problem by allowing some cats to wander in from hid critical result. If you are going to try this technique, don’t forget that it is possible to extract images and text from PDF even if they are hidden by overlays.

In this case we can reveal that the latest result for CP-violating asymmetry at Belle is

XXX ± 0.50 ± 0.22

Oh well…


LHC update and plans for Sept/Oct

August 29, 2010

The experiments at the Large Hadron Collider have now collected over 3/pb of data each. This plot shows the integrated luminosity collected  by ATLAS as 2.83/pb but you can add 0.28/pb from the current overnight run which will continue to collect data this morning. Physics runs will go on until Monday morning when the collider will be shut down for a regular start-of-month technical stop.

There had been some debate among the beam controllers about whether or not to increase luminosity further before the stop, but it was decided that the priority should be to perfect the injection procedures.

There are now 9 weeks left to increase the luminosity before the LHC is turned into a heavy ion collider during November. In that 9 weeks the luminosity must be increased from its current peak value of 10/nb/s to 100/nb/s . That will be equivalent to 3/fb per year and they want to collect 1/fb in 2011. When you take into account the luminosity decay, and available running time you find that they really need twice that luminosity to achieve their goal, so either they will have to increase luminosity further in 2011, or settle for less than 1/fb of data.

To get to higher luminosities they need to perfect the injection procedures so that they can pack the proton bunches into the collider ring more closely. They have recently debated whether to aim for injections with a spacing between bunches of 75ns or 150ns  (22.5m or 45m). The decision is to aim for 150ns separations because it avoids parasitic collisions. These are collisions between the bunches that happen outside the normal collision point when the angle at which the beams cross is not sufficient to keep them apart. Such collisions produce false signals in the detectors and may produce radiation in unwanted places that age the equipment.

With 150ns separations they can fit a maximum of 384 bunches into each beam which should be enough to increase the luminosity to 100/nb/s. However the plan is still tight and they cannot afford to lose much time to unplanned stoppages. Otherwise they will be left with more work to do when they restart next year making it even harder to hit the 1/fb target. The work to complete the injection method has already been held back due to a problem with the UPS systems so they cannot afford further delays after the technical stop. 


“crackpots” who were right: the conclusion

August 28, 2010

I have been posting a blog series about scientists who were called “crackpots” but eventually turned out to be right. There is a convenient archive of the posts under the tag crackpots-who-were-right in case you missed any of these fascinating stories. I could carry on the series forever, but I want to do other things so I’m going to conclude it with this last post.

If I had continued I would have gone on to tell you about Barry Marshall who got the Nobel Prize after showing that stomach ulcers are caused by a bacterium rather than stress as everyone believed. He found it so difficult to convince anyone that he eventually drank a petri dish of the bacteria to prove it. I also wanted write a bit about Robert Chambers who wrote a popular book about evolution before Darwin. He was ridiculed by biologists for his misuse of terminology but the public were won over and he paved the way for acceptance of Darwin’s theory while much of  the scientific establishment held on to creationism. I also never got round to the famous case of Hannes Alfvén another Nobel laureate who faced ridicule when he realised that plasmas and magnetic and electric fields are important in galactic physics, not just gravity as everyone else believed. Nor have I mentioned Subrahmanyan Chandrasekhar who showed that stars above a certain size would eventually collapse to form black holes at a time when others did not believe they could really exist. The lambasting he got from Eddington almost ended his brilliant career. Then there was Joseph Goldberger who showed that Pellagra is a disease caused by dietary deficiency but for political reasons his opponents continued to claim it was infectious. Others on my list are William Harvey for blood circulation, Doppler for light frequency shifts, Peyton Rous for showing viruses can cause cancer, Boltzmann, Dalton, Tesla, Alverez, Margulis, Krebs, and on and on. All of them had to fight against resistance before their ground breaking work gained the recognition it deserved.

But so what? What can we draw from this? Some people have commented that these people were not real crackpots. They worked as real scientists and had ideas that just took time to establish. They are not like the people who turn up in physics and maths forums with crazy ideas that have no respect for hundreds of years of progress ins science. Furthermore, our “crackpots”-who-were-right are a tiny minority compared to all the ones who were wrong.

I disagree with these points. Firstly, these people really were treated as crazy and were subjected to ridicule or were ignored. The cases described here are the extremes. There are many more who have merely had an important paper rejected. In fact it is hard to know the real extent of the problem because only the most important stories get told in the history of science. My guess is that these people represent the tip of a large iceberg most of which lies hidden below the threshold it takes for historians to take note.

Furthermore, even if the “crackpots” who were right are the minority among all “crackpots”, they are still the most significant part. It is better to create an environment in which these people can have their theories recorded for the sake of the few who are right, than to try to dispel them all because of some irrational fear that they disrupt real science.

And, even amongst those who have really crazy ideas there will be the people like Ohm who also have some valid ideas hidden underneath. No amount of peer-review or archive moderation can reliably separate the good ideas from the bad. The only solution is to allow everyone to have their say and to record it in a permanent accessible form. Some people ask me why I expect scientists to wade through so many papers looking for something they find worthwhile. The answer is I don’t. Work of no value will be ignored while useful ideas will be found by someone doing related research who finds it through keyword searches or other means. Even in the academically run archives there are vast numbers of papers that will never be cited or read by many people. Scientists find out about new ideas through citations, seminars, conferences, word or mouth, etc.

I hope that some people at least will read this series and get the point about why we run the viXra archive with an open policy that allows any work on scientific topic to be recorded. I can’t say that some future Nobel Prize winner will be among our deposits, but it is not impossible. More likely there will be lots of smaller good ideas that move science along in less dramatic steps, but that is the way most science is done.


Beyond The Standard Model CMS/ATLAS at Toronto

August 27, 2010

“Beyond the Standard Model” or BSM has become this years trending phrase in particle physics as the Tevatron, the Large Hadron Collider and a range of passive particle physics experiments battle it out to see who will find the first conclusive signs of physics that cannot be explained within the parameters of the Standard Model of particle physics.

There may already be some signs of BSM physics from the Tevatron where matter/anti-matter asymmetry is more consistent with a multiple Higgs sector as predicted by supersymmetry. Now attention is focussed on the Large Hadron Collider where the exponential growth in accumulated collision events at energies 3.5 times higher than the Tevatron, means that new physics could surface there at any moment. When it does they may take months to analyse it carefullty before they dare make an announcement, but negative results can be released tentatively much quicker and will appear at the never-ending series of particle physics conferences.

The conference that is currently running is the HCP symposium in Toronto and yesterday morning was the time for BSM physics reports from the Tevatron and the LHC. The talks about BSM physics from the ATLAS and CMS experiments were the most interesting. There were no positive results shown, but there were new plots using more data with some new limits set of BSM parameters.

Only the slides are online with no video and I am no expert on interpreting particle experiment plots, but at least some information is clear from the slides. The CMS talk by Sung-Won Lee had the most to show. This plot of dijet events does up to  1.9TeV using 836/nb. This will be one to watch as more data is accumulated.

For more see the slides yourself on the symposium website.


New results from the LHC at Toronto.

August 24, 2010

It’s less than a month since we saw results from the Large Hadron Collider presented at the ICHEP conference and we marvelled that they were able to show the analysis from 350/nb of luminosity collected just a few days before the conference opened. But ICHEP is not the only conference worth watching and now they have about 2200/nb (2.2/pb) of data with much of it collected in the last week.

At the Hadron Collider Physics Symposium in Toronto this week more new results are being shown. Some of it using over 1/pb of data or about three times what was seen at ICHEP. Sadly there are no videos of the talks but the slides are being put online and I hope that some of the more expert particle physics bloggers will soon be able to tell us some details of what has been presented.

From a talk today by Corrine Mills on W and Z physics at ATLAS we have this nice update to the beautiful plot of muon pairs showing some classic resonances very clearly. This is using 0.9/pb of data. I can’t help thinking they have cut the x-axis off a bit shorter than necessary. What lies just above the 100 GeV energy range shown I wonder?

More plots using over 1/pb appear at the end of the slides. I don’t think there is any new physics here but it’s just nice to know that some results from the first 1/pb have come out so quickly. It is only 16 days since they passed that milestone. ATLAS have really thrown the gauntlet down for CMS who only showed plots using 200/nb in their talk on the same topic.

Tomorrow they will present top physics and Higgs searches with Beyond Standard Model stuff and heavy ion physics the day after. On Friday they conclude by looking to the future, including the possibility of LHC upgrades and the case for keeping the Tevatron running. Startling announcements are not expected but it’s worth watching just in case.


LHC Luminosity passes 10 inverse microbarns per second

August 24, 2010

The Large Hadron Collider has passed a new luminosity milestone with a peak of 10.1/μb/s ( or 10.131 Hz/cm2 or 0.32/fb/year) recorded in the CMS detector. This was achieved at the start of a run with a new filling scheme using 50 proton bunches per scheme today. This figure is one tenth of the target luminosity for the end of this year that is needed to get them ready for collecting 1/fb during 2012.


Can nuclear decay rates change with solar flares?

August 24, 2010

Following a news article from Stanford,  slashdot has followed up with a report on observed changes in decay rates of radioactive elements caused by solar flares. This is not really new and follows other claims that nuclear decay rates change by about 0.1% on an annual cycle (or other longer cycles corresponding to various forms of solar activity) of course Motl has reported it two years ago.

The explanation for the effect offered by the scientists involved is that it is caused by changes in neutrino fluxes from the Sun. There are small variations in beutrino flux of this sort but the effect on deacy rates seems unlikely because neutrino interactions as we know them are too weak for this to happen. It would have to be something far outside the standard model that has somehow avoided detection in very sensitive experiments such as Super-Kamiokande. If not neutrinos then the next suggestion is some as-yet-unknown particle. It is an appealing idea but it is still unlikely that something like that could affect nuclear decay rates without being observed in different ways in other experiments.

Could the effect be caused by something more basic such as changes in electric or magnetic fields or even temperature in the environment of the experiment? If such dramatic effects on decay rates could be caused in these ways it would have been observed in controlled experiments a long time ago, so this can be ruled out. Environmental effects on the measuring apparatus are another matter, so suspicions immediately arise.

Putting aside the solar flare result for the moment, it turns out that someone has already done some more careful experiments to look for annual variations of decay rates and found nothing (See  Evidence against correlations between nuclear decay rates and Earth–Sun distance) here is one of their plots. I have chosen the one that looks like it comes closest to showing some effect to my eye but it is nowhere near the claimed effect shown by the sine curve and is not statistically significant.

Could there be some effect seen better with solar flares? The trouble is that solar flares are relatively brief. One claim was for an effect that lasted 43 minutes. The studies of annual variations were performed over a period of two years and still the statistical errors are just a little below the expected effect. If solar flares caused an effect observable over 43 minutes it would surely have to be much larger to stand out from the noise.

Sadly then, the warning signs are not aligning in favour of these results, but ruling out an effect conclusively might require more experiments.


Follow

Get every new post delivered to your Inbox.

Join 281 other followers