LHC Heavy Ion Collisions

October 29, 2010

In about a weeks time the Large Hadron Collider will stop proton-proton physics for this year and the physicists working on ATLAS, CMS and LHCb will work hard on their 50/pb of data to try to figure out if supersymmetry exists in nature. Meanwhile the LHC will continue running for another month colliding lead ions instead of protons. The main experiment designed to take advantage of these heavy-ion collisions is ALICE, but ATLAS and CMS will also take a look.

One of the exciting features of the Heavy-Ion collisions will be the total amount of energy in each collision. I don’t know how high they will actually get in this first series of runs but the target seems to be about 2.7 TeV per nucleon. Lead nuclei have 207 nucleons so the total centre of mass-energy could be as high as 1100 TeV. That is enough energy to create a million protons. The LHC is about to become the worlds first Petatron collider. (Edit: actually it will be half this energy for this year with full energy in 2013, see comments)

Sadly this does not mean they will be exploring the particle spectrum at such high energies. When heavy ions collide it is more like lots of small collisions between the quarks and gluons in the nuclei so the energy does not get concentrated into the production of any single heavy particles. Instead you can get lots of lighter particles that can form a very hot plasma ball. The interactions involved are dominated by QCD so this experiment is mostly a study of QCD phenomena.

If you want an idea of what such a collision will look like have a look at what has been seen at the RHIC collider. Here is a typical example with thousands of particles produced. The RHIC uses energies of 500 GeV per nucleon so we can expect something like 5 times as many particles at the LHC.

The aim of these experiments is to find out something about the phase diagram of QCD. According to various theories it probably looks like this

All the stuff to the bottom right is what happens in neutron stars at very high densities of matter. There is not much possibility of recreating such conditions in any experiment because they only way to produce the densities required are by using the enormous pressures due to gravity that occur inside neutron stars.We will have to rely on astronomical observations to probe those regions of the phase diagram. RHIC and the LHC are better suited to looking at the top left where the enourmous collision energies produce a plasma at very high temperature.

The hadronic matter phase is what we are used to at low temperature and at most nuclear density. Here the quarks are confined inside hadrons and mesons. At higher temperatures and densities theory predicts that the quarks will enter a deconfined phase where hadrons do not form. Instead the quarks and gluons just form a liquid-like plasma where they can flow around freely. You can cross from the hadronic phase to the quark gluon plasma over a first order phase transition (the thick red line) where the two phases mix just like gas and liquid in boiling water. However, at lower densities you can pass from one phase to another without going through the phase transition. A similar thing happens with water turning to steam at high pressure. The first order phase transition stops at a critical point and one objective of RHIC has been to try to find this point experimentally. This requires running with lower energy, not higher energy, so the LHC is not looking for the same thing.

Instead the LHC will be able to explore the crossover region where there is a smooth change from confined to deconfined matter, but there is something else in this region. Another phase transition is though to be crossed over, but it is a second order phase transition, not first order. This is the phase transition for chiral symmetry breaking.

The QCD Lagrangian has an approximate symmetry known as chrial symmetry that relates different flavours of quarks and left and right chiral states. The symmetry is broken by any quark mass but the up and down quark masses are small enough for this symmetry to be a good approximation. the symmetry is also broken by the electric charges, but this is also a relatively small effect. The spontaneous symmetry breaking leaves a residual symmetry which is isospin and it generates Goldstone bosons such as the pion. The pion would be massless if the symmetry was exact. At high temperatures the chiral symmetry is restored, so there must be a transition. lattice calculations suggest that it is around temperatures corresponding to 170 MeV.

With symmetry breaking phase transitions it is not possible to have a cross-over region if the symmetry is exact. A symmetry is either broken or not. You cant go smoothly from one phase to another. However, chiral symmetry is not exact in QCD so the phase transition will not be sharp. It is though to coincide with the deconfining phase transition up to the critical point and then continue across to the zero density axis as shown on my diagram. Actually it could separate before the critical point, we don’t know for sure.

When the LHC starts Ion-collisions at ramped energies it will be a leap ahead of where the RHIC has been looking. It has been said that it is one of the largest jumps in energy for a specific type of accelerator ever taken. The regions explored are thought to be similar to the conditions in the big bang a little after inflation stopped. It will be very interesting to see what happens.


LHC Back With Gusto

October 25, 2010

Following a four-day technical stop to fix a restriction at the injection point, the Large Hadron Collider is back up and running with even more power. A few days before the stop they passed the target luminosity of 100/μb/s, but this morning they reached 206/μb/s. This is much better than expected and they are not finished yet.

Six weeks ago they had collected 3.6/pb and were at the end of a three-week commissioning phase for bunch trains. I predicted then that by the end of the pp run they would be collecting that much data in a single run. Yesterday they collected 4.2/pb in a single run of 10.5 hours with 312 bunches. In another run with 368 bunches still ongoing this morning, they have collected 3.8/pb in just 6.5 hours.

With 12 days left they could easily collect another 50/pb or more, but they have a long list of measurements and tests that need to be done. In fact their aim now is to switch to higher density bunch trains and push towards even higher luminosities. Going for higher intensities now is a good idea because new problems keep cropping up as they step up (e.g. the injection restrictions, UFOs and electron cloud background were only found at high intensity.) If they know about these things now then they have time to fix them properly while the machine is shut down during December to February.

Next year they aim to collect 1/fb of data per experiment. They can reasonably expect to have 200 days of running with pp collisions, so they need to be able to deliver 5/pb per day. The luminosities they have now reached mean that they can achieve this quite comfortably. In fact they have scope to increase luminosity by a considerable factor with tighter bunch trains and smaller beta*. When you do the maths, even peak luminosities approaching 1/nb/s are starting to look possible for early next year. How far they choose to go is up for discussion at a technical meeting  in Evian in December.

This means that the target of 1/fb could be reached much sooner than expected in 2011. They will then have several options: They could close down early to ready the LHC for nominal energies of 7 TeV ahead of schedule, they could carry on and collect more data during the year, or they could aim for slightly higher proton energies straight away. Operation at 4.5 TeV per beam next year is a possibility already being considered.

Update: The latest fill ended with 6/pb collected by ATLAS.

On this version of the plan from a meeting this afternoon, they intend to run with the current 368 bunch scheme for three more days. This should take the collected data up to the 50/pb that the experiments were hoping for. If all goes well they can then spend the last week trying out 50ns separations which will allow them to push the bunch numbers over 400 for even higher luminosity.


LHC end of run troubles

October 21, 2010

As the Large Hadron Collider approaches its end of proton-proton physics for 2010, the CERN bulletin reports on some of the problems that have hit the final runs. One particularly frustrating issue concerns beam losses at the time of injection. X-rays of the injection point have revealed a “non-conformity” in the mounting of the interconnection.

Despite this they managed to reach peak luminosities of 148/μb/s last week, but not without difficulty. To run efficiently at this luminosity and higher they need to remove the obstruction so the collider has been closed down for four days while they try to fix it. This is a blow for their hopes to collect 50/pb this year. So far they have about 20/pb, but there are now just 13 days of running left until they start setting up for Heavy Ion physics on 5th November.

How much more data the experiments can get in time to show at the winter conferences will depend on how fast they can get back up to speed after this technical stop.

 

update 24-Oct-2010: The technical stop is now over and beams are once again colliding with 312 bunches giving peak luminosities of 152/μb/s. The plan now is to move to 360 bunches as soon as possible. After that they may just continue running at that level for the remaining 12 days to see how much physics data they can collect, or they may try injecting with 50ns bunch spacing and push for even higher bunch numbers.

update: ATLAS collected 4.3/pb on this run! Next run increases to 368 bunches.


INSPIRE

October 15, 2010

Berenstein and Motl have posted about the new HEP search engine INSPIRE which replaces the old and trusted SPIRES. INSPIRE can do everything SPIRES could, but better! or at least faster. But can it do anything new? yes it can. Try entering the search term “vixra” and you will get a list of all the HEP papers on viXra. These were also indexed by SPIRES, but as far as I know there was no way of searching for them all in one list.


Horizon: Before the Big Bang

October 15, 2010

This week the BBC showed a program in their long running “Horizon” series about the question “What came before the Big Bang?”  Here is the gist of the message: A few years back cosmologists accepted that time did not exist before the big bang, so the question did not make sense. The universe along with time itself just started to exist and has been evolving nicely ever since. But now cosmologists are forming all kinds of theories that do put something before the big bang to explain how and why it happened.

So here is a list of the scientists that featured and the theory they adhere to:

  • Andrei Linde: Multiverse inspired eternal inflation
  • Param Singh: Big Bounce due to repulsive gravity at small distances
  • Lee Smolin: Black Holes spawning baby universes
  • Michio Kaku: Vacuum fluctuation from empty space
  • Neil Turok: Colliding Branes
  • Roger Penrose: The future is empty expanding space = a new big bang
  • Laura Mersini Houghton: String cosmology

Each of these ideas has been around for some time and has been worked on by several people. The individuals mentioned here are not necessarily the ones who invented them. The Penrose theory is an exception in that it is a new idea that features in his next book.

In the program each of these scientists was interviewed while they tried to solve one of those  wooden puzzles

The obvious conclusion to draw is that there are a lot of viable theories out there which cannot all be right. Each of the scientists seemed to have quite a strong belief in the theory they supported, but they would all acknowledge that more experimental input is needed to resolve the question. All of them are driven by a philosophical argument that temporal causality must hold absolute so some prior cause of the big bang is needed.

Along with all the theorising and philosophising, a couple of experiments were mentioned which they think might help test these different hypothesis. The first was LOFAR, a low-frequency radio telescope array that may detect background remnants from the big bang. The standard prediction is that it will be white noise, but anything else could be a clue that separates different theories, prepare your predictions in advance please. The second experiment was the more familiar LIGO and its space bound successors LISA. These may be able to detect a gravitational wave remnant from the big bang that could also have a distinctive signature. It is hoped that either of these experiments may see past the wall of last scattering from which the cosmic microwave background emerged to provide information from an earlier time.

Personally, I don’t accept the philosophical need for something before the big bang and I don’t particularly like any of the theories mentioned. I think it is more likely that there was no space or time prior to big bang singularity which itself is a high temperature and density phase with no fixed topology or geometry for spacetime. I am not alone in preferring theories that do not require time to extend before the big bang, but the program has selected those that do. Where was Hawking’s view for example?

I think that explaining the universe requires us to look at ontological causality rather than temporal causality and the big bang is just one feature of the universe, not the reason for its existence. Although the experiments mentioned and others may throw some light on the nature of the big bang, we first need a better understanding of quantum gravity. There is still scope for theoretical developments that may help even before the experiments bear fruit. Even if you favour the string theory/M-theory route to quantum gravity (as I do), a better understanding of their foundations is required before we can hope to answer these questions about cosmology.

Despite that, I don’t think it is wrong to explore a wide range of cosmological ideas of this kind provided they have some good mathematics behind them. It is time for science to start trying to answer such questions. They will have to be looked at from all angles, philosphical, mathematical and experimental if we want to get the right understanding.

For the record I thought this was a good Horizon program, some of their physics/cosmology episodes lately have been a bit empty and ill-conceived. The position was too one-sided, but well researched. I’m glad they did not make the mistake of mentioning the LHC as if it was likely to resolve these questions, but did mention some other experiments that stand a better chance.  

If you missed the program or it is has not yet aired in your country, I dare say you will find it on the web using Google video search. I wont provide any links because I don’t know which if any are legal copies, or how long they will remain available, or whether the same links will work everywhere.


LHC Reaches 100/μb/s target for 2010

October 14, 2010

The Large Hadron Collider has reached its official 2010 target for peak luminosity of  100/μb/s or 1032/cm2/s . Massive Congratulations are due to all the teams at CERN who have worked incredibly hard this year to achieve this success!

This luminosity is 3.15/fb/year. The target next year is to collect 1/fb of data, but it is not possible to run at peak luminosity continuously so an extra margin is required. If they can run for 40% of the time during the running time allocated they really need peak luminosities of about 200/μb/s so they still have a little ground to make up.

The 100/μb/s has been reaches with 248 bunches by pushing to tighter emittances and better intensities. The next physics run will be with 300 bunches.

There are still three more weeks of proton running this year, so with the official target passed they can relax and use the time remaining for whatever seems most useful. That will include running to collect physics data before the winter break, pushing the number of bunches a little higher, and running various tests and scans that help them understand the machine better.

One last thing they might possibly try is something extra to improve luminosity. There are options to squeeze the beams to a lower beta* of 2m, or reduce the bunch separation to 75 ns or even 50 ns, but these might take time and result in less data collected for this year. If they don’t do these things now there should be time next year.

Whatever they, this year’s proton physics runs will be counted a great success.

Update (16 Oct 2010): In a short run last night with 312 bunches the peak luminosity was increased to about 135/μb/s. The filling scheme is 150ns_312b_295_16_295_3x8bpi19inj which means 312 bunches per beam with a 150ns minimum separation between bunches, 295 collisions per turn in ATLAS, CMS and LHCb, 16 collisions per turn in ALICE and 19 injections of up to 3 trains of 8 bunches at each go.

Tentative plans are to move to 360 bunches after three good runs at 312. Then they hope to try out a 50ns bunch spacing and 32 bunch injections so that they can go to 400 bunches or higher. They are also still hoping to collect 50/pb before stopping pp physics. Current total delivered is just over 20/pb. Whether they can get there, (or even go beyond) depends on how many running problems come up in the two weeks they have left, but with the goals for 2010 already met, anything extra is icing on the cake.


Science is Vital

October 10, 2010

Yesterday there was a rally in London by scientists who want to convince the government not to cut back on science. I would love to have gone to show my support and bring back some pictures but I had family duties to perform.

I’m afraid the government here is set to make savage cuts to the UK science budget. They don’t understand at all that our economy depends on science, perhaps more than any other country. Britain has always built its wealth on manufacturing industries based on technological innovation, but in time we lose the old industries and can only continue to prosper if we keep creating new ones. That requires scientific research and people educated in science. The only other sector that has kept our economy going for the last few decades is finance, but now we know it is dangerous to rely on the banks.

When Sarkozy spoke at ICHEP we knew that he was serious about supporting science. Other world leaders of countries that rely on science are doing the same. They know that cutting science is a false economy at times like this.  In britain they are not just cutting back the budget. They are also making it more difficult for people from outside the EU to come here to so science. They have made an exception to the rules to allow footballers to come here from all around the world, but scientists like Andre Geim and Konstantin Novoselov who won the Nobel prize for work done in the UK will no longer be welcome here.  Young scientists are very mobile. They will go elsewhere. Science cannot just be stopped and started again for short-term savings. Once people leave it is hard to get them back.


LHC Ready for 248 Bunches

October 7, 2010

The Large Hadron Collider is now ready for its next step up to 248 bunches which should raise the peak luminosity to 85/μb/s. That will be one more step away from this years target of 100/μb/s which will require 296 bunches.

To progress at each step when they add another 48 bunches, they need three fills and 20 hours of stable beams. At 152 bunches they had a rough time and ended up with only 16 hours, but they decided that was OK because that’s 20 in octal (I’m not making it up). The latest run at 200 bunches has gone very well with one run of 14 hours delivering 2/pb. Two shorter runs take the total time to over 20 hours so they are preparing for the next step.

To inject 248 bunches they need to switch to injecting 24 bunches in one go. If all goes well they should be ready to set the new record tonight. There is no sleep for the LHC.

The plot above shows the delivered luminosity for each experiment which is now between 12/pb and 13/pb for the big three. However, the experiments do not collect everything that is delivered. The plot below shows that ATLAS has collected 11.5/pb. I don’t have the equivalent plot for the other but it is likely that CMS and LHCb are also past the 10/pb mark at least.

Update (8 Oct 2010): 248 bunches now running with peak luminosity in ATLAS of around 88/μb/s

In case you want to see what 248 bunches per beam looks like here is a picture taken from the LHC filling schemes page. The current fill pattern is 150ns_248b_233_16_233_3x8bpi15inj.txt 

update (11 Oct 2010): There was a three day interuption to LHC opperations due to a problem with cryogenics and other niggles. They are now back up and have increase luminosity to 94/μb/s for the second run with 248 bunches. Remember,  the target for this year is 100/μb/s, so nearly there!

update (12 Oct 2010): In a surprise move they have switched to a new filling scheme with 256 bunches giving 3% more collisions for ATLAS and CMS. If they get a good fill they might just reach 100/μb/s which means party time! If they don’t get there with this fill but they keep it going for a few hours they will be able to add 48 more bunches and that will certainly do it. Of course that might be someone elses shift :)


Nobel Prize in Chemistry to Richard Heck, Ei-ichi Negishi and Akira Suzuki for Palladium-Catalyzed Cross Coupling

October 6, 2010

The 2010 Nobel prize in chemistry has been awarded to Richard Heck, Ei-ichi Negishi and Akira Suzuki for work that has greatly improved the ability to synthesise organic chemicals. This is important in many fields of manufacturing, especially pharmaceuticals and electronics where rare compounds found in nature are often in higher demand than nature can provide. Of course the methods can also be used to produce organic (carbon-based) molecules that nature has not discovered yet, but which may still be useful to us. The work was done in the US, China and Japan.


Nobel Prize in Physics to Andre Geim and Konstantin Novoselov for Graphene

October 5, 2010

The 2010 Nobel Prize in physics has been awarded to Andre Geim and Konstantin Novoselov for the discovery of graphene. Both laureates work at Manchester University making it the second Nobel Prize this year to be awarded to work in the UK. The winners themselves are from Russia.

Graphene is a material one atom thick made of carbon atoms. Graphite which is often used in pencils or as a lubricant is actually just layers of graphene. separating the graphene and studying its properties started with the idea of using sticky tape to peel the layers off. It sounds like a simple idea but you can be sure that it was not an easy process otherwise other people would have done it first.

It turns out that graphene has extraordinary properties for conducting electricity and heat and is very string for its feeble thickness. Recently the world record for rotational speed of objects was taken by a flake of graphene that was spun using light to a million revolutions per second. Any other material would have broken apart but graphene has the potential to go even faster before it breaks.     

You may be wondering why a discovery of a new type of molecular substance wins the physics prize instead of the chemistry prize. Me too. Perhaps it makes up for the fact that some recent chemistry prizes would have been better suited to the medicine Nobel.  


Follow

Get every new post delivered to your Inbox.

Join 270 other followers