LHC to go for higher intensity next

April 28, 2010

Following the record luminosities a few days ago, the Large Hadron Collider was closed down for three days this week to make some technical corrections. This evening they are ready to restart and new plans for the next two weeks have been drawn up.

The main goal now is to increase the number of protons per bunch. The physics runs so far have used up to about 12 billion protons per bunch. A few weeks ago they did some brief tests with 100 billion protons per bunch but the beams were dumped very quickly. Now they will try to maintain high intensities long enough to declare stable beams and do physics.

Increasing the number of protons per bunch is one of three ways that they increase the overall luminosity of the beams. They can also increase the number of bunches and they can squeeze the beams as they did last week. But there is a good reason why increasing the number of protons per bunch is especially exciting. If they multiply the number by ten in one beam it means ten times the luminosity and ten times the number of collisions, but of course they can increase the number by a factor of ten in both beams so the luminosity is then increased by a factor of 100 ( That is assuming the bunches remain the same size). By comparison, if they increase the number of bunches by a factor of ten in both beams they still only increase the luminosity by a factor of ten.

The current golden orbit for physics is  a configuration of  3.5TeV/2m/12Billion/3-bunches. The plan shows them aiming for 3.5TeV/2m/40Billion/2-bunches. This will actually give a modest increase in luminosity of about 5. If they put the number of bunches back up to 3 it will be a factor of ten. That’s not bad just a few weeks after the last factor of 10. Of course sometimes they do actually keep to the plan.


LHC achieves stable squeezed beams

April 24, 2010

Over the last few weeks the LHC controllers have been working towards an improved luminosity target using squeezed beams. This morning they succeeded when they declared stable beams using the new configuration. Since the 30th march when the protons were collided at 3.5TeV per beam for the first time, they have been running with a configuration of  3.5TeV/11m/10Billion/2-bunches  (energy per beam/beta/protons per bunch/bunches per beam). The new configuration is 3.5TeV/2m/12Billion/3-bunches. This should increase the luminosity by a factor of  about 10  (x5 from the squeeze and x2 from the bunches) but they may need to do some luminosity scans to reposition the beams before they actually increase the collision rate by that amount.  

The bucket configuration being used is (1, 8941, 17851) for beam 1 and (1, 8911, 17851) for beam 2. The total number of buckets is determined by the frequency of the RF fields used to accelerate the beam and the size of the collider ring. The result is that there are exactly 35640 buckets in each beam where the bunches of protons can be positioned. The bunches are injected from the SPS ring into the main LHC ring with careful timing so that they are placed in the buckets the controllers want.  

The buckets chosen determine where the protons in the two beams will cross over and collide. bunches in bucket 1 for beam 1 and bucket 1 for beam 2 circulate in opposite directions so they will come together at two points diametrically opposite around the ring. These two points are called IP1 and IP5 and these are where the two biggest experiments live (ATLAS and CMS) . Other bunches that are in the same bucket number will also collide at these points. E.g with todays bucket numbers the bunches in bucket 17851 of either beam will also collide in ATLAS and CMS, but the bunches in buckets 8941 and 8911 will miss, so these experiments are now getting twice as many collisions as the previous configuration.  

The other bucket numbers are chosen to provide collisions at the other two intersection points IP2 (ALICE),  and IP8 (LHCb) The point IP2 is exactly one eighth of the way round the collider ring so bunches will collide there when the difference of bunch numbers (b2 – b1) is exactly 35640/4 = 8910, so with today’s configuration the bunch in bucket 8911 of beam 2 collides with the bunch in bucket 1 of beam 1, and 17851 of beam 2 collides with 8941 of beam 1. So ALICE is also getting twice as many bunches colliding as before.  

Finally, LHCb at IP8 is at a point approximately one eighth of the way round the ring in the other direction, but because of the nature of the detector its collision point is 11.5 meters away from the exact point. This means that the difference in bucket numbers must be -8940 rather than the more convenient -8910. With the new numbers we have bucket 8941 of beam 1 colliding with bucket 1 of beam 2 and bucket 17851 of beam 1 colliding with 8911 of beam 2. So LHCb also sees two collisions for every circuit of the ring and the controllers have been fair to each of the experiments. 

b2-b1 1 8941 17851
1 0 (IP1+IP5) -8940 (IP8) -17850
8911 8910 (IP2) -30 -8940 (IP8)
17851 17850 8910 (IP2) 0 (IP1+IP5)

As the number of bunches is increased the controllers will have to work harder to find the best bucket numbers. For CMS and ATLAS they want all bunches in equal bucket numbers to maximise the number of collisions. To please ALICE they should place them at intervals a quarter of the way round. Four bunches in buckets (1,8911, 17821, 26731) for both beams would be ideal for CMS, ALICE and ATLAS who would each see four collisions per circuit, but would fail for LHCb. They will have to offset the bucket numbers to get the best results.  

As the number of bunches gets larger the problem eases. If they could have 1188 bunches placed at 30 bucket intervals then all four experiments would be seeing 1188 collisions per circuit. In practice this is not possible because some gaps must be left to allow safe dumping of the beams at the end of each run. The bunches must also be at least 10 bucket numbers apart. There are other constraints depending on how bunches can be injected and various other considerations. In fact the highest number of bunches planned is 2808 per beam.  

Before they get there we will see them go through other carefully worked out bucket schemes with possibly 16, 43, 96, 156 or 936 bunches per beam. Juggling the precise bucket numbers to please all the experiments is going to be a delicate business.

Update: The stable beam was held for 30 hours before being purposefully disabled. This is a new record for longevity


When will the LHC increase its energy?

April 21, 2010

Now that the LHC is running routinely, everyone is wondering when it will produce some new physics. Over at “Not Even Wrong” Peter Woit did a nice comparison of the LHC with the Tevatron based on luminosity. Of course luminosity is not the only thing that counts here. The LHC runs at over three times the energy and this increases the cross-section for most interactions of interest such as Higgs production. In fact it is possible that the LHC could find new physics at higher energies that could not be seen at the Tevatron no matter how long it runs for. It would also be interesting to hear a comparison of the detectors at the LHC and the Tevatron. If any detector experts are listening please tell us about it.

So the important question is not just “when will the LHC increase its luminosity?” , it is “When will it increase its energy further?” The design energy of the LHC is twice the energy it is currently running at. At present they are keeping the energy lower because of the problems that had at the end of 2008 when an overheated joint caused an explosion of helium delaying the start of physics at the LHC for over a year.

The current plan is to run the LHC at 3.5 TeV per beam while concentrating on increasing the luminosity. It will run this way through 2011 until it is shut down for a long repair lasting up to one year. Then they will be ready to increase energy to 7 TeV per beam in 2013.

But if you have been watching the LHC progress you will know that their plans are written on soluble paper and have been known to dissolve very quickly. So what is really likely to happen? At the press conference for the start of physics on 3oth March, Steve Meyer, the technical director at CERN was asked about the prospects of increasing the energy. He explained that the problem in 2008 was caused by a joint with a resistance of 200 nΩ. They have now tested all the joints and the highest resistance left in any of the joints is about 1 nΩ. Most are about 0.3 nΩ which is about what they should be. This was not the only problem, but Meyers says that he is sure the magnets can take the currents required to go to 6.5 TeV.

If this is the case, it is hard to believe that they will shut down the collider for a year without at least trying to go to higher energies. Even if there are problems the quench protection system should ensure that no serious damage is done. There have already been rumours circulating that they will try for 5 TeV per beam later this year. Realistically it depends on how easily they can increase the luminosity and how stable the systems prove to be. It may also depend on the physics. If there are indications that new physics lies at the higher energies they will be inclined to increase it, but if they just need more time at lower energies to turn an uncertain observation into a solid discovery, they will keep running at the current energy.

This video shows  Steve Meyers at the press conference, so have a watch and see what you think he would like to do.

Update: From the current plan it can be seen that they want to get stable squeezed beams for the first time this weekend. That will be at 3.5TeV but with beta of 2m and  intensity of 35 billion protons per bunch which is a bit higher than current runs. Later they will also try for stable beams with 100 billion but only at 0.9 TeV. If they are successful it will bring them closer to their next operational target which seems to be 3.5TeV/2m/100Billion, They have not indicated how many bunches they will circulate at this intensity but presumeably they will try to step up the number of bunches at these levels.  (The design limit is something like 7TeV/0.5m/200Billion/2808 bunches, and they are currently running stable beams with 3.5TeV/11m/10Billion/2 bunches). The plan also shows a long technical stop during next week. Let’s hope they will be ready for the higher luminosities when that is completed.

Followers may also enjoy the video log at  http://www.collidingparticles.com/


“crackpots” who were right 8: James Lovelock

April 17, 2010

James Lovelock is the first scientist in this series who is still alive. This also means that some of his work remains controversial, but a great deal of his research that was originally attacked is now widely accepted.

As a child, Lovelock was fascinated by science and read many books about physics and chemistry at the library. His school life was not very happy and his teachers did not rate him very highly. Towards the end of his secondary schooling he took part in a written test on general knowledge and came top. His teachers were indignant.

Although he was interested in a broad range of science topics he went on to study chemistry at university because he had a form of dyslexia that made it difficult for him to succeed in more mathematical subjects such as physics. He went on to work for the Medical Research Council and gained a doctorate in medicine in 1948. He invented a number of detection devices including the electron capture detector which made it possible to detect very small amounts of certain chemicals in the atmosphere. Although he has at times worked for various institutions and universities he has done most of his research as an independent scientist funded by revenues from his inventions.

Working from his home laboratory, Lovelock decided to investigate the effects of human pollution on atmospheric conditions such as haze. He used his electron capture detector to measure concentrations of CFC compounds in the atmosphere and correlated the results with conditions of visibility,finding a strong relationship. Because CFCs have no natural origin this demonstrated a clear link between pollution and its effects on the weather. The work drew attention to the buildup of CFC’s in the atmosphere which nobody else had measured before Lovelock. It was then realised by others that CFC gases were harming the ozone layer that protects us from ultraviolet radiation and a worldwide ban on the substances was put in place preventing a natural disaster. In 1974 Frank Rowland and Mario Molina were awarded the Nobel prize for this discovery. Once again we see how the most independent thinkers seem to make discoveries that lead to Nobel Prices for others who work in a more institutionalised environment. 

 Lovelock established a good reputation through his work and was called on by NASA when they wanted to develop tests that would detect life on Mars. Lovelock worked with other scientists on the project but became critical of the approaches others were taking. The director told him he must produce a good test himself or leave the team. He came back with the suggestion that they should measure the composition of the Martian atmosphere because if there was life on Mars it would result in a mixture of compounds that would be hard to explain through inorganic processes. This idea had the benefit that it could be carried out without sending probes to Mars and measurements were soon taken showing that the atmosphere was almost entirely carbon dioxide. It was concluded that there is probably very little or no life on Mars at this time.

It was inspiration from this work that led Lovelock to the hypothesis for which he is now well-known. He suggested that the atmosphere and climate on Earth is not just affected by life, it is actually controlled by it. The temperature of our atmosphere can be controlled by phytoplankton that live in the upper sunlit layers of the ocean. In response to the sunlight they produce chemicals that rise in the atmosphere and increase the cloud levels. This in turns cools the planet. Carbon dioxide levels can be controlled by algae that bloom when there are high concentrations of the gas. This removes the carbon dioxide and deposits it on the seafloor. Even oxygen levels are controlled by vegetation that will burn more frequently when concentrations get too high.

At first Lovelock’s hypothesis did not get much attention so  in 1979 he gave it a catchy name and wrote a popular book about it: “Gaia: A New Look at Life on Earth” In the book he described the earth as acting like a superorganism that self regulates its systems.  The reaction was probably not quite what he had anticipated. The cause was taken up by New Age thinkers in ways he did not particularly like. Evolutionary scientists such as Dawkins, Gould and Doolittle attacked the idea, saying that it was not consistent with evolution. Lovelock was not anti-evolution and set about more research aimed at showing how the Gaia hypothesis could arise naturally. Eventually he started to receive more support for his work.

Thrity years later scientists now accept that there are strong links between biological systems and the way our atmosphere is regulated by nature, much as Lovelock proposed. The way such systems developed is still open to question. Lovelock went on to suggest that human activity is now upsetting the balance that nature established and this has set the foundations of the environmentalist movement. 

The reaction to Lovelock’s research show how the scientific establishment still reacts negatively to new ideas that go against their accepted views. As he said himself “Nearly all scientists nowadays are slaves. They are not free men or women. They have to work in institutes or universities or government places or industry. Very few of them are free to think outside the box. So when you come along with a theory like Gaia, it’s so far beyond their experience that they are not able to react to it.” For a long time Lovelock and his supporters in science found it hard to get their results published in scientific journals because of the opposition from other scientists. He has called this “wicked censorship

At 86 Lovelock is no longer considered a crank. He is appreciated as the founder of a new area of science investigating the relationship between biological systems and the atmosphere. Without his insight we would have been much slower to understand the negative effects we have been having on our climate through pollution.

 


“crackpots” who were right 7: Fritz Zwicky

April 11, 2010

Fritz Zwicky was a Swiss astronomer who worked most of his life at Caltech in the US. He had a good reputation as an astronomical observer but his real passion was for astronomical theory based on applications of physics. He was in fact one of the first true astrophysicists from the 1920s onwards. But during most of his lifetime he was very underappreciated for his theories of cosmology and stellar physics. In fact that is really putting it mildly. Many of his colleagues were very hostile towards him and his theories. Of the scientists described in this series he is arguably the one who was most regarded as a “crackpot”. That is until many of his ideas were proved right many years later.  

Zwicky had a remarkable ability to consider a problem from a fresh perspective and disregard any misguided preconceptions of the time. Because of this he was capable of coming up with what seemed like wild theories to others. With hindsight it seems like about half of these ideas turned out to be right while the others really were just too wild, or perhaps some of them are still ahead of their time.

In 1935 Zwicky published his theory in collaboration with Walter Baade that when supernovae explode they leave behind them a star with the density of nuclear matter made of neutrons. They predicted that these neutron stars were responsible for cosmic rays and proposed Supernovae as standard candles for measuring distances to other galaxies. 

Today these ideas are so much a standard part of our astrophysics that it is hard to appreciate just how revolutionary they were at the time. Neutrons had just been discovered two years earlier while cosmic rays had only been observed since 1912. Even the term “supernova” had only been coined in 1926 by Zwicky himself. To other scientists of the time, putting these new ideas together in such a way must have seemed like just a historical trick that was too much to swallow.

In fact the theory was based on sound reasoning and built on the theory of white dwarfs as a Fermi gas which had developed over the previous decade. At the same time as Zwicky and Baade proposed their theory of supernovae another controversy was raging on the other side of the Atlantic between Chandrasekhar and Eddington. Chandrasekhar predicted that there was a limit to how heavy a white dwarf could be before it must collapse to form a black hole. Eddington could not accept that nature would include black holes and argued that relativity must be modified in such extreme circumstances to avoid the  Chandrasekhar limit. Astronomers now believe that the neutron star is the densest stable state before this limit is reached.

 The resistance towards these ideas persisted so long that when pulsars were observed 32 years later, few people were prepared for the discovery. The pulsing radio signals observed by Jocelyn Bell in Cambridge were at first thought to be interference and then alien signals. Bell’s supervisor Antony Hewish could not accept the observation at first because the strength, rapidity and regularity of the signal meant that it had to come from a small dense source. It was not until the following year that Thomas Gold and Franco Pacini proposed that pulsars were rotating neutron stars. When Stephen Hawking heard of the discovery that neutron stars exist he told Hewish that now they must accept that black holes too are out there in space waiting to be found.

There is a story that in the 1950s a woman member of the public viewed the Crab Nebula source at the University of Chicago’s telescope, and pointed out to the astronomer Elliot Moore that it appeared to be flashing. Moore, told her that it was just the star twinkling due to atmospheric waves. The woman protested that as a qualified pilot she understood scintillation and this was something else. We now know that it is a neutron star that flashes 30 times a second. At the time most astronomers could not have accepted such an explanation.

Neutron stars were not Zwicky’s only successful theory. He also believed that galaxies were held together in clusters with unseen dark matter accounting for the gravitational forces needed. he predicted on this basis that such clusters could act as gravitational lenses producing effects that would be observed. he was of course right on all counts but it is only in the last few decades that the theory of dark matter has finally become widely accepted over alternative explanations.

Not everything Zwicky thought of turned out to be right. His notable failures include his theory of tired light  which he invented because he did not accept the expanding universe theory. Even though such ideas our now discounted, at the time they were not so unreasonable and such alternative theories are important in the development of cosmology and physics as counterfoils against which observations can be used.

Nevertheless, in his time almost all of Zwicky’s theories were rejected by his colleagues. He garnered respect only for his careful astronomical observations which included the discovery of over a hundred supernovae, more than any other individual has found. He lived just long enough to see neutron stars become excepted but it took longer for other astronomers to admit he had been right and credit him with the greatness he deserves. He received very few honours for his scientific work but was awarded a gold medal of the Royal Astronomical Society.


LHC needs more luminosity

April 10, 2010

The LHC continues to make physics runs with beam energies of 3.5TeV. Although this is three times as much energy as the Tevatron in the US, we are unlikely to see new physics until the LHC beats the Tevatron on luminosity too. The current luminosity of runs at the LHC is about 1027 cm-1s-1, but the Tevatron is reaching peak luminosities of 3 x 1032 cm-1s-1. That’s an impressive 300,000 times better.

These accelerators are looking for the collisions events where new particles are created, but these events are extremely rare. Furthermore, even if the particles are being created they are masked by background processes that mimic their signature in the detectors. The physicists can only know the particles are there when they see significantly more signal than the expected background noise. This means collecting many events and the number of events they can collect is proportional to the luminosity and the length of time they can run the collider for.

The higher energy does give the LHC some advantage though. For most processes they want to see the number of events seen is likely to be something like ten to a hundred times higher at 3.5TeV than at 1 TeV. That still leaves the LHC a long way short of the Tevatron, for now.

Luckily the design luminosity of the LHC is much higher than what they are currently running it at. In fact it should be able to reach luminosities of 1034 cm-1s-1. That’s 30 times what the Tevatron runs at. But to reach this energy the collider operators have a lot of work to do. They need to gain seven orders of magnitude more luminosity. To achieve this they have three main tactics and already we have seen them testing out some of these over the last few days.

The first trick is to “squeeze” the beams by focusing them with magnets as they pass through the detectors. The amount of squeeze of measured by a parameter called beta which starts out at 11 meters. Last week they were able to squeeze this down to 2 meters, but the target is to get it down to 0.5 meters. When they do this the beam becomes much narrower so they have so control its position more carefully, otherwise the tight beams will miss each other completely and there will be no collisions at all. If they get it right they stand to gain a factor of up to about 20 using this method.

The second trick is much simpler. They just put more protons in each bunch. For the physics runs they have been doing this month they have been putting about 10 billion protons in each bunch, but last night they ran some trial injections gradually increasing the number up to 100 billion. That’s another factor of 10 in luminosity for each beam ready to be tapped, or 100 overall.

The final step is to increase the number of bunches circulating round the collider ring. They are currently circulating 2 bunched in each direction but each detector is seeing just the collisions from one bunch in each beam. Last year they were able to circulate 16 bunches in each beam but the design limit is 2808 bunches, so there is another factor of nearly 3000 to be gained.

Overall that makes a potential increase of 20x100x3000 = 6000000 times the current luminosity. When they reach that point the work of several weeks of running at the current luminosities will be done in a fraction of a second. Obviously they will want to get to these higher luminosities as soon as possible, but they have to be careful. At present luminosities the beams are relatively safe and if they make a mistake that sends the beam into the collimators there is not too much damage. At the higher luminosities an error could send particles flying in all directions causing serious damage to the detectors.

So hopefully they will increase the luminosity carefully, but not too slowly. In the meantime the detectors are rediscovering known physics. One of the latest candidate events is this W to muon from last week. Soon they should also rediscover the Z, then as the luminosity goes up they should start to collect top quark events for the first time in Europe. The Higgs and other new physics will have to wait for better luminosity, unless of course there is something unexpectedly easy to see at 7 TeV.


“crackpots” who were right 6: Abel Niepce de Saint-Victor

April 8, 2010

Who discovered Radioactivity? Every physics student has heard the story of how Henri Becquerel made the chance discovery while studying the effects of light on chemicals using photographic plates in 1896. He normally exposed the chemicals to sunlight and then left them on a photographic plate in a dark drawer to see if they would expose the plate. One day he was trying the experiment with some uranium salts but the sky stayed cloudy, so he put the plates and chemicals in the drawer without the exposure to wait for better light. When he took them out he decided to develop the plates anyway and was surprised to find that they had been darkened despite the lack of light. He had discovered the effects of radioactivity.

Becquerel was awarded the Nobel Prize in 1903. His contribution is further recognised in the modern name for the unit of radioactivity.

What most people don’t know is that the same discovery had been made some four decades earlier by Abel Niepce de Saint-Victor. Like Becquerel he was studying the effects of light on various chemicals and was using photographic plates to test the reaction. He also used uranium salts and found that they continued to blacken the plates long after any exposure to light had been stopped. Fluorescence and Phosphorescence had been known for many years and Niepce knew that this new observation did not conform to either phenomena. He reported his results to the Academy of Sciences in France several times.

A few scientists including Foucault commented on the findings but no-one had a good explanation. Surprisingly no-one seems to have tried to replicate them and it is likely that everyone thought that there was most likely some experimental error. In any case  Niepce and his discovery were soon forgotten.

When Becquerel rediscovered the same result as Niepce the situation was very different. By then X-rays were known and physicists were ready to appreciate that another new type of ray could exist. One physicist Gustave Le Bon pointed to the prior work of Niepce de Saint-Victor but he was ridiculed. Any further chance that Niepce might gain some recognition were extinguished when the Nobel Committee awarded the physics prize to Becquerel.

The story of Abel Niepce de Saint-Victor is typical of what happens to scientists who make a discovery whose importance is not recognised at the time. You would expect that when the effect is rediscovered later, people would appreciate the original discoverer, but that is not what happens. Usually the new discoverer gets most or all of the credit and the original scientists contribution is neglected because he failed to grab everybody’s attention, even if he managed to publish the result six times. In this case it is only in recent years that some small amount of appreciation for the work of Abel Niepce de Saint-Victor has finally emerged.


Open Grants for Research

April 8, 2010

If you need some money to help you carry out your research on fundamental science questions you may be able to apply for a grant. The FQXi is currently inviting proposals for grants on the “Nature of Time”. This was also the subject of their first essay contest  in 2008 and it is curious that they have gone for such a narrow area for their large grants program. Initial proposal are required by June 14th 2010.

They have $2 million to give out but it is not clear where the money comes from. Previous FQXi grants were funded  by the John Templeton Foundation. If the “Nature of Time” does not fit your research program you could apply directly to the JTF for a grant under one of their active funding priorities. These include “Quantum Physics and the nature of Reality”, “Foundational Questions in the Mathematical Sciences”, “Culture, Biology and Human Uniqueness”, etc. For these you have to hurry because they want proposals by April 15th.

The catch with any of these grants is that you need to be backed by an institution to be taken seriously, but it could be worth applying even of you are not affiliated to anything. In the past a few independent researchers such as Garrett Lisi have been successful.


Long Physics Run at the LHC

April 5, 2010

 

Physicists at the Large Hadron Collider have just completed their longest physics run yet with about 20 hours of collecting events. Fill number 1022 which started yesterday evening and finished at lunchtime today will have more than doubled the total amount of data taken so far at CERNs new collider. The run finally ended when a glitch in the cryogenics caused the beams to be automatically dumped. The stability of the beams was so good that if such glitches can be avoided they would circulate for many days before gradual loss of the protons degrades the beam intensity. This is a great boost for the prospects of the collider.  

As with earlier fills this one used two bunches of protons circulating in each direction of the collider ring. The bunches were injected with timings carefully worked out so that the protons would collide at intersection points around the circuit where the different particle detectors are positioned. With just one bunch in each beam the protons will collide at two points diametrically opposed on the 27 kilometer circle. The two main experiments CMS and ATLAS are positioned at such points, but two other experiments LHCb and ALICE are at different locations about one eighth of the way round the ring in either direction starting from ATLAS. With two bunches per beam the protons can be made to collide at these points as well and that has been the configuration used since first collisions last week.  For this run all the LHC experiments were switched on and should have been able to collect data.

We will have to wait some time before we know what the results will be. The physicists have to collect vast amounts of collision data and analyse it before they can publish their results. Exactly how long depends on what physics is waiting to be discovered and how efficiently the accelerator engineers can increase the luminosity of the beams to generate more collision events in the detectors.

Already the proton beams are circulating with an energy of 3.5 TeV per proton and 20 billion protons in each beam. The experiments were collecting data at the rate of about 100 events per second on this latest run which means they already have some 10 million events to analyse from each detector. But the most interesting events are the very few where new particles never seen before are created. To separate those from the background the teams will have to collect many times more events. To achieve this we expect to see more bunches of protons injected into the collider before they are ramped up in energy. Last year during early test runs at lower energy they already had 16 bunches circulating in each beam but the ultimate target is to increase this figure to 2808. Further increases in collision rates can be achieved by adjusting and “squeezing” the beams using magnetic fields to make them collide in a smaller space. If all this can be accomplished the number of events seen during the latest long run will be seen in just a few seconds. Obviously they will want to get there as soon as possible and the stability of the latest fills should give them confidence to move forward quickly.


LHC Watch

April 4, 2010

Now that the Large Hadron Collider is doing physics runs you may wish to try to follow how the experiment is going. It may be some time before results are published and any new physics will be kept secret within the experiment collaborations until it has been very carefully checked and signed off. In the meantime we outsiders can still watch how they are progressing with the job of colliding the beams and increasing the intensity to get better results.

There are a few good links you can use to see how it is coming along. The best known is the status screen called Page 1. The exact contents change depending on what they are doing, but there is always a status message and an indication of beam intensity and energy.  When all is well and the beams are stable all the lights go green. That is how it looks now as I write this. They also feed the status messages to twitter so you can look back on what has been happening. 

A related page is the operation screen that shows a longer historical graph of the energy and the status of each of the experiments. As I look at it now I can see that all the experiments are running except ALICE. The graph shows that the two beams were injected about three hours ago in two bunches for each beam. The energy was ramped up starting an hour later.

After the webcast last week they left us another status display that has a graphic display showing how the beams circulate in the collider ring and how the experiments are placed. If you click on the experiments it takes you to a real-time plot of the rate at which events are currently being seen.

Of course it is brilliant to be able to see live displays of events from the experiments themselves. One experiment that is making this possible from outside is LHCb . You may need to fiddle with your browser to get this page working due to the use of an unrecognised certificate, but it does work. Check the timestamp at the bottom left to confirm whether or not it is currently showing live events.

There is now also a live event display for ATLAS. if you spot a Higgs boson here publish your result immediately along with your calculation for its mass, then claim your Nobel Prize before they do :) If you want more details you can download the Atlantis/Minerva application.

There are a lot more links and discussion forums to be found at the LHCPortal if you want to delve in any deeper, and of course anything new that comes along after I write this is likely to be found there.

Finally if you want the latest news about what has been going on you should check the official press releases and unofficial blogs listed on the right of this page.


Follow

Get every new post delivered to your Inbox.

Join 276 other followers