“first direct evidence of cosmic inflation” BICEP2 results

March 17, 2014

At a presentation from the Harvard-Smithsonian the BICEP2 team have announced that they have the “first direct evidence of cosmic inflation”. As rimoured they have detected what they believe to be primordial gravitational waves with a ratio or tensor to scalar modes of r=0.2 (+0.07 -0.05) which is 5 sigma over the null hypothesis. This is a game-changing result for inflationary cosmology and possibly for quantum gravity research because the result indicates that the scale of the inflation is only about a factor of 100 below the Planck scale. These results and the future followups that will no doubt be carried out could be the experimental test-bed for the leading edge of theoretical physics including string-theory.

The papers are now online but everything is down at the monet due to heavy load. I just managed to get this snapshot of the abstract and a few pictures. More when I have it.

bicep2abstract

These images show the actual signal from a small patch of the sky (on left) compared to a simulation based on predictions from inflation and cold dark matter (on right)

BICEP2signal

The signal is stronger than many theories predicted so it will have an immediate effect on the direction of theoretical research in quantum gravity and the first moments of the universe

http://www.youtube.com/watch?v=jq-OvV-XHdc

Full paper is at http://bicepkeck.org/b2_respap_arxiv_v1.pdf

inflation

This graph is the money plot showing where we now stand in observational inflationary cosmology. Blue is the new result using BICEP2 compared to previous results from Planck and other sources. Note that Planck should release more polarisation measurements soon.

https://www.youtube.com/watch?v=ZlfIVEy_YOA

Should have spotted this video yesterday, very moving.

The most interesting thing now is going to see how theorists react to these results. They will have implications for inflationary cosmology (obviously), galaxy formation and quantum gravity. To get the ball rolling theorist Liam McAllister has a guest post on Lubos Motl’s Blog with the quote “The tensor fluctuations write quantum gravity on the sky” exciting stuff!


Who should get the Nobel Prize for cosmic inflation?

March 16, 2014

Tomorrow we might hear some good news about a discovery of primordial gravitational waves and within a few more weeks that could be confirmed in more detail by Planck. If this happens the observational status of the theory of cosmic inflation will change dramatically because primordial gravitational waves have been described as a smoking gun for the theory. Well that may be an exaggeration but  the observed scale invariance of the CMB anisotropy spectrum is already a good pointer towards inflation so could the combination be enough to sway the notoriously cautious Nobel committee towards awarding a prize for the theory?

SONY DSC

Rumors say that Alan Guth and Andrei Linde have been invited to tomorrow’s meeting where the team of astronomers who work with the BICEP2 observatory in Antartica will announce a “major discovery” about B-modes in the cosmic microwave background. E-mode polarisation in the cosmic radiation was produced at the time of last scattering when it decoupled from atomic gas in the early universe. These E-modes could then have been distorted by the tensor modes of the primordial gravitational waves permeating space, twisting the polarisation field of the microwave background into the (hopefully) observed B-modes. So the B-modes are a signature of the gravitational waves that are themselves a remnant of the much earlier inflationary epoch of the universe.

The presence of Guth and Linde at this meeting echos the presence of Higgs and Englert at the announcement of the discovery of the Higgs Boson in 2012, and that is probably no coincidence. Just as Higgs and Englert were awarded the Nobel Prize last year for the theory behind the Higgs discovery, Guth and Linde will be prime candidates for any Nobel Prize awarded for the theory of inflation. However, there was much discussion about who else might have deserved the Higgs prize and the similar decision for inflation could be equally awkward and controversial.

Guth and Linde have already been jointly awarded several honours for their work on inflation theory including the Gruber Prize and Milner’s Fundamental Physics Prize, but the Swedish committee places a higher bar for empirical verification. The general idea of the inflationary universe may pass with the new evidence giving Guth his ticket, but Linde has worked on more specific models of inflation such as slow-roll and chaotic inflation. Brilliant and important though his work is, I am not convinced that he is destined for the Nobel yet. Argue with me if you disagree. Update: after the first 24 hours of analysis Linde’s model of chaotic inflation with quadratic potential appears to be in particularly good shape so perhaps I was being premature. In any case his widely seen status as one of the “principle architects of inflation theory” along with Guth is sure to win him many nominations.

SONY DSC

On the other hand Guth is not the only one with a claim to the original idea of inflation. It has been recorded that he first had the breakthrough idea on 6th December 1979, gave a seminar on the theory at SLAC on 23rd January 1980 and his paper was received on 11th August 1980. At around the same time  Katsuhiko Sato in Japan had written a paper proposing inflation by 21st February 1980 which was received for publication on 9th September 1980, and another similar paper by Demosthenes Kazanas had already been received by 5th May 1980. All three contributions seem to have been independent and similar. The only thing that may have singled out the work of Guth was that his term “inflation” stuck and he was part of a more influential circuit of physicists. Closer examination of dates and the points they made in their papers may separate them, but I think it would be hard to be truly objective about what really counts.

MukhanovStarobinsky

But then all three were preempted by Soviet physicist  Alexei Starobinsky who had already worked out the ideas behind inflation in 1979. Wikipedia describes his contribution like this

“Although Alexei Starobinsky of the L.D. Landau Institute of Theoretical Physics in Moscow developed the first realistic inflation theory in 1979 he failed to articulate its relevance to modern cosmological problems. Due to political difficulties in the former Soviet Union, regarding the free exchange of scientific knowledge, most scientists outside the USSR remained ignorant about Starobinsky’s work until years later. Starobinsky’s model was relatively complicated, however, and said little about how the inflation process could start.”

I think this is an overly negative view of his contribution and I suspect that it owes more to a bias that tries to rationalize the fact that we do not recognize his work as well as we recognize Guth’s. It is notable that he had already predicted the primordial gravitational waves in 1979 before anyone else had even started thinking about inflation. How the Nobel committee will see it I can only guess.  Starobinsky did also win the Gruber prize independently of the prize given earlier to Guth and Linde. He was recognised along with Viatcheslav Mukhanov who, in colaboration with Chibisov (deceased), first calculated the spectrum of anisotropies from quantum fluctuations during inflation and who could therefore be yet another candidate for the Nobel. Once again the Nobel committee will again be inflicted with the headache that strikes them when more than three people deserve their recognition for the same discovery.

Update 26/03/2014: Since the first version of this post I learnt about another thread of discoveries concerning inflationary cosmology that preceded even the work of Starobinsky. In 1965 Soviet physicist Erast Gliner published the earliest known proposal for inflationary cosmology. Andrei Sakharov built on the idea looking at its cosmological consequences including implications for the Horizon problem. A number of papers were written that are now hard to access but a history by Chris Smeenk available online gives a detailed account of their ideas. Another paper by L. E. Gurevich in 1975 is also accessible and shows just how advanced this work had become by that time, five years before the burst of interest in cosmic inflation. Gurevich considered the horizon and flatness problems, primordial inhomogenieties that could lead to galaxy formation and he even speculated about a version of perpetual inflation with multiple universes of “metaglaxies” as he called them.


Brain Power

June 28, 2013

Supercomputers

In 1984 when big brother was meant to invade our privacy I was a graduate student in Glasgow working on lattice gauge theories. As part of the research towards my doctorate I spent a week on a special mission to Germany where I was allowed into a secret nuclear base to borrow some computer time on a Cray-XMP. It was the world’s fastest supercomputer of the time and there was only one in Europe so I was very privileged to get some time on it even if it was only a few hours of CPU. Such resources would have been hugely expensive if we had to pay for them. I remember how the German’s jokingly priced the unseen cost in BMWs. The power of that computer was 400 Megaflops and it had a massive 512 Megabyte ram disk.

The problem I was working on was to look for chiral symmetry breaking in QCD at high temperatures and densities using lattice simulations. In the last few years this has been seen experimentally at the LHC and other heavy ion accelerators but back then it was just theory. To do this I had to look at the linear discretised Dirac equation for quarks on a background of lattice gauge fields. This gave a big hermitian NxN matrix where N is the number of lattice sites times 3 for the QCD colours. On a small lattice of 164 sites (working in 4D spacetime) this gave matrices of  196608 square and I had to find its smallest eignevalues. The density of this spectrum says whether or not chiral symmetry is broken. Those are pretty big matrices to calculate the eigenvalues of, but they are sparse matrices with only 12 complex non-zero components in each row or column. My collaborators and I had some good tricks for solving the problem. Our papers are still collecting a trickle of citations.

tianhe-2

Thirty years later big brother has finally succeeded in monitoring what everyone is doing in the privacy of their own homes and my desktop computer has perhaps 100 times the speed and 30 times the memory of the Cray-XMP, which makes me wonder what I should be doing with it. The title for the fastest supercomputer has recently been taken by China’s Tianhe-2 which has been benchmarked at 33.86 Petaflops and it has a theoretical peek performance of 53.9 Petaflops so it is about 100,000,000 times faster than the Cray. This beats Moore’s law by a factor of 5000 which may be in part due to governments being willing to spend much more money on them. The US who more commonly hold the record wont be beaten for long because the NSA is said to have a secret and very expensive project to build a supercomputer to surpass the Exaflop mark in the next few years. I doubt that any HEP grad students will have  a chance to use it.

This begs the question: Why do they need such powerful computers? In the past they may have been used to simulate nuclear explosions or design stealth fighters. Now they may be needed to decrypt and search all our e-mails for signs of dissenting tendencies, or perhaps there is an even more sinister purpose.

Artificial Intelligence

When computer pioneers such as Von Neumann and Turing conceived the possibility of building electronic computers they thought it would be easy to make computers think like humans even though they had no idea how fast computers would become. This turned out to be much harder than expected. Despite some triumphs such as “superhuman” chess programs which can now crush the best grandmasters (see discussion at World Science Festival) the problem of making computers think like us has seen little progress. One possibility that looked promising back in the 1980s was neural networks. When I left academia some of my colleagues at Edinburgh were switching to neural networks because the theory and the computing problems were very similar to lattice calculations. Today their work has applications in areas such as facial recognition but it has failed to deliver any real AI.

Now a new idea is raising hopes based on the increasing power of computers and scanning technologies. Can we simply map the brain and simulate it on a computer? To get a flavour of what is involved you can watch this TED talk by neuroscientist Sebastian Seung. His aim is to simulate a small part of a mouse brain, which seems quite unambitious but actually it is a huge challenge. If they can get that working then it may be simply a case of scaling up to simulate a complete human brain. If you want to see a project that anyone can join try OpenWorm which aims to simulate the 7000 neuro-connections of a nemotode worm, the simplest functioning brain in nature (apart from [insert your favourite victim here]).

Brain Scans

An important step will be to scan every relevant detail of the brain which consists of  100 billion neurons connected by a quadrillion synapses. Incredibly the first step towards this has already been taken. As part of the European Human Brain Project funded with a billion Euros scientists have taken the brain of a 65 year old women who died with a healthy brain, and they have sliced into 7404 sections each just 20 microns thick (Obama’s Brain Mapping Project which has had a lot of publicity is just a modest scaled down version of the European one). This is nearly good enough detail to get a complete map of the synaptic connections of every neuron in the brain, but that is not clear yet. If it is not quite good enough yet it is at least clear that with an order of magnitude more detail it will be, so it is now only a matter of time before that goal is achieved.

If we can map the brain in such precise detail will we be able to simulate its function? The basic connectivity graph of the neurons forms a sparse matrix much like the ones I used to study chiral symmetry breaking but with about a trillion times as many numbers. An Exaflop supercomputer is about a trillion times more powerful than the one I used back in 1984, so we are nearly there (assuming linear scaling). The repeated firing of neurons in the brain is ( to a first approximation ) just like multiplying the signal repeatedly by the connection matrix. Stable signals will be represented by eigenvectors of the matrix so it is plausible that our memories are just the eigenvalue spectrum of the synaptic map and the numerical methods we used in lattice gauge theories will be applicable here to.

However, the processes of logical reasoning are more than just recalling memories and will surely depend on non-linear effects in the brain just as the real physics of lattice QCD depends on the highly non-linear interactions of the gauge field. Will they be able to simulate those for a human brain on a computer? I have no idea, but the implications of being able to do so are enormous. People are starting to talk seriously about the moral implications as well as what it may bring in capability. I can understand that some agencies may want any such simulations to be conducted under a veil of secrecy if possible. Is this what is driving governments to push supercomputer power so far?

It would be ironic if the first true artificial intelligence is actually a faithful simulation of a human brain. No doubt billionaires will want to fund the copying of their own brains to giant supercomputers at the end of their lives if this becomes possible. But once we have the capability to simulate a brain we will also start to understand how it works, and then we will be able to build intelligent computers whose power of thought goes far beyond our own. Soon it may no longer be a question of if this is possible, just when.


BSM CPV in LHCb at HCP11

November 14, 2011

Beyond standard model CP violation has been reported by Mat Charles for the LHCb collaboration at the Hadron Collider Physics conference today. Here is the relevant plot in which the cyan coloured band indicating the measurement fails to cross the black dot as predicted by the standard model.

The numerical result which was already rumoured at the weekend is ΔACP = -0.82% ± 0.25% which is just over 3 sigma significance.

This measurement is sensitive to new physics such as higher mass particles with CP violating interactions so that could be the explanation. On the other hand it is also a very tricky measurement subject to background and systematics. The significance will improve with more data and already twice as much is on tape so this is one to watch.  The interesting thing will be to see if the phenomenologists can account for this result using models that are consistent with the lack of other BSM results from ATLAS and CMS.

Update: This is also being reported in other blogs of course e.g. here and here, but for the most expert details see the LHCb public page and the CERN bulletin


Expected LHC Higgs Significance at 5/fb+5/fb

November 14, 2011

The Hadron Collider Physics conference starts today in Paris and we eagerly await updates for various searches including the Higgs. 5/fb of luminosity have been collected in each experiment but it is too soon for the analysis of the full data to be out and this week we are only expecting results at 2/fb to be shown (but surprises are always possible) Indeed ATLAS have recently revealed updates for three of the diboson Higgs channels at 2/fb in conference notes and other conferences. These do not make much difference but an update to the diphoton search would be worth seeing. It has so far only been shown by ATLAS at 1/fb. CMS have only released results at 1.6/fb for the main Higgs decay modes so they are even more overdue for updates.

While we are waiting for that we can look forward to next year when results for 5/fb will be revealed, probably in March. When the results are combined we will see 10/fb and here is a plot showing the expected significance at that level. This is for 10/fb at CMS which can be taken as a good approximation for the ATLAS+CMS combination at 5/fb for each.

From this you can see that they expect at least 4 sigma significance all the way from 120 GeV to 550 GeV, which suggests that a good clear signal for the Higgs is almost certain if it exists, but not so fast. There are a couple of caveats that should be added.

Firstly the WW decay channels have been very good for excluding the Higgs over a wide mass range. Here is the viXra combined plot using 2/fb of ATLAS data and 1.5/fb from CMS.

This is only a rough approximation to what would be produced if they did an official version because it assumes a flat normal distribution uses a linear interpolation for CMS points and ignores any correlations.

Within those limitations we get an exclusion from 140 GeV to 218 GeV with a broad excess around 2 sigma extending all the way from 120 GeV to 160 GeV. A Standard Model Higgs in this region would only have a width of a few GeV and no bump of the sort is seen, so what does it mean? ATLAS and CMS will probably need to consider this question for a long time before agreeing to approve results like this with more data along with a suitable explanation. For now you should just bear in mind that this plot suffers from large backgrounds and poor energy resolution due to the use of missing energy to identify the two neutrinos. These effects have been worsened by high pile-up this year. I suspect that this channel will have to be used only where it provides a 5 sigma exclusion and should be left out when looking for a positive signal.

For this reason I have added a red line to the projected significance plot above showing the expected significance for just the diphoton plus ZZ to 4 lepton channels. These decay modes have very good energy resolution because the photons and high energy leptons (electrons and muons) are detected directly with good clarity and are not effected by pile-up. I think that the best early signal for the Higgs boson will be seen in a combination of these channels alone. The projected significance plot shows that with the data delivered in 2011 we can expect a signal or exclusion at a level of significance ranging from about 3 sigma to 6 sigma in the mass range of 115 GeV to 150 GeV where the Higgs boson is now most likely to be found.

Does this mean that we will get at least a 3 sigma “observation” for the Higgs by March? No, not quite. There is one other obvious caveat that is often forgotten when showing these projected significance plots. These are only expected levels of significance and like everything else they are subject to fluctuations. Indeed, given twenty uncorrelated mass points we should expect fluctuations of up to 2 sigma over the range. How could this affect the result? The next plot illustrates what this could mean assuming an expected significance of 4 sigma

In this plot the green line represents the expected level for a positive signal of a standard model Higgs, while the gred line represents the level where there is no Higgs. The data points have error bars at the size you will get when you expect a 4-sigma level of significance. So point A shows where the bars are expected to sit  if the SM Higgs exists at a given mass value and point B shows where the bars are expected if there is no Higgs. If they get observed data in these locations they will be able to claim a 4-sigma observation or exclusion, but remember that fluctuations are also expected. Point C show what happens when the Higgs is there but an unlucky one sigma fluctuation reduces the number of observed events. The result is a reduced significance of three sigmas. Likewise point D shows an unlucky one sigma fluctuation when there is no Higgs which still gives a healthy three sigma exclusion. But remember that we expect fluctuations of up to two sigma somewhere in the range. Point E shows what happens when a Higgs is there but an unlucky two sigma fluctuations hits that mass point, and point F shows what happens when there is no Higgs with an unlucky two sigma fluctuation. The points are the same, corresponding to either a two sigma signal or a two sigma exclusion. We have already seen some points that look just like this at the summer conferences. This is why the CERN DG has cautiously promised to identify or exclude the Higgs only by the end of 2012 and not by the end of 2011. More optimistically we can also hope for some lucky fluctuations. If they fall at the mass where the Higgs actually lives we will get a 6 sigma discovery level signal like point G instead of merely a 4-sigma observation.

It’s a simple point and my explanation is a little too long-winded, but I think this had too be said clearly before the next results come out in case people do not see what they thought they should have expected to see. With another year of data 10/fb becomes (perhaps) 40/fb and 4 sigma becomes 8 sigma. Even with unlucky 2 sigma fluctuations they will be left with 6 sigma signals. The results will probably be good enough to claim discoveries even for the individual experiments and some individual decay channels, but for this year’s data there could still be a lot of ambiguity to mull over.


HCP 2011: Will it Deliver?

November 10, 2011

The rumour mill is once again turning its rusty wheels, and there are suggestions that an interesting result will be revealed at Hadron Collider Physics conference in Paris next week. More on that in a minute.

You may think that things have been quietly lately but there have been a lot of workshops going. They have not been reported much but of course us bloggers have been trawling the slides for anything new and exciting. In case you want to search for anything we might have missed here is a convenient list of links:

One thing that turned up was an update to the Higgs -> WW analysis for ATLAS upgrading it from 1.7/fb to 2/fb, The effect is not terribly exciting, nothing has changed.
So now we are waiting for the HCP conference but not much is expected, or is it? The full schedule of talks can be found here. If this is to believed even the new update for H -> WW will not be shown. The only thing certainly new is the ATLAS+CMS combination of data shown at Lepton Photon nearly three months ago.
But then an organizer speaks of a last-minute talk being added and a comment over at NEW says “…or maybe something else violates CP at 3.5 sigma level.” So do we have a new rumour about – perhaps – a result from LHCb, or is someone just hyping the conference?
Apart from that the next big question is when will the next wave of Higgs results be revealed? They must have done more analysis at 2/fb, yet we have not had anything beyond 1/fb for the crucial diphoton search from ATLAS. I am sure they must have also looked at plots using 3/fb to 4/fb but nothing has been said, except a few vague rumours that I don’t find convincing.
Now they will be preparing the 5/fb plots that should be ready for approval in December. We may see them soon after but if the results are really so inconclusive we may have to wait for the 5/fb ATLAS+CMS combination. That means there may be nothing ready to show until Moriond in March, unless…
Rumour Update 24-Nov-2011: The rumour apparently concerns a measurement of ΔACP at 600/pb and will be shown in the last talk today at HCP11. This quantity is the difference between decays of a charmed D meson into Kaons or pions. It is not yet clear if the rumoured 3.5 sigma result is merely a signal of CP violation or a deviation from the standard model.

What is the Future for Particle Accelerators?

November 6, 2011

This year all physics eyes are on the Large Hadron Collider as it approaches its promised landmark discovery of the Higgs Boson (or maybe its undiscovery). At the same time some physicists are planning the future for the next generation of colliders. What will they be like?

The answer depends in part on what the LHC finds. Nothing is likely to be built if there is no sign that it will do anything useful, but decisions are overdue and they have to make some choices soon.

Hadron colliders

Accelerators like the LHC that collide protons are at the leading edge of the Energy and Luminosity frontiers because they work with the heaviest stable particles that are available. The downside of colliding protons is that they produce messy showers of hadrons making it difficult to separate the signal from the noise. With the Tevatron and now the LHC, hadron colliders have been transformed into precision experiments using advanced detectors.

One technique is to capture and track nearly all the particles from the collisions making it possible to reconstruct the jets corresponding to interesting high energy particles such as bottom quarks created in the heart of the collision. Missing energy and momentum can also be calculated by subtracting the observed energy of all the particles from the original energy of the protons. This may correspond to neutrinos that cannot be detected or even to new stable uncharged particles that could be candidates for dark matter.

High luminosities have been achieved making it possible to scour the data for rare events and build up a picture of the interactions with high statistics. As luminosity increases further there can be many collision events at once making it difficult to reconstruct everything that happens. The LHC is now moving towards a new method of operation where it looks for rare events producing high energy electrons, muons and photons that escape from the heart of the collision giving precise information about new particles that decayed without producing jets or missing energy. In this way hadron colliders are getting a new lease of life that turns them into precision tools very different from how they have been seen in the past.

So what is the future of hadron colliders? The LHC will go on to increase its energy to the design limit of 14 TeV while pushing its luminosity even higher over the coming years. Its luminosity is currently limited by the capabilities of the injection chain and the cryogenics. These could undergo an upgrade to push luminosities ten times higher so that each year they collect 50 times as much data as they have in 2011. Beyond that a higher energy upgrade is being planned that could push its energy up to 33 TeV. The magnets used in the LHC main ring today are based on superconducting  niobium-titanium coils to generate magnetic fields of 8.5 tesla. Newer magnets could be built using niobium-tin to push the field up to 20 Tesla to more than double the energy. If they could revive the tunnel of the abandoned SSC collider in Texas and use niobium-tin magnets it would be possible to build a 100 TeV collider, but the cost would be enormous. The high-energy upgrade for the LHC is not foreseen before 2030 and anything beyond that is very distant.  Realistically we must look to other methods for earlier advances.

Is the future linear?

The latest linear accelerator built to date is SLAC at Stanford with a centre of mass energy of 90 GeV. As hadron colliders reach their physical limits physicists are returning to the linear design for the next generation of colliders. When accelerating in a straight line there is no advantage in using heavy particles so linear colliders work equally well with electrons and positrons which give much cleaner collisions.

The most advanced proposal is the International Linear Collider which would provide centre of mass energies of at least 500 GeV with 1 TeV also possible. The aim of the ILC would be to study the Higgs boson and top quark with very high precision measurements of their mass, width and other parameters. This may seem like an unambitious goal but if the LHC finds nothing beyond the standard model in the data collected in 2011 this could be the best option. the standard model makes very precise predictions about the quantities that a linear collider could measure. If these can be checked, any deviations could give clues to the existence of new particles at higher energies. Such precision measurements have already been useful in predicting where the mass of the Higgs Boson lies, but once all the parameters of the standard model can be measured the technique will really come into its own. Finding solid evidence for deviations from the standard model would be the requirement to choose and justify the construction of the next collider at the energy frontier.

But there is an alternative. A new innovative design for a compact linear collider (CLIC)  is being studied at CERN and it could push the energy of linear colliders up to 3 TeV or even 5 TeV. The principle behind CLIC is to use a high intensity drive beam of electrons at lower energy to accelerate another lower intensity beam of electrons too much higher energy. Just think of how a simple transformer can be used to convert a high current low voltage source of electricity into a low current high voltage source. CLIC does a similar trick but the coils of wire in the transformer are replaced by resonant cavities. It is a beautiful idea, but is it worth doing?

The answer depends on whether there is anything to be found in the extended energy range. This is being explored by the LHC and so far nothing new has been seen with any level of certainty. There is still plenty of room for discovery but decisions must be made soon so the data collected in 2011 will be what any decision has to be based on.

It is going to be a hard choice. For me it would be swung towards CLIC if it could be the start of a design that could lead to even higher energies. Could the same trick be used a second time to provide even higher energies, or is it limited by the amount of power needed to run it? Do other designs have better prospects, such as a muon collider? Big money and decades of development are at stake so let’s hope that the right decision is made based on physics rather than politics.

Perhaps it is worth a poll. If it was a straight choice, which of these would you prefer to see international funds spent on?


LHC end of run update

October 30, 2011

Today is scheduled as the end of proton physics at the Large Hadron Collider and the last few fills are circulating this morning. The integrated luminosity recorded this year will end at about 5.2/fb each for CMS and ATLAS, 1.1/fb for LHCb and 5/pb for ALICE. For the remainder of this year they will return to heavy ion physics until the winter shutdown.

The good news this year has been the high luminosity achieved with peaks at 3.65/nb/s. This compares with the expectations of 0.288/nb/s estimated before the 2011 run began. The higher luminosity has been made possible by pushing beam parameters (number of bunches, bunch intensity, emittance, beta*) to give better than expected performance. The not so good news is that out of 230 days that were available for physics runs only 55 (24%) were spent in stable beams. This was due to a barrage of technical difficulties including problems with RF, Vacuum, cryogenics, power stability, UFOs, SEUs and more. There were times when everything ran much more smoothly and the time in stable beams was then twice the average. The reality is that the Large Hadron Collider pushes a number of technologies far beyond anything attempted before and nothing on such scales can be expected to run smoothly first time out. The remarkable amount of data collected this year is testament to the competence and dedication of the teams of engineers and physicists in the operation groups.

After the heavy ion runs they will start looking towards next year. There will be a workshop at Evian in mid December to review the year and prepare for 2012.  Mike Lamont, the LHC Machine Coordinator will be providing a less technical overview for the John Adams Lecture on 18th November.


Brian Cox, Bloggers and Peer Review

October 24, 2011

Brian Cox is a professor of physics at Manchester and a member of the ATLAS collaboration. He is very well-known as a television presenter for science, especially in the UK and has been credited with a 40% increase in uptake of maths and science subjects at UK schools. He is clever, funny and very popular. If you are in the UK and missed his appearance on comedy quiz QI last week you should watch it now (4 days left to view).

At the weekend the Guardian published a great question and answers article with Brian Cox and Jeff Forshaw, who I am less familiar with. The answers all made perfect sense except one:

How do you feel about scientists who blog their research rather than waiting to publish their final results?

BC: The peer review process works and I’m an enormous supporter of it. If you try to circumvent the process, that’s a recipe for disaster. Often, it’s based on a suspicion of the scientific community and the scientific method. They often see themselves as the hero outside of science, cutting through the jungle of bureaucracy. That’s nonsense: science is a very open pursuit, but peer review is there to ensure some kind of minimal standard of professionalism.

JF: I think it’s unfair for people to blog. People have overstepped the mark and leaked results, and that’s just not fair on their collaborators who are working to get the result into a publishable form.

I would be interested to know what Brain Cox was thinking of here. Which bloggers does he think see themselves as “the hero outside of science” and what examples back up the idea that bloggers try to circumvent peer review?

It is not clear to me whether Brian Cox is referring to the internal review process that experimental collaboration go through or the peer review provided by journals as a part of publication. Surely it cannot be the latter because most science research and especially that from CERN is widely circulated long before it reaches  the desk of any journal editor, not by bloggers but by CERN through conferences, press releases, preprints etc. So Cox must be talking about internal review, but that does not really count as peer-review in the usual sense. In any people within a collaboration do not get away with blogging about results before approval.

There have been a few leaks of results from CERN and Fermilab before approval from the collaboration. For example, one plot featured here earlier this year from a talk that turned out to be not intended for the public. However, by time I had passed it on it was already in Google having been “accidentally” released in a form that made it look like any other seminar where new preliminary results are shown. There were a few other examples of leaks but none that fit what Cox is saying that I can think of.

Given his obvious dislike for blogs I can’t hold much optimism that Brian will comment here and elaborate on what he means, but it would be nice if he did. Otherwise perhaps someone else knows some examples that could justify his answer. Please let us know.


Help CERN search for the Higgs boson

August 11, 2011

If you have been following our reports on new developments in the search for the Higgs Boson you may be itching to get involved yourself. Now you can by joining LHC@Home 2.0 a new project for people to run simulations of LHC particle collisions on their home PCs.

Projects like this used to be difficult to set up because the software is written to run on LINUX systems, but a new virtual machine environment from Oracle has made it much easier. CERN runs simulations on a powerful global computing grid but you can never have too much CPU time for the calculations they have to do.

Running monte carlo simulations to compare with the latest experiments is an importnat part of the analysis that goes into the plots they show at the conferences. CERN have been making extraordinary efforts to show results as quickly as possible to the public but these calculations is one of the limiting factors that keeps us waiting. Getting the public involved in the process may be one way to solve the problem.


Follow

Get every new post delivered to your Inbox.

Join 270 other followers