Is someone trying to sabotage the Large Hadron Collider?

May 25, 2010

This morning the Large Hadron Collider set yet another  record as it injected 13 proton bunches in each beam to reach a luminosity of 2 x 1029 cm-2 s-1, about three times previous records. The next step will probably be to increase the intensity of the bunches from 20 billion protons in each bunch up to the nominal design limit of 120 billion.

With such high bunch intensities the beam controllers must take extra care. If the beam becomes unstable the protons can go astray and plough into the sensitive particle detectors causing damage to the instruments. If instabilities grow the beams must be dumped quickly before all control is lost.  This is why the gradual increase in luminosities has been a slow process.

Since they started up the LHC at the end of last year the physicists have been dogged by a mysterious source of interference christened “The Hump”. Some unknown vibration is causing an extra vertical oscillation in the beams. The movement is very tiny, just microns in magnitude, but when its frequency hits the tune frequency of the beam it resonates and the protons start to spread out from the beam vertically, causing beam losses. So far this has been just a nuisance decreasing the lifetimes of the beams and sometimes triggering an emergency dump, but as beam intensities increase it becomes a bigger threat and could lead to damage of the accelerator.

Over the past few months the engineers have been trying to trace the source of the interference, but without success. Accelerators are very sensitive to any kind of movement. The previous accelerator at CERN, known as LEP was affected by the tidal pull of the moon and by the passing of France’s high-speed train, the TGV. Recently the Tevatron in the US detected the devastating Earthquake in Haiti thousands of miles away when it caused a small wobble in the beams. 

But The Hump is different. It is a high frequency vibration which drifts in frequency between about 3.1 and 3.8 KHz, sliding up and down the spectrum repeatedly over a period of 7 to 10 minutes. It could either be a sonic vibration or a low-frequency radio wave. It is unpredictable, being sometimes there and sometimes not. The CERN engineers have looked for sources of vibration from equipment in the accelerator such as vacuum pumps or the cryogenics, but nothing matches the observed interference and now they are stumped. 

At this point they must start to consider a more sinister possibility. Could The Hump be a deliberate plot to sabotage the LHC? There are plenty of people who would be motivated to attempt such an act. Since its inception the collider has been a source of controversy because a few fringe physicists have suggested that it could create black holes of other unnatural entities that could grow in size to swallow up the Earth. Scientists at CERN have dispelled such theories, pointing out that anything created will fly off at nearly the speed of light and will not be captured by the Earth. Black holes would decay just like any other particle and could not be dangerous, even in the unlikely event that they are created. It has also been pointed out that the kind of particle collisions that the LHC produces are commonly produced in the atmosphere by cosmic rays and would have caused problems long ago if there was any danger. Further studies of particle interactions around neutron stars have eliminated any possibility of catastrophic events beyond any reasonable doubt.

This has not satisfied the more hardened detractors who claim that the only people qualified to make such a risk assessment are particle physicists with a self-interest in running the experiment. Some activists have even instigated court cases to try to get the LHC closed down by legal process. With the failure of such initiatives it is certainly not beyond the realms of possibility that some fanaticists would attempt to disrupt the LHC directly.

The Hump has especially annoying characteristics. The LHC beam has a tune frequency of around 3.5 KHz which can be modified to avoid most sources of noise, but The Hump varies in frequency, crisscrossing the range in which the tune can be set. If you wanted to design a source of interference to disrupt the accelerator you could do no better than this.

If The Hump is caused by a sonic wave or radio wave it must be coming from either directly above or below the collider ring at some point because it is disturbing the beams in the vertical direction. More distant sources would cause horizontal vibrations.  Natural causes from below can be ruled out because they would have been seen before. This points to a source of interference coming from above the ring, probably at ground level. It is possible that a sonar device or radio transmitter operating at the observed frequency could be deliberately targeted at the ring from a station or movable vehicle above ground.

The LHC scientists are nearly ready to step up the beam intensity so that real physics experiments capable of discovering new phenomena can begin. the saboteurs, if they are really out there, will be waiting for the right moment to throw the collider beams of course, damaging the detectors and making further experiments impossible.

Perhaps this is just another conspiracy theory and a benign source for The Hump will be found soon but after several months of searching without result no possibility can be ruled out.


“crackpots” who were right 10: Roger Apéry

May 24, 2010

Up to now our series on people who were regarded as “crackpots” but who turned out to be right has not included any mathematicians. That is because most issues in mathematics can be resolved quite quickly. The logic of a claimed proof can normally be checked and either verified or debunked beyond doubt by examining its logic.

Not that mathematics is without its controversies. These have mostly arisen when a mathematician introduced a new level of abstraction whose validity was disputed. Here are a few classic examples just to give a flavour:

  • 1545: Gerolamo Cardano introduced imaginary numbers which were further developed by his disciple Rafaello Bombelli, but they were not widely accepted until the work of Leonhard Euler in the eighteenth century.
  • 1830: Nikolai Lobachevsky introduced hyperbolic geometry which was an early example of non-euclidean geometry, but his work was rejected by the St. Petersburg Academy of Sciences.  Gauss and  Bolyai had similar ideas at the same time but it was not until later when Riemann introduced more general non-euclidean geometries that the validity of the work started to be accepted.
  • 1852: Ludwig Schläfli classified the six regular polytopes in four dimensions but his manuscript was rejected by the Austrian Academy of Sciences and then by the Berlin Academy of Sciences and was not published in full until after his death.
  • 1873: Georg Cantor analysed transfinite numbers but his work was opposed by a number of mathematicians who believed that such entities do not exist. Leopold Kronecker was especially critical and prevented publication of the work in Crelle’s Journal.
  • 1888: David Hilbert’s proof of his basis theorem was heavily criticised for being non-constructive. Paul Gordan who had originated the problem was particularly unimpressed and refused to accept the solution.
  • 1945: When Saunders MacLane and Samuel Eilenberg tried to publish their seminal work on category theory it was at first rejected as being devoid of content.

Some of these examples could be justified as cases of “crackpots” who were right, and there are others, but a more striking story is that of  Roger Apéry. His mathematics was initially rejected, not because it was too abstract, but rather because his colleagues would not believe that it could be right.

Roger Apéry was a Greek-French mathematician born in 1916. As a working mathematician in France he took a rebellious political and philosophical position that was not liked by his contemporaries. When he was asked to join the Bourbaki team who were famously compiling an encyclopedia of mathematics under the pseudonym,  Apéry declined because he saw mathematics as a more individual pursuit. This led to his isolation from French mathematicians.

At the age of 61  Apéry was an undistinguished mathematician suffering from the dislike of his colleagues and his own problems with alcoholism. At such a late age mathematicians are not normally expected to produce ground breaking results so it is easy to understand the level of skepticism that greeted his announcement of a proof that the number ζ(3) is irrational.

The incredulity heightened when it was seen that the method of proof was very basic and used methods that could have been understood by mathematicians such as Euler who died nearly 200 years earlier. Many claimed proofs of old problems are rejected today at a glance simply because an experienced mathematician “knows” that methods that are too elementary can not solve the problem. All such avenues should have been explored before.

In 1978 he presented a lecture on his proof at the Journées Arithmétiques de Marseille which was greeted with doubt and disbelief. Each step he wrote on the blackboard appeared to be a remarkable identity that his audience considered unlikely to be true. When someone asked him “where do these identities come from?” he replied “They grow in my garden.” obviuosly this did not boost anyone’s confidence.

Nevertheless, a few mathematicians recognised that there was something significant in the proposed proof. They checked the identities numerically and found that they did indeed seem to hold. It was not long before the full validity of Apéry’s work was confirmed and the skeptics were forced to eat their words.


LHC progress

May 22, 2010

The rediscovery of physics at the LHC progresses with ATLAS showing the first candidate events for the Z-boson. These were first discovered in 1983 at CERN using the smaller SPS collider which now serves as the injection ring for the LHC. ATLAS and the other LHC experiments have been seeing plenty of Z-bosons but we outsiders have to wait for them to approve the reports before we can see them. The next physics milestone will be observations of the top quark which was discovered at the Tevatron in 1995.

Peak luminosity in ATLAS is currently 7.7 x 1027 cm-2s-1. Over the next few weeks they plan to increase this in a series of doubling steps. The process has been going very smoothly thanks to the good performance of the magnets and cryogenics. The plot below using a log scale shows how the integrated luminosity has been increasing exponentially with a thousand factor improvement in less than two months. The LHC controllers will want to see this trend continuing until luminosity is reached at about 10000 times current levels towards the end of the summer. One of the main bottlenecks holding up faster progress is the need to train the shift staff that control the LHC so that they all know how to deal with any issues that have been found.

 


1000 Papers on viXra.org

May 15, 2010

It is less than a year since we launched viXra.org yet today we passed a major milestone when we uploaded the 1000th submission. When we started I had no idea that it would prove so popular, but now I can see that we have just scratched the surface. I now believe there are many more scientists who could be taking advantage of our e-print archive and I hope more of them will start to use it over the next year.

The reaction to viXra.org from the scientific community has been mixed. Those of us who are unable to use arXiv.org have of course welcomed it, but even some who submit regularly to arXiv.org have been submitting to viXra.org as well as a sign of support. On the other hand, some scientists regard viXra.org as a sinbed of crackpottery and send e-mails to those who use it warning them that it could harm their scientific career!

It is encouraging to see that a large number of submissions to viXra.org are now marked with comments indicating that they are submitted to or even published in peer-reviewed journals. Indeed the quality of many of the papers has surprised even us. There will always be those who look for the worst paper they can find then hold it up as an indication of how poor viXra.org is. Clearly they are missing the point. 

We have always acknowledged that our open submission policy means that we gather some work of dubious quality. However, we also recognise that our judgement can be wrong. The series of posts here under the heading “crackpots who were right” is a warning to us all that radical research is often rejected by the scientific community for many years before it is accepted. The 9 cases described so far are just the tip of the iceberg. These are the scientists whose work eventually led to a paradigm shift in their discipline and in several cases the work was recognised with a Nobel Prize, though not always for the scientists who made the initial breakthrough. For every scientists who makes such a major advance in science there are many others who take smaller steps. Undoubtedly there must be many other independent scientists whose work was so completely rejected and ignored that it never garnered any recognition and has long been forgotten. Science suffers through such neglect and that is why we think viXra.org is so important.

Let’s not forget the work of James Lovelock who discovered the buildup of CFC gasses in the atmosphere while working as an independent scientist. Were it not for his research that was not sponsored by any outside agency, we might not have been able to act fast enough to prevent the destruction of the ozone layer and the ensuing catastrophe that would have brought. Today the chance of such a discovery being ignored is greater than ever. Scientists who work in academic institutions are quicker to reject anything from outsiders. The new ease of communication brought about by the internet means that they are exposed to much more research from all sources. To keep down their workload very few will take the trouble to read anything sent by an outsider. The major journals are obsessed with their impact factor metric and will reject work merely because they think it is not likely to get many citations. It is ironic that in the era of the world-wide wide it is more diffcult than ever to gain recognition for scientific research and easier to be ignored and forgotten.

When arXiv.org started in the 90’s it was possible for anyone to submit papers. Any “inappropriate” submissions were filtered out by hand. As the volume of submissions increased this process became impractical. the solution should have been to stop filtering altogether and use other systems such as public rating to help readers select the papers they are prepared to read. Instead they introduced a system of endorsement that means submission to arXiv.org now depends on who you know rather than what you write. In theory and authorised endorser should be ready to endorse good work from an outsider, but in practice the effort required for them to evaluate anything from people they don’t know (coupled to the threats from arXiv administrators in case they get it wrong) means that almost all work from independent scientists is rejected.

The frustrating part is that many people who could make use of viXra.org still wont. They fear that their work will be seen as a reject from arXiv.org and will not have the credibility it deserves. In fact the worse fate is to submit successfully to arXiv.org and have it sent to the graveyard categories “general physics” or “general mathematics”. That is a definitive insult that is intended to remove it from the view of serious scientists. But, neither arXiv.org nor viXra.org are peer-reviewed journals so there is no need for anyone to consider work in either as rejected. The only bad thing that can happen is that a paper is not archived anywhere.

Most papers these days are read either when someone else refers to them, or when someone finds them through a keyword search on Google or another internet search engine. The papers on viXra.org are just as well indexed by Google as those from arXiv.org and we see many paper downloads as a result of such searches. Nevertheless it is disappointing that some specialised search engines do not yet recognise our archive. Google Scholar only include papers from viXra.org if they have been cited or are related to some other paper from another source. SPIRES will apparently include links to papers from viXra.org if the author requests it. Others such as CiteSeer simply will not include us at all. I wonder how many papers viXra.org will have archived before we see a change of attitude.


Another Luminosity Record at the Large Hadron Collider

May 14, 2010

Three weeks ago we reported that the LHC had achieved a record luminosity by squeezing the proton beams to get a factor of 10 improvement. Now they have upped the numbers once again to get a theoretical increase by a further factor of about three. The new configuration is 3.5TeV/2m/20Billion/4-bunches compared to the previous 3.5TeV/2m/12Billion/3-bunches.

Increasing the number of bunches normally increases the luminosity but it depends on how the bunches are organised in buckets to make them collide at the right points of the collider ring. The new arrangement is bucket numbers (1,3231,21081,26731) for beam 1 and (1,12141,17791,26731) for beam 2. Bunches in the same bucket number of both beams will collide in the CMS and ATLAS experiments so from these numbers you get 2 collisions per turn. The other numbers provide the same collisions rates for the other two points where LHCb and ALICE are situated. In fact these bucket numbers are providing the same number of collisions as the previous 3-bunch configuration so for now this is not providing an increase in luminosity.

However they have also increased the intensity of the beams from 12 billion to about 20 billion protons por bunch. This gives a theoretical increase in luminosity of about 3 times. The actual figure will depend on factors such as how much the beam spreads out and how well they can be aimed at each other in the collision points.

The plan is to increase to 6 bunches per beam in the next few days. Depending on the buckets they use this could provide 4 collisions per turn which means another doubling of the luminosity. At the same time they have been doing test runs at much higher intensities of 100 billion protons per bunch but so far this has not been used in combination with multiple bunches and squeezed beams. This means much improved luminosities should be possible soon.

What does this mean for the physics? With the luminosities they have been running at so far they have already announced observations of W bosons and bottom quarks. They must have already seen Z bosons too which are more rare and perhaps they will have seen some top quarks. It takes time for them to review and approve any announcements so we have not heard much about that yet. All of these particles are well-known and are routinely produced in large numbers at the Tevatron in the US. The LHC needs to keep improving its luminosities so that it has enough collisions to see new particles that are only produced very rarely. Of course they are also using energies three times higher than the Tevatron which means they could see new heavier particles that the Tevatron could never produce.

Update Saturday Morning: They have just succeeded in ramping “nominal bunches” to 3.5 TeV. This means bunches with 100 billion protons which is 5 times the number used in the latest physics runs. This is exciting because luminosity is increased by the square of this number. In fact if they used nominal bunches on a physics run right now they could have a new luminosity record even without the squeezing and multiple bunches. On this occassion they lost 40% of the beams during ramping due to “excitation of synchrotron sidebands” but it was still a good step forward. It seems that using such high intensities can lead to problems with instabilities and beam spreading, but they nearly have it cracked. Looking forward to seeing physics with squeezed nominal bunches, hope that is not too far off now.

Highest luminosity recorded so far in Atlas is 30·1027 cm-2s-1

Update Saturday Afternoon: After the nominal intensity run of the morning they started a 6 bunch physics run with configuration 3.5TeV/2m/20Billion/6-bunches. The fill pattern provides 3 collisions per turn in each experiment. It would have been possible to get 4 collisions per turn with six bunches but it looks like they are now opting for schemes that avoid displaced collisions. These are collisions that happen 11.5 m away from the collision point of one of the experiments and these are impossible to avoid in the more efficient filling schemes. 

In any case the injection and ramp for this fill went very well with no sharp beam losses. The run is still continuing after 18 hours and by now they must have doubled the total integrated luminosity of the LHC.  Highest luminosities have now been reported as 60·1027 cm-2s-1 which is double the previous record from last week and 6 times the earlier record. It is also 60 times the earlier runs before they squeezed the beams a few weeks back.

The LHC is now only a factor of 6000 behind the peak luminosities seen at the Tevatron. They can make up another factor of 25 if they use the nominal intensity bunches that they tested in the morning. The rest of the factor can be made up by increasing the number of bunches in the fill. They also have the option to further squeeze the beams down to the nominal beta of 0.5m rather than the current 2.0m. This would give another factor of four.

The plan of a few days ago was to reach this stage late on Sunday so they have really had a good few days


“crackpots” who were right 9: Karl Jansky

May 2, 2010

In 1930 astronomers studied the heavens using only optical telescopes. Today things are very different and we now know that electromagnetic waves of different wavelengths from radio waves up to gamma rays are radiated by stars and other cosmic bodies, so we use a range of telescopes covering the whole spectrum. This change was instigated in 1931, not by a professional astronomer, but by an amateur and it took twenty years before the academics took notice.

Karl Jansky was a young radio engineer working for the Bell Telephone Labs who was tasked to look for sources of radio interference so that they could be eliminated from communication equipment. Jansky built a circular radio antenna some 30 meters in diameter that could be rotated on a platform and designed to pick up radio signals at 20 MHz. After some months he had identified different types of radio static including some that he understood to come from distant lightning storms, but another source was more enigmatic. He noticed that it repeated on a cycle of 23 hours and 56 minutes, the length of the sidereal day. This meant that it was coming from a fixed point against the stars and with more investigation he found that the strongest signal came from the centre of our galaxy.

The discovery was widely publicised and was even reported in the New York Times in 1933, yet professional astronomers did not see it as more than a curiosity. Jansky wanted to study the signal with a better purpose-built antenna, but times were hard and the Bell Labs were only interested in the practicalities of radio transmissions. Jansky’s application was rejected. Writing to his father in 1934 he said  “I’m not working on the interstellar waves anymore. Friis has seen fit to make me work on the problems of and methods of measuring noise in general. A fundamental and necessary work, but not near as interesting as interstellar waves, nor will it bring near as much publicity. I’m going to do a little theoretical research of my own at home on the interstellar waves, however.”

The Great Depression was followed by World War II during which pure scientific research was put on hold while scientists contributed to the war effort. During that time another amateur engineer took up the cause of radio astronomy. Grote Reber built an impressive radio telescope dish  in his back yard in 1937. He confirmed Jansky’s observations and drew up detailed contour maps of the signal strength which he published in the Astrophysical Journal.

Wanting to persue his research further, Jansky applied for a teaching post at  Iowa State University hoping to make use of their facilities. Sadly however he was struck by illness and could not fulfill his dreams. He died in 1950 without seeing how important his pioneering research was about to become.

There were several reasons why it took so long for the importance of Jansky’s discovery to be recognised. The economic hardship and political unrest played its part, but in addition to that astronomers simply did not believe that the galaxy could be such a strong emitter of radio waves. This was not an observation they were willing to accept from amateurs at a time when professionals were concentrating on the revolutionary discoveries being made using optical instruments.

After the war radio astronomy gradually started to take off. In the US, John Krauss founded a radio observatory at Ohio State University, but it was Europe that took the first major lead. Bernard Lovell used equipment left over from the war to start a major project at Jodrell Bank in the UK. By 1957 he had built the giant radio dish 76 meters in diameter that would go on to revolutionise our knowledge of radio-active galaxies and the distant universe.

Karl Jansky did not live long enough to be honoured in his lifetime but posthumously he is accorded one of sciences most enduring forms of recognition. The metric unit for measuring the strength of radio sources is named the Jansky.


Follow

Get every new post delivered to your Inbox.

Join 275 other followers