Hazel Muir

SOMETIME in 2007, physicists are going to come closest to seeing what the universe was like a split-second after the big bang. Inside a 27-kilometre-long circular tunnel that straddles the border of France and Switzerland 100 metres underground, the Large Hadron Collider will push protons to almost the speed of light and smash them head-on at energies never before created on Earth.

But it will be a messy business. The torrent of information gushing forth from the LHC each year will be enough to fill a stack of CDs three times as high as Mount Everest. To make sense of it will require some 100,000 of today's most powerful PCs, so it is little wonder that CERN - the European centre for particle physics near Geneva that is building the collider - is co-opting a worldwide "grid" of computers to help store and analyse the data. Physicists hope that this collective computing power will help them spot exotic new particles, including the elusive Higgs boson, and validate theories that aim to unite three of the four fundamental forces of nature.

"It will be very exciting to have a new facility searching for new particles in a completely unknown regime of energy," says Peter Watkins, head of particle physics at the University of Birmingham in the UK and a member of GridPP, the collaboration organising the UK's contribution to the LHC computing grid. "Many people have been working on the design for a significant part of their lives, and they're very excited about the day the LHC switches on."

In April, computer scientists at CERN got a taste of what they will be up against when the LHC powers up in 2007. Computers in Geneva sustained a continuous data flow of 600 megabytes per second to seven sites in Europe and the US over a period of 10 days. They transmitted a total

"Each collision will reproduce the conditions inside the hot fireball that filled the universe just after the big bang"

of 500 terabytes, which would take 250 years to download on a typical domestic broadband link. "People were very happy with that," says Francois Grey, an IT spokesman for CERN. He adds that it was a challenge to get everyone to cooperate. "Computer centres tend to guard their particular way of doing things, and they are uncomfortable for security reasons about opening up to the world," he says.

This cooperation will be tested further in September, when CERN hopes to transmit data to many other computing centres and keep the stream going for three months, allowing scientists to test their software for analysing the data. "Then we'll really start to see how robust the system is, to understand its strengths and limitations," says Grey." Everybody is quite excited, but also I think a little nervous about how that's going to go." The ultimate goal is to sustain triple the data flow achieved in

April by the time the LHC gets down to business.

And CERN will definitely need that capacity and more when the LHC goes full throttle. Nearly 5000 superconducting magnets, weighing up to 35 tonnes each and chilled to -271 °C, will accelerate counter-rotating bunches of protons to nearly the speed of light and smash them into each other 40 million times a second. The proton bunches will collide with an energy of 14 teraelectronvolts - about seven times what any previous accelerator has achieved.

At each collision vast amounts of energy will be squeezed into a microscopic volume, reproducing the conditions inside the hot fireball that filled the universe just a million-millionth of a second after the big bang. Hundreds or thousands of particles will spray out from each collision, and a large fraction of these will have to be tracked and identified.

Detecting and sifting through nearly a billion trajectories each second will be the job of four main detectors. The two largest, ATLAS and CMS, will search for the Higgs particle, also known as the "God particle" as it is believed to be what endows other particles with mass. The detectors will also look for so-called "supersymmetric" particles. Glimpsing these would be a boost for theories of supersymmetry, which go beyond the standard model of particle physics. Although these theories help to unify the strong, weak and electromagnetic forces of nature, to do so they require the existence of a whole host of partners for all the known particles. The LHC will be the first particle accelerator to even come close to the energies thought necessary to create these supersymmetric partners.

The third detector, called the LHC "Beauty" experiment, will look for evidence of asymmetry in the production of particles called B mesons, to explain why today's universe is dominated by matter rather than antimatter. The fourth detector, ALICE, will be looking for an exotic and extremely dense state of matter called the quark-gluon plasma, which is thought to have existed a fraction of a second after the big bang. For ALICE, the LHC will collide heavy ions such as lead, rather than protons.

These four experiments will generate a staggering 15 million gigabytes of data each year. CERN could have built a massive centralised computing centre to analyse this data, but instead it has opted for the distributed grid approach. CERN will store all raw data in huge tape silos in a local facility, but it will also relay the data to a dozen storage sites in Europe, North America and Asia. From there it will filter down to about 100 smaller sites in 40 countries, then on to individual institutes and universities.

A worldwide grid of computers will allow the 6000-plus physicists working on the experiments to log on to local PCs and request a data analysis. To complicate things further, the data covering the 10 to 15 years that the LHC is expected to operate for will have to be

0 0

Post a comment