CERN, Scientists Test Big ‘Crunch’ Theory

GENEVA — CERN, the world’s biggest particle physics laboratory and creator of the Worldwide Web, on Friday unveiled a new computer network allowing thousands of scientists around the world to crunch data on its huge experiments.

Some 7,000 scientists in 33 countries are now linked through the computing network at CERN, the European Organization for Nuclear Research, to analyze data from its particle-smashing test probing the nature of matter that began last month.

That experiment, which could provide clues about the origins of the universe, began on September 10 and was shut down nine days later because of a helium leak in the 27 km (17 mile) tunnel of CERN’s Large Hadron Collider (LHC).

When it starts up again next year, physicists involved in the experiment will have access to real-time data on their desktops, thanks to CERN’s computing grid that links more than 100,000 processors at 140 institutes around the world.

The massive distributed supercomputer was built for the LHC project but has wide implications for the study of science, said Ian Bird, leader of the Worldwide LHC Computing Grid project.

“Many other researchers and projects are already benefiting,” Bird said. “Grid computing is enabling all-new ways of doing science where large data handling and analysis capabilities are required.”

The amounts of data involved in the largest scientific experiment ever conducted are hard to comprehend.

The LHC experiment involves firing beams of protons in opposite directions around the tunnel buried 100 meters (330 feet) below the French-Swiss border, on the outskirts of Geneva.

At full capacity the LHC will produce 600 million proton collisions per second, producing data 40 million times per second.

These will be filtered down in the four massive subterranean detectors — the largest of which is the size of a five-storey building — to 100 collisions of interest per second.

The data flow will be about 700 megabytes per second or 15 million gigabytes a year for 10 to 15 years — enough to fill three million DVDs a year or create a tower of CDs more than twice as high as Mount Everest.

“To analyze that amount of data you require not only a lot of computing but a new computing paradigm — that’s what we call the Grid, and that’s what we’re here to celebrate today,” CERN spokesman James Gillies told a press briefing.

Just as the Worldwide Web — invented in 1990 at CERN — allows users to share access to information over the Internet, computer grids allow the linking of computing resources such as data storage capacity and processing power.

CERN has only 10 percent of the computing capacity needed for the LHC experiment, which will allow scientists to observe sub-atomic particles and probe the nature of gravity and matter. The grid will provide the rest.

News Around the Web