June 10th, 2007 07:45 EST
Small Cogs Drive a Big Machine At Vanderbilt
University scientists are the backbone of particle physics; like cogs in a complex machine, they deliver expertise, funding, and equipment exactly where needed. At Vanderbilt, they’re developing ways to handle a flood of data from the Large Hadron Collider.
Will Johns lugs his computer everywhere. This wouldn’t be a big deal if the computer wasn’t a desktop PC, and if his travel didn’t include frequent trips across the United States and Europe. For this formidable task, he claims to own the largest suitcase in the world.
"I’ve learned to covet huge hard-side luggage," he says. "I need it for the world’s worst laptop."
Johns is an assistant professor of physics and astronomy at Vanderbilt University in Tennessee, an urban campus just a few blocks from Nashville’s "Music City USA" of country-music fame. He needs that big, unwieldy desktop and its custom features to test electronics being assembled around the world for a massive detector at the Large Hadron Collider at CERN, the European laboratory for particle physics in Geneva, Switzerland.
Vanderbilt is one of 157 US institutions involved in high-energy physics, and one of 49 American universities collaborating on the collider’s Compact Muon Solenoid (CMS) detector. Like many others, the university faces the challenge of operating an experiment that’s on the other side of the ocean. By developing and constantly improving the technology that allows them to access data from very far away, Vanderbilt physicists are making it possible for their own group, and more than 2000 of their colleagues around the globe, to fully participate. The high-energy physics community is a leader in demanding fast, high-bandwidth networking, and advances in the field expand that capability each year.
Universities contribute a large part of the funding, equipment, manpower, and expertise needed for large collaborations such as the CMS detector. They hold more than half the project contracts for the US part of the collaboration and are involved in designing and operating the detector and analyzing the data from it. The University of Wisconsin, for example, designed and coordinated the assembly of the discs that form the end caps of the detector, while Ohio State University designed and built a portion of the electronics that will go into them.
"Many universities have specialty mechanical shops, which allow them to contribute to the design and manufacture of specialized electronics," says Fermilab’s Dan Green, former head and current member of the US CMS collaboration. He adds that university groups have the unique advantage of seeing the process from the start, which gives them the best perspective on the results: "The people who discover something will be the people who really understand the detector."
Universities are breeding grounds for technological development as well as for scholarship. Vanderbilt is part of a multi-university collaboration, led by Caltech, setting world records in computer networking that will allow data from the LHC to be transmitted across the Atlantic. The Massachusetts Institute of Technology, the University of Florida, and the University of Maryland will be leading physics analyses once the LHC is commissioned. In addition, universities nurture the future by training the next generation of particle physicists. For graduate students and post-doctoral researchers, participation in an experiment the size of CMS develops practical skills that can’t be learned from a textbook.
"University physicists are at the heart of modern particle physics. They develop theories, construct detectors, analyze data, and attract students to science," says Robin Staffin, associate director for high-energy physics at the Department of Energy’s Office of Science. "The physicists at Vanderbilt carry out their research at Fermilab and CERN, a strong testament to the partnership between DOE, its laboratories, and the university community."
Tony Chan, director of Math and Physical Sciences at the National Science Foundation, values NSF’s unique role among federal funding agencies in supporting physics research. "It is always satisfying for us at NSF to support basic science in universities, where most of the future generations of scientists are trained," he says. "This education role is recognized by the research community as critical to the health of the field."
The Department of Physics and Astronomy at Vanderbilt doubles as a high-performance computer science group. Vanderbilt’s physicists develop and test software to handle the massive amounts of data produced at the scale of the CMS. Each LHC detector will observe up to 600 million collisions per second, storing data from about 100 million "interesting" collisions.
The 12,500-ton CMS detector will include the largest solenoid magnet ever built. Four layers of sub-detectors will identify particles splintering off from proton-proton collisions, recording mass, speed, and electric charge.
"The LHC and CMS are a whole new scale of activity," says Bob Scherrer, head of the physics department. "Everybody’s feeling like it’s new."
The innermost part of the CMS detector, called the silicon pixel tracker, contains thousands of tissue-thin silicon squares densely packed around the beam pipe. They record changes in electrical charge left by particles whizzing away from collisions. Each signal goes to one of 38 readout boards, which then converts the energies to a format that can be read by a computer.
After all the data are read out, the trigger system selects only the most interesting particle collisions for storage. Even so, the data accumulated in one day would fill more than 50,000 home computers, and researchers need a way to access it from their home institutions.
"Say I want to grab CMS data and manipulate it myself from my office at Vanderbilt," says Professor Paul Sheldon, a founder of Vanderbilt’s Advanced Computing Center for Research and Education. "I’m looking for something specific, and I want it to be fast. I want to compute with it, transform it, and even write it out temporarily. The less I have to think about where it comes from, the better." With Vanderbilt’s Daniel Engh, Sheldon is helping develop and deliver a US-based computing infrastructure to handle and distribute CMS data.
As the data are transmitted, they must immediately be written to disk. This will require massive storage centers.
Sheldon leads a team of US researchers who have developed a network of data storage clusters at nine universities and institutions across the Americas, including Vanderbilt and Fermilab. It’s called the Research and Education Data Depot Network, or REDDnet, and it works as a gridcomputing system, with each institution acting as a separate storage drive. When a file is saved to the system, it’s split into many smaller pieces and distributed among the institutions. Later, when a scientist retrieves that file, the pieces travel back all at once from different locations and are pieced together. These small chunks of data load into the researcher’s local system much faster than they would in one large lump, greatly speeding access.
"Half of the file might be at Caltech and half of it somewhere else," explains Sheldon. "As a user, I don’t care about where it is, as long as it works."
Researchers across the US and at several institutions in Brazil already use REDDnet to store and access data from a number of disciplines, including medicine, geology and astrophysics. Sheldon hopes physicists will soon do the same.
Once data are available and accessible, physicists need to understand what they are seeing. The detector records a jumble of energy measurements at many places within its many layers. Like detectives following a trail of clues, researchers use these measurements to trace events back through the layers to the center of the detector and find out what happened in the instant after the collision.
Vanderbilt’s Eric Vaandering writes software that will reconstruct these events. The pathways are based on data from previous and current collider experiments, as well as theoretical expectations. To detect new physics, such as supersymmetry or the Higgs boson, this software must be able to handle a vast range of data at a precise scale.
"We want to make sure we can do this reconstruction accurately with as many different kinds of input as possible," says Vaandering. "This gives us some confidence that we will do okay when the real data come in."
Vanderbilt capitalizes on a strong ongoing connection with Fermilab, which has been designated as an LHC Tier 1 data center; that means it will receive much of the CMS data directly from the experiment at CERN.
"The Vanderbilt group has had a tradition of computing success on many experiments at Fermilab," says Joel Butler, director of US CMS.
Group members often travel to the LHC Physics Center at Fermilab and meet with other CMS physicists to work out the kinks. "I can work on a problem for two weeks and just get really stuck on it," says Engh, who travels to the physics center about once every other month. "Then I go to Fermilab and get the whole thing worked out in two hours by talking to five of the right people."
Once LHC switches on, the Vanderbilt group will join the ranks of experts sifting the data for inklings of new physics at the world’s highest-energy collider.
"Particle physics has been waiting a long time for something like the LHC," says Scherrer. "If amazing discoveries are going to happen, that’s where they will be."
This summer, Will Johns will move to Vienna to test final changes to the readout boards before they are installed to the CMS detector. He’s hoping advances in technology will someday make it unnecessary to lug the "world’s worst laptop" and its specialized data acquisition card.
"They just made a new card that works with a non-special computer," he says, "so maybe my schlepping days are drawing to a close."
Source: By Christine Buckley
"Symmetry magazine" and linking to the original: http://www.symmetrymagazine.org/cms/?pid=1000465