Abstract
CMS is one of the two general-purpose HEP experiments currently under construction for the Large Hadron Collider at CERN. The handling of multi-petabyte data samples in a worldwide context requires computing and software systems with unprecedented scale and complexity. We describe how CMS is meeting the many data analysis challenges in the LHC area. We cover in particular our system of globally distributed regional centers, the status of our object-oriented software, and our strategies for Grid-enriched data analysis.
| Original language | English |
|---|---|
| Pages (from-to) | 353-357 |
| Number of pages | 5 |
| Journal | Nuclear Instruments and Methods in Physics Research, Section A: Accelerators, Spectrometers, Detectors and Associated Equipment |
| Volume | 502 |
| Issue number | 2-3 |
| DOIs | |
| Publication status | Published - 21 Apr 2003 |
| Event | ACAT 2002 - Moscow, Russian Federation Duration: 24 Jun 2002 → 28 Jun 2002 |