Data Analysis in ALICE

Lukasz Kamil Graczykowski & Malgorzata Anna Janik,

1. Introduction

The statement “Complex experiments, such as ALICE, generate huge amount of data that need to be processed and analyzed” cannot be characterized as revelatory. But, it can trigger an interesting question, namely “How does this processing and analysis work?”. This brings up more questions such as: Does each physicist have a supercomputer designed for such "noble tasks" under his desk? Or rather the opposite, a regular laptop is more than enough? Are the computer programs physicists use a scientific secret or are they publicly available and free? And... what do those programs exactly do and how do they work?

2. A brief history of data analysis

When the High Energy Physics (HEP) experiments started, physicists did not have an easy task analyzing the recorded data to extract physics results. For some detectors, such as cloud and bubble chambers, films, similar to those used for cinema movies, were employed to record the trajectories of particles as they were traversing the sensitive material of the detectors. Scientists were sitting with rulers, protractors and other geometric tools to analyze such images. The analysis took a lot of time and was possible only when the number of particles (visualized as tracks) was small - they had to be all visible in one image! Fortunately, together with the increasing energy of accelerators, came the development of computers. This allowed digitizing the obtained data and automatizing many of the procedures. This is valid not only for high energy physics, but for science in general. We develop the algorithms and then the computers proceed with the data processing much faster (by many orders of magnitude) than us.


Before electronic data analysis, physicists visually examined photographs of Bubble Chamber particle interactions. http://www.fnal.gov/

3. Data analysis in ALICE

Different physics experiments used to develop their own software for their data analysis. To provide a basic common functionality an open-access data-analysis framework was developed at CERN: ROOT. It is an object-oriented software package, written in C++, originally targeted to particle physics data analysis. By now, almost all big HEP experiments in the world use ROOT as basis for the development of their own software. Naturally, many of the actual computations are experiment specific, for example we need to take into account the geometry of our detector in the reconstruction process. So, it is necessary for each experiment to use their own specific algorithms.

<200>



<200>



Alice Matters