- Dipole rotated wr to ALICE coordinate system
[u/mrichter/AliRoot.git] / doc / aliroot-primer / primer.tex
c4593ee3 1\documentclass[12pt,a4paper,twoside]{article}
14% ---------------------------------------------------------------
15% define new commands/symbols
16% ---------------------------------------------------------------
18% General stuff
29\newcommand {\pT} {\mbox{$p_{\rm t}$}}
31\newcommand {\grid} {Grid\@\xspace}
32\newcommand {\MC} {Monte~Carlo\@\xspace}
33\newcommand {\alien} {AliEn\@\xspace}
34\newcommand {\pp} {\mbox{p--p}\@\xspace}
35\newcommand {\pA} {\mbox{p--A}\@\xspace}
36\newcommand {\PbPb} {\mbox{Pb--Pb}\@\xspace}
37\newcommand {\aliroot} {AliRoot\@\xspace}
38\newcommand {\ROOT} {ROOT\@\xspace}
39\newcommand {\OO} {Object-Oriented\@\xspace}
46\newcommand{\Jpsi} {\mbox{J\kern-0.05em /\kern-0.05em$\psi$}\xspace}
47\newcommand{\psip} {\mbox{$\psi^\prime$}\xspace}
48\newcommand{\Ups} {\mbox{$\Upsilon$}\xspace}
49\newcommand{\Upsp} {\mbox{$\Upsilon^\prime$}\xspace}
50\newcommand{\Upspp} {\mbox{$\Upsilon^{\prime\prime}$}\xspace}
51\newcommand{\qqbar} {\mbox{$q\bar{q}$}\xspace}
53\newcommand {\grad} {\mbox{$^{\circ}$}}
55\newcommand {\rap} {\mbox{$\left | y \right | $}}
56\newcommand {\mass} {\mbox{\rm GeV$\kern-0.15em /\kern-0.12em c^2$}}
57\newcommand {\tev} {\mbox{${\rm TeV}$}}
58\newcommand {\gev} {\mbox{${\rm GeV}$}}
59\newcommand {\mev} {\mbox{${\rm MeV}$}}
60\newcommand {\kev} {\mbox{${\rm keV}$}}
61\newcommand {\mom} {\mbox{\rm GeV$\kern-0.15em /\kern-0.12em c$}}
62\newcommand {\mum} {\mbox{$\mu {\rm m}$}}
63\newcommand {\gmom} {\mbox{\rm GeV$\kern-0.15em /\kern-0.12em c$}}
64\newcommand {\mmass} {\mbox{\rm MeV$\kern-0.15em /\kern-0.12em c^2$}}
65\newcommand {\mmom} {\mbox{\rm MeV$\kern-0.15em /\kern-0.12em c$}}
66\newcommand {\nb} {\mbox{\rm nb}}
67\newcommand {\musec} {\mbox{$\mu {\rm s}$}}
68\newcommand {\cmq} {\mbox{${\rm cm}^{2}$}}
69\newcommand {\cm} {\mbox{${\rm cm}$}}
70\newcommand {\mm} {\mbox{${\rm mm}$}}
71\newcommand {\dens} {\mbox{${\rm g}\,{\rm cm}^{-3}$}}
76b461ed 72%
73\newcommand{\FR}{ALICE alignment framework}
c4593ee3 76\lstset{ % general command to set parameter(s)
77% basicstyle=\small, % print whole listing small
78 basicstyle=\ttfamily, % print whole listing monospace
79 keywordstyle=\bfseries, % bold black keywords
80 identifierstyle=, % identifiers in italic
81 commentstyle=\itshape, % white comments in italic
82 stringstyle=\ttfamily, % typewriter type for strings
83 showstringspaces=false, % no special string spaces
84 columns=fullflexible, % Flexible columns
85 xleftmargin=2em, % Extra margin, left
86 xrightmargin=2em, % Extra margin, right
87 numbers=left, % Line numbers on the left
88 numberfirstline=true, % First line numbered
89 firstnumber=1, % Always start at 1
90 stepnumber=5, % Every fifth line
91 numberstyle=\footnotesize\itshape, % Style of line numbers
92 frame=lines} % Lines above and below listings
95% ---------------------------------------------------------
96% - End of Definitions
97% ---------------------------------------------------------
101\title{AliRoot Primer}
102\author{Editor P.Hristov}
103\date{Version v4-05-06 \\
115% -----------------------------------------------------------------------------
118\subsection{About this primer}
120The aim of this primer is to give some basic information about the
121ALICE offline framework (AliRoot) from users perspective. We explain
122in detail the installation procedure, and give examples of some
123typical use cases: detector description, event generation, particle
124transport, generation of ``summable digits'', event merging,
125reconstruction, particle identification, and generation of event
126summary data. The primer also includes some examples of analysis, and
127short description of the existing analysis classes in AliRoot. An
128updated version of the document can be downloaded from
131For the reader interested by the AliRoot architecture and by the
132performance studies done so far, a good starting point is Chapter 4 of
133the ALICE Physics Performance Report\cite{PPR}. Another important
134document is the ALICE Computing Technical Design Report\cite{CompTDR}.
135Some information contained there has been included in the present
136document, but most of the details have been omitted.
138AliRoot uses the ROOT\cite{ROOT} system as a foundation on which the
139framework for simulation, reconstruction and analysis is built. The
140transport of the particles through the detector is carried on by the
141Geant3\cite{Geant3} or FLUKA\cite{FLUKA} packages. Support for
142Geant4\cite{Geant4} transport package is coming soon.
144Except for large existing libraries, such as Pythia6\cite{MC:PYTH} and
145HIJING\cite{MC:HIJING}, and some remaining legacy code, this framework
146is based on the Object Oriented programming paradigm, and it is
147written in C++.
149The following packages are needed to install the fully operational
150software distribution:
152\item ROOT, available from \url{http://root.cern.ch}
153or using the ROOT CVS repository
157\item AliRoot from the ALICE offline CVS repository
161\item transport packages:
163\item GEANT~3 is available from the ROOT CVS repository
164\item FLUKA library can
165be obtained after registration from \url{http://www.fluka.org}
166\item GEANT~4 distribution from \url{http://cern.ch/geant4}.
170The access to the GRID resources and data is provided by the
171AliEn\cite{AliEn} system.
173The installation details are explained in Section \ref{Installation}.
175\subsection{AliRoot framework}\label{AliRootFramework}
177In HEP, a framework is a set of software tools that enables data
178processing. For example the old CERN Program Library was a toolkit to
179build a framework. PAW was the first example of integration of tools
180into a coherent ensemble specifically dedicated to data analysis. The
181role of the framework is shown in Fig.~\ref{MC:Parab}.
184 \centering
185 \includegraphics[width=10cm]{picts/Parab}
186 \caption{Data processing framework.} \label{MC:Parab}
189The primary interactions are simulated via event generators, and the
190resulting kinematic tree is then used in the transport package. An
191event generator produces set of ``particles'' with their momenta. The
192set of particles, where one maintains the production history (in form
193of mother-daughter relationship and production vertex) forms the
194kinematic tree. More details can be found in the ROOT documentation of
195class \class{TParticle}. The transport package transports the
196particles through the set of detectors, and produces \textbf{hits},
197which in ALICE terminology means energy deposition at a given
198point. The hits contain also information (``track labels'') about the
199particles that have generated them. In case of calorimeters (PHOS and
200EMCAL) the hit is the energy deposition in the whole active volume of
201a detecting element. In some detectors the energy of the hit is used
202only for comparison with a given threshold, for example in TOF and ITS
203pixel layers.
205At the next step the detector response is taken into account, and the
206hits are transformed into \textbf{digits}. As it was explained above,
207the hits are closely related to the tracks which generated them. The
208transition from hits/tracks to digits/detectors is marked on the
209picture as ``disintegrated response'', the tracks are
210``disintegrated'' and only the labels carry the \MC information.
211There are two types of digits: \textbf{summable digits}, where one
212uses low thresholds and the result is additive, and {\bf digits},
213where the real thresholds are used, and result is similar to what one
214would get in the real data taking. In some sense the {\bf summable
215digits} are precursors of the \textbf{digits}. The noise simulation is
216activated when \textbf{digits} are produced. There are two differences
217between the \textbf{digits} and the \textbf{raw} data format produced
218by the detector: firstly, the information about the \MC particle
219generating the digit is kept as data member of the class
220\class{AliDigit}, and secondly, the raw data are stored in binary
221format as ``payload'' in a ROOT structure, while the digits are stored
222in ROOT classes. Two conversion chains are provided in AliRoot:
223\textbf{hits} $\to$ \textbf{summable digits} $\to$ \textbf{digits},
224and \textbf{hits} $\to$ \textbf{digits}. The summable digits are used
225for the so called ``event merging'', where a signal event is embedded
226in a signal-free underlying event. This technique is widely used in
227heavy-ion physics and allows to reuse the underlying events with
228substantial economy of computing resources. Optionally it is possible
229to perform the conversion \textbf{digits} $\to$ \textbf{raw data},
230which is used to estimate the expected data size, to evaluate the high
231level trigger algorithms, and to carry on the so called computing data
232challenges. The reconstruction and the HLT algorithms can work both
233with \textbf{digits} or with \textbf{raw data}. There is also the
234possibility to convert the \textbf{raw data} between the following
235formats: the format coming form the front-end electronics (FEE)
236through the detector data link (DDL), the format used in the data
237acquisition system (DAQ), and the ``rootified'' format. More details
238are given in section \ref{Simulation}.
240After the creation of digits, the reconstruction and analysis chain
241can be activated to evaluate the software and the detector
242performance, and to study some particular signatures. The
243reconstruction takes as input digits or raw data, real or simulated.
244The user can intervene into the cycle provided by the framework to
245replace any part of it with his own code or implement his own analysis
246of the data. I/O and user interfaces are part of the framework, as are
247data visualization and analysis tools and all procedures that are
248considered of general enough interest to be introduced into the
249framework. The scope of the framework evolves with time as the needs
250and understanding of the physics community evolve.
252The basic principles that have guided the design of the AliRoot
253framework are re-usability and modularity. There are almost as many
254definitions of these concepts as there are programmers. However, for
255our purpose, we adopt an operative heuristic definition that expresses
256our objective to minimize the amount of unused or rewritten code and
257maximize the participation of the physicists in the development of the
260\textbf{Modularity} allows replacement of parts of our system with
261minimal or no impact on the rest. Not every part of our system is
262expected to be replaced. Therefore we are aiming at modularity
263targeted to those elements that we expect to change. For example, we
264require the ability to change the event generator or the transport \MC
265without affecting the user code. There are elements that we do not
266plan to interchange, but rather to evolve in collaboration with their
267authors such as the ROOT I/O subsystem or the ROOT User Interface
268(UI), and therefore no effort is made to make our framework modular
269with respect to these. Whenever an element has to be modular in the
270sense above, we define an abstract interface to it. The codes from the
271different detectors are independent so that different detector groups
272can work concurrently on the system while minimizing the
273interference. We understand and accept the risk that at some point the
274need may arise to make modular a component that was not designed to
275be. For these cases, we have elaborated a development strategy that
276can handle design changes in production code.
278\textbf{Re-usability} is the protection of the investment made by the
279programming physicists of ALICE. The code embodies a large scientific
280knowledge and experience and is thus a precious resource. We preserve
281this investment by designing a modular system in the sense above and
282by making sure that we maintain the maximum amount of backward
283compatibility while evolving our system. This naturally generates
284requirements on the underlying framework prompting developments such
285as the introduction of automatic schema evolution in ROOT.
287The \textbf{support} of the AliRoot framework is a collaborative effort
288within the ALICE experiment. Question, suggestions, topics for
289discussion and messages are exchanged in the mailing list
290\url{alice-off@cern.ch}. Bug reports and tasks are submitted on the
291Savannah page \url{http://savannah.cern.ch/projects/aliroot/}.
297\section{Installation and development tools}\label{Installation}
299% -----------------------------------------------------------------------------
301\subsection{Platforms and compilers}
303The main development and production platform is Linux on Intel 32 bits
304processors. The official Linux\cite{Linux} distribution at CERN is
305Scientific Linux SLC\cite{SLC}. The code works also on
306RedHat\cite{RedHat} version 7.3, 8.0, 9.0, Fedora Core\cite{Fedora} 1
307-- 5, and on many other Linux distributions. The main compiler on
308Linux is gcc\cite{gcc}: the recommended version is gcc 3.2.3 --
3093.4.6. The older releases (2.91.66, 2.95.2, 2.96) have problems in the
310FORTRAN optimization which has to be switched off for all the FORTRAN
311packages. AliRoot can be used with gcc 4.0.X where the FORTRAN
312compiler g77 is replaced by g95. The last release series of gcc (4.1)
313work with gfortran as well. As an option you can use Intel
314icc\cite{icc} compiler, which is supported as well. You can download
315it from \url{http://www.intel.com} and use it free of charge for
316non-commercial projects. Intel also provides free of charge the
317VTune\cite{VTune} profiling tool which is really one of the best
318available so far.
320AliRoot is supported on Intel 64 bit processors
321(Itanium\cite{Itanium}) running Linux. Both the gcc and Intel icc
322compilers can be used.
324On 64 bit AMD\cite{AMD} processors such as Opteron AliRoot runs
325successfully with the gcc compiler.
327The software is also regularly compiled and run on other Unix
328platforms. On Sun (SunOS 5.8) we recommend the CC compiler Sun
329WorkShop 6 update 1 C++ 5.2. The WorkShop integrates nice debugging
330and profiling facilities which are very useful for code development.
332On Compaq alpha server (Digital Unix V4.0) the default compiler is cxx
333( Compaq C++ V6.2-024 for Digital UNIX V4.0F). Alpha provides also its
334profiling tool pixie, which works well with shared libraries. AliRoot
335works also on alpha server running Linux, where the compiler is gcc.
337Recently AliRoot was ported to MacOS (Darwin). This OS is very
338sensitive to the circular dependences in the shared libraries, which
339makes it very useful as test platform.
341% -----------------------------------------------------------------------------
343\subsection{Essential CVS information}
345CVS\cite{CVS} stands for Concurrent Version System. It permits to a
346group of people to work simultaneously on groups of files (for
347instance program sources). It also records the history of files, which
348allows back tracking and file versioning. The official CVS Web page is
349\url{http://www.cvshome.org/}. CVS has a host of features, among them
350the most important are:
352\item CVS facilitates parallel and concurrent code development;
353\item it provides easy support and simple access;
354\item it has possibility to establish group permissions (for example
355 only detector experts and CVS administrators can commit code to
356 given detector module).
358CVS has rich set of commands, the most important are described below.
359There exist several tools for visualization, logging and control which
360work with CVS. More information is available in the CVS documentation
361and manual\cite{CVSManual}.
363Usually the development process with CVS has the following features:
365\item all developers work on their \underline{own} copy of the project
366 (in one of their directories)
367\item they often have to \underline{synchronize} with a global
368 repository both to update with modifications from other people and
369 to commit their own changes.
372Here below we give an example of a typical CVS session
375 # Login to the repository. The password is stored in ~/.cvspass
376 # If no cvs logout is done, the password remains there and
377 # one can access the repository without new login
378 % cvs -d :pserver:hristov@alisoft.cern.ch:/soft/cvsroot login
379 (Logging in to hristov@alisoft.cern.ch)
380 CVS password:
381 xxxxxxxx
383 # Check-Out a local version of the TPC module
384 % cvs -d :pserver:hristov@alisoft.cern.ch:/soft/cvsroot checkout TPC
385 cvs server: Updating TPC
386 U TPC/.rootrc
387 U TPC/AliTPC.cxx
388 U TPC/AliTPC.h
389 ...
391 # edit file AliTPC.h
392 # compile and test modifications
394 # Commit your changes to the repository with an appropriate comment
395 % cvs commit -m "add include file xxx.h" AliTPC.h
396 Checking in AliTPC.h;
397 /soft/cvsroot/AliRoot/TPC/AliTPC.h,v <-- AliTPC.h
398 new revision: 1.9; previous revision:1.8
399 done
403Instead of specifying the repository and user name by -d option, one
404can export the environment variable CVSROOT, for example
407 % export CVSROOT=:pserver:hristov@alisoft.cern.ch:/soft/cvsroot
410Once the local version has been checked out, inside the directory tree
411the CVSROOT is not needed anymore. The name of the actual repository
412can be found in CVS/Root file. This name can be redefined again using
413the -d option.
415In case somebody else has committed some changes in AliTPC.h file, the
416developer have to update the local version merging his own changes
417before committing them:
420 % cvs commit -m "add include file xxx.h" AliTPC.h
421 cvs server: Up-to-date check failed for `AliTPC.h'
422 cvs [server aborted]: correct above errors first!
424 % cvs update
425 cvs server: Updating .
426 RCS file: /soft/cvsroot/AliRoot/TPC/AliTPC.h,v
427 retrieving revision 1.9
428 retrieving revision 1.10
429 Merging differences between 1.9 and 1.10 into AliTPC.h
431 M AliTPC.h
432 # edit, compile and test modifications
434 % cvs commit -m "add include file xxx.h" AliTPC.h
435 Checking in AliTPC.h;
436 /soft/cvsroot/AliRoot/TPC/AliTPC.h,v <-- AliTPC.h
437 new revision: 1.11; previous revision: 1.10
438 done
441\textbf{Important note:} CVS performs a purely mechanical merging, and
442it is the developer's to verify the result of this operation. It is
443especially true in case of conflicts, when the CVS tool is not able to
444merge the local and remote modifications consistently.
447\subsection{Main CVS commands}
449In the following examples we suppose that the CVSROOT environment
450variable is set, as it was shown above. In case a local version has
451been already checked out, the CVS repository is defined automatically
452inside the directory tree.
455\item\textbf{login} stores password in .cvspass. It is enough to login
456 once to the repository.
458\item\textbf{checkout} retrieves the source files of AliRoot version v4-04-Rev-08
459 \begin{lstlisting}[language=sh]
460 % cvs co -r v4-04-Rev-08 AliRoot
461 \end{lstlisting}
463\item\textbf{update} retrieves modifications from the repository and
464 merges them with the local ones. The -q option reduces the verbose
465 output, and the -z9 sets the compression level during the data
466 transfer. The option -A removes all the ``sticky'' tags, -d removes
467 the obsolete files from the local distribution, and -P retrieves the
468 new files which are missing from the local distribution. In this way
469 the local distribution will be updated to the latest code from the
470 main development branch.
471 \begin{lstlisting}[language=sh]
472 % cvs -qz9 update -AdP STEER
473 \end{lstlisting}
475\item\textbf{diff} shows differences between the local and repository
476 versions of the whole module STEER
477 \begin{lstlisting}[language=sh]
478 % cvs -qz9 diff STEER
479 \end{lstlisting}
481\item \textbf{add} adds files or directories to the repository. The
482 actual transfer is done when the commit command is invoked.
483 \begin{lstlisting}[language=sh]
484 % cvs -qz9 add AliTPCseed.*
485 \end{lstlisting}
487\item\textbf{remove} removes old files or directories from the
488 repository. The -f option forces the removal of the local files. In
489 the example below the whole module CASTOR will be scheduled for
490 removal.
491 \begin{lstlisting}[language=sh]
492 % cvs remove -f CASTOR
493 \end{lstlisting}
495\item\textbf{commit} checks in the local modifications to the
496 repository and increments the versions of the files. In the example
497 below all the changes made in the different files of the module
498 STEER will be committed to the repository. The -m option is
499 followed by the log message. In case you don't provide it you will
500 be prompted by an editor window. No commit is possible without the
501 log message which explains what was done.
502 \begin{lstlisting}[language=sh]
503 % cvs -qz9 commit -m ``Coding convention'' STEER
504 \end{lstlisting}
506\item\textbf{tag} creates new tags and/or branches (with -b option).
507 \begin{lstlisting}[language=sh]
508 % cvs tag -b v4-05-Release .
509 \end{lstlisting}
510\item\textbf{status} returns the actual status of a file: revision,
511 sticky tag, dates, options, and local modifications.
512 \begin{lstlisting}[language=sh]
513 % cvs status Makefile
514 \end{lstlisting}
516\item\textbf{logout} removes the password which is stored in
517 \$HOME/.cvspass. It is not really necessary unless the user really
518 wants to remove the password from that account.
522% -----------------------------------------------------------------------------
524\subsection{Environment variables}
526Before the installation of AliRoot the user has to set some
527environment variables. In the following examples the user is working
528on Linux and the default shell is bash. It is enough to add to the
529.bash\_profile file few lines as shown below:
532 # ROOT
533 export ROOTSYS=/home/mydir/root
534 export PATH=$PATH\:$ROOTSYS/bin
537 # AliRoot
538 export ALICE=/home/mydir/alice
539 export ALICE_ROOT=$ALICE/AliRoot
540 export ALICE_TARGET=`root-config --arch`
541 export PATH=$PATH\:$ALICE_ROOT/bin/tgt_${ALICE_TARGET}
544 # Geant3
545 export PLATFORM=`root-config --arch` # Optional, defined otherwise in Geant3 Makefile
546 export
549 # FLUKA
550 export FLUPRO=$ALICE/fluka # $FLUPRO is used in TFluka
551 export PATH=$PATH\:$FLUPRO/flutil
553 # Geant4: see the details later
556where ``/home/mydir'' has to be replaced with the actual directory
557path. The meaning of the environment variables is the following:
559\texttt{ROOTSYS} -- the place where the ROOT package is located;
561\texttt{ALICE} -- top directory for all the software packages used in ALICE;
563\texttt{ALICE\_ROOT} -- the place where the AliRoot package is located, usually
564as subdirectory of ALICE;
566\texttt{ALICE\_TARGET} -- specific platform name. Up to release
567v4-01-Release this variable was set to the result of ``uname''
568command. Starting from AliRoot v4-02-05 the ROOT naming schema was
569adopted, and the user has to use ``root-config --arch'' command.
571\texttt{PLATFORM} -- the same as ALICE\_TARGET for the GEANT~3
572package. Until GEANT~3 v1-0 the user had to use `uname` to specify the
573platform. From version v1-0 on the ROOT platform is used instead
574(``root-config --arch''). This environment variable is set by default
575in the Geant3 Makefile.
578% -----------------------------------------------------------------------------
580\subsection{Software packages}
584The installation of AliEn is the first one to be done if you plan to
585access the GRID or need GRID-enabled Root. You can download the AliEn
586installer and use it in the following way:
587 \begin{lstlisting}[language=sh, title={AliEn installation}]
588 % wget http://alien.cern.ch/alien-installer
589 % chmod +x alien-installer
590 % ./alien-installer
591 \end{lstlisting}
592The alien-installer runs a dialog which prompts for the default
593selection and options. The default installation place for AliEn is
594/opt/alien, and the typical packages one has to install are ``client''
595and ``gshell''.
599All ALICE offline software is based on ROOT\cite{ROOT}. The ROOT
600framework offers a number of important elements which are exploited in
604\item a complete data analysis framework including all the PAW
605 features;
606\item an advanced Graphic User Interface (GUI) toolkit;
607\item a large set of utility functions, including several commonly
608 used mathematical functions, random number generators,
609 multi-parametric fit and minimization procedures;
610\item a complete set of object containers;
611\item integrated I/O with class schema evolution;
612\item C++ as a scripting language;
613\item documentation tools.
615There is a nice ROOT user's guide which incorporates important and
616detailed information. For those who are not familiar with ROOT a good
617staring point is the ROOT Web page at \url{http://root.cern.ch}. Here
618the experienced users may find easily the latest version of the class
619descriptions and search for useful information.
622The recommended way to install ROOT is from the CVS sources, as it is
623shown below:
626\item Login to the ROOT CVS repository if you haven't done it yet.
627 \begin{lstlisting}[language=sh]
628 % cvs -d :pserver:cvs@root.cern.ch:/user/cvs login
629 % CVS password: cvs
630 \end{lstlisting}
632\item Download (check out) the needed ROOT version (v5-13-04 in the example)
633 \begin{lstlisting}[language=sh]
634 % cvs -d :pserver:cvs@root.cern.ch:/user/cvs co -r v5-13-04 root
635 \end{lstlisting}
636 The appropriate combinations of Root, Geant3 and AliRoot versions
637 can be found at
638 \url{http://aliceinfo.cern.ch/Offline/AliRoot/Releases.html}
640\item The code is stored in the directory ``root''. You have to go
641 there, set the ROOTSYS environment variable (if this is not done in
642 advance),and configure ROOT. The ROOTSYS contains the full path to
643 the ROOT directory.
645 \lstinputlisting[language=sh, title={Root configuration}]{scripts/confroot}
647\item Now you can compile and test ROOT
648 \lstinputlisting[language=sh,title={Compiling and testing
649 ROOT}]{scripts/makeroot}
653At this point the user should have a working ROOT version on a Linux
654(32 bit Pentium processor with gcc compiler). The list of supported
655platforms can be obtained by ``./configure --help'' command.
659The installation of GEANT~3 is needed since for the moments this is
660the default particle transport package. A GEANT~3 description is
661available at
663You can download the GEANT~3 distribution from the ROOT CVS repository
664and compile it in the following way:
666\lstinputlisting[language=sh,title={Make GEANT3}]{scripts/makeg3}
668Please note that GEANT~3 is downloaded in \$ALICE directory. Another
669important feature is the PLATFORM environment variable. If it is not
670set, the Geant3 Makefile sets it to the result of `root-config
674To use GEANT~4\cite{Geant4}, some additional software has to
675be installed. GEANT~4 needs CLHEP\cite{CLHEP} package, the user can
676get the tar file (here on ``tarball'') from
678 Then the installation can be done in the following way:
680\lstinputlisting[language=sh, title={Make CLHEP}]{scripts/makeclhep}
683Another possibility is to use the CLHEP CVS repository:
685\lstinputlisting[language=sh, title={Make CLHEP from
686 CVS}]{scripts/makeclhepcvs}
688Now the following lines should be added to the .bash\_profile
694The next step is to install GEANT~4. The GEANT~4 distribution is available from
695\url{http://geant4.web.cern.ch/geant4/}. Typically the following files
696will be downloaded (the current versions may differ from the ones below):
698\item geant4.8.1.p02.tar.gz: source tarball
699\item G4NDL.3.9.tar.gz: G4NDL version 3.9 neutron data files with thermal cross sections
700\item G4EMLOW4.0.tar.gz: data files for low energy electromagnetic processes - version 4.0
701\item PhotonEvaporation.2.0.tar.gz: data files for photon evaporation - version 2.0
702\item RadiativeDecay.3.0.tar.gz: data files for radioactive decay hadronic processes - version 3.0
703\item G4ELASTIC.1.1.tar.gz: data files for high energy elastic scattering processes - version 1.1
706Then the following steps have to be executed:
708\lstinputlisting[language=sh, title={Make GEANT4}]{scripts/makeg4}
710The execution of the env.sh script can be made from the
711\texttt{\~{}/.bash\_profile} to have the GEANT~4 environment variables
712initialized automatically.
716The installation of FLUKA\cite{FLUKA} consists of the following steps:
720\item register as FLUKA user at \url{http://www.fluka.org} if you
721 haven't yet done so. You will receive your ``fuid'' number and will set
722 you password;
724\item download the latest FLUKA version from
725 \url{http://www.fluka.org}. Use your ``fuid'' registration and
726 password when prompted. You will obtain a tarball containing the
727 FLUKA libraries, for example fluka2006.3-linuxAA.tar.gz
729\item install the libraries;
731 \lstinputlisting[language=sh, title={install FLUKA}]{scripts/makefluka}
733\item compile TFluka;
735 \begin{lstlisting}[language=sh]
736 % cd $ALICE_ROOT
737 % make all-TFluka
738 \end{lstlisting}
740\item run AliRoot using FLUKA;
741 \begin{lstlisting}[language=sh]
742 % cd $ALICE_ROOT/TFluka/scripts
743 % ./runflukageo.sh
744 \end{lstlisting}
746 This script creates the directory tmp and inside all the necessary
747 links for data and configuration files and starts aliroot. For the
748 next run it is not necessary to run the script again. The tmp
749 directory can be kept or renamed. The user should run aliroot from
750 inside this directory.
752\item from the AliRoot prompt start the simulation;
753 \begin{lstlisting}[language=C++]
754 root [0] AliSimulation sim;
755 root [1] sim.Run();
756 \end{lstlisting}
758 You will get the results of the simulation in the tmp directory.
760\item reconstruct the simulated event;
761 \begin{lstlisting}[language=sh]
762 % cd tmp
763 % aliroot
764 \end{lstlisting}
766 and from the AliRoot prompt
767 \begin{lstlisting}[language=C++]
768 root [0] AliReconstruction rec;
769 root [1] rec.Run();
770 \end{lstlisting}
772\item report any problem you encounter to the offline list \url{alice-off@cern.ch}.
779The AliRoot distribution is taken from the CVS repository and then
781 % cd $ALICE
782 % cvs -qz2 -d :pserver:cvs@alisoft.cern.ch:/soft/cvsroot co AliRoot
783 % cd $ALICE_ROOT
784 % make
787The AliRoot code (the above example retrieves the HEAD version from CVS) is contained in
788ALICE\_ROOT directory. The ALICE\_TARGET is defined automatically in
789the \texttt{.bash\_profile} via the call to `root-config --arch`.
795While developing code or running some ALICE program, the user may be
796confronted with the following execution errors:
799\item floating exceptions: division by zero, sqrt from negative
800 argument, assignment of NaN, etc.
801\item segmentation violations/faults: attempt to access a memory
802 location that is not allowed to access, or in a way which is not
803 allowed.
804\item bus error: attempt to access memory that the computer cannot
805 address.
808In this case, the user will have to debug the program to determine the
809source of the problem and fix it. There is several debugging
810techniques, which are briefly listed below:
813\item using \texttt{printf(...)}, \texttt{std::cout}, \texttt{assert(...)}, and
814 \texttt{AliDebug}.
815 \begin{itemize}
816 \item often this is the only easy way to find the origin of the
817 problem;
818 \item \texttt{assert(...)} aborts the program execution if the
819 argument is FALSE. It is a macro from \texttt{cassert}, it can be
820 inactivated by compiling with -DNDEBUG.
821 \end{itemize}
822\item using gdb
823 \begin{itemize}
824 \item gdb needs compilation with -g option. Sometimes -O2 -g
825 prevents from exact tracing, so it is save to use compilation with
826 -O0 -g for debugging purposes;
827 \item One can use it directly (gdb aliroot) or attach it to a
828 process (gdb aliroot 12345 where 12345 is the process id).
829 \end{itemize}
832Below we report the main gdb commands and their descriptions:
835\item \textbf{run} starts the execution of the program;
836\item \textbf{Control-C} stops the execution and switches to the gdb shell;
837\item \textbf{where <n>} prints the program stack. Sometimes the program
838 stack is very long. The user can get the last n frames by specifying
839 n as a parameter to where;
840\item \textbf{print} prints the value of a variable or expression;
842 \begin{lstlisting}[language=sh]
843 (gdb) print *this
844 \end{lstlisting}
845\item \textbf{up} and \textbf{down} are used to navigate in the program stack;
846\item \textbf{quit} exits the gdb session;
847\item \textbf{break} sets break point;
849 \begin{lstlisting}[language=C++]
850 (gdb) break AliLoader.cxx:100
851 (gdb) break 'AliLoader::AliLoader()'
852 \end{lstlisting}
854 The automatic completion of the class methods via tab is available
855 in case an opening quote (`) is put in front of the class name.
857\item \textbf{cont} continues the run;
858\item \textbf{watch} sets watchpoint (very slow execution). The example below
859 shows how to check each change of fData;
861 \begin{lstlisting}[language=C++]
862 (gdb) watch *fData
863 \end{lstlisting}
864\item \textbf{list} shows the source code;
865\item \textbf{help} shows the description of commands.
871Profiling is used to discover where the program spends most of the
872time, and to optimize the algorithms. There are several profiling
873tools available on different platforms:
875\item Linux tools:\\
876 gprof: compilation with -pg option, static libraries\\
877 oprofile: uses kernel module\\
878 VTune: instruments shared libraries.
879\item Sun: Sun workshop (Forte agent). It needs compilation with
880 profiling option (-pg)
881\item Compaq Alpha: pixie profiler. Instruments shared libraries for profiling.
884On Linux AliRoot can be built with static libraries using the special
885target ``profile''
888 % make profile
889 # change LD_LIBRARY_PATH to replace lib/tgt_linux with lib/tgt_linuxPROF
890 # change PATH to replace bin/tgt_linux with bin/tgt_linuxPROF
891 % aliroot
892 root [0] gAlice->Run()
893 root [1] .q
896After the end of aliroot session a file called gmon.out will be created. It
897contains the profiling information which can be investigated using
901 % gprof `which aliroot` | tee gprof.txt
902 % more gprof.txt
907\textbf{VTune profiling tool}
909VTune is available from the Intel Web site
910\url{http://www.intel.com/software/products/index.htm}. It is free for
911non-commercial use on Linux. It provides possibility for call-graph
912and sampling profiling. VTune instruments shared libraries, and needs
913only -g option during the compilation. Here is an example of
914call-graph profiling:
917 # Register an activity
918 % vtl activity sim -c callgraph -app aliroot,'' -b -q sim.C'' -moi aliroot
919 % vtl run sim
920 % vtl show
921 % vtl view sim::r1 -gui
924\subsection{Detection of run time errors}
926The Valgrind tool can be used for detection of run time errors on
927linux. It is available from \url{http://www.valgrind.org}. Valgrind
928is equipped with the following set of tools:
930\item memcheck for memory management problems;
931\item addrcheck: lightweight memory checker;
932\item cachegrind: cache profiler;
933\item massif: heap profiler;
934\item hellgrind: thread debugger;
935\item callgrind: extended version of cachegrind.
938The most important tool is memcheck. It can detect:
940\item use of non-initialized memory;
941\item reading/writing memory after it has been free'd;
942\item reading/writing off the end of malloc'd blocks;
943\item reading/writing inappropriate areas on the stack;
944\item memory leaks -- where pointers to malloc'd blocks are lost forever;
945\item mismatched use of malloc/new/new [] vs free/delete/delete [];
946\item overlapping source and destination pointers in memcpy() and
947 related functions;
948\item some misuses of the POSIX pthreads API;
951Here is an example of Valgrind usage:
954 % valgrind --tool=addrcheck --error-limit=no aliroot -b -q sim.C
958%\textbf{ROOT memory checker}
960% The ROOT memory checker provides tests of memory leaks and other
961% problems related to new/delete. It is fast and easy to use. Here is
962% the recipe:
963% \begin{itemize}
964% \item link aliroot with -lNew. The user has to add `\-\-new' before
965% `\-\-glibs' in the ROOTCLIBS variable of the Makefile;
966% \item add Root.MemCheck: 1 in .rootrc
967% \item run the program: aliroot -b -q sim.C
968% \item run memprobe -e aliroot
969% \item Inspect the files with .info extension that have been generated.
970% \end{itemize}
972\subsection{Useful information LSF and CASTOR}
974\textbf{The information in this section is included for completeness: the
975 users are strongly advised to rely on the GRID tools for massive
976 productions and data access}
978LSF is the batch system at CERN. Every user is allowed to submit jobs
979to the different queues. Usually the user has to copy some input files
980(macros, data, executables, libraries) from a local computer or from
981the mass-storage system to the worker node on lxbatch, then to execute
982the program, and to store the results on the local computer or in the
983mass-storage system. The methods explained in the section are suitable
984if the user doesn't have direct access to a shared directory, for
985example on AFS. The main steps and commands are described below.
987In order to have access to the local desktop and to be able to use scp
988without password, the user has to create pair of SSH keys. Currently
989lxplus/lxbatch uses RSA1 cryptography. After login into lxplus the
990following has to be done:
993 % ssh-keygen -t rsa1
994 # Use empty password
995 % cp .ssh/identity.pub public/authorized_keys
996 % ln -s ../public/authorized_keys .ssh/authorized_keys
999A list of useful LSF commands is given bellow:
1001\item \textbf{bqueues} shows the available queues and their status;
1002\item \textbf{ bsub -q 8nm job.sh} submits the shell script job.sh to
1003 the queue 8nm, where the name of the queue indicates the
1004 ``normalized CPU time'' (maximal job duration 8 min of normalized CPU time);
1005\item \textbf{bjobs} lists all unfinished jobs of the user;
1006\item \textbf{lsrun -m lxbXXXX xterm} returns a xterm running on the
1007 batch node lxbXXXX. This permits to inspect the job output and to
1008 debug a batch job.
1011Each batch job stores the output in directory LSFJOB\_XXXXXX, where
1012XXXXXX is the job id. Since the home directory is on AFS, the user has
1013to redirect the verbose output, otherwise the AFS quota might be
1014exceeded and the jobs will fail.
1016The CERN mass storage system is CASTOR2\cite{CASTOR2}. Every user has
1017his/her own CASTOR2 space, for example /castor/cern.ch/user/p/phristov.
1018The commands of CASTOR2 start with prefix ``ns'' of ``rf''. Here is
1019very short list of useful commands:
1022\item \textbf{nsls /castor/cern.ch/user/p/phristov} lists the CASTOR
1023 space of user phristov;
1024\item \textbf{rfdir /castor/cern.ch/user/p/phristov} the same as
1025 above, but the output is in long format;
1026\item \textbf{nsmkdir test} creates a new directory (test) in the
1027 CASTOR space of the user;
1028\item \textbf{rfcp /castor/cern.ch/user/p/phristov/test/galice.root .}
1029 copies the file from CASTOR to the local directory. If the file is
1030 on tape, this will trigger the stage-in procedure, which might take
1031 some time.
1032\item \textbf{rfcp AliESDs.root /castor/cern.ch/p/phristov/test}
1033 copies the local file AliESDs.root to CASTOR in the subdirectory
1034 test and schedules it for migration to tape.
1037The user also has to be aware, that the behavior of CASTOR depends on
1038the environment variables RFIO\_USE\_CASTOR\_V2(=YES),
1039STAGE\_HOST(=castoralice) and STAGE\_SVCCLASS(=default). They are set
1040by default to the values for the group (z2 in case of ALICE).
1042Below the user can find an example of job, where the simulation and
1043reconstruction are run using the corresponding macros sim.C and rec.C.
1044An example of such macros will be given later.
1046\lstinputlisting[language=sh,title={LSF example job}]{scripts/lsfjob}
1051\section{Simulation} \label{Simulation}
1053% -----------------------------------------------------------------------------
1056Heavy-ion collisions produce a very large number of particles in the
1057final state. This is a challenge for the reconstruction and analysis
1058algorithms. The detector design and the development of these algorithms requires a predictive
1059and precise simulation of the detector response. Model predictions
1060discussed in the first volume of Physics Performance Report for the
1061charged multiplicity at LHC in \mbox{Pb--Pb} collisions vary from 1400
1062to 8000 particles in the central unit of rapidity. The experiment was
1063designed when the highest available nucleon--nucleon center-of-mass energy
1064heavy-ion interactions was at $20 \, {\rm GeV}$ per nucleon--nucleon
1065pair at CERN SPS, i.e. a factor of about 300 less than the energy at
1066LHC. Recently, the RHIC collider came online. Its top energy of
1067$200\, {\rm GeV}$ per nucleon--nucleon pair is still 30 times less
1068than the LHC energy. The RHIC data seem to suggest that the LHC
1069multiplicity will be on the lower side of the interval. However, the
1070extrapolation is so large that both the hardware and software of ALICE
1071have to be designed for the highest multiplicity. Moreover, as the
1072predictions of different generators of heavy-ion collisions differ
1073substantially at LHC energies, we have to use several of them and
1074compare the results.
1076The simulation of the processes involved in the transport through the
1077detector of the particles emerging from the interaction is confronted
1078with several problems:
1080\begin {itemize}
1081\item existing event generators give different answers on parameters
1082 such as expected multiplicities, $p_T$-dependence and rapidity
1083 dependence at LHC energies.
1085\item most of the physics signals, like hyperon production, high-$p_T$
1086 phenomena, open charm and beauty, quarkonia etc., are not exactly
1087 reproduced by the existing event generators.
1089\item simulation of small cross-sections would demand prohibitively
1090 high computing resources to simulate a number of events that is commensurable with
1091 the expected number of detected events in the experiment.
1093\item the existing generators do not provide for event topologies like
1094 momentum correlations, azimuthal flow etc.
1095\end {itemize}
1097To allow nevertheless efficient simulations we have adopted a
1098framework that allows for a number of options:
1102\item{} the simulation framework provides an interface to external
1103 generators, like HIJING~\cite{MC:HIJING} and
1104 DPMJET~\cite{MC:DPMJET}.
1106\item{} a parameterized, signal-free, underlying event where the
1107 produced multiplicity can be specified as an input parameter is
1108 provided.
1110\item{} rare signals can be generated using the interface to external
1111 generators like PYTHIA or simple parameterizations of transverse
1112 momentum and rapidity spectra defined in function libraries.
1114\item{} the framework provides a tool to assemble events from
1115 different signal generators (event cocktails).
1117\item{} the framework provides tools to combine underlying events and
1118 signal events at the primary particle level (cocktail) and at the
1119 summable digit level (merging).
1121\item{} ``afterburners'' are used to introduce particle correlations in a
1122 controlled way. An afterburner is a program which changes the
1123 momenta of the particles produced by another generator, and thus
1124 modifies as desired the multi-particle momentum distributions.
1127The implementation of this strategy is described below. The results of
1128different \MC generators for heavy-ion collisions are
1129described in section~\ref{MC:Generators}.
1131\subsection{Simulation framework}
1133The simulation framework covers the simulation of primary collisions
1134and generation of the emerging particles, the transport of particles
1135through the detector, the simulation of energy depositions (hits) in
1136the detector components, their response in form of so called summable
1137digits, the generation of digits from summable digits with the
1138optional merging of underlying events and the creation of raw data.
1139The \class{AliSimulation} class provides a simple user interface to
1140the simulation framework. This section focuses on the simulation
1141framework from the (detector) software developers point of view.
1144 \centering
1145 \includegraphics[width=10cm]{picts/SimulationFramework}
1146 \caption{Simulation framework.} \label{MC:Simulation}
1151\textbf{Generation of Particles}
1153Different generators can be used to produce particles emerging from
1154the collision. The class \class{AliGenerator} is the base class
1155defining the virtual interface to the generator programs. The
1156generators are described in more detail in the ALICE PPR Volume 1 and
1157in the next chapter.
1160\textbf{Virtual Monte Carlo}
1162The simulation of particles traversing the detector components is
1163performed by a class derived from \class{TVirtualMC}. The Virtual
1164Monte Carlo also provides an interface to construct the geometry of
1165detectors. The task of the geometry description is done by the
1166geometrical modeler \class{TGeo}. The concrete implementation of the
1167virtual Monte Carlo application TVirtualMCApplication is AliMC. The
1168Monte Carlos used in ALICE are GEANT~3.21, GEANT~4 and FLUKA. More
1169information can be found on the VMC Web page:
1172As explained above, our strategy was to develop a virtual interface to
1173the detector simulation code. We call the interface to the transport
1174code virtual Monte Carlo. It is implemented via C++ virtual classes
1175and is schematically shown in Fig.~\ref{MC:vmc}. The codes that
1176implement the abstract classes are real C++ programs or wrapper
1177classes that interface to FORTRAN programs.
1180 \centering
1181 \includegraphics[width=10cm]{picts/vmc}
1182 \caption{Virtual \MC} \label{MC:vmc}
1185Thanks to the virtual Monte Carlo we have converted all FORTRAN user
1186code developed for GEANT~3 into C++, including the geometry definition
1187and the user scoring routines, \texttt{StepManager}. These have been
1188integrated in the detector classes of the AliRoot framework. The
1189output of the simulation is saved directly with ROOT I/O, simplifying
1190the development of the digitization and reconstruction code in C++.
1193\textbf{Modules and Detectors}
1195Each module of the ALICE detector is described by a class derived from
1196\class{AliModule}. Classes for active modules (= detectors) are not
1197derived directly from \class{AliModule} but from its subclass
1198\class{AliDetector}. These base classes define the interface to the
1199simulation framework via a set of virtual methods.
1202\textbf{Configuration File (Config.C)}
1204The configuration file is a C++ macro that is processed before the
1205simulation starts. It creates and configures the Monte Carlo object,
1206the generator object, the magnetic field map and the detector modules.
1207A detailed description is given below.
1210\textbf{Detector Geometry}
1212The virtual Monte Carlo application creates and initializes the
1213geometry of the detector modules by calling the virtual functions
1214\method{CreateMaterials}, \method{CreateGeometry}, \method{Init} and
1218\textbf{Vertexes and Particles}
1220In case the simulated event is intended to be merged with an
1221underlying event, the primary vertex is taken from the file containing
1222the underlying event by using the vertex generator
1223\class{AliVertexGenFile}. Otherwise the primary vertex is generated
1224according to the generator settings. Then the particles emerging from
1225the collision are generated and put on the stack (an instance of
1226\class{AliStack}). The transport of particles through the detector is
1227performed by the Monte Carlo object. The decay of particles is usually
1228handled by the external decayer \class{AliDecayerPythia}.
1231\textbf{Hits and Track References}
1233The Monte Carlo simulates the transport of a particle step by step.
1234After each step the virtual method \method{StepManager} of the module
1235in which the particle currently is located is called. In this step
1236manager method, the hits in the detector are created by calling
1237\method{AddHit}. Optionally also track references (location and
1238momentum of simulated particles at selected places) can be created by
1239calling \method{AddTackReference}. \method{AddHit} has to be
1240implemented by each detector whereas \method{AddTackReference} is
1241already implemented in AliModule. The container and the branch for the
1242hits -- and for the (summable) digits -- are managed by the detector
1243class via a set of so-called loaders. The relevant data members and
1244methods are fHits, fDigits, \method{ResetHits}, \method{ResetSDigits},
1245\method{ResetDigits},\method{MakeBranch} and \method{SetTreeAddress}.
1247For each detector methods like \method{PreTrack}, \method{PostTrack},
1248\method{FinishPrimary}, \method{FinishEvent} and \method{FinishRun}
1249are called during the simulation when the conditions indicated by the
1250method names are fulfilled.
1253\textbf{Summable Digits}
1255Summable digits are created by calling the virtual method Hits2SDigits
1256of a detector. This method loops over all events, creates the summable
1257digits from hits and stores them in the sdigits file(s).
1260\textbf{ Digitization and Merging}
1262Dedicated classes derived from \class{AliDigitizer} are used for the
1263conversion of summable digits into digits. Since \class{AliDigitizer}
1264is a \class{TTask}, this conversion is done for
1265the current event by the \method{Exec} method. Inside this method the summable
1266digits of all input streams have to be added, combined with noise,
1267converted to digital values taking into account possible thresholds
1268and stored in the digits container.
1270The input streams (more than one in case of merging) as well as the
1271output stream are managed by an object of type \method{AliRunDigitizer}. The
1272methods GetNinputs, GetInputFolderName and GetOutputFolderName return
1273the relevant information. The run digitizer is accessible inside the
1274digitizer via the protected data member fManager. If the flag
1275fRegionOfInterest is set, only detector parts where summable digits
1276from the signal event are present should be digitized. When \MC labels
1277are assigned to digits, the stream-dependent offset given by the
1278method \method{GetMask} is added to the label of the summable digit.
1280The detector specific digitizer object is created in the virtual
1281method CreateDigitizer of the concrete detector class. The run
1282digitizer object is used to construct the detector
1283digitizer. The \method{Init} method of each digitizer is called before the loop
1284over the events starts.
1287A direct conversion from hits directly to digits can be implemented in
1288the method \method{Hits2Digits} of a detector. The loop over the events is
1289inside the method. Of course merging is not supported in this case.
1291An example of simulation script that can be used for simulation of
1292proton-proton collisions is provided below:
1294\begin{lstlisting}[language=C++, title={Simulation run}]
1295 void sim(Int_t nev=100) {
1296 AliSimulation simulator;
1297 // Measure the total time spent in the simulation
1298 TStopwatch timer;
1299 timer.Start();
1300 // List of detectors, where both summable digits and digits are provided
ababa197 1301 simulator.SetMakeSDigits("TRD TOF PHOS EMCAL HMPID MUON ZDC PMD FMD T0 VZERO");
c4593ee3 1302 // Direct conversion of hits to digits for faster processing (ITS TPC)
1303 simulator.SetMakeDigitsFromHits("ITS TPC");
1304 simulator.Run(nev);
1305 timer.Stop();
1306 timer.Print();
1307 }
1310The following example shows how one can do event merging
1312\begin{lstlisting}[language=C++, title={Event merging}]
1313 void sim(Int_t nev=6) {
1314 AliSimulation simulator;
1315 // The underlying events are stored in a separate directory.
1316 // Three signal events will be merged in turn with each
1317 // underlying event
1318 simulator.MergeWith("../backgr/galice.root",3);
1319 simulator.Run(nev);
1320 }
1324\textbf{Raw Data}
1326The digits stored in ROOT containers can be converted into the DATE\cite{DATE}
1327format that will be the `payload' of the ROOT classes containing the
1328raw data. This is done for the current event in the method
1329\method{Digits2Raw} of the detector.
1331The simulation of raw data is managed by the class \class{AliSimulation}. To
1332create raw data DDL files it loops over all events. For each event it
1333creates a directory, changes to this directory and calls the method
1334\method{Digits2Raw} of each selected detector. In the Digits2Raw method the DDL
1335files of a detector are created from the digits for the current
1338For the conversion of the DDL files to a DATE file the
1339\class{AliSimulation} class uses the tool dateStream. To create a raw
1340data file in ROOT format with the DATE output as payload the program alimdc is
1343The only part that has to be implemented in each detector is
1344the \method{Digits2Raw} method of the detectors. In this method one file per
1345DDL has to be created obeying the conventions for file names and DDL
1346IDs. Each file is a binary file with a DDL data header in the
1347beginning. The DDL data header is implemented in the structure
1348\class{AliRawDataHeader}. The data member fSize should be set to the total
1349size of the DDL raw data including the size of the header. The
1350attribute bit 0 should be set by calling the method \method{SetAttribute(0)} to
1351indicate that the data in this file is valid. The attribute bit 1 can
1352be set to indicate compressed raw data.
1354The detector-specific raw data are stored in the DDL files after the
1355DDL data header. The format of this raw data should be as close as
1356possible to the one that will be delivered by the detector. This
1357includes the order in which the channels will be read out.
1359Below we show an example of raw data creation for all the detectors
1362 void sim(Int_t nev=1) {
1363 AliSimulation simulator;
1364 // Create raw data for ALL detectors, rootify it and store in the
1365 // file raw,root. Do not delete the intermediate files
1366 simulator.SetWriteRawData("ALL","raw.root",kFALSE);
1367 simulator.Run(nev);
1368 }
1372\subsection{Configuration: example of Config.C}
1374The example below contains as comments the most important information:
1376\lstinputlisting[language=C++] {scripts/Config.C}
1378% -----------------------------------------------------------------------------
1380\subsection{Event generation}
1383 \centering
1384 \includegraphics[width=10cm]{picts/aligen}
1385 \caption{\texttt{AliGenerator} is the base class, which has the
1386 responsibility to generate the primary particles of an event. Some
1387 realizations of this class do not generate the particles themselves
1388 but delegate the task to an external generator like PYTHIA through the
1389 \texttt{TGenerator} interface. }
1390 \label{MC:aligen}
1393\subsubsection{Parameterized generation}
1395The event generation based on parameterization can be used to produce
1396signal-free final states. It avoids the dependences on a
1397specific model, and is efficient and flexible. It can be used to
1398study the track reconstruction efficiency
1399as a function of the initial multiplicity and occupation.
1401\class{AliGenHIJINGparam}~\cite{MC:HIJINGparam} is an example of internal
1402AliRoot generator based on parameterized
1403pseudorapidity density and transverse momentum distributions of
1404charged and neutral pions and kaons. The pseudorapidity
1405distribution was obtained from a HIJING simulation of central
1406Pb--Pb collisions and scaled to a charged-particle multiplicity of
14078000 in the pseudo rapidity interval $|\eta | < 0.5$. Note that
1408this is about 10\% higher than the corresponding value for a
1409rapidity density with an average ${\rm d}N/{\rm d}y$ of 8000 in
1410the interval $|y | < 0.5$.
1411The transverse-momentum distribution is parameterized from the
1412measured CDF pion $p_T$-distribution at $\sqrt{s} = 1.8 \, TeV$.
1413The corresponding kaon $p_T$-distribution was obtained from the
1414pion distribution by $m_T$-scaling. See Ref.~\cite{MC:HIJINGparam}
1415for the details of these parameterizations.
1417In many cases, the expected transverse momentum and rapidity
1418distributions of particles are known. In other cases the effect of
1419variations in these distributions must be investigated. In both
1420situations it is appropriate to use generators that produce
1421primary particles and their decays sampling from parameterized
1422spectra. To meet the different physics requirements in a modular
1423way, the parameterizations are stored in independent function
1424libraries wrapped into classes that can be plugged into the
1425generator. This is schematically illustrated in
1426Fig.~\ref{MC:evglib} where four different generator libraries can
1427be loaded via the abstract generator interface.
1429It is customary in heavy-ion event generation to superimpose
1430different signals on an event to tune the reconstruction
1431algorithms. This is possible in AliRoot via the so-called cocktail
1432generator (Fig.~\ref{MC:cocktail}). This creates events from
1433user-defined particle cocktails by choosing as ingredients a list
1434of particle generators.
1437 \centering
1438 \includegraphics[width=10cm]{picts/evglib}
1439 \caption{\texttt{AliGenParam} is a realization of \texttt{AliGenerator}
1440 that generates particles using parameterized $\pt$ and
1441 pseudo-rapidity distributions. Instead of coding a fixed number of
1442 parameterizations directly into the class implementations, user
1443 defined parameterization libraries (AliGenLib) can be connected at
1444 run time allowing for maximum flexibility.} \label{MC:evglib}
1447An example of \class{AliGenParam} usage is presented below:
1450 // Example for J/psi Production from Parameterization
1451 // using default library (AliMUONlib)
1452 AliGenParam *gener = new AliGenParam(ntracks, AliGenMUONlib::kUpsilon);
1453 gener->SetMomentumRange(0,999); // Wide cut on the Upsilon momentum
1454 gener->SetPtRange(0,999); // Wide cut on Pt
1455 gener->SetPhiRange(0. , 360.); // Full azimutal range
1456 gener->SetYRange(2.5,4); // In the acceptance of the MUON arm
1457 gener->SetCutOnChild(1); // Enable cuts on Upsilon decay products
1458 gener->SetChildThetaRange(2,9); // Theta range for the decay products
1459 gener->SetOrigin(0,0,0); // Vertex position
1460 gener->SetSigma(0,0,5.3); // Sigma in (X,Y,Z) (cm) on IP position
1461 gener->SetForceDecay(kDiMuon); // Upsilon->mu+ mu- decay
1462 gener->SetTrackingFlag(0); // No particle transport
1463 gener->Init()
1466To facilitate the usage of different generators we have developed
1467an abstract generator interface called \texttt{AliGenerator}, see
1468Fig.~\ref{MC:aligen}. The objective is to provide the user with
1469an easy and coherent way to study a variety of physics signals as
1470well as full set of tools for testing and background studies. This
1471interface allows the study of full events, signal processes, and
1472a mixture of both, i.e. cocktail events (see an example later).
1474Several event generators are available via the abstract ROOT class
1475that implements the generic generator interface, \texttt{TGenerator}.
1476Through implementations of this abstract base class we wrap
1477FORTRAN \MC codes like PYTHIA, HERWIG, and HIJING that are
1478thus accessible from the AliRoot classes. In particular the
1479interface to PYTHIA includes the use of nuclear structure
1480functions of LHAPDF.
1485Pythia is used for simulation of proton-proton interactions and for
1486generation of jets in case of event merging. An example of minimum
1487bias Pythia events is presented below:
1490 AliGenPythia *gener = new AliGenPythia(-1);
1491 gener->SetMomentumRange(0,999999);
1492 gener->SetThetaRange(0., 180.);
1493 gener->SetYRange(-12,12);
1494 gener->SetPtRange(0,1000);
1495 gener->SetProcess(kPyMb); // Min. bias events
1496 gener->SetEnergyCMS(14000.); // LHC energy
1497 gener->SetOrigin(0, 0, 0); // Vertex position
1498 gener->SetSigma(0, 0, 5.3); // Sigma in (X,Y,Z) (cm) on IP position
1499 gener->SetCutVertexZ(1.); // Truncate at 1 sigma
1500 gener->SetVertexSmear(kPerEvent);// Smear per event
1501 gener->SetTrackingFlag(1); // Particle transport
1502 gener->Init()
1507HIJING (Heavy-Ion Jet Interaction Generator) combines a
1508QCD-inspired model of jet production~\cite{MC:HIJING} with the
1509Lund model~\cite{MC:LUND} for jet fragmentation. Hard or
1510semi-hard parton scatterings with transverse momenta of a few GeV
1511are expected to dominate high-energy heavy-ion collisions. The
1512HIJING model has been developed with special emphasis on the role
1513of mini jets in pp, pA and A--A reactions at collider energies.
1515Detailed systematic comparisons of HIJING results with a wide
1516range of data demonstrates a qualitative understanding of the
1517interplay between soft string dynamics and hard QCD interactions.
1518In particular, HIJING reproduces many inclusive spectra,
1519two-particle correlations, and the observed flavor and
1520multiplicity dependence of the average transverse momentum.
1522The Lund FRITIOF~\cite{MC:FRITIOF} model and the Dual Parton
1523Model~\cite{MC:DPM} (DPM) have guided the formulation of HIJING
1524for soft nucleus--nucleus reactions at intermediate energies,
1525$\sqrt{s_{\rm NN}}\approx 20\, GeV$. The hadronic-collision
1526model has been inspired by the successful implementation of
1527perturbative QCD processes in PYTHIA~\cite{MC:PYTH}. Binary
1528scattering with Glauber geometry for multiple interactions are
1529used to extrapolate to pA and A--A collisions.
1531Two important features of HIJING are jet quenching and nuclear
1532shadowing. Jet quenching is the energy loss by partons in nuclear
1533matter. It is responsible for an increase of the particle
1534multiplicity at central rapidities. Jet quenching is modeled by an
1535assumed energy loss by partons traversing dense matter. A simple
1536color configuration is assumed for the multi-jet system and the Lund
1537fragmentation model is used for the hadronisation. HIJING does not
1538simulate secondary interactions.
1540Shadowing describes the modification of the free nucleon parton
1541density in the nucleus. At the low-momentum fractions, $x$,
1542observed by collisions at the LHC, shadowing results in a decrease
1543of the multiplicity. Parton shadowing is taken into account using
1544a parameterization of the modification.
1546Here is an example of event generation with HIJING:
1549 AliGenHijing *gener = new AliGenHijing(-1);
1550 gener->SetEnergyCMS(5500.); // center of mass energy
1551 gener->SetReferenceFrame("CMS"); // reference frame
1552 gener->SetProjectile("A", 208, 82); // projectile
1553 gener->SetTarget ("A", 208, 82); // projectile
1554 gener->KeepFullEvent(); // HIJING will keep the full parent child chain
1555 gener->SetJetQuenching(1); // enable jet quenching
1556 gener->SetShadowing(1); // enable shadowing
1557 gener->SetDecaysOff(1); // neutral pion and heavy particle decays switched off
1558 gener->SetSpectators(0); // Don't track spectators
1559 gener->SetSelectAll(0); // kinematic selection
1560 gener->SetImpactParameterRange(0., 5.); // Impact parameter range (fm)
1561 gener->Init()
1564\subsubsection{Additional universal generators}
1566The following universal generators are available in AliRoot:
1569\item DPMJET: this is an implementation of the dual parton
1570 model\cite{MC:DPMJET};
1571\item ISAJET: a \MC event generator for pp, $\bar pp$, and $e^=e^-$
1572 reactions\cite{MC:ISAJET};
1573\item HERWIG: \MC package for simulating Hadron Emission
1574 Reactions With Interfering Gluons\cite{MC:HERWIG}.
1577An example of HERWIG configuration in the Config.C is shown below:
1579AliGenHerwig *gener = new AliGenHerwig(-1);
1580// final state kinematic cuts
1582gener->SetPhiRange(0. ,360.);
1583gener->SetThetaRange(0., 180.);
1586// vertex position and smearing
1587gener->SetOrigin(0,0,0); // vertex position
1589gener->SetSigma(0,0,5.6); // Sigma in (X,Y,Z) (cm) on IP position
1590// Beam momenta
1592// Beams
1595// Structure function
1597// Hard scatering
1600// Min bias
1604\subsubsection{Generators for specific studies}
1608MEVSIM~\cite{MC:MEVSIM} was developed for the STAR experiment to
1609quickly produce a large number of A--A collisions for some
1610specific needs -- initially for HBT studies and for testing of
1611reconstruction and analysis software. However, since the user is
1612able to generate specific signals, it was extended to flow and
1613event-by-event fluctuation analysis. A detailed description of
1614MEVSIM can be found in Ref.~\cite{MC:MEVSIM}.
1616MEVSIM generates particle spectra according to a momentum model
1617chosen by the user. The main input parameters are: types and
1618numbers of generated particles, momentum-distribution model,
1619reaction-plane and azimuthal-anisotropy coefficients, multiplicity
1620fluctuation, number of generated events, etc. The momentum models
1621include factorized $p_T$ and rapidity distributions, non-expanding
1622and expanding thermal sources, arbitrary distributions in $y$ and
1623$p_T$ and others. The reaction plane and azimuthal anisotropy is
1624defined by the Fourier coefficients (maximum of six) including
1625directed and elliptical flow. Resonance production can also be
1628MEVSIM was originally written in FORTRAN. It was later integrated into
1629AliRoot. A complete description of the AliRoot implementation of MEVSIM can
1630be found on the web page (\url{http://home.cern.ch/~radomski}).
1634GeVSim \cite{MC:GEVSIM} is a fast and easy-to-use \MC
1635event generator implemented in AliRoot. It can provide events of
1636similar type configurable by the user according to the specific
1637needs of a simulation project, in particular, that of flow and
1638event-by-event fluctuation studies. It was developed to facilitate
1639detector performance studies and for the test of algorithms.
1640GeVSim can also be used to generate signal-free events to be
1641processed by afterburners, for example HBT processor.
1643GeVSim is based on the MevSim \cite{MC:MEVSIM} event generator
1644developed for the STAR experiment.
1646GeVSim generates a list of particles by randomly sampling a
1647distribution function. The parameters of single-particle spectra
1648and their event-by-event fluctuations are explicitly defined by
1649the user. Single-particle transverse-momentum and rapidity spectra
1650can be either selected from a menu of four predefined
1651distributions, the same as in MevSim, or provided by user.
1653Flow can be easily introduced into simulated events. The parameters of
1654the flow are defined separately for each particle type and can be
1655either set to a constant value or parameterized as a function of
1656transverse momentum and rapidity. Two parameterizations of elliptic
1657flow based on results obtained by RHIC experiments are provided.
1659GeVSim also has extended possibilities for simulating of
1660event-by-event fluctuations. The model allows fluctuations
1661following an arbitrary analytically defined distribution in
1662addition to the Gaussian distribution provided by MevSim. It is
1663also possible to systematically alter a given parameter to scan
1664the parameter space in one run. This feature is useful when
1665analyzing performance with respect to, for example, multiplicity
1666or event-plane angle.
1668The current status and further development of GeVSim code and documentation
1669can be found in Ref.~\cite{MC:Radomski}.
1671\textbf{HBT processor}
1673Correlation functions constructed with the data produced by MEVSIM
1674or any other event generator are normally flat in the region of
1675small relative momenta. The HBT-processor afterburner introduces
1676two particle correlations into the set of generated particles. It
1677shifts the momentum of each particle so that the correlation
1678function of a selected model is reproduced. The imposed
1679correlation effects due to Quantum Statistics (QS) and Coulomb
1680Final State Interactions (FSI) do not affect the single-particle
1681distributions and multiplicities. The event structures before and
1682after passing through the HBT processor are identical. Thus, the
1683event reconstruction procedure with and without correlations is
1684also identical. However, the track reconstruction efficiency, momentum
1685resolution and particle identification need not to be, since
1686correlated particles have a special topology at small relative
1687velocities. We can thus verify the influence of various
1688experimental factors on the correlation functions.
1690The method, proposed by L.~Ray and G.W.~Hoffmann \cite{MC:HBTproc}
1691is based on random shifts of the particle three-momentum within a
1692confined range. After each shift, a comparison is made with
1693correlation functions resulting from the assumed model of the
1694space--time distribution and with the single-particle spectra
1695which should remain unchanged. The shift is kept if the
1696$\chi^2$-test shows better agreement. The process is iterated
1697until satisfactory agreement is achieved. In order to construct
1698the correlation function, a reference sample is made by mixing
1699particles from some consecutive events. Such a method has an
1700important impact on the simulations when at least two events must
1701be processed simultaneously.
1703Some specific features of this approach are important for practical
1706\item{} the HBT processor can simultaneously generate correlations of up
1707 to two particle types (e.g. positive and negative pions).
1708 Correlations of other particles can be added subsequently.
1709\item{} the form of the correlation function has to be parameterized
1710 analytically. One and three dimensional parameterizations are
1711 possible.
1712\item{} a static source is usually assumed. Dynamical effects,
1713 related to
1714 expansion or flow, can be simulated in a stepwise form by repeating
1715 simulations for different values of the space--time parameters
1716 associated with different kinematic intervals.
1717\item{} Coulomb effects may be introduced by one of three
1718 approaches: Gamow
1719 factor, experimentally modified Gamow correction and integrated
1720 Coulomb wave functions for discrete values of the source radii.
1721\item{} Strong interactions are not implemented.
1724The detailed description of the HBT processor can be found
1727\textbf{Flow afterburner}
1729Azimuthal anisotropies, especially elliptic flow, carry unique
1730information about collective phenomena and consequently are
1731important for the study of heavy-ion collisions. Additional
1732information can be obtained studying different heavy-ion
1733observables, especially jets, relative to the event plane.
1734Therefore it is necessary to evaluate the capability of ALICE to
1735reconstruct the event plane and study elliptic flow.
1737Since there is not a well understood microscopic description of
1738the flow effect it cannot be correctly simulated by microscopic
1739event generators. Therefore, to generate events with flow the user has
1740to use event generators based on macroscopic models, like GeVSim
1741\cite{MC:GEVSIM} or an afterburner which can generate flow on top
1742of events generated by event generators based on the microscopic
1743description of the interaction. In the AliRoot framework such a
1744flow afterburner is implemented.
1746The algorithm to apply azimuthal correlation consists in shifting the
1747azimuthal coordinates of the particles. The transformation is given
1748by \cite{MC:POSCANCER}:
1752\varphi \rightarrow \varphi '=\varphi +\Delta \varphi \]
1754\Delta \varphi =\sum _{n}\frac{-2}{n}v_{n}\left( p_{t},y\right)
1755\sin n \times \left( \varphi -\psi \right) \] where \(
1756v_{n}(p_{t},y) \) is the flow coefficient to be obtained, \( n \)
1757is the harmonic number and \( \psi \) is the event-plane angle.
1758Note that the algorithm is deterministic and does not contain any
1759random numbers generation.
1761The value of the flow coefficient can be either constant or parameterized as a
1762function of transverse momentum and rapidity. Two parameterizations
1763of elliptic flow are provided as in GeVSim.
1766 AliGenGeVSim* gener = new AliGenGeVSim(0);
1768 mult = 2000; // Mult is the number of charged particles in |eta| < 0.5
1769 vn = 0.01; // Vn
1771 Float_t sigma_eta = 2.75; // Sigma of the Gaussian dN/dEta
1772 Float_t etamax = 7.00; // Maximum eta
1774 // Scale from multiplicity in |eta| < 0.5 to |eta| < |etamax|
1775 Float_t mm = mult * (TMath::Erf(etamax/sigma_eta/sqrt(2.)) /
1776 TMath::Erf(0.5/sigma_eta/sqrt(2.)));
1778 // Scale from charged to total multiplicity
1779 mm *= 1.587;
1781 // Define particles
1783 // 78% Pions (26% pi+, 26% pi-, 26% p0) T = 250 MeV
1784 AliGeVSimParticle *pp =
1785 new AliGeVSimParticle(kPiPlus, 1, 0.26 * mm, 0.25, sigma_eta) ;
1786 AliGeVSimParticle *pm =
1787 new AliGeVSimParticle(kPiMinus, 1, 0.26 * mm, 0.25, sigma_eta) ;
1788 AliGeVSimParticle *p0 =
1789 new AliGeVSimParticle(kPi0, 1, 0.26 * mm, 0.25, sigma_eta) ;
1791 // 12% Kaons (3% K0short, 3% K0long, 3% K+, 3% K-) T = 300 MeV
1792 AliGeVSimParticle *ks =
1793 new AliGeVSimParticle(kK0Short, 1, 0.03 * mm, 0.30, sigma_eta) ;
1794 AliGeVSimParticle *kl =
1795 new AliGeVSimParticle(kK0Long, 1, 0.03 * mm, 0.30, sigma_eta) ;
1796 AliGeVSimParticle *kp =
1797 new AliGeVSimParticle(kKPlus, 1, 0.03 * mm, 0.30, sigma_eta) ;
1798 AliGeVSimParticle *km =
1799 new AliGeVSimParticle(kKMinus, 1, 0.03 * mm, 0.30, sigma_eta) ;
1801 // 10% Protons / Neutrons (5% Protons, 5% Neutrons) T = 250 MeV
1802 AliGeVSimParticle *pr =
1803 new AliGeVSimParticle(kProton, 1, 0.05 * mm, 0.25, sigma_eta) ;
1804 AliGeVSimParticle *ne =
1805 new AliGeVSimParticle(kNeutron, 1, 0.05 * mm, 0.25, sigma_eta) ;
1807 // Set Elliptic Flow properties
1809 Float_t pTsaturation = 2. ;
1811 pp->SetEllipticParam(vn,pTsaturation,0.) ;
1812 pm->SetEllipticParam(vn,pTsaturation,0.) ;
1813 p0->SetEllipticParam(vn,pTsaturation,0.) ;
1814 pr->SetEllipticParam(vn,pTsaturation,0.) ;
1815 ne->SetEllipticParam(vn,pTsaturation,0.) ;
1816 ks->SetEllipticParam(vn,pTsaturation,0.) ;
1817 kl->SetEllipticParam(vn,pTsaturation,0.) ;
1818 kp->SetEllipticParam(vn,pTsaturation,0.) ;
1819 km->SetEllipticParam(vn,pTsaturation,0.) ;
1821 // Set Direct Flow properties
1823 pp->SetDirectedParam(vn,1.0,0.) ;
1824 pm->SetDirectedParam(vn,1.0,0.) ;
1825 p0->SetDirectedParam(vn,1.0,0.) ;
1826 pr->SetDirectedParam(vn,1.0,0.) ;
1827 ne->SetDirectedParam(vn,1.0,0.) ;
1828 ks->SetDirectedParam(vn,1.0,0.) ;
1829 kl->SetDirectedParam(vn,1.0,0.) ;
1830 kp->SetDirectedParam(vn,1.0,0.) ;
1831 km->SetDirectedParam(vn,1.0,0.) ;
1833 // Add particles to the list
1835 gener->AddParticleType(pp) ;
1836 gener->AddParticleType(pm) ;
1837 gener->AddParticleType(p0) ;
1838 gener->AddParticleType(pr) ;
1839 gener->AddParticleType(ne) ;
1840 gener->AddParticleType(ks) ;
1841 gener->AddParticleType(kl) ;
1842 gener->AddParticleType(kp) ;
1843 gener->AddParticleType(km) ;
1845 // Random Ev.Plane
1847 TF1 *rpa = new TF1("gevsimPsiRndm","1", 0, 360);
1849 gener->SetPtRange(0., 9.) ; // Used for bin size in numerical integration
1850 gener->SetPhiRange(0, 360);
1852 gener->SetOrigin(0, 0, 0); // vertex position
1853 gener->SetSigma(0, 0, 5.3); // Sigma in (X,Y,Z) (cm) on IP position
1854 gener->SetCutVertexZ(1.); // Truncate at 1 sigma
1855 gener->SetVertexSmear(kPerEvent);
1856 gener->SetTrackingFlag(1);
1857 gener->Init()
1860\textbf{Generator for e$^+$e$^-$ pairs in Pb--Pb collisions}
1862In addition to strong interactions of heavy ions in central and
1863peripheral collisions, ultra-peripheral collisions of ions give
1864rise to coherent, mainly electromagnetic, interactions among which
1865the dominant process is is the (multiple) e$^+$e$^-$-pair
1866production \cite{MC:AlscherHT97}
1868 AA \to AA + n({\rm e}^+{\rm e}^-), \label{nee}
1870where $n$ is the pair multiplicity. Most electron--positron pairs
1871are produced into the very forward direction escaping the
1872experiment. However, for Pb--Pb collisions at the LHC the
1873cross-section of this process, about 230 \, ${\rm kb}$, is
1874enormous. A sizable fraction of pairs produced with large-momentum
1875transfer can contribute to the hit rate in the forward detectors
1876increasing the occupancy or trigger rate. In order to study this
1877effect an event generator for e$^+$e$^-$-pair production has
1878been implemented in the AliRoot framework \cite{MC:Sadovsky}. The
1879class \texttt{TEpEmGen} is a realisation of the \texttt{TGenerator}
1880interface for external generators and wraps the FORTRAN code used
1881to calculate the differential cross-section. \texttt{AliGenEpEmv1}
1882derives from \texttt{AliGenerator} and uses the external generator to
1883put the pairs on the AliRoot particle stack.
1886\subsubsection{Combination of generators: AliGenCocktail}
1889 \centering
1890 \includegraphics[width=10cm]{picts/cocktail}
1891 \caption{The \texttt{AliGenCocktail} generator is a realization of {\tt
1892 AliGenerator} which does not generate particles itself but
1893 delegates this task to a list of objects of type {\tt
1894 AliGenerator} that can be connected as entries ({\tt
1895 AliGenCocktailEntry}) at run time. In this way different physics
1896 channels can be combined in one event.} \label{MC:cocktail}
1899Here is an example of cocktail, used for studies in the TRD detector:
1902 // The cocktail generator
1903 AliGenCocktail *gener = new AliGenCocktail();
1905 // Phi meson (10 particles)
1906 AliGenParam *phi =
1907 new AliGenParam(10,new AliGenMUONlib(),AliGenMUONlib::kPhi,"Vogt PbPb");
1908 phi->SetPtRange(0, 100);
1909 phi->SetYRange(-1., +1.);
1910 phi->SetForceDecay(kDiElectron);
1912 // Omega meson (10 particles)
1913 AliGenParam *omega =
1914 new AliGenParam(10,new AliGenMUONlib(),AliGenMUONlib::kOmega,"Vogt PbPb");
1915 omega->SetPtRange(0, 100);
1916 omega->SetYRange(-1., +1.);
1917 omega->SetForceDecay(kDiElectron);
1919 // J/psi
1920 AliGenParam *jpsi = new AliGenParam(10,new AliGenMUONlib(),
1921 AliGenMUONlib::kJpsiFamily,"Vogt PbPb");
1922 jpsi->SetPtRange(0, 100);
1923 jpsi->SetYRange(-1., +1.);
1924 jpsi->SetForceDecay(kDiElectron);
1926 // Upsilon family
1927 AliGenParam *ups = new AliGenParam(10,new AliGenMUONlib(),
1928 AliGenMUONlib::kUpsilonFamily,"Vogt PbPb");
1929 ups->SetPtRange(0, 100);
1930 ups->SetYRange(-1., +1.);
1931 ups->SetForceDecay(kDiElectron);
1933 // Open charm particles
1934 AliGenParam *charm = new AliGenParam(10,new AliGenMUONlib(),
1935 AliGenMUONlib::kCharm,"central");
1936 charm->SetPtRange(0, 100);
1937 charm->SetYRange(-1.5, +1.5);
1938 charm->SetForceDecay(kSemiElectronic);
1940 // Beauty particles: semi-electronic decays
1941 AliGenParam *beauty = new AliGenParam(10,new AliGenMUONlib(),
1942 AliGenMUONlib::kBeauty,"central");
1943 beauty->SetPtRange(0, 100);
1944 beauty->SetYRange(-1.5, +1.5);
1945 beauty->SetForceDecay(kSemiElectronic);
1947 // Beauty particles to J/psi ee
1948 AliGenParam *beautyJ = new AliGenParam(10, new AliGenMUONlib(),
1949 AliGenMUONlib::kBeauty,"central");
1950 beautyJ->SetPtRange(0, 100);
1951 beautyJ->SetYRange(-1.5, +1.5);
1952 beautyJ->SetForceDecay(kBJpsiDiElectron);
1954 // Adding all the components of the cocktail
1955 gener->AddGenerator(phi,"Phi",1);
1956 gener->AddGenerator(omega,"Omega",1);
1957 gener->AddGenerator(jpsi,"J/psi",1);
1958 gener->AddGenerator(ups,"Upsilon",1);
1959 gener->AddGenerator(charm,"Charm",1);
1960 gener->AddGenerator(beauty,"Beauty",1);
1961 gener->AddGenerator(beautyJ,"J/Psi from Beauty",1);
1963 // Settings, common for all components
1964 gener->SetOrigin(0, 0, 0); // vertex position
1965 gener->SetSigma(0, 0, 5.3); // Sigma in (X,Y,Z) (cm) on IP position
1966 gener->SetCutVertexZ(1.); // Truncate at 1 sigma
1967 gener->SetVertexSmear(kPerEvent);
1968 gener->SetTrackingFlag(1);
1969 gener->Init();
1973\subsection{Particle transport}
1975\subsubsection{TGeo essential information}
1977A detailed description of the Root geometry package is available in
1978the Root User's Guide\cite{RootUsersGuide}. Several examples can be
1979found in \$ROOTSYS/tutorials, among them assembly.C, csgdemo.C,
1980geodemo.C, nucleus.C, rootgeom.C, etc. Here we show a simple usage for
1981export/import of the ALICE geometry and for check for overlaps and
1985 aliroot
1986 root [0] gAlice->Init()
1987 root [1] gGeoManager->Export("geometry.root")
1988 root [2] .q
1989 aliroot
1990 root [0] TGeoManager::Import("geometry.root")
1991 root [1] gGeoManager->CheckOverlaps()
1992 root [2] gGeoManager->PrintOverlaps()
1993 root [3] new TBrowser
1994 # Now you can navigate in Geometry->Illegal overlaps
1995 # and draw each overlap (double click on it)
2000Below we show an example of VZERO visualization using the Root
2001geometry package:
2004 aliroot
2005 root [0] gAlice->Init()
2006 root [1] TGeoVolume *top = gGeoManager->GetMasterVolume()
2007 root [2] Int_t nd = top->GetNdaughters()
2008 root [3] for (Int_t i=0; i<nd; i++) \
2009 top->GetNode(i)->GetVolume()->InvisibleAll()
2010 root [4] TGeoVolume *v0ri = gGeoManager->GetVolume("V0RI")
2011 root [5] TGeoVolume *v0le = gGeoManager->GetVolume("V0LE")
2012 root [6] v0ri->SetVisibility(kTRUE);
2013 root [7] v0ri->VisibleDaughters(kTRUE);
2014 root [8] v0le->SetVisibility(kTRUE);
2015 root [9] v0le->VisibleDaughters(kTRUE);
2016 root [10] top->Draw();
2020\subsubsection{Particle decays}
2022We use Pythia to carry one particle decays during the transport. The
2023default decay channels can be seen in the following way:
2026 aliroot
2027 root [0] AliPythia * py = AliPythia::Instance()
2028 root [1] py->Pylist(12); >> decay.list
2031The file decay.list will contain the list of particles decays
2032available in Pythia. Now if we want to force the decay $\Lambda^0 \to
2033p \pi^-$, the following lines should be included in the Config.C
2034before we register the decayer:
2037 AliPythia * py = AliPythia::Instance();
2038 py->SetMDME(1059,1,0);
2039 py->SetMDME(1060,1,0);
2040 py->SetMDME(1061,1,0);
2043where 1059,1060 and 1061 are the indexes of the decay channel (from
2044decay.list above) we want to switch off.
2049\textbf{Fast simulation}
2051This example is taken from the macro
2052\$ALICE\_ROOT/FASTSIM/fastGen.C. It shows how one can create a
2053Kinematics tree which later can be used as input for the particle
2054transport. A simple selection of events with high multiplicity is
2057\lstinputlisting[language=C++] {scripts/fastGen.C}
2059\textbf{Reading of kinematics tree as input for the particle transport}
2061We suppose that the macro fastGen.C above has been used to generate
2062the corresponding sent of files: galice.root and Kinematics.root, and
2063that they are stored in a separate subdirectory, for example kine. Then
2064the following code in Config.C will read the set of files and put them
2065in the stack for transport:
2068 AliGenExtFile *gener = new AliGenExtFile(-1);
2070 gener->SetMomentumRange(0,14000);
2071 gener->SetPhiRange(0.,360.);
2072 gener->SetThetaRange(45,135);
2073 gener->SetYRange(-10,10);
2074 gener->SetOrigin(0, 0, 0); //vertex position
2075 gener->SetSigma(0, 0, 5.3); //Sigma in (X,Y,Z) (cm) on IP position
2077 AliGenReaderTreeK * reader = new AliGenReaderTreeK();
2078 reader->SetFileName("../galice.root");
2080 gener->SetReader(reader);
2081 gener->SetTrackingFlag(1);
2083 gener->Init();
2087\textbf{Usage of different generators}
2089A lot of examples are available in
2090\$ALICE\_ROOT/macros/Config\_gener.C. The correspondent part can be
2091extracted and placed in the relevant Config.C file.
2101% -----------------------------------------------------------------------------
2103\subsection{Reconstruction Framework}
2105This chapter
2106focuses on the reconstruction framework from the (detector) software
2107developers point of view.
2109Wherever it is not specified explicitly as different, we refer
2110to the `global ALICE coordinate system'\cite{CoordinateSystem}. It is a right-handed coordinate
2111system with
2112the $z$ axis coinciding with the beam-pipe axis and going in the direction
2113opposite to the muon arm, the $y$ axis going up, and the origin of
2114coordinates defined by the intersection point of the $z$ axis
2115and the central-membrane plane of TPC.
2117Here is a reminder of the following terms which are used in the
2118description of the reconstruction framework (see also section \ref{AliRootFramework}):
2120\item {\it Digit}: This is a digitized signal (ADC count) obtained by
2121 a sensitive pad of a detector at a certain time.
2122\item {\it Cluster}: This is a set of adjacent (in space and/or in time)
2123 digits that were presumably generated by the same particle crossing the
2124 sensitive element of a detector.
2125\item Reconstructed {\it space point}: This is the estimation of the
2126 position where a particle crossed the sensitive element of a detector
2127 (often, this is done by calculating the center of gravity of the
2128 `cluster').
2129\item Reconstructed {\it track}: This is a set of five parameters (such as the
2130 curvature and the angles with respect to the coordinate axes) of the particle's
2131 trajectory together with the corresponding covariance matrix estimated at a given
2132 point in space.
2136The input to the reconstruction framework are digits in root tree
2137format or raw data format. First a local reconstruction of clusters is
2138performed in each detector. Then vertexes and tracks are reconstructed
2139and the particle identification is carried on. The output of the reconstruction
2140is the Event Summary Data (ESD). The \class{AliReconstruction} class provides
2141a simple user interface to the reconstruction framework which is
2142explained in the source code and.
2145 \centering
2146 \includegraphics[width=10cm]{picts/ReconstructionFramework}
2147 \caption{Reconstruction framework.} \label{MC:Reconstruction}
2150\textbf{Requirements and Guidelines}
2152The development of the reconstruction framework has been carried on
2153according to the following requirements and guidelines:
2155\item the prime goal of the reconstruction is to provide the data that
2156 is needed for a physics analysis;
2157\item the reconstruction should be aimed for high efficiency, purity and resolution.
2158\item the user should have an easy to use interface to extract the
2159 required information from the ESD;
2160\item the reconstruction code should be efficient but also maintainable;
2161\item the reconstruction should be as flexible as possible.
2162 It should be possible to do the reconstruction in one detector even in
2163 the case that other detectors are not operational.
2164 To achieve such a flexibility each detector module should be able to
2165 \begin{itemize}
2166 \item find tracks starting from seeds provided by another detector
2167 (external seeding),
2168 \item find tracks without using information from other detectors
2169 (internal seeding),
2170 \item find tracks from external seeds and add tracks from internal seeds
2171 \item and propagate tracks through the detector using the already
2172 assigned clusters in inward and outward direction.
2173 \end{itemize}
2174\item where it is appropriate, common (base) classes should be used in
2175 the different reconstruction modules;
2176\item the interdependencies between the reconstruction modules should
2177 be minimized.
2178 If possible the exchange of information between detectors should be
2179 done via a common track class.
2180\item the chain of reconstruction program(s) should be callable and
2181 steerable in an easy way;
2182\item there should be no assumptions on the structure or names of files
2183 or on the number or order of events;
2184\item each class, data member and method should have a correct,
2185 precise and helpful html documentation.
2192The interface from the steering class \class{AliReconstruction} to the
2193detector specific reconstruction code is defined by the base class
2194\class{AliReconstructor}. For each detector there is a derived reconstructor
2195class. The user can set options for each reconstructor in format of a
2196string parameter which is accessible inside the reconstructor via the
2197method GetOption.
2199The detector specific reconstructors are created via
2200plugins. Therefore they must have a default constructor. If no plugin
2201handler is defined by the user (in .rootrc), it is assumed that the
2202name of the reconstructor for detector DET is AliDETReconstructor and
2203that it is located in the library libDETrec.so (or libDET.so).
2206\textbf{Input Data}
2208If the input data is provided in format of root trees, either the
2209loaders or directly the trees are used to access the digits. In case
2210of raw data input the digits are accessed via a raw reader.
2212If a galice.root file exists, the run loader will be retrieved from
2213it. Otherwise the run loader and the headers will be created from the
2214raw data. The reconstruction can not work if there is no galice.root file
2215and no raw data input.
2218\textbf{Output Data}
2220The clusters (rec. points) are considered as intermediate output and
2221are stored in root trees handled by the loaders. The final output of
2222the reconstruction is a tree with objects of type \class{AliESD} stored in the
2223file AliESDs.root. This Event Summary Data (ESD) contains lists of
2224reconstructed tracks/particles and global event properties. The detailed
2225description of the ESD can be found in section \ref{ESD}.
2228\textbf{Local Reconstruction (Clusterization)}
2230The first step of the reconstruction is the so called ``local
2231reconstruction''. It is executed for each detector separately and
2232without exchanging information with other detectors. Usually the
2233clusterization is done in this step.
2235The local reconstruction is invoked via the method \method{Reconstruct} of the
2236reconstructor object. Each detector reconstructor runs the local
2237reconstruction for all events. The local reconstruction method is
2238only called if the method HasLocalReconstruction of the reconstructor
2239returns kTRUE.
2241Instead of running the local reconstruction directly on raw data, it
2242is possible to first convert the raw data digits into a digits tree
2243and then to call the \method{Reconstruct} method with a tree as input
2244parameter. This conversion is done by the method ConvertDigits. The
2245reconstructor has to announce that it can convert the raw data digits
2246by returning kTRUE in the method \method{HasDigitConversion}.
2251The current reconstruction of the primary-vertex
2252position in ALICE is done using the information provided by the
2253silicon pixel detectors, which constitute the two innermost layers of the
2256The algorithm starts with looking at the
2257distribution of the $z$ coordinates of the reconstructed space points
2258in the first pixel layers.
2259At a vertex $z$ coordinate $z_{\rm true} = 0$ the distribution is
2260symmetric and
2261its centroid ($z_{\rm cen}$) is very close to the nominal
2262vertex position. When the primary vertex is moved along the $z$ axis, an
2263increasing fraction
2264of hits will be lost and the centroid of the distribution no longer gives
2265the primary
2266vertex position. However, for primary vertex locations not too far from
2267$z_{\rm true} = 0$
2268(up to about 12~cm), the centroid of the distribution is still correlated to
2269the true vertex position.
2270The saturation effect at large $z_{\rm true}$ values of the vertex position
2271($z_{\rm true} = $12--15~cm)
2272is, however, not critical, since this procedure is only meant to find a rough
2273vertex position, in order to introduce some cut along $z$.
2275To find the final vertex position,
2276the correlation between the points $z_1$, $z_2$ in the two layers
2277was considered. More details and performance studies are available in
2280The primary vertex is reconstructed by a vertexer object derived from
2281\class{AliVertexer}. After the local reconstruction was done for all detectors
2282the vertexer method \method{FindVertexForCurrentEvent} is called for each
2283event. It returns a pointer to a vertex object of type \class{AliESDVertex}.
2285The vertexer object is created by the method \method{CreateVertexer} of the
2286reconstructor. So far only the ITS is used to determine the primary
2287vertex (\class{AliITSVertexerZ} class).
2289The precision of the primary vertex reconstruction in the bending plane
2290required for the reconstruction of D and B mesons in pp events
2291can be achieved only after the tracking is done. The method is
2292implemented in \class{AliITSVertexerTracks}. It is called as a second
2293estimation of the primary vertex. The details of the algorithm can be
2294found in Appendix \ref{VertexerTracks}.
2297\textbf{Combined Track Reconstruction}
2298The combined track reconstruction tries to accumulate the information from
2299different detectors in order to optimize the track reconstruction performance.
2300The result of this is stored in the combined track objects.
2301The \class{AliESDTrack} class also
2302provides the possibility to exchange information between detectors
2303without introducing dependencies between the reconstruction modules.
2304This is achieved by using just integer indexes pointing to the
2305specific track objects, which on the other hand makes it possible to
2306retrieve the full information if needed.
2307The list of combined tracks can be kept in memory and passed from one
2308reconstruction module to another.
2309The storage of the combined tracks should be done in the standard way.
2311The classes responsible for the reconstruction of tracks are derived
2312from \class{AliTracker}. They are created by the method
2313\method{CreateTracker} of the
2314reconstructors. The reconstructed position of the primary vertex is
2315made available to them via the method \method{SetVertex}. Before the track
2316reconstruction in a detector starts the clusters are loaded from the
2317clusters tree by the method \method{LoadClusters}. After the track reconstruction the
2318clusters are unloaded by the method \method{UnloadClusters}.
2320The track reconstruction (in the barrel part) is done in three passes. The first
2321pass consists of a track finding and fitting in inward direction in
2322TPC and then in ITS. The virtual method \method{Clusters2Tracks} (of
2323class \class{AliTracker}) is the
2324interface to this pass. The method for the next pass is
2325\method{PropagateBack}. It does the track reconstruction in outward direction and is
2326invoked for all detectors starting with the ITS. The last pass is the
2327track refit in inward direction in order to get the track parameters
2328at the vertex. The corresponding method \method{RefitInward} is called for TRD,
2329TPC and ITS. All three track reconstruction methods have an AliESD object as
2330argument which is used to exchange track information between detectors
2331without introducing dependences between the code of the detector
2334Depending on the way the information is used, the tracking methods can be
2335divided into two large groups: global methods and local methods. Each
2336group has advantages and disadvantages.
2338With the global methods, all the track measurements are treated
2339simultaneously and the decision to include or exclude a measurement is
2340taken when all the information about the track is known.
2341Typical algorithms belonging to this class are combinatorial methods,
2342Hough transform, templates, conformal mappings. The advantages are
2343the stability with respect to noise and mismeasurements and the possibility
2344to operate directly on the raw data. On the other hand, these methods
2345require a precise global track model. Such a track model can sometimes be
2346unknown or does not even exist because of stochastic processes (energy
2347losses, multiple scattering), non-uniformity of the magnetic field etc.
2348In ALICE, global tracking methods are being extensively used in the
2349High-Level Trigger (HLT) software. There, we
2350are mostly interested in the reconstruction of the high-momentum tracks
2351only, the required precision is not crucial, but the speed of the
2352calculations is of great importance.
2355Local methods do not need the knowledge of the global track model.
2356The track parameters are always estimated `locally' at a given point
2357in space. The decision to accept or to reject a measurement is made using
2358either the local information or the information coming from the previous
2359`history' of this track. With these methods, all the local track
2360peculiarities (stochastic physics processes, magnetic fields, detector
2361geometry) can be naturally accounted for. Unfortunately, the local methods
2362rely on sophisticated space point reconstruction algorithms (including
2363unfolding of overlapped clusters). They are sensitive to noise, wrong or
2364displaced measurements and the precision of space point error parameterization.
2365The most advanced kind of local track-finding methods is Kalman
2366filtering which was introduced by P. Billoir in 1983~\cite{MC:billoir}.
2370When applied to the track reconstruction problem, the Kalman-filter
2371approach shows many attractive properties:
2374\item It is a method for simultaneous track recognition and
2375 fitting.
2377\item There is a possibility to reject incorrect space points `on
2378 the fly', during a single tracking pass. These incorrect points can
2379 appear as a consequence of the imperfection of the cluster finder or
2380 they may be due to noise or they may be points from other tracks
2381 accidentally captured in the list of points to be associated with
2382 the track under consideration. In the other tracking methods one
2383 usually needs an additional fitting pass to get rid of incorrectly
2384 assigned points.
2386\item In the case of substantial multiple scattering, track
2387 measurements are correlated and therefore large matrices (of the
2388 size of the number of measured points) need to be inverted during
2389 a global fit. In the Kalman-filter procedure we only have to
2390 manipulate up to $5 \times 5$ matrices (although as many times as
2391 we have measured space points), which is much faster.
2393\item One can handle multiple scattering and
2394 energy losses in a simpler way than in the case of global
2395 methods. At each step the material budget can be calculated and the
2396 mean correction calculated accordingly.
2398\item It is a natural way to find the extrapolation
2399 of a track from one detector to another (for example from the TPC
2400 to the ITS or to the TRD).
2404In ALICE we require good track-finding efficiency and reconstruction
2405precision for track down to \mbox{\pt = 100 MeV/$c$.} Some of the ALICE
2406tracking detectors (ITS, TRD) have a significant material budget.
2407Under such conditions one can not neglect the energy losses or the multiple
2408scattering in the reconstruction. There are also rather
2409big dead zones between the tracking detectors which complicates finding
2410the continuation of the same track. For all these reasons,
2411it is the Kalman-filtering approach that has been our choice for the
2412offline reconstruction since 1994.
2414% \subsubsection{General tracking strategy}
2416The reconstruction software for the ALICE central tracking detectors (the
2417ITS, TPC and the TRD) shares a common convention on the coordinate
2418system used. All the clusters and tracks are always expressed in some local
2419coordinate system related to a given sub-detector (TPC sector, ITS module
2420etc). This local coordinate system is defined as the following:
2422\item It is a right handed-Cartesian coordinate system;
2423\item its origin and the $z$ axis coincide with those of the global
2424 ALICE coordinate system;
2425\item the $x$ axis is perpendicular to the sub-detector's `sensitive plane'
2426 (TPC pad row, ITS ladder etc).
2428Such a choice reflects the symmetry of the ALICE set-up
2429and therefore simplifies the reconstruction equations.
2430It also enables the fastest possible transformations from
2431a local coordinate system to the global one and back again,
2432since these transformations become simple single rotations around the
2436The reconstruction begins with cluster finding in all of the ALICE central
2437detectors (ITS, TPC, TRD, TOF, HMPID and PHOS). Using the clusters
2438reconstructed at the two pixel layers of the ITS, the position of the
2439primary vertex is estimated and the track finding starts. As
2440described later, cluster-finding as well as the track-finding procedures
2441performed in the detectors have some different detector-specific features.
2442Moreover, within a given detector, on account of high occupancy and a big
2443number of overlapped clusters, the cluster finding and the track finding are
2444not completely independent: the number and positions of the clusters are
2445completely determined only at the track-finding step.
2447The general tracking strategy is the following. We start from our
2448best tracker device, i.e. the TPC, and from the outer radius where the
2449track density is minimal. First, the track candidates (`seeds') are
2450found. Because of the small number of clusters assigned to a seed, the
2451precision of its parameters is not enough to safely extrapolate it outwards
2452to the other detectors. Instead, the tracking stays within the TPC and
2453proceeds towards the smaller TPC radii. Whenever
2454possible, new clusters are associated with a track candidate
2455at each step of the Kalman filter if they are within a given distance
2456from the track prolongation and the track parameters are more and
2457more refined. When all of the seeds are extrapolated to the inner limit of
2458the TPC, proceeds into the ITS. The ITS tracker tries to prolong
2459the TPC tracks as close as possible to the primary vertex.
2460On the way to the primary vertex, the tracks are assigned additional,
2461precisely reconstructed ITS clusters, which also improves
2462the estimation of the track parameters.
2464After all the track candidates from the TPC are assigned their clusters
2465in the ITS, a special ITS stand-alone tracking procedure is applied to
2466the rest of the ITS clusters. This procedure tries to recover the
2467tracks that were not found in the TPC because of the \pt cut-off, dead zones
2468between the TPC sectors, or decays.
2470At this point the tracking is restarted from the vertex back to the
2471outer layer of the ITS and then continued towards the outer wall of the
2472TPC. For the track that was labeled by the ITS tracker as potentially
2473primary, several particle-mass-dependent, time-of-flight hypotheses
2474are calculated. These hypotheses are then used for the particle
2475identification (PID) with the TOF detector. Once the outer
2476radius of the TPC is reached, the precision of the estimated track
2477parameters is
2478sufficient to extrapolate the tracks to the TRD, TOF, HMPID and PHOS
2479detectors. Tracking in the TRD is done in a similar way to that
2480in the TPC. Tracks are followed till the outer wall of the TRD and the
2481assigned clusters improve the momentum resolution further.
2482% Next, after the
2483% matching with the TOF, HMPID and PHOS is done, and the tracks aquire
2484% additional PID information.
2485Next, the tracks are extrapolated to the TOF, HMPID and PHOS, where they
2486acquire the PID information.
2487Finally, all the tracks are refitted with the Kalman filter backwards to
2488the primary vertex (or to the innermost possible radius, in the case of
2489the secondary tracks). This gives the most precise information about
2490the track parameters at the point where the track appeared.
2492The tracks that passed the final refit towards the primary vertex are used
2493for the secondary vertex (V$^0$, cascade, kink) reconstruction. There is also
2494an option to reconstruct the secondary vertexes `on the fly' during the
2495tracking itself. The potential advantage of such a possibility is that
2496the tracks coming from a secondary vertex candidate are not extrapolated
2497beyond the vertex, thus minimizing the risk of picking up a wrong track
2498prolongation. This option is currently under investigation.
2500The reconstructed tracks (together with the PID information), kink, V$^0$
2501and cascade particle decays are then stored in the Event Summary Data (ESD).
2503More details about the reconstruction algorithms can be found in
2504Chapter 5 of the ALICE Physics Performance Report\cite{PPRVII}.
2507\textbf{Filling of ESD}
2509After the tracks were reconstructed and stored in the \class{AliESD} object,
2510further information is added to the ESD. For each detector the method
2511\method{FillESD} of the reconstructor is called. Inside this method e.g. V0s
2512are reconstructed or particles are identified (PID). For the PID a
2513Bayesian approach is used (see Appendix \ref{BayesianPID}. The constants
2514and some functions that are used for the PID are defined in the class
2518\textbf{Monitoring of Performance}
2520For the monitoring of the track reconstruction performance the classes
2521\class{AliTrackReference} are used.
2522Objects of the second type of class are created during the
2523reconstruction at the same locations as the \class{AliTrackReference}
2525So the reconstructed tracks can be easily compared with the simulated
2527This allows to study and monitor the performance of the track reconstruction in detail.
2528The creation of the objects used for the comparison should not
2529interfere with the reconstruction algorithm and can be switched on or
2532Several ``comparison'' macros permit to monitor the efficiency and the
2533resolution of the tracking. Here is a typical usage (the simulation
2534and the reconstruction have been done in advance):
2537 aliroot
2538 root [0] gSystem->SetIncludePath("-I$ROOTSYS/include \
2539 -I$ALICE_ROOT/include \
2543 root [1] .L $ALICE_ROOT/TPC/AliTPCComparison.C++
2544 root [2] .L $ALICE_ROOT/ITS/AliITSComparisonV2.C++
2545 root [3] .L $ALICE_ROOT/TOF/AliTOFComparison.C++
2546 root [4] AliTPCComparison()
2547 root [5] AliITSComparisonV2()
2548 root [6] AliTOFComparison()
2551Another macro can be used to provide a preliminary estimate of the
2552combined acceptance: \texttt{STEER/CheckESD.C}.
2556The following classes are used in the reconstruction:
2558\item \class{AliTrackReference}:
2559 This class is used to store the position and the momentum of a
2560 simulated particle at given locations of interest (e.g. when the
2561 particle enters or exits a detector or it decays). It is used for
2562 mainly for debugging and tuning of the tracking.
2564\item \class{AliExternalTrackParams}:
2565 This class describes the status of a track in a given point.
2566 It knows the track parameters and its covariance matrix.
2567 This parameterization is used to exchange tracks between the detectors.
2568 A set of functions returning the position and the momentum of tracks
2569 in the global coordinate system as well as the track impact parameters
2570 are implemented. There is possibility to propagate the track to a
2571 given radius \method{PropagateTo} and \method{Propagate}.
2573\item \class{AliKalmanTrack} and derived classes:
2574 These classes are used to find and fit tracks with the Kalman approach.
2575 The \class{AliKalmanTrack} defines the interfaces and implements some
2576 common functionality. The derived classes know about the clusters
2577 assigned to the track. They also update the information in an
2578 \class{AliESDtrack}.
2579 The current status of the track during the track reconstruction can be
2580 represented by an \class{AliExternalTrackParameters}.
2581 The history of the track during the track reconstruction can be stored
2582 in a list of \class{AliExternalTrackParameters} objects.
2583 The \class{AliKalmanTrack} defines the methods:
2584 \begin{itemize}
2585 \item \method{Double\_t GetDCA(...)} Returns the distance
2586 of closest approach between this track and the track passed as the
2587 argument.
2588 \item \method{Double\_t MeanMaterialBudget(...)} Calculate the mean
2589 material budget and material properties between two points.
2590 \end{itemize}
2592\item \class{AliTracker} and subclasses:
2593 The \class{AliTracker} is the base class for all the trackers in the
2594 different detectors. It fixes the interface needed to find and
2595 propagate tracks. The actual implementation is done in the derived classes.
2597\item \class{AliESDTrack}:
2598 This class combines the information about a track from different detectors.
2599 It knows the current status of the track
2600 (\class{AliExternalTrackParameters}) and it has (non-persistent) pointers
2601 to the individual \class{AliKalmanTrack} objects from each detector
2602 which contributed to the track.
2603 It knows about some detector specific quantities like the number or
2604 bit pattern of assigned clusters, dEdx, $\chi^2$, etc..
2605 And it can calculate a conditional probability for a given mixture of
2606 particle species following the Bayesian approach.
2607 It defines a track label pointing to the corresponding simulated
2608 particle in case of \MC.
2609 The combined track objects are the basis for a physics analysis.
2616The example below shows reconstruction with non-uniform magnetic field
2617(the simulation is also done with non-uniform magnetic field by adding
2618the following line in the Config.C: field$\to$SetL3ConstField(1)). Only
2619the barrel detectors are reconstructed, a specific TOF reconstruction
2620has been requested, and the RAW data have been used:
2623 void rec() {
2624 AliReconstruction reco;
2626 reco.SetRunReconstruction("ITS TPC TRD TOF");
2627 reco.SetNonuniformFieldTracking();
2628 reco.SetInput("raw.root");
2630 reco.Run();
2631 }
2634% -----------------------------------------------------------------------------
2636\subsection{Event summary data}\label{ESD}
2638The classes which are needed to process and analyze the ESD are packed
2639together in a standalone library (libESD.so) which can be used
2640separately from the \aliroot framework. Inside each
2641ESD object the data is stored in polymorphic containers filled with
2642reconstructed tracks, neutral particles, etc. The main class is
2643\class{AliESD}, which contains all the information needed during the
2644physics analysis:
2647\item fields to identify the event such as event number, run number,
2648 time stamp, type of event, trigger type (mask), trigger cluster (mask),
2649 version of reconstruction, etc.;
2650\item reconstructed ZDC energies and number of participant;
ababa197 2651\item primary vertex information: vertex z position estimated by the T0,
c4593ee3 2652 primary vertex estimated by the SPD, primary vertex estimated using
2653 ESD tracks;
2654\item SPD tracklet multiplicity;
ababa197 2655\item interaction time estimated by the T0 together with additional
2656 time and amplitude information from T0;
c4593ee3 2657\item array of ESD tracks;
2658\item arrays of HLT tracks both from the conformal mapping and from
2659 the Hough transform reconstruction;
2660\item array of MUON tracks;
2661\item array of PMD tracks;
2662\item array of TRD ESD tracks (triggered);
2663\item arrays of reconstructed $V^0$ vertexes, cascade decays and
2664 kinks;
2665\item array of calorimeter clusters for PHOS/EMCAL;
2666\item indexes of the information from PHOS and EMCAL detectors in the
2667 array above.
2676% -----------------------------------------------------------------------------
2679The analysis of experimental data is the final stage of event
2680processing and it is usually repeated many times. Analysis is a very diverse
2681activity, where the goals of each
2682particular analysis pass may differ significantly.
2684The ALICE detector \cite{PPR} is optimized for the
2685reconstruction and analysis of heavy-ion collisions.
2686In addition, ALICE has a broad physics programme devoted to
2687\pp and \pA interactions.
2690The data analysis is coordinated by the Physics Board via the Physics
2691Working Groups (PWGs). At present the following PWG have started
2692their activity:
2695\item PWG0 \textbf{first physics};
2696\item PWG1 \textbf{detector performance};
2697\item PWG2 \textbf{global event characteristics:} particle multiplicity,
2698 centrality, energy density, nuclear stopping; \textbf{soft physics:} chemical composition (particle and resonance
2699 production, particle ratios and spectra, strangeness enhancement),
2700 reaction dynamics (transverse and elliptic flow, HBT correlations,
2701 event-by-event dynamical fluctuations);
2702\item PWG3 \textbf{heavy flavors:} quarkonia, open charm and beauty production.
2703\item PWG4 \textbf{hard probes:} jets, direct photons;
2706Each PWG has corresponding module in AliRoot (PWG0 -- PWG4). The code
2707is managed by CVS administrators.
2709The \pp and \pA programme will provide, on the one hand, reference points
2710for comparison with heavy ions. On the other hand, ALICE will also
2711pursue genuine and detailed \pp studies. Some
2712quantities, in particular the global characteristics of interactions, will
2713be measured during the first days of running exploiting the low-momentum
2714measurement and particle identification capabilities of ALICE.
2716The ALICE computing framework is described in details in the Computing
2717Technical Design Report \cite{CompTDR}. This article is based on
2718Chapter 6 of the document.
2721\paragraph{The analysis activity.}
2723We distinguish two main types of analysis: scheduled analysis and
2724chaotic analysis. They differ in their data access pattern, in the
2725storage and registration of the results, and in the frequency of
2726changes in the analysis code {more details are available below).
2728In the ALICE Computing Model the analysis starts from the Event Summary
2729Data (ESD). These are produced during the reconstruction step and contain
2730all the information for the analysis. The size of the ESD is
2731about one order of magnitude lower than the corresponding raw
2732data. The analysis tasks produce Analysis
2733Object Data (AOD) specific to a given set of physics objectives.
2734Further passes for the specific analysis activity can be performed on
2735the AODs, until the selection parameter or algorithms are changed.
2737A typical data analysis task usually requires processing of
2738selected sets of events. The selection is based on the event
2739topology and characteristics, and is done by querying the tag
2740database. The tags represent physics quantities which characterize
2741each run and event, and permit fast selection. They are created
2742after the reconstruction and contain also the unique
2743identifier of the ESD file. A typical query, when translated into
2744natural language, could look like ``Give me
2745all the events with impact parameter in $<$range$>$
2746containing jet candidates with energy larger than $<$threshold$>$''.
2747This results in a list of events and file identifiers to be used in the
2748consecutive event loop.
2751The next step of a typical analysis consists of a loop over all the events
2752in the list and calculation of the physics quantities of
2753interest. Usually, for each event, there is a set of embedded loops on the
2754reconstructed entities such as tracks, ${\rm V^0}$ candidates, neutral
2755clusters, etc., the main goal of which is to select the signal
2756candidates. Inside each loop a number of criteria (cuts) are applied to
2757reject the background combinations and to select the signal ones. The
2758cuts can be based on geometrical quantities such as impact parameters
2759of the tracks with
2760respect to the primary vertex, distance between the cluster and the
2761closest track, distance of closest approach between the tracks,
2762angle between the momentum vector of the particle combination
2763and the line connecting the production and decay vertexes. They can
2764also be based on
2765kinematics quantities such as momentum ratios, minimal and maximal
2766transverse momentum,
2767angles in the rest frame of the particle combination.
2768Particle identification criteria are also among the most common
2769selection criteria.
2771The optimization of the selection criteria is one of the most
2772important parts of the analysis. The goal is to maximize the
2773signal-to-background ratio in case of search tasks, or another
2774ratio (typically ${\rm Signal/\sqrt{Signal+Background}}$) in
2775case of measurement of a given property. Usually, this optimization is
2776performed using simulated events where the information from the
2777particle generator is available.
2779After the optimization of the selection criteria, one has to take into
2780account the combined acceptance of the detector. This is a complex,
2781analysis-specific quantity which depends on the geometrical acceptance,
2782the trigger efficiency, the decays of particles, the reconstruction
2783efficiency, the efficiency of the particle identification and of the
2784selection cuts. The components of the combined acceptance are usually
2785parameterized and their product is used to unfold the experimental
2786distributions or during the simulation of some model parameters.
2788The last part of the analysis usually involves quite complex
2789mathematical treatments, and sophisticated statistical tools. Here one
2790may include the correction for systematic effects, the estimation of
2791statistical and systematic errors, etc.
2794\paragraph{Scheduled analysis.}
2796The scheduled analysis typically uses all
2797the available data from a given period, and stores and registers the results
2798using \grid middleware. The tag database is updated accordingly. The
2799AOD files, generated during the scheduled
2800analysis, can be used by several subsequent analyses, or by a class of
2801related physics tasks.
2802The procedure of scheduled analysis is centralized and can be
2803considered as data filtering. The requirements come from the PWGs and
2804are prioritized by the Physics Board taking into
2805account the available computing and storage resources. The analysis
2806code is tested in advance and released before the beginning of the
2807data processing.
2809Each PWG will require some sets of
2810AOD per event, which are specific for one or
2811a few analysis tasks. The creation of the AOD sets is managed centrally.
2812The event list of each AOD set
2813will be registered and the access to the AOD files will be granted to
2814all ALICE collaborators. AOD files will be generated
2815at different computing centers and will be stored on
2816the corresponding storage
2817elements. The processing of each file set will thus be done in a
2818distributed way on the \grid. Some of the AOD sets may be quite small
2819and would fit on a single storage element or even on one computer; in
2820this case the corresponding tools for file replication, available
2821in the ALICE \grid infrastructure, will be used.
2824\paragraph{Chaotic analysis.}
2826The chaotic analysis is focused on a single physics task and
2827typically is based on the filtered data from the scheduled
2828analysis. Each physicist also
2829may access directly large parts of the ESD in order to search for rare
2830events or processes.
2831Usually the user develops the code using a small subsample
2832of data, and changes the algorithms and criteria frequently. The
2833analysis macros and software are tested many times on relatively
2834small data volumes, both experimental and \MC.
2835The output is often only a set of histograms.
2836Such a tuning of the analysis code can be done on a local
2837data set or on distributed data using \grid tools. The final version
2838of the analysis
2839will eventually be submitted to the \grid and will access large
2840portions or even
2841the totality of the ESDs. The results may be registered in the \grid file
2842catalog and used at later stages of the analysis.
2843This activity may or may not be coordinated inside
2844the PWGs, via the definition of priorities. The
2845chaotic analysis is carried on within the computing resources of the
2846physics groups.
2849% -----------------------------------------------------------------------------
2851\subsection{Infrastructure tools for distributed analysis}
2855The main infrastructure tools for distributed analysis have been
2856described in Chapter 3 of the Computing TDR\cite{CompTDR}. The actual
2857middleware is hidden by an interface to the \grid,
2858gShell\cite{CH6Ref:gShell}, which provides a
2859single working shell.
2860The gShell package contains all the commands a user may need for file
2861catalog queries, creation of sub-directories in the user space,
2862registration and removal of files, job submission and process
2863monitoring. The actual \grid middleware is completely transparent to
2864the user.
2866The gShell overcomes the scalability problem of direct client
2867connections to databases. All clients connect to the
2868gLite\cite{CH6Ref:gLite} API
2869services. This service is implemented as a pool of preforked server
2870daemons, which serve single-client requests. The client-server
2871protocol implements a client state which is represented by a current
2872working directory, a client session ID and time-dependent symmetric
2873cipher on both ends to guarantee client privacy and security. The
2874server daemons execute client calls with the identity of the connected
2877\subsubsection{PROOF -- the Parallel ROOT Facility}
2879The Parallel ROOT Facility, PROOF\cite{CH6Ref:PROOF} has been specially
2880designed and developed
2881to allow the analysis and mining of very large data sets, minimizing
2882response time. It makes use of the inherent parallelism in event data
2883and implements an architecture that optimizes I/O and CPU utilization
2884in heterogeneous clusters with distributed storage. The system
2885provides transparent and interactive access to terabyte-scale data
2886sets. Being part of the ROOT framework, PROOF inherits the benefits of
2887a performing object storage system and a wealth of statistical and
2888visualization tools.
2889The most important design features of PROOF are:
2892\item transparency -- no difference between a local ROOT and
2893 a remote parallel PROOF session;
2894\item scalability -- no implicit limitations on number of computers
2895 used in parallel;
2896\item adaptability -- the system is able to adapt to variations in the
2897 remote environment.
2900PROOF is based on a multi-tier architecture: the ROOT client session,
2901the PROOF master server, optionally a number of PROOF sub-master
2902servers, and the PROOF worker servers. The user connects from the ROOT
2903session to a master server on a remote cluster and the master server
2904creates sub-masters and worker servers on all the nodes in the
2905cluster. All workers process queries in parallel and the results are
2906presented to the user as coming from a single server.
2908PROOF can be run either in a purely interactive way, with the user
2909remaining connected to the master and worker servers and the analysis
2910results being returned to the user's ROOT session for further
2911analysis, or in an `interactive batch' way where the user disconnects
2912from the master and workers (see Fig.~\vref{CH3Fig:alienfig7}). By
2913reconnecting later to the master server the user can retrieve the
2914analysis results for that particular
2915query. This last mode is useful for relatively long running queries
2916(several hours) or for submitting many queries at the same time. Both
2917modes will be important for the analysis of ALICE data.
2920 \centering
2921 \includegraphics[width=11.5cm]{picts/alienfig7}
2922 \caption{Setup and interaction with the \grid middleware of a user
2923 PROOF session distributed over many computing centers.}
2924 \label{CH3Fig:alienfig7}
2927% -----------------------------------------------------------------------------
2929\subsection{Analysis tools}
2931This section is devoted to the existing analysis tools in \ROOT and
2932\aliroot. As discussed in the introduction, some very broad
2933analysis tasks include the search for some rare events (in this case the
2934physicist tries to maximize the signal-over-background ratio), or
2935measurements where it is important to maximize the signal
2936significance. The tools that provide possibilities to apply certain
2937selection criteria and to find the interesting combinations within
2938a given event are described below. Some of them are very general and are
2939used in many different places, for example the statistical
2940tools. Others are specific to a given analysis.
2942\subsubsection{Statistical tools}
2944Several commonly used statistical tools are available in
2945\ROOT\cite{ROOT}. \ROOT provides
2946classes for efficient data storage and access, such as trees
2947and ntuples. The
2948ESD information is organized in a tree, where each event is a separate
2949entry. This allows a chain of the ESD files to be made and the
2950elaborated selector mechanisms to be used in order to exploit the PROOF
2951services. The tree classes
2952permit easy navigation, selection, browsing, and visualization of the
2953data in the branches.
2955\ROOT also provides histogramming and fitting classes, which are used
2956for the representation of all the one- and multi-dimensional
2957distributions, and for extraction of their fitted parameters. \ROOT provides
2958an interface to powerful and robust minimization packages, which can be
2959used directly during some special parts of the analysis. A special
2960fitting class allows one to decompose an experimental histogram as a
2961superposition of source histograms.
2963\ROOT also has a set of sophisticated statistical analysis tools such as
2964principal component analysis, robust estimator, and neural networks.
2965The calculation of confidence levels is provided as well.
2967Additional statistical functions are included in \texttt{TMath}.
2969\subsubsection{Calculations of kinematics variables}
2971The main \ROOT physics classes include 3-vectors and Lorentz
2972vectors, and operations
2973such as translation, rotation, and boost. The calculations of
2974kinematics variables
2975such as transverse and longitudinal momentum, rapidity,
2976pseudorapidity, effective mass, and many others are provided as well.
2979\subsubsection{Geometrical calculations}
2981There are several classes which can be used for
2982measurement of the primary vertex: \texttt{AliITSVertexerZ},
2983\texttt{AliITSVertexerIons}, \texttt{AliITSVertexerTracks}, etc. A fast estimation of the {\it z}-position can be
2984done by \texttt{AliITSVertexerZ}, which works for both lead--lead
2985and proton--proton collisions. An universal tool is provided by
2986\texttt{AliITSVertexerTracks}, which calculates the position and
2987covariance matrix of the primary vertex based on a set of tracks, and
2988also estimates the $\chi^2$ contribution of each track. An iterative
2989procedure can be used to remove the secondary tracks and improve the
2992Track propagation to the primary vertex (inward) is provided in
2995The secondary vertex reconstruction in case of ${\rm V^0}$ is provided by
2996\texttt{AliV0vertexer}, and in case of cascade hyperons by
76b461ed 2997\texttt{AliCascadeVertexer}.
2998\texttt{AliITSVertexerTracks} can be used to find secondary
c4593ee3 2999vertexes close to the primary one, for example decays of open charm
3000like ${\rm D^0 \to K^- \pi^+}$ or ${\rm D^+ \to K^- \pi^+ \pi^+}$. All
3001the vertex
3002reconstruction classes also calculate distance of closest approach (DCA)
3003between the track and the vertex.
3005The calculation of impact parameters with respect to the primary vertex
3006is done during the reconstruction and the information is available in
3007\texttt{AliESDtrack}. It is then possible to recalculate the
3008impact parameter during the ESD analysis, after an improved determination
3009of the primary vertex position using reconstructed ESD tracks.
3011\subsubsection{Global event characteristics}
3013The impact parameter of the interaction and the number of participants
3014are estimated from the energy measurements in the ZDC. In addition,
3015the information from the FMD, PMD, and T0 detectors is available. It
3016gives a valuable estimate of the event multiplicity at high rapidities
3017and permits global event characterization. Together with the ZDC
3018information it improves the determination of the impact parameter,
3019number of participants, and number of binary collisions.
3021The event plane orientation is calculated by the \texttt{AliFlowAnalysis} class.
3023\subsubsection{Comparison between reconstructed and simulated parameters}
3025The comparison between the reconstructed and simulated parameters is
3026an important part of the analysis. It is the only way to estimate the
3027precision of the reconstruction. Several example macros exist in
3028\aliroot and can be used for this purpose: \texttt{AliTPCComparison.C},
3029\texttt{AliITSComparisonV2.C}, etc. As a first step in each of these
3030macros the list of so-called `good tracks' is built. The definition of
3031a good track is explained in detail in the ITS\cite{CH6Ref:ITS_TDR} and
3032TPC\cite{CH6Ref:TPC_TDR} Technical Design
3033Reports. The essential point is that the track
3034goes through the detector and can be reconstructed. Using the `good
3035tracks' one then estimates the efficiency of the reconstruction and
3036the resolution.
3038Another example is specific to the MUON arm: the \texttt{MUONRecoCheck.C}
3039macro compares the reconstructed muon tracks with the simulated ones.
3041There is also the possibility to calculate directly the resolutions without
3042additional requirements on the initial track. One can use the
3043so-called track label and retrieve the corresponding simulated
3044particle directly from the particle stack (\texttt{AliStack}).
3046\subsubsection{Event mixing}
3048One particular analysis approach in heavy-ion physics is the
3049estimation of the combinatorial background using event mixing. Part of the
3050information (for example the positive tracks) is taken from one
3051event, another part (for example the negative tracks) is taken from
3052a different, but
3053`similar' event. The event `similarity' is very important, because
3054only in this case the combinations produced from different events
3055represent the combinatorial background. Typically `similar' in
3056the example above means with the same multiplicity of negative
3057tracks. One may require in addition similar impact parameters of the
3058interactions, rotation of the tracks of the second event to adjust the
3059event plane, etc. The possibility for event mixing is provided in
3060\aliroot by the fact that the ESD is stored in trees and one can chain
3061and access simultaneously many ESD objects. Then the first pass would
3062be to order the events according to the desired criterion of
3063`similarity' and to use the obtained index for accessing the `similar'
3064events in the embedded analysis loops. An example of event mixing is
3065shown in Fig.~\ref{CH6Fig:phipp}. The background distribution has been
3066obtained using `mixed events'. The signal distribution has been taken
3067directly from the \MC simulation. The `experimental distribution' has
3068been produced by the analysis macro and decomposed as a
3069superposition of the signal and background histograms.
3072 \centering
3073 \includegraphics*[width=120mm]{picts/phipp}
3074 \caption{Mass spectrum of the ${\rm \phi}$ meson candidates produced
3075 inclusively in the proton--proton interactions.}
3076 \label{CH6Fig:phipp}
3080\subsubsection{Analysis of the High-Level Trigger (HLT) data}
3082This is a specific analysis which is needed in order to adjust the cuts
3083in the HLT code, or to estimate the HLT
3084efficiency and resolution. \aliroot provides a transparent way of doing
3085such an analysis, since the HLT information is stored in the form of ESD
3086objects in a parallel tree. This also helps in the monitoring and
3087visualization of the results of the HLT algorithms.
3091\subsubsection{EVE -- Event Visualization Environment}
3093EVE is composed of:
3095\item small application kernel;
3096\item graphics classes with editors and OpenGL renderers;
3097\item CINT scripts that extract data, fill graphics classes and register
3098 them to the application.
3101The framework is still evolving ... some things might not work as expected.
3106\item Initialize ALICE environment.
3107\item Spawn 'alieve' executable and invoke the alieve\_init.C macro,
3108 for example:
3110To load first event from current directory:
3112 # alieve alieve\_init.C
3114To load 5th event from directory /data/my-pp-run:
3116 # alieve 'alieve\_init.C("/data/my-pp-run", 5)'
3120 # alieve
3121 root[0] .L alieve\_init.C
3122 root[1] alieve\_init("/somedir")
3125\item Use GUI or CINT command-line to invoke further visualization macros.
3126\item To navigate the events use macros 'event\_next.C' and 'event\_prev.C'.
3127 These are equivalent to the command-line invocations:
3129 root[x] Alieve::gEvent->NextEvent()
3133 root[x] Alieve::gEvent->PrevEvent()
3135The general form to go to event via its number is:
3137 root[x] Alieve::gEvent->GotoEvent(<event-number>)
3141See files in EVE/alice-macros/. For specific uses these should be
3142edited to suit your needs.
3144\underline{Directory structure}
3146EVE is split into two modules: REVE (ROOT part, not dependent on
3147AliROOT) and ALIEVE (ALICE specific part). For the time being both
3148modules are kept in AliROOT CVS.
3150Alieve/ and Reve/ -- sources
3152macros/ -- macros for bootstraping and internal steering\\
3153alice-macros/ -- macros for ALICE visualization\\
3154alica-data/ -- data files used by ALICE macros\\
3155test-macros/ -- macros for tests of specific features; usually one needs
3156 to copy and edit them\\
3157bin/, Makefile and make\_base.inc are used for stand-alone build of the
3163\item Problems with macro-execution
3165A failed macro-execution can leave CINT in a poorly defined state that
3166prevents further execution of macros. For example:
3169 Exception Reve::Exc_t: Event::Open failed opening ALICE ESDfriend from
3170 '/alice-data/coctail_10k/AliESDfriends.root'.
3172 root [1] Error: Function MUON_geom() is not defined in current scope :0:
3173 *** Interpreter error recovered ***
3174 Error: G__unloadfile() File "/tmp/MUON_geom.C" not loaded :0:
3177'gROOT$\to$Reset()' helps in most of the cases.
3180% ------------------------------------------------------------------------------
3183\subsection{Existing analysis examples in \aliroot}
3185There are several dedicated analysis tools available in \aliroot. Their results
3186were used in the Physics Performance Report and described in
3187ALICE internal notes. There are two main classes of analysis: the
3188first one based directly on ESD, and the second one extracting first
3189AOD, and then analyzing it.
3192\item\textbf{ESD analysis }
3194 \begin{itemize}
3195 \item[ ] \textbf{${\rm V^0}$ and cascade reconstruction/analysis}
3197 The ${\rm V^0}$ candidates
3198 are reconstructed during the combined barrel tracking and stored in
3199 the ESD object. The following criteria are used for the selection:
3200 minimal-allowed impact parameter (in the transverse plane) for each
3201 track; maximal-allowed DCA between the two tracks; maximal-allowed
3202 cosine of the
3203 ${\rm V^0}$ pointing angle
3204 (angle between the momentum vector of the particle combination
3205 and the line connecting the production and decay vertexes); minimal
3206 and maximal radius of the fiducial volume; maximal-allowed ${\rm
3207 \chi^2}$. The
3208 last criterion requires the covariance matrix of track parameters,
3209 which is available only in \texttt{AliESDtrack}. The reconstruction
3210 is performed by \texttt{AliV0vertexer}. This class can be used also
3211 in the analysis. An example of reconstructed kaons taken directly
3212 from the ESDs is shown in Fig.\ref{CH6Fig:kaon}.
3214 \begin{figure}[th]
3215 \centering
3216 \includegraphics*[width=120mm]{picts/kaon}
3217 \caption{Mass spectrum of the ${\rm K_S^0}$ meson candidates produced
3218 inclusively in the \mbox{Pb--Pb} collisions.}
3219 \label{CH6Fig:kaon}
3220 \end{figure}
3222 The cascade hyperons are reconstructed using the ${\rm V^0}$ candidate and
3223 `bachelor' track selected according to the cuts above. In addition,
3224 one requires that the reconstructed ${\rm V^0}$ effective mass belongs to
3225 a certain interval centered in the true value. The reconstruction
3226 is performed by \texttt{AliCascadeVertexer}, and this class can be
3227 used in the analysis.
3229 \item[ ] \textbf{Open charm}
3231 This is the second elaborated example of ESD
3232 analysis. There are two classes, \texttt{AliD0toKpi} and
3233 \texttt{AliD0toKpiAnalysis}, which contain the corresponding analysis
3234 code. The decay under investigation is ${\rm D^0 \to K^- \pi^+}$ and its
3235 charge conjugate. Each ${\rm D^0}$ candidate is formed by a positive and
3236 a negative track, selected to fulfill the following requirements:
3237 minimal-allowed track transverse momentum, minimal-allowed track
3238 impact parameter in the transverse plane with respect to the primary
3239 vertex. The selection criteria for each combination include
3240 maximal-allowed distance of closest approach between the two tracks,
3241 decay angle of the kaon in the ${\rm D^0}$ rest frame in a given region,
3242 product of the impact parameters of the two tracks larger than a given value,
3243 pointing angle between the ${\rm D^0}$ momentum and flight-line smaller than
3244 a given value. The particle
3245 identification probabilities are used to reject the wrong
3246 combinations, namely ${\rm (K,K)}$ and ${\rm (\pi,\pi)}$, and to enhance the
3247 signal-to-background ratio at low momentum by requiring the kaon
3248 identification. All proton-tagged tracks are excluded before the
3249 analysis loop on track pairs. More details can be found in
3250 Ref.\cite{CH6Ref:Dainese}.
3252 \item[ ] \textbf{Quarkonia analysis}
3254 Muon tracks stored in the ESD can be analyzed for example by the macro
3255 \texttt{MUONmassPlot\_ESD.C}.
3256 This macro performs an invariant-mass analysis of muon unlike-sign pairs
3257 and calculates the combinatorial background.
3258 Quarkonia \pt and rapidity distribution are built for \Jpsi and \Ups.
3259 This macro also performs a fast single-muon analysis: \pt,
3260 rapidity, and
3261 ${\rm \theta}$ vs ${\rm \varphi}$ acceptance distributions for positive
3262 and negative muon
3263 tracks with a maximal-allowed ${\rm \chi^2}$.
3265 \end{itemize}
3267 % \newpage
3268\item\textbf{AOD analysis}
76b461ed 3270{\bf OBSOLETE}
c4593ee3 3272 Often only a small subset of information contained in the ESD
3273 is needed to perform an analysis. This information
3274 can be extracted and stored in the AOD format in order to reduce
3275 the computing resources needed for the analysis.
3277 The AOD analysis framework implements a set of tools like data readers,
3278 converters, cuts, and other utility classes.
3279 The design is based on two main requirements: flexibility and common
3280 AOD particle interface. This guarantees that several analyses can be
3281 done in sequence within the same computing session.
3283 In order to fulfill the first requirement, the analysis is driven by the
3284 `analysis manager' class and particular analyses are added to it.
3285 It performs the loop over events, which are delivered by an
3286 user-specified reader. This design allows the analyses to be ordered
3287 appropriately if some of them depend on the results of the others.
3289 The cuts are designed to provide high flexibility
3290 and performance. A two-level architecture has been adopted
3291 for all the cuts (particle, pair and event). A class representing a cut
3292 has a list of `base cuts'. Each base cut implements a cut on a
3293 single property or performs a logical operation (and, or) on the result of
3294 other base cuts.
3296 A class representing a pair of particles buffers all the results,
3297 so they can be re-used if required.
3299 \vspace{-0.2cm}
3300 \begin{itemize}
3301 \item[ ] \textbf{Particle momentum correlations (HBT) -- HBTAN module}
3303 Particle momentum correlation analysis is based on the event-mixing technique.
3304 It allows one to extract the signal by dividing the appropriate
3305 particle spectra coming from the original events by those from the
3306 mixed events.
3308 Two analysis objects are currently implemented to perform the mixing:
3309 the standard one and the one implementing the Stavinsky
3310 algorithm\cite{CH6Ref:Stavinsky}. Others can easily be added if needed.
3312 An extensive hierarchy of the function base classes has been implemented
3313 facilitating the creation of new functions.
3314 A wide set of the correlation, distribution and monitoring
3315 functions is already available in the module. See Ref.\cite{CH6Ref:HBTAN}
3316 for the details.
3318 The package contains two implementations of weighting algorithms, used
3319 for correlation simulations (the first developed by Lednicky
3320 \cite{CH6Ref:Weights} and the second due to CRAB \cite{CH6Ref:CRAB}), both
3321 based on an uniform interface.
3323 \item[ ] \textbf{Jet analysis}
3325 The jet analysis\cite{CH6Ref:Loizides} is available in the module JETAN. It has a set of
3326 readers of the form \texttt{AliJetParticlesReader<XXX>}, where \texttt{XXX}
3327 = \texttt{ESD},
3328 \texttt{HLT}, \texttt{KineGoodTPC}, \texttt{Kine}, derived from the base class
3329 \texttt{AliJetParticlesReader}. These
3330 provide an uniform interface to
3331 the information from the
3332 kinematics tree, from HLT, and from the ESD. The first step in the
3333 analysis is the creation of an AOD object: a tree containing objects of
3334 type \texttt{AliJetEventParticles}. The particles are selected using a
3335 cut on the minimal-allowed transverse momentum. The second analysis
3336 step consists of jet finding. Several algorithms are available in the
3337 classes of the type \texttt{Ali<XXX>JetFinder}.
3338 An example of AOD creation is provided in
3339 the \texttt{createEvents.C} macro. The usage of jet finders is illustrated in
3340 \texttt{findJets.C} macro.
3343 \item[ ] \textbf{${\rm V^0}$ AODs}
3345 The AODs for ${\rm V^0}$ analysis contain several additional parameters,
3346 calculated and stored for fast access. The methods of the class {\tt
3347 AliAODv0} provide access to all the geometrical and kinematics
3348 parameters of a ${\rm V^0}$ candidate, and to the ESD information used
3349 for the calculations.
3351 \vspace{-0.1cm}
3352 \item[ ] \textbf{MUON}
3354 There is also a prototype MUON analysis provided in
3355 \texttt{AliMuonAnalysis}. It simply fills several histograms, namely
3356 the transverse momentum and rapidity for positive and negative muons,
3357 the invariant mass of the muon pair, etc.
3358 \end{itemize}
3366\section{Analysis Foundation Library}
76b461ed 3368{\bf OBSOLETE}
c4593ee3 3370The result of the reconstruction chain is the Event Summary Data (ESD)
3371object. It contains all the information that may
3372be useful in {\it any} analysis. In most cases only a small subset
3373of this information is needed for a given analysis.
3374Hence, it is essential to provide a framework for analyses, where
3375user can extract only the information required and store it in
3376the Analysis Object Data (AOD) format. This is to be used in all his
3377further analyses. The proper data preselecting allows to speed up
3378the computation time significantly. Moreover, the interface of the ESD classes is
3379designed to fulfill the requirements of the reconstruction
3380code. It is inconvenient for most of analysis algorithms,
3381in contrary to the AOD one. Additionally, the latter one can be customized
3382to the needs of particular analysis, if it is only required.
3384We have developed the analysis foundation library that
3385provides a skeleton framework for analyses, defines AOD data format
3386and implements a wide set of basic utility classes which facilitate
3387the creation of individual analyses.
3388It contains classes that define the following entities:
3391\item AOD event format
3392\item Event buffer
3393\item Particle(s)
3394\item Pair
3395\item Analysis manager class
3396\item Base class for analyses
3397\item Readers
3398\item AOD writer
3399\item Particle cuts
3400\item Pair cuts
3401\item Event cuts
3402\item Other utility classes
3405It is designed to fulfill two main requirements:
3408\item \textbf{Allows for flexibility in designing individual analyses}
3409 Each analysis has its most performing solutions. The most trivial example is
3410 the internal representation of a particle momentum: in some cases the Cartesian coordinate system is preferable and in other cases - the cylindrical one.
3411\item \textbf{All analyses use the same AOD particle interface to access the data }
3412 This guarantees that analyses can be chained. It is important when
3413 one analysis depends on the result of the other one, so the latter one can
3414 process exactly the same data without the necessity of any conversion.
3415 It also lets to carry out many analyses in the same job and consequently, the
3416 computation time connected with
3417 the data reading, job submission, etc. can be significantly reduced.
3419% ..
3420The design of the framework is described in detail below.
3423% -----------------------------------------------------------------------------
3427The \texttt{AliAOD} class contains only the information required
3428for an analysis. It is not only the data format as they are
3429stored in files, but it is also used internally throughout the package
3430as a particle container.
3431Currently it contains a \texttt{TClonesArray} of particles and
3432data members describing the global event properties.
3433This class is expected to evolve further as new analyses continue to be
3434developed and their requirements are implemented.
3436% -----------------------------------------------------------------------------
3440\texttt{AliVAODParticle} is a pure virtual class that defines a particle
3442Each analysis is allowed to create its own particle class
3443if none of the already existing ones meet its requirements.
3444Of course, it must derive from \texttt{AliVAODParticle}.
3445However, all analyses are obliged to
3446use the interface defined in \texttt{AliVAODParticle} exclusively.
3447If additional functionality is required, an appropriate
3448method is also added to the virtual interface (as a pure virtual or an empty one).
3449Hence, all other analyses can be ran on any AOD, although the processing time
3450might be longer in some cases (if the internal representation is not
3451the optimal one).
3453We have implemented the standard concrete particle class
3454called \texttt{AliAODParticle}. The momentum is stored in the
3455Cartesian coordinates and it also has the data members
3456describing the production vertex. All the PID information
3457is stored in two dynamic arrays. The first array contains
3458probabilities sorted in descending order,
3459and the second one - corresponding PDG codes (Particle Data Group).
3460The PID of a particle is defined by the data member which is
3461the index in the arrays. This solution allows for faster information
3462access during analysis and minimizes memory and disk space consumption.
3465% -----------------------------------------------------------------------------
3469The pair object points to two particles and implements
3470a set of methods for the calculation of the pair properties.
3471It buffers calculated values and intermediate
3472results for performance reasons. This solution applies to
3473quantities whose computation is time consuming and
3474also to quantities with a high reuse probability. A
3475Boolean flag is used to mark the variables already calculated.
3476To ensure that this mechanism works properly,
3477the pair always uses its own methods internally,
3478instead of accessing its variables directly.
3480The pair object has pointer to another pair with the swapped
3481particles. The existence of this feature is connected to
3482the implementation of the mixing algorithm in the correlation
3483analysis package: if particle A is combined with B,
3484the pair with the swapped particles is not mixed.
3485In non-identical particle analysis their order is important, and
3486a pair cut may reject a pair while a reversed one would be
3487accepted. Hence, in the analysis the swapped pair is also tried
3488if a regular one is rejected. In this way the buffering feature is
3489automatically used also for the swapped pair.
3491% -----------------------------------------------------------------------------
3493\subsection{Analysis manager class and base class}
3495The {\it analysis manager} class (\texttt{AliRunAnalysis}) drives all
3496the process. A particular analysis, which must inherit from
3497\texttt{AliAnalysis} class, is added to it.
3498The user triggers analysis by calling the \texttt{Process} method.
3499The manager performs a loop over events, which are delivered by
3500a reader (derivative of the \texttt{AliReader} class, see section
3502This design allows to chain the analyses in the proper order if any
3503depends on the results of the other one.
3505The user can set an event cut in the manager class.
3506If an event is not rejected, the \texttt{ProcessEvent}
3507method is executed for each analysis object.
3508This method requires two parameters, namely pointers to
3509a reconstructed and a simulated event.
3511The events have a parallel structure, i.e. the corresponding
3512reconstructed particles and simulated particles have always the same index.
3513This allows for easy implementation of an analysis where both
3514are required, e.g. when constructing residual distributions.
3515It is also very important in correlation simulations
3516that use the weight algorithm\cite{CH6Ref:Weights}.
3517By default, the pointer to the simulated event is null,
3518i.e. like it is in the experimental data processing.
3520An event cut and a pair cut can be set in \texttt{AliAnalysis}.
3521The latter one points two particle cuts, so
3522an additional particle cut data member is redundant
3523because the user can set it in this pair cut.
3525\texttt{AliAnalysis} class has the feature that allows to choose
3526which data the cuts check:
3528\item the reconstructed (default)
3529\item the simulated
3530\item both.
3533It has four pointers to the method (data members):
3535\item \texttt{fkPass1} -- checks a particle, the cut is defined by the
3536 cut on the first particle in the pair cut data member
3537\item \texttt{fkPass2} -- as above, but the cut on the second particle is used
3538\item \texttt{fkPass} -- checks a pair
3539\item \texttt{fkPassPairProp} -- checks a pair, but only two particle properties
3540 are considered
3542Each of them has two parameters, namely pointers to
3543reconstructed and simulated particles or pairs.
3544The user switches the behavior with the
3545method that sets the above pointers to the appropriate methods.
3546We have decided to implement
3547this solution because it performs faster than the simpler one that uses
3548boolean flags and "if" statements. These cuts are used mostly inside
3549multiply nested loops, and even a small performance gain transforms
3550into a noticeable reduction of the overall computation time.
3551In the case of an event cut, the simpler solution was applied.
3552The \texttt{Rejected} method is always used to check events.
3553A developer of the analysis code must always use this method and
3554the pointers to methods itemized above to benefit from this feature.
3556% -----------------------------------------------------------------------------
3561A Reader is the object that provides data far an analysis.
3562\texttt{AliReader} is the base class that defines a pure virtual
3565A reader may stream the reconstructed and/or the
3566simulated data. Each of them is stored in a separate AOD.
3567If it reads both, a corresponding reconstructed and
3568simulated particle have always the same index.
3570Most important methods for the user are the following:
3572\item \texttt{Next} -- It triggers reading of a next event. It returns
3573 0 in case of success and 1 if no more events
3574 are available.
3575\item \texttt{Rewind} -- Rewinds reading to the beginning
3576\item \texttt{GetEventRec} and \texttt{GetEventSim} -- They return
3577 pointers to the reconstructed and the simulated events respectively.
3580The base reader class implements functionality for
3581particle filtering at a reading level. A user can set any
3582number of particle cuts in a reader and the particle is
3583read if it fulfills the criteria defined by any of them.
3584Particularly, a particle type is never certain and the readers
3585are constructed in the way that all the PID hypotheses (with non-zero
3586probability) are verified.
3587In principle, a track can be read with more than one mass
3589For example, consider a track
3590which in 55\% is a pion and in 40\% a kaon, and a user wants to read
3591all the pions and kaons with the PID probabilities higher then
359250\% and 30\%, respectively. In such cases two particles
3593with different PIDs are added to AOD.
3594However, both particle have the same Unique Identification
3595number (UID) so it can be easily checked that in fact they are
3596the same track.
3598% Multiple File Sources
3599\texttt{AliReader} implements the feature that allows to specify and manipulate
3600multiple data sources, which are read sequentially.
3601The user can provide a list of directory names where the data are searched.
3602The \texttt{ReadEventsFromTo} method allows to limit the range of events that are read
3603(e.g. when only one event of hundreds stored in an AOD is of interest).
3604% Event Buffering