The HLT run very complex physics tests to look for specific signatures, for instance matching tracks to hits in the muon chambers, or spotting photons through their high energy but lack of charge. Overall, from every one hundred thousand events per second they select just dizaines of events and the remaining dizaines of thousands are thrown out.
This is achieved by a combination of different techniques which require a detailed online event reconstruction:
- Trigger: selecting interesting events based on detailed online analysis of its physics observables.
- Selection: selecting the Regions of Interests (interesting part of single events).
- Compression: reducing the event size by advanced data compression without any loss of the contained physic
The HLT system allows the number of collected events containing the physics signals of interest to be significantly increased.
The data from the detectors are read out in parallel according to the natural detector granularity. The detector data are shipped with optical links to the first processing layer where local pattern recognition is done. The results are forwarded to the following processing layers where a global pattern recognition (track finding and sector merging) takes place. The trigger decision is generated in the final layer.
For the online event analysis a powerful computing cluster with a few hundred computing nodes is needed. This cluster is built up of inexpensive standard components like ordinary PCs which are connected by a standard network.
The HLT-cluster is administered fully autonomously with the help of the Computer-Health-And-Remote-Monitoring (CHARM) card, which is plugged into the PCI-bus of the host-computer. It is a small single-board computer running Linux as operation system that can monitor and control the host computer. Connections to the CHARM card are possible via its own network for full remote control of the host system.
Read more about the ALICE HLT
More about the Alice HLT