Commissioning of the ATLAS Trigger and Data Acquisition System with Single-Beam and
Cosmic Rays
Kostas KORDAS, on behalf of the ATLAS Collaboration
The ATLAS experiment [1][2] is a general purpose experiment designed to exploit the full discovery potential of the Large Hadron Collider (LHC) at CERN. From exploring the fundamental nature of matter and the basic forces that shape our universe, to investigating the space-time structure of our world. With proton bunches crossing at a rate of 40 MHz at the heart of the detector, and about 23 interactions per bunch crossing, a highly selective trigger and data acquisition (TDAQ) system is needed to reduce the expected 109 interactions per second to an acceptable rate of ~200 Hz, without compromising the full physics potential of the experiment.
The TDAQ system is composed of a hardware implemented Level-1 system,
followed by a farm of PCs running quasi-offline software for the
High Level Trigger (HLT) selection, which is performed in two steps:
Level-2 and Event Filter (EF).
The HLT farm is currently composed of ~800 nodes, a third of its final size. Throughout the system, the event selection is based on the reconstruction
of potentially interesting physical objects like electrons, muons, jets, etc. In the HLT (both Level-2 and EF) the event processing proceeds in steps, each including a) feature extraction algorithms for the reconstruction of useful quantities and candidate physical objects, and b) hypothesis algorithms which accept or reject these candidates according to pre-defined criteria. To achieve the earliest possible rejection of background events, steps with high background rejection are executed first.
The Level-1 trigger is implemented in custom hardware, uses coarse calorimeter and muon detector information and reduces the initial event rate of 40 MHz to about 75 kHz (upgradable to 100 kHz) within a time budget of 2.5 ms. Upon an event accept from LVL1, the front-end electronics of the various sub-detectors push the corresponding data into Readout Buffers; event data remain there and are pulled by the Level-2 trigger and then by the event builder on demand. The Level-2 system fetches only information from geometrical Regions of Interest (RoI) identified by Level-1, as needed; the accepted event rate is then reduced to ~3 kHz within a latency of ~40ms. Thanks to the RoI mechanism, only ~2% of the full event data are needed for the decision process at Level-2. The Event Filter works always on the complete event information provided by the event builder, and accepts the final 200 Hz of events within a latency of a few seconds. While the Level-2 reconstructs localized detector regions, the baseline for the EF is a full offline-like event reconstruction, guided by the findings of Level-2, and using more complete calibration, alignment and magnetic field data. At the end of the data acquisition chain, the data-logger streams
and indexes the events into different files, according to
the trigger classification.
From there, the data are passed to the
central mass storage facility at CERN. ATLAS records about 1.5 MB of information per event, resulting in 300 MB/s of data to permanent storage and further analysis offline.
The LHC startup and short single-beam run in September 2008 provided a "stress test" of the trigger system. The ATLAS detector started recording events where the first proton beams circulating in the LHC were dumped against collimators near the ATLAS experiment. These events, named “splash events”, produced a cascade of pions and muons that were detected by the different sub-detectors in ATLAS, as seen in Fig. 1. After opening all collimators, the proton beams were allowed to circulate along the LHC tunnel and caused beam-gas events to be produced near ATLAS. During this period, event recording in ATLAS was triggered by the Level-1 alone. The HLT was running in real time with no rejection. It was scrutinizing the Level-1 RoI content and, based on it, was directing the events into the different recording “streams”. The HLT algorithms were tested offline through the recorded data soon after the single beam data was taken. This enabled testing of the reconstruction algorithms in both Level-2 and EF without any selection bias. Following this period, ATLAS collected cosmic-ray events for detector alignment and calibration purposes. During cosmic-ray data taking, the HLT runs in real time accepting all events, allowing the trigger algorithms to be exercised online. The integrated cosmic data rate for ATLAS versus time from Sep. 13, 2008 till the end of October 2008, is shown in Fig. 2. The data-sample was increased by ~50% during a two-week data taking session in July 2009.
These periods allowed testing the reconstruction and selection algorithms, as well as the whole framework with the full-scale ATLAS detector and data acquisition system. The trigger commissioning was happening in parallel with the commissioning of each sub-detector system. This created a fast-changing environment to which the trigger needed to adapt. Foremost, the configuration of the system proved to be flexible enough to adapt to the quick changes of the trigger selection menu. Track-finding, muon identification and calorimeter reconstruction algorithms were all tested under different conditions stressing the robustness of the software. The experience gained while running the trigger system online was very valuable to design and implement the strategy for the collision data taking period of 2009. Nevertheless, despite the invaluable feedback from the usage of the system in cosmic and first beam data-taking
periods, the typical system working point (in terms of event rates and data-throughputs) was quite far from its design specification. Therefore, dedicated performance, robustness and
scalability measurements have also been performed.
In conclusion, this paper describes the trigger design and its innovative features, such as the step-wise reconstruction and the RoI mechanism. The early experience on LHC single-beam in 2008 will be shown and then focus on the valuable experience gained in running the DAQ and the trigger reconstruction and event selection in the fast-changing environment of the detector commissioning with cosmic data. Results on system performance based on preselected simulated events will be also presented. Finally, the readiness of the system for LHC collisions, expected later this year, will be discussed.
Figure 1 First event seen in ATLAS. Figure 2 Integrated cosmic events recorded by ATLAS in 2008.
References
-
The ATLAS Collaboration, B. Aad et al., “The ATLAS Experiment at the CERN Large Hadron Collider”, JINST (2008) 3 S08003.
-
The ATLAS Collaboration, “Expected Performance of the ATLAS Experiment, Detector, Trigger and Physics”, CERN-OPEN-2008-020, Geneva, 2008.
Share with your friends: |