cms cr -2017/420 the compact muon solenoid experiment ...available on cms information server cms cr...

7
Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Conference Report 13 November 2017 (v4, 07 January 2018) A modern and versatile data-acquisition package for calorimeter prototypes test-beams H4DAQ Andrea Carlo Marini for the CMS Collaboration Abstract The upgrade of the calorimeters for the HL-LHC or for future colliders requires an extensive pro- gramme of tests to qualify different detector prototypes with dedicated test beams. A common data- acquisition system (called H4DAQ) was developed for the H4 test beam line at the North Area of the CERN SPS in 2014 and it has since been adopted by an increasing number of teams involved in the CMS experiment and AIDA groups. Several different calorimeter prototypes and precision tim- ing detectors have used H4DAQ from 2014 to 2017, and it has proved to be a versatile application, portable to many other beam test environments (the CERN beam lines EA-T9 at the PS, H2 and H4 at the SPS, and at the INFN Frascati Beam Test Facility).The H4DAQ is fast, simple, modular and can be configured to support different setups. The different functionalities of the DAQ core software are split into three configurable finite state machines the data readout, run control, and event builder. The distribution of information and data between the various computers is performed using ZEROMQ (0MQ) sockets. Different plugins are available to read different types of hardware, including VME crates with different types of boards, PADE boards, custom front-end boards and beam instrumenta- tion devices. The raw data are saved as root files, using the CERN c++ root libraries.A Graphical User Interface, based on the python gtk libraries, is used to operate the H4DAQ and integrated data quality monitoring (DQM), written in c++, allows for fast processing of the events for quick feedback to the user. The 0MQ libraries are available as well for the National Instruments LabVIEW program. This facilitates communication with existing instrumentation and detector control systems, via commands issued by the H4DAQ GUI. The design, functionality and operational experience with the H4DAQ system will be described in this talk. Presented at CHEF2017 Calorimetry for the High Energy Frontier 2017

Upload: others

Post on 13-Jul-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

Available on CMS information server CMS CR -2017/420

The Compact Muon Solenoid Experiment

Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Conference Report13 November 2017 (v4, 07 January 2018)

A modern and versatile data-acquisition packagefor calorimeter prototypes test-beams H4DAQ

Andrea Carlo Marini for the CMS Collaboration

Abstract

The upgrade of the calorimeters for the HL-LHC or for future colliders requires an extensive pro-gramme of tests to qualify different detector prototypes with dedicated test beams. A common data-acquisition system (called H4DAQ) was developed for the H4 test beam line at the North Area ofthe CERN SPS in 2014 and it has since been adopted by an increasing number of teams involved inthe CMS experiment and AIDA groups. Several different calorimeter prototypes and precision tim-ing detectors have used H4DAQ from 2014 to 2017, and it has proved to be a versatile application,portable to many other beam test environments (the CERN beam lines EA-T9 at the PS, H2 and H4at the SPS, and at the INFN Frascati Beam Test Facility).The H4DAQ is fast, simple, modular andcan be configured to support different setups. The different functionalities of the DAQ core softwareare split into three configurable finite state machines the data readout, run control, and event builder.The distribution of information and data between the various computers is performed using ZEROMQ(0MQ) sockets. Different plugins are available to read different types of hardware, including VMEcrates with different types of boards, PADE boards, custom front-end boards and beam instrumenta-tion devices. The raw data are saved as root files, using the CERN c++ root libraries.A Graphical UserInterface, based on the python gtk libraries, is used to operate the H4DAQ and integrated data qualitymonitoring (DQM), written in c++, allows for fast processing of the events for quick feedback to theuser. The 0MQ libraries are available as well for the National Instruments LabVIEW program. Thisfacilitates communication with existing instrumentation and detector control systems, via commandsissued by the H4DAQ GUI. The design, functionality and operational experience with the H4DAQsystem will be described in this talk.

Presented at CHEF2017 Calorimetry for the High Energy Frontier 2017

Page 2: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

Prepared for submission to JINST

H4DAQ: a modern and versatile data-acquisition packagefor calorimeter prototypes test-beams

on behalf of the CMS collaborationAndrea Carlo Marinia

aMassachusetts Institute of Technology, 77 Massachussetts avenue, Cambridge MA, US

E-mail: [email protected]

Abstract: The upgrade of the particle detectors for the HL-LHC or for future colliders requiresan extensive program of tests to qualify different detector prototypes with dedicated test beams. Acommon data-acquisition system, H4DAQ, was developed for the H4 test beam line at the NorthArea of the CERN SPS in 2014 and it has since been adopted in various applications for the CMSexperiment and AIDA project. Several calorimeter prototypes and precision timing detectors haveused our system from 2014 to 2017. H4DAQ has proven to be a versatile application and has beenported to many other beam test environments.

H4DAQ is fast, simple, modular and can be configured to support various kinds of setup. Thefunctionalities of the DAQ core software are split into three configurable finite state machines: datareadout, run control, and event builder. The distribution of information and data between the variouscomputers is performed using ZEROMQ (0MQ) sockets. Plugins are available to read differenttypes of hardware, including VME crates with many types of boards, PADE boards, custom front-end boards and beam instrumentation devices. The raw data are saved as ROOT files, using theCERN C++ ROOT libraries. A Graphical User Interface, based on the python gtk libraries, is usedto operate the H4DAQ and integrated data quality monitoring (DQM), written in C++, allows forfast processing of the events for quick feedback to the user. As the 0MQ libraries are also availablefor the National Instruments LabVIEW program, this environment can easily be integrated withinH4DAQ applications.

Keywords: H4DAQ,DAQ,H4,H2,VME,ZEROMQ,SPS,PS,BTF,H4DQM,H4GUI,CERN,INFN

Page 3: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

The upgrade of the particle detectors for the high luminosity (HL) project of the LHC and forfuture colliders requires an extensive program of tests to qualify several detector prototypes withdedicated test beams. A common data acquisition system (called “H4DAQ”) was developed [1, 2]for the H4 test beam line at the North Area of the CERN SPS in 2014 and it has since been adoptedby an increasing number of teams involved in the CMS experiment[3] and AIDA-2020 project. Sofar, the package was used at several beam facilities: EA-T9 at the CERN PS, H2 and H4 at the CERNSPS, at the INFN Frascati Beam Test Facility, and in cosmic muons table-top laboratory setups. TheH4DAQ is fast, simple and modular. It can be configured to support different setups, and it is easilyadaptable to several configurations in the test areas. Numerous calorimeter prototypes and precisiontiming detectors have been tested using H4DAQ from 2014 to 2017. It has proved to be a versatileapplication, portable to many other beam test environments, and for these reasons H4DAQ has beenused in many research and development studies, among which the new calorimeter detectors basedon W-CeF3[4–7], and W-LYSO[8], the timing detectors based on micro-channel plates[9–12], andnew sensors based on silicon devices [13].

The core of theDAQsoftware is organized and split into three configurable finite statemachines:the data readout (DR), the run control (RC), and the event builder (EB). These machines are meantto run on conventional desktops (PCs) and conventional LAN/IP networks, in order to avoid thecosts related to the use of dedicated hardware. All the components of the DAQ are configurablethrough XML files to provide adaptability to different setups with a convenient user interface.

The RC is in charge of controlling and synchronizing the data acquisition, while coordinatingthe different parts of the DAQ. It communicates with the other machines, with the user through thegraphical user interface (GUI) or the command line interface (CLI), and with the accelerator whenpossible (e.g. SPS status commands).The system was originally designed for a test setup at the CERN SPS, and therefore the extractionstructure and information system of this machine played a significant role in the original H4DAQdesign. The SPS issues three status commands: the warning warning extraction (WWE, 1s beforeextraction), the warning extraction (WE, 10ms before extraction), and extraction end (EE). Thisstatus information is read by the RC and prepares the DAQ to the acquisition, enabling/disablingthe triggers and subsequently starting the aggregation of the data, as detailed below. Simulatedsoftware versions of these commands are used for pedestal/laser runs, where the end of the spillsignal is emulated after a predefined number of events are collected. The emulation can be obtainedin hardware, using e.g. an Arduino board[14]. The SPS spill extraction structure is formed bytwo spills of about ≈ 5s every minute as shown in fig. 1. During the spills the maximum amountof events are collected, postponing potentially CPU/Disk/Network intense operations, like the datatransfer or the event building, to the idle time available between two consecutive spills.The distribution of information and data between the computers (DAQ machine, user interface) isperformed using ZEROMQ (0MQ) sockets [15]. This choice was driven by the need to have accessto both asynchronous network operations for data streaming, and to synchronous sockets for theDAQ internal commands and updates.

The DR machine handles several boards and crates in order to collect all the information readout by the many hardware components in the setup. Multiple DRs can be located in different placesof the acquisition area. The DR receives and reads an hardware trigger sent by e.g. a triggertelescope located on the beam line and starts to collect data from all the boards assigned to it. It

– 2 –

Page 4: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

5s

30s

Figure 1. Spill structure in the CERN-SPS (2014).

communicates with the RC, exchanging status information, to know which operations needs to beperformed in each moment.Plugins are available to read different types of hardware, including VME crates with numeroustypes of boards (CAEN®, LeCroy, including digitizers, ADCs, ...), PADE boards, custom front-endboards (IPBUS[16], TOFPET[17]) and beam instrumentation devices (wire chambers, hodoscopes,scintillators, ...). The DRs are interfaced to the boards using the respective proprietary library.

The EB machine aggregates all the data in a spill collected by the different DRs. An hard-ware/sotware busy logic is implement to coordinate all the machines and avoid the acquisition ofoverlapping events. When an event is triggered by the trigger system, all the DAQ status is set tobusy, and the acquisition starts. When the data are successfully transferred from the board to thesystem, each DR unsets its own busy flag. The presence of a busy flag from any machine (DR, RC)vetos the acquisition of a new trigger, resulting in a unique event interpretation.Each event carries the timestamp produced by the commercial CPU in order to be able to matchthe events together. A precision in time differences of 10µs can be achieved with commercially-available computers, and it is sufficient to match the events from the different DRs. Figure 2-leftshows the time difference recorded between two DR in the event of one pedestal-spill in unitsdefined by the desktop CPU-timing precision (10µs), while fig. 2-right shows how this informationcan be used to align the event among three DRs machines (red, blue, and green) in the event of onespill, lasting slightly less than 5s. Currently the main limitation to the acquisition rate is due to thetransfer rate of the data from the board to the PCs, in particular with the association of one or manychannels of the digitizers, that represents the typical usage of the DAQ, reaching therefore 1kHz ofcollected events rate.After the event is successfully reconstructed, the raw data are saved as binary files, and on-line toolscan convert them to ROOT format, using the CERN C++ ROOT libraries[18], for a convenientaccess to the registered information.

A Graphical User Interface (GUI), based on the python gtk libraries, is used to operate theH4DAQ and integrates a web browser for visual access to the IP-based webcam, and the data qualitymonitoring (DQM).The 0MQ libraries are available for the National Instruments LabVIEW program. This facilitates

– 3 –

Page 5: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

Andrea Carlo Marini 5 Oct 2017

• Event in each read out is built separately and sent to the Event Builder machine• events are efficiently matched in time• commercial laptops have sufficient

precision on time differences (~10μs)• Building starts online after “spill” ends• Raw events are saved on disk.

Event Building

8

Andrea Carlo Marini 20 April 2015 Backup

Example of spill

170

�5s

One of the first tests, showing thepossibilities of aligning read outs and building the event with conventional desktops

Andrea Carlo Marini 20 April 2015 Backup

• Newly developed DAQ Software• Infrastructure now available for all ECAL-related

tests in H4• 3 VME crates read out by individual PCs• event merging, unpacking, and DQM automatically

launched after each spill• excellent performances of the system • no crash !

!• Fast and intuitive GUI (Graphical User Interface):• fully integrated remote table positioning

• X-Y table controlled via LabView program, interfaced to GUI• Temperature monitored by sensors placed in various positions inside the box.

H4 beam area AC at 18oC. Temperature inside box stabilized by LAUDA at 18oC (some heat produced by the PMT bases).

DAQ

162

Andrea Carlo Marini 3 Oct 2014

Progresses and Conclusions

9

• EventBuilder is working correctly• Busy logic is working fine.!ToDo:• Few small features and/or bugs to be

fixed/implemented.

Time difference in the event of one spill (timestamp) in 10us

Time difference in the event of one spill (timestamp) in 10 μs

beginspill

end spill

⤸⤸⤸⤸⤸⤸⤸

time1-time2 [10 μs]

Andrea Carlo Marini 20 April 2015 Backup

Example of spill

170

�5s

⤸⤸⤸⤸⤸⤸⤸

beginspill

end spill

Figure 2. Time difference recorded in the events of one spill with commercial CPUs in 10µs with pedestalevents (left). The test shows the possibility to align events with three commercial desktop computers (right)

Andrea

• Simple User Interface • Interfaced to LabView and web-browser:• control position of the test-channel from control room • automize operation on the DAQ system• monitor with IP webcam different read-outs

Graphical User Interface

9Marco Peruzzi (ETH Zurich) H4 DAQ network communication software 2

Log display(including remote logs

sent via 0mq)

Run status display(auto-updates every 0.2s)

+ performance monitoring

FSM status display

Temperature sensors (data fetched from DB)

Run configuration and comments

DQM plots(fetched via http)

with click-to-zoom feature

Integrated table position control

Blinking alarm bar + sounds

Keep-aliveand SPS signals

Auto-stop run after desired #ev.

Figure 3. Graphical User Interface (GUI) used to operate H4DAQ

communication with existing instrumentation and detector control systems, via commands issuedby the H4DAQGUI. For example, we controlled the detector position on a LabVIEW-based remote-controllable table, allowing the physical translation of the setup without access to the experimentalarea.The DQM software, written in C++, allows a fast processing of the events from the raw data formatto the ROOT based data format, and the production of graphical summaries of the latest acquireddata. It can be run online on a subset of the total acquired events, and allows for quick feedback tothe user. A standard set of plots can be produced for each board type supported by the DAQ.

Finally, an example of acquired pulse-shapes is shown in fig. 5 for a silicon-based precisiontiming prototype and for the W-CeF3 calorimeter detector prototype.

In conclusion, a modern and versatile data-acquisition package has been developed and usedfor calorimeter and timing detector prototypes, and since then succesfully employed in severalbeam lines and setups. Numerous acquisition boards are supported and a single DAQ can be splitinto different physical locations. It is robust, fast, reliable, and modular and runs on commercial

– 4 –

Page 6: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

Figure 4. Example of DQM plots from the hodoscope (left) and the maximum amplitude collected by thedigitizer (right).

calibration in Section 3. We present multi-MIP response of the sensorsin Section 4 and evaluate the timing resolution in Section 5 in somedetail.

2. Experimental setup

The test setup is shown in Fig. 1. Two silicon sensors were installedin the beam line at all times. A micro-channel plate (MCP), viewing aCherenkov radiator, provided a precise timing reference for each eventwithin ∼20 ps, with full efficiency for single MIPs at −2750 V [8]. Ascintillator counter (2×2 cm2) upstream of the detectors defined thearea of triggered events. The impact position of particles on the sensorswas estimated by projecting the hit positions, with better than amillimeter resolution, obtained from a set of delay wire chambers(Fig. 1). Lead sheets with different thickness were positioned in front ofthe sensors to generate electromagnetic showers to be able to investi-gate the multi-MIP response of these sensors.

The measurements described in this paper were performed at theH2 beam line of the super proton synchrotron (SPS) at CERN in July2015. 150 GeV muons and 50 GeV electrons were used for the majorityof the measurements. The beam particle intensities were typicallyseveral thousand per spill. Each spill lasted 4.9 s and was repeatedtwice a minute for most of the data taking period. Although the beamspot size on the sensors varied slightly with the beam tune and settings,the FWHM was about 1 cm in horizontal and vertical directions –larger than the sensors (Fig. 1).

All tested sensors were p-type (n-on-p), 5 × 5 mm2 in the effectivearea and physically 32. μm thick. They were biased at 600 V and fullydepleted. They were produced by deep-diffused float-zone (dd-FZ)technique by Hamamatsu within the framework of the CMS trackerupgrade project in three different depletion thickness: 133, 211, and285 µm with capacitances of 22.5, 13.6, and 9.9 pF, respectively [9,10].The radiation effects on these sensors are currently under study;however, none of the sensors used here was previously irradiated.

The electrical signals from the sensors were amplified by a broad-band (2 GHz/40 dB) amplifier,1 and the waveforms were digitized at5 GHz by a Domino Ring Sampler (DRS) unit from CAEN (V1742). Thesignals from MCP and the event trigger were also fed to the samedigitizer unit in order to remove trigger jitter off-line. The intrinsictiming resolution of the digitizer was ∼5 ps and did not contribute to

the systematic uncertainty of our measurements in any significant way[11].

3. Pulse shape and MIP calibration

The rise-times from 10% to 90% of the pulse amplitude for thethree sensors were measured to be 1.1 ns (Fig. 2). The pulse time wasdefined as the time when the pulse reaches its 50% amplitude. Theoffline algorithm searched for a pulse in the data stream, fitted its peakwith a Gaussian function requiring at least 5 samples, and calculatedthe time at which the pulse reached its half amplitude.

We measured the responses of the three different type of sensors tosingle MIPs. A 50 GeV electron beam was used to perform thismeasurement. Without the lead plates in front of the sensors, 50 GeVelectrons were effectively equivalent to MIPs. The results of thecalibration with electrons have been found to be in agreement withthe ones obtained using a 150 GeV muon beam. We selected eventswith a signal from the MCP and from the second sensor. The presenceof a signal in the second sensor ensured that most tracks did passthrough the first sensor whose response was being measured. Thesignal from the first sensor included a correction for a 20% noise

Fig. 1. The schematic of the layout displays the main components and the readout scheme on the left. Downstream of the trigger counter (TRG) and wire chambers (WC), a micro-channel plate (MCP) photomultiplier tube was positioned to provide a timing reference in front of the silicon sensors. Various lead plates were placed in between the MCP and thesensors to evaluate their response to multi-MIPs. A typical response pattern of a 285-μm thick silicon sensor (5 × 5 mm2) to 50 GeV electrons when normalized to the MIP signal isdisplayed on the right. Note that the sensors were placed behind X2 o of lead absorber in this case.

Fig. 2. Examples of single pulse shapes from two 211-μm thick sensors for the sameevent are shown on the left. 50% of the peak amplitude is indicated by dotted lines and isused in all the timing measurements discussed in Section 5. The peak of the pulses isarbitrarily set at t=0.1 CIVIDEC Instrumentation, C2 Broadband Diamond Amplifier, Vienna, Austria.

N. Akchurin et al. Nuclear Instruments and Methods in Physics Research A 859 (2017) 31–36

32

Andrea Carlo Marini 20 April 2015 Backup

• Uniformity of the tower• Example of pulse shape

Example of results CeF3

171

0

0.2

0.4

0.6

0.8

1

1.2

610×

Position X [mm]15− 10− 5− 0 5 10 15

Posi

tion

Y [m

m]

15−

10−

5−

0

5

10

15100 GeV Electron Beam

Time [ns]0 20 40 60 80 100 120 140 160 180 200

Tens

ion

[mV]

0

200

400

600

800

1000

Pulse shape

E=100 GeV, HV=950 V

Figure 5. Example of pulse shapes taken with the Si precision timinig detector [13] (left) and with theW-CeF3 calorimeter prototype [6].

computers.

References

[1] A. C. Marini, Differential studies of vector boson plus jet and Higgs production with data from theCMS experiment. PhD thesis, ETH, Zurich, Zurich, Switzerland, apr, 2015.10.3929/ethz-a-010469567.

[2] M. Peruzzi,Measurement of differential cross sections for diphoton poduction in pp collisions withthe CMS experiment. PhD thesis, ETH, Zurich, Zurich, Switzerland, apr, 2015.10.3929/ethz-a-010558087.

[3] CMS collaboration, S. Chatrchyan et al., The CMS experiment at the CERN LHC, JINST 3 (2008)S08004.

[4] R. Becker, G. Dissertori, L. Djambazov, M. Donegà, M. Droge, C. Haller et al., Test beam results witha sampling calorimeter of cerium fluoride scintillating crystals and tungsten absorber plates forcalorimetry at the hl-lhc, NIM A 824 (2016) 681 – 683.

– 5 –

Page 7: CMS CR -2017/420 The Compact Muon Solenoid Experiment ...Available on CMS information server CMS CR -2017/420 The Compact Muon Solenoid Experiment Mailing address: CMS CERN, CH-1211

[5] R. Becker, V. Candelise, F. Cavallari, I. Dafinei, G. D. Ricca, M. Diemoz et al., Beam test results for atungsten-cerium fluoride sampling calorimeter with wavelength-shifting fiber readout, Journal ofInstrumentation 10 (2015) P07002.

[6] Performance of a tungsten-cerium fluoride sampling calorimeter in high-energy electron beam tests,Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers,Detectors and Associated Equipment 804 (2015) 79.

[7] L. Bianchini, G. Dissertori, M. Donegà, W. Lustermann, A. Marini, F. Micheli et al., High-energyelectron test results of a calorimeter prototype based on cef3 for hl-lhc applications, in 2015 IEEENuclear Science Symposium and Medical Imaging Conference (NSS/MIC), pp. 1–2, Oct, 2015. DOI.

[8] M. Lucchini, S. Gundacker, P. Lecoq, A. Benaglia, M. Nikl, K. Kamada et al., Timing capabilities ofgarnet crystals for detection of high energy charged particles, Nuclear Instruments and Methods inPhysics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 852(2017) 1 – 9.

[9] A. Barnyakov, M. Barnyakov, L. Brianza, F. Cavallari, V. Ciriolo, D. D. Re et al.,Micro-channelplates in ionization mode as a fast timing device for future hadron colliders, Journal ofInstrumentation 12 (2017) C08014.

[10] A. Y. Barnyakov et al., Response of microchannel plates in ionization mode to single particles andelectromagnetic showers, Nucl. Instrum. Meth. A879 (2018) 6–12, [1707.08503].

[11] A. Barnyakov, M. Barnyakov, L. Brianza, F. Cavallari, V. Ciriolo, D. D. Re et al.,Micro-channelplates in ionization mode as a fast timing device for future hadron colliders, Journal ofInstrumentation 12 (2017) C08014.

[12] L. Brianza et al., Response of microchannel plates to single particles and to electromagnetic showers,Nucl. Instrum. Meth. A797 (2015) 216–221, [1504.02728].

[13] N. Akchurin, V. Ciriolo, E. CurrÃąs, J. Damgov, M. FernÃąndez, C. Gallrapp et al., On the timingperformance of thin planar silicon sensors, Nuclear Instruments and Methods in Physics ResearchSection A: Accelerators, Spectrometers, Detectors and Associated Equipment 859 (2017) 31 – 36.

[14] arduino, “arduino.” www.arduino.cc.

[15] zeromq, “0mq.” http://zeromq.org/.

[16] C. G. Larrea, K. Harder, D. Newbold, D. Sankey, A. Rose, A. Thea et al., Ipbus: a flexibleethernet-based control system for xtca hardware, Journal of Instrumentation 10 (2015) C02019.

[17] A. D. Francesco, R. Bugalho, L. Oliveira, L. Pacher, A. Rivetti, M. Rolo et al., Tofpet2: ahigh-performance asic for time and amplitude measurements of sipm signals in time-of-flightapplications, Journal of Instrumentation 11 (2016) C03042.

[18] R. Brun and F. Rademakers, ROOT: An object oriented data analysis framework, Nucl. Instrum. Meth.A389 (1997) 81–86.

[19] A. Benaglia, S. Gundacker, P. Lecoq, M. Lucchini, A. Para, K. Pauwels et al., Detection of highenergy muons with sub-20ps timing resolution using l(y)so crystals and sipm readout, NuclearInstruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors andAssociated Equipment 830 (2016) 30 – 35.

[20] H4DAQ, “H4DAQ.” github.com:cmsromadaq/H4DAQ.

[21] H4DQM, “H4DQM.” github.com:cmsromadaq/H4DQM.

[22] H4GUI, “H4GUI.” github.com:cmsromadaq/H4GUI.

– 6 –