july 7, 2008slac annual program reviewpage 1 current atlas program charles young

43
July 7, 2008 SLAC Annual Program Review Page 1 Current ATLAS Program Charles Young

Upload: valentine-atkinson

Post on 13-Dec-2015

217 views

Category:

Documents


1 download

TRANSCRIPT

July 7, 2008 SLAC Annual Program Review Page 1

Current ATLAS Program

Charles Young

July 7, 2008 SLAC Annual Program Review Page 2

Outline

* History and context. * Initial involvements.

– High Level Trigger (HLT). – Pixel.

* Additional activities. – Western Tier 2 (WT2). – Simulation. – 3D Sensors. – Remote Monitoring. – Production support.

* Personnel, roles and financial data. * Summary

July 7, 2008 SLAC Annual Program Review Page 3

History of SLAC’s ATLAS Involvement

* Initial investigations in 2005. – Internal discussions at SLAC in spring.

– Informal meetings with US ATLAS physicists.

– Met with ATLAS and US ATLAS management in June. • Identified high level trigger and pixel detector as two areas where ATLAS

needs help and where SLAC has corresponding interests and strengths.

– Started work on pixel module testing at LBNL.

* Formal steps to join ATLAS. – Submitted application in February 2006.

– ATLAS Collaboration Board vote in July 2006.

– Separate applications to join ATLAS HLT and ATLAS pixel.

– Competition for 2nd round of US ATLAS Tier 2.

* Further engagements in the last two years. – Aligned with SLAC’s core competencies and long-term interests.

July 7, 2008 SLAC Annual Program Review Page 4

Context of SLAC’s ATLAS Involvement

* Some smaller institutions are members of ATLAS by affiliation with larger groups like SLAC. – University of Iowa.

– California State University at Fresno.

* Informal arrangements to supervise/mentor non-Stanford students resident at SLAC and at CERN.

* Western Tier 2 provides computing resources and a fully supported ATLAS environment for users.

* Support of ATLAS-wide activities, e.g. – First ATLAS Physics Workshop of the Americas in 2007.

– Jet/Energy Scale workshop in 2008.

* Coordination with and support of university groups. – Office space.

– Common interests and joint activities.

July 7, 2008 SLAC Annual Program Review Page 5

Outline

* History and context. * Initial involvements.

– High Level Trigger (HLT). – Pixel.

* Additional activities. – Western Tier 2 (WT2). – Simulation. – 3D Sensors. – Remote Monitoring. – Production support.

* Personnel, roles and financial data. * Summary

July 7, 2008 SLAC Annual Program Review Page 6

High Level Trigger

* Details in talks by Rainer Bartoldus and Ignacio Aracena. * HLT configuration.

– Delivering total of O(0.1 – 1) TB of data to O(2000) nodes x 8 clients/cores per node not possible with single server.

– DbProxy solution is hierarchical and scalable. – Rainer Bartoldus, Sarah Demers, Amedeo Perazzo, Andy Salnikov, Su Dong.

* HLT algorithms: Jet, Missing ET, Tau and b-jet slices. * Partial event building.

– Calibration or alignment events require information from only part of ATLAS. – Can afford much greater rate. – Code implemented and validated, and now working on monitoring. – Ignacio Aracena, Anson Hook.

* Online beam spot measurement. – Critical for certain trigger algorithms such as b-tagging. – Useful to LHC. – Rainer Bartoldus, David Miller, Su Dong.

* Operational responsibilities.

July 7, 2008 SLAC Annual Program Review Page 7

Why the Pixel Detector?

* Historical and current interest in silicon sensors. – SLD. – SiD. – LSST. – ATLAS B-Layer Replacement. – ATLAS Tracker Upgrade.

* Data flow and read-out. * Interest in physics topics that rely on pixel detector.

– b-tagging.

July 7, 2008 SLAC Annual Program Review Page 8

Pixel Detector

* Pixel package lowered into the Pit on 6/25/07. – Two weeks after review last year. – Installed on 6/29/07.

* Connections started in February 2008. – SCT services and cooling must be installed first. – Connection finished on time in April 2008.

* Sign-off, i.e. certification that Inner Detector (ID) volume can be sealed. – Suspended on 5/1/08 after 6 days due to cooling plant failure.

• Three of out 6 compressors failed “simultaneously”. – Approximately 2/3 of sign-off activities completed. – ID volume sealed.

* Plan and schedule. – Cooling plant back in operation on 7/21/08. – First used for beam pipe bake-out (~10 days).

• Cooling is an absolute requirement! – Then resume pixel sign-off and start commissioning early August.

• LHC will start injections at about the same time. – Pixel HV interlocked to be off during injection. – Incompatible with sign-off and commissioning.

• One month time estimate will stretch out depending actual time available.

July 7, 2008 SLAC Annual Program Review Page 9

July 7, 2008 SLAC Annual Program Review Page 10

July 7, 2008 SLAC Annual Program Review Page 11

July 7, 2008 SLAC Annual Program Review Page 12

July 7, 2008 SLAC Annual Program Review Page 13

July 7, 2008 SLAC Annual Program Review Page 14

July 7, 2008 SLAC Annual Program Review Page 15

Pixel Activities

* Cooling system and monitoring. * Calibration. * Digital Signal Processor (DSP). * High voltage (HV) distribution. * Track based alignment.

July 7, 2008 SLAC Annual Program Review Page 16

Cooling System and Monitoring

* Evaporative cooling system mandatory for detector operations (and for beam pipe bake-out) but has presented serious problems. – Heater failures last year. – Some instability during sign-off in May 2008. – Cooling plant failure.

* Detailed study of operational parameters and temperature measurements from sign-off to understand the system better and to find a more stable operation point. – Bart Butler, Joe Dipirro (Fresno).

* Implement additional monitoring and alarms. – Lawrence Carlson (Fresno).

* Long term trends of operational and environmental conditions. – Prafulla Behera (Iowa), Gerald Rude (Fresno), Steve Wilburn (Fresno),

CY. * Use event display program to display non-event data in

geographically accurate format. – Jim Black, Tim Nelson, Ben Zastovnick (Fresno).

July 7, 2008 SLAC Annual Program Review Page 17

Calibration

* Enormous undertaking by many people over a very long time!* Detector noise studies.

– Essential to understand noise and mask out bad channels (out of 80 x 106 channels) to avoid overwhelming DAQ bandwidth or compromising track and vertex performance.

* Digital scan analysis. – Basic communications with pixel modules.

* Back of Crate card (BOC) tuning algorithm. – Optimize operational parameters for robust optical communications

between VME crate and pixel detector.

* Analysis Framework. – Track calibration results over time. – Compare calibration with reference or another calibration. – Combine multiple calibrations runs into one coherent set of parameters

for the entire pixel detector.

* Bart Butler, Claus Horn, Ariel Schwartzman.

July 7, 2008 SLAC Annual Program Review Page 18

Digital Signal Processor

* Pixel read-out system has 132 Read-out Drivers (ROD) in 9 ROD crates. * Each ROD has 5 DSP (one Master and four Slaves). * DSP’s functions include:

– Data taking related code is relatively stable. – Calibration tasks, e.g. threshold scan, noisy pixel scan and so on.

• Focus of much of activity and significant progress in the last two years. • Work still going on at a furious pace.

* Manpower is slim and declining. – Jed Biesiada (LBNL), Alex Schreiner (Iowa) and Matthias Wittgen.

* SLAC’s roles. – Improve the software environment to enhance productivity. – Maintain test stands and assure their coherence with CERN setup. – Effort hampered by recent developments at CERN introducing incompatibilities with test

stands. * Long-term sustainable manpower needed for pixel operations.

– Matthias Wittgen will increase his level of involvement. – Martin Kocian is joining. – New RA may get involved also.

* Data flow is an area SLAC has long participated in, and is an area we intend stay active and take a leading role for ATLAS and other projects.

July 7, 2008 SLAC Annual Program Review Page 19

Outline

* History and context. * Initial involvements.

– High Level Trigger (HLT). – Pixel.

* Additional activities. – Western Tier 2 (WT2). – Simulation. – 3D Sensors. – Remote Monitoring. – Production support.

* Personnel, roles and financial data. * Summary

July 7, 2008 SLAC Annual Program Review Page 20

Tier 2 Computing Center

* Three sites selected in 2005 by US ATLAS. * Second call for proposals in 2006.

– SLAC was one of two sites selected. * Structure of Western Tier 2 (WT2).

– Supported by 8 institutions. • Lawrence Berkeley National Lab. • SLAC. • University of Arizona. • University of California, Irvine. • University of California, Santa Cruz. • University of Oregon. • University of Washington. • University of Wisconsin, Madison.

– Advisory group with representative from each institution. • Current Chair is Gordon Watts (Washington).

* Functions. – Production use via centralized control. – User activities, particularly analysis.

* Leverages and continues SLAC’s capabilities and expertise in computing, especially in the area of data-intensive analysis applications.

July 7, 2008 SLAC Annual Program Review Page 21

Simulation

* Formation of “fast shower” working group by ATLAS ~2 years ago was due largely to our initiative.

– Calorimeter was the dominant consumer of CPU time. – Kick-off workshop at SLAC in late 2006. – SLAC participation: one rotation student, Bart Butler, Zach Marshall (Columbia), Dennis

Wright, CY. * ATLAS created Simulation Optimization Group to address other optimization issues.

– Code speed in other places, but also physics quality. – Work mostly delegated to subsystem experts with assistance and advice from SLAC Geant4

experts Makoto Asai and Dennis Wright. – Zach Marshall and CY are members.

* Recent engagement in muon system. – Geometry issues.

• Remove clashes of sensitive detectors as well as dead material. . • Implement cut-out chambers with complicated geometry.

– Code optimization. – Structural or fundamental improvements.

• “Stepper dispatcher” and field integration and AMDB database. • Transition radiation process and hadronic physics validation and improvements.

– Makoto Asai and Dennis Wright. • Core developers in Geant4.

– Mentoring of less experienced people. * Leveraging of SLAC’s team of Geant4 core developers and experience from BaBar.

July 7, 2008 SLAC Annual Program Review Page 22

Overlay

* Average of >20 minimum bias events per beam crossing at design luminosity. – Governed by bunch luminosity rather than total beam luminosity.

* Other background sources. – Beam halo and beam gas. – Cavern background. – Adjacent bunches (due to electronics integration time).

* Crucial to include all such effects in simulation, but difficult to achieve from pure simulation. – Cross sections not known a priori. – Dependency on parameters such as beam condition, (dynamic) beam

pipe vacuum, etc. * Most reliably method is to record “zero bias” data, and overlay on

simulated events. * Mike Kelsey and Peter Kim will join Bill Lockman (UCSC) on overlay

in muon system and in validation tools. * Leverages experience from BaBar.

July 7, 2008 SLAC Annual Program Review Page 23

* 3D pixel sensor is candidate for B-layer replacement and/or tracker upgrade. – Radiation hard due to electrode proximity. – Active edge allows for butt joints.

* Beam tests. – Three 3D sensors.

• Two non-irradiated sensors. • One irradiated sensor (1015 p / cm2). • ATLAS front-end chip.

– External tracking with three sets of (x,y) silicon strip detectors. – Just concluded a 3-week run.

• Bart Butler, David Miller, CY. – Analysis to determine efficiency and resolution near the sensor edges. – Next run scheduled for November.

* Leverages our connections to Stanford University (where 3D sensors were first fabricated and continue to be developed), the new silicon lab at SLAC, and close working relationship with LBNL.

3D Sensor

July 7, 2008 SLAC Annual Program Review Page 24

Remote Monitoring

* Routine shift role by remote collaborators. – Reduce travel time and costs. – Take on work load during normal waking hours. – Implementation must be affordable at remote sites.

* Experts cannot all be resident at CERN all the time. – May be on travel and have laptop only.

* Decided to use the ATLAS Control Room (ACR) tools and export the interface and display.

* NX implementation first tested by Eric Feng (Chicago). – Peter Kim tested it at SLAC during a recent weekend combined run.

• Performance is generally acceptable. • Start-up is a little slow. • Scatterplots can be very slow if not properly handled.

– Working with other US groups who want to test. * Plan to install a high-functionality station at SLAC.

– Available for use by not only SLAC physicists but also ATLAS collaborators, whether they are normally resident at SLAC or visiting.

July 7, 2008 SLAC Annual Program Review Page 25

Production Support

* ATLAS production currently runs (0.5 – 1) M jobs per month, and already has ~15M jobs in its DB.

* We estimate it will go up by (at least) 100x. * Production DB is needed to bridge requests and job executions. * Must be reliable and scalable to track all jobs.

– Which ones have finished successfully?

– Which ones need to be re-run?

– Which ones should be aborted?

– What are the worst production problems? • Cannot fix problems that we cannot identify!

* Douglas Smith will help re-structure the ATLAS Production DB. * Takes advantage of the experience from BaBar, and continues our

involvement in data management.

July 7, 2008 SLAC Annual Program Review Page 26

Outline

* History and context. * Initial involvements.

– High Level Trigger (HLT). – Pixel. – Western Tier 2 (WT2).

* Additional activities. – Simulation. – 3D Sensors. – Remote Monitoring. – Production support.

* Personnel, roles and financial data. * Summary

July 7, 2008 SLAC Annual Program Review Page 27

SLAC ATLAS Personnel

* RA: Ignacio Aracena, Sarah Demers, Claus Horn, Paul Jackson. [4]

* Students: Jim Black, Bart Butler, David Miller, Dan Silverstein, Anson Hook. [5]

* Engineers: David Nelson, Marco Oriunno. [2]* Staff: Makoto Asai, Tim Barklow, Rainer Bartoldus, Mike

Kelsey, Peter Kim, Martin Kocian, Tim Nelson, Amedeo Perazzo, Andy Salnikov, Douglas Smith, Matthias Wittgen, Dennis Wright, CY. [13]

* Faculty: Ariel Schwartzman, Su Dong. [2]

July 7, 2008 SLAC Annual Program Review Page 28

Roles in ATLAS

* B-Layer Replacement Task Force. * DAQ Steering Group Scientific Secretary. * Collaboration Council Chair Advisory Group.

– Also functioned as Spokesperson Nominating Committee.

* DAQ Partial Event Build. * ETmiss Validation coordinator of jet/ETmiss combined performance

group. * HLT DbProxy responsible. * HLT Release Validation Manager. * Pixel Monitoring Coordinator (till 6/08). * Pixel Run Coordinator. * Referees for CSC Notes:

– Soft Muons and b-tagging. – Jet Algorithms. – In-situ Calibration.

July 7, 2008 SLAC Annual Program Review Page 29

Evolution of ATLAS budget: FY07-FY10

FY 2007-2010 Total M$ by Cost TypeATLAS

0

2

4

6

8

10

12

FY07 FY08 FY09 FY10

(

M$

)

Labor M&S Allocation of PPA DPS

FY 2007-2010 Total M$ by ActivityATLAS

0

2

4

6

8

10

12

FY07 FY08 FY09 FY10

(

M$

)

Research Operat'nUpgrade ATLAS Tier 2Allocation of PPA DPS

July 7, 2008 SLAC Annual Program Review Page 30

ATLAS manpower and budget: FY08

FY 2008 FTE by Job CategoryATLAS

Engineer / Computing

Professional, 5.2

Permanent PhD, 5.5

Temporary PhD, 3.4

Graduate Students, 0.8

Administrative / Technician,

0.6

Other, 0.0

Total FTE: 15.6

FY 2008 Total M$ by ActivityATLAS

Upgrade, 0.0

Allocation of PPA DPS, 0.5ATLAS Tier 2,

0.6

Research, 3.9

Total M$ of ATLAS: 4.531

July 7, 2008 SLAC Annual Program Review Page 31

FY 2010 FTE by Job CategoryATLAS

Other, 0.0Administrative / Technician,

0.7

Graduate Students, 0.9

Temporary PhD, 11.0

Permanent PhD, 9.3

Engineer / Computing

Professional, 9.6

Total FTE: 31.5

ATLAS manpower and budget: FY10

FY 2010 Total M$ by ActivityATLAS

Operat'n, 2.7

Upgrade, 2.2

ATLAS Tier 2, 0.6

Allocation of PPA DPS, 0.8

Research, 3.2

Total M$ of ATLAS: 8.7

July 7, 2008 SLAC Annual Program Review Page 32

Summary

* SLAC joined ATLAS 2 years ago. – Started in HLT and pixels. – Involvement has grown since that time.

* Key contributions to a number of areas. – Also see Ariel’s talk on physics.

* Significant roles in the collaboration. * Support of US institutions.

– Western Tier 2. – Affiliation of smaller groups. – Support of people resident at/near SLAC.

* Expect to grow in size of group and in scope of work. * Focus on projects and areas consistent with SLAC’s

experience, expertise and interest.

July 7, 2008 SLAC Annual Program Review Page 33

Additional Material

July 7, 2008 SLAC Annual Program Review Page 34

Sign-Off Status

Q1 Q2 Q3 Q4

Modules (13 bistaves6 bisectors) 410

(14 bistaves6 bisectors)436

(14 bistaves6 bisectors)436

(15 bistaves6 bisectors)462

Config modules for cooling sign-off

410 436 206 (7bistaves and 4 bistaves missing)

462

Threshold Scan 356 316 -- 368

Results 242 191 -- 271

BOC tuning -- Done on failing -- Done on failing

Threshold Scan -- Not analyzed (by me)

-- All but four modules

87%

60%

40%

July 7, 2008 SLAC Annual Program Review Page 35

Vertical Heater in Pixel

July 7, 2008 SLAC Annual Program Review Page 36

High Voltage Distribution

* HV distribution and module-level voltage control at Patch Panel 4 (HVPP4). – David Miller.

* Significant redesign and rework needed on HVPP4. * Analyzed vulnerability to noise.

– Inject 1.25 V peak-to-peak signal at HVPP4. • 100 Hz to 40M Hz frequency range. .

• For comparison, depletion voltage ~100 V.

– Readout noise increased from ~180 to ~450 electrons equivalent for worst case frequency of 600K Hz.

• Typical operational threshold ~4000 electrons.

– Data collected by Konstantin Toms (New Mexico) and Michael Koehler (Seigen).

July 7, 2008 SLAC Annual Program Review Page 37

With noise injection

Without noise injection

July 7, 2008 SLAC Annual Program Review Page 38

Track Based Alignment

* Base approach is to use high-momentum tracks from p-p collisions during normal data taking.

– Adjust module position and orientation to improve tracking residuals. * Tracks all coming from a point source leads to degenerate solutions.

– Such as overall size scale or momentum bias. – Luminous region is small compared with ATLAS trackers.

* Non-collision tracks can provide additional constraints. – Cosmic ray – low rate and restricted angles. – Beam halo – poor to non-existent illumination of pixel detector. – Beam gas – hope to have very little! – Offset interaction point (IP).

* Proposed to offset interaction point (IP) by one RF bucket, i.e. +/- 37.5 cm. – LHC indicated this is not difficult. – Investigating what improvements are possible.

• Bart Butler, Heather Gray (Columbia), CY.

* Trigger algorithms and DAQ bandwidth issues. – Use partial event build to minimize data size. – Applicable to both normal IP and offset IP. – Ignacio Aracena, Anson Hook.

July 7, 2008 SLAC Annual Program Review Page 39

Offset IP for Alignment

July 7, 2008 SLAC Annual Program Review Page 40

Cutout Muon Chambers

* Special chambers with notches or holes removed to fit in tight spaces.

* Blue: Monitored Drift Tube (MDT).

* Red: Resistive Plate Chamber (RPC).

* Green: cut-away view of detector feet assembly.

July 7, 2008 SLAC Annual Program Review Page 41

Production Request(# events, configurations, etc) Bamboo

(create Panda commands)Panda (Job Execution)

Grid-Z

CPU CPU CPU CPU CPU

Grid SiteGrid Site

Grid Site

Production DB

Tasks &

jobs

Grid-A

Tasks & jobs

Results &

status

July 7, 2008 SLAC Annual Program Review Page 42

LHC Commissioning Plan

Parameter Phase A Phase B Phase C Nominal

k / no. bunches 43-156 936 2808 2808

Bunch spacing (ns) 2021-566 75 25 25

N (1011 protons) 0.4-0.9 0.4-0.9 0.5 1.15

Crossing angle (rad) 0 250 280 280

*/*nom) 2 1 1

* (m, IR1&5) 32 22 16 16

L (cm-2s-1) 6x1030-1032 1032-1033 (1-2)x1033 1034

July 7, 2008 SLAC Annual Program Review Page 43