status of prs/muon activities d. acosta university of florida work toward the “june” hlt...

8
Status of PRS/Muon Activities Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

Upload: dylan-booker

Post on 26-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

Status of PRS/Muon ActivitiesStatus of PRS/Muon Activities

D. AcostaUniversity of Florida

Work toward the “June” HLT milestonesUS analysis environment

Page 2: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta2

HLT MilestonesHLT MilestonesThe June HLT milestones are:

Complete HLT selection for high-lumi scenario HLT results on B physics CPU analysis for high lumi selection Repeat on line selection for low-lumi

Must have results in DAQ TDR by September!

We don’t have these results yet, but the current status and L1 results were reported at the June CMS Week HLT Muon code had severe crashes, infinite loops, and

memory leaks that prohibited collecting any statistics on our HLT algorithms

After monumental debugging effort, crashes traced to incorrect use of “ReferenceCounted” objects

User must never delete, even if performed new! In L1 results, rate spike appeared at =1.6

New holes in geometry? Problems were “fixed” for ORCA_6_2_0 release

Page 3: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta3

L1 CSC Rate SpikeL1 CSC Rate Spike

Contributes ~1 kHz to L1 rate Spike occurs in and ! Region of crack between barrel/endcap Traced to ambiguity in pT assignment

for low pT muons (or punch-through) Fixed in CSC Track-Finder (but not sure

why this is a problem only now)

Page 4: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta4

Not out of the woods…Not out of the woods…Tony reports that the Muon RootTreeMaker has a massive memory leak (200–500kB/event) Analysis stopped at CERN (batch nodes were dying) But muon HLT code alone was shown to have a leak of “only”

16 kB/event when released So is it because events have more occupancy with pile-up, or is it

because of jet/tau/pixel code? Still under investigation

At FNAL, I find a leak of 800 kB/event for Z, and it is lumi dependent (600 kB/event for 2*1033)

Nicola Amapane promises some results(fix?) this eveningMoreover, the DT reconstruction is known to have some deficiencies

So we have a two-fold plan: Continue with current Root Tree production at remote sites to get us

a set of baseline results for HLT milestone We already had Pt1, Pt10, and Pt4 (low lumi) done before CERN

shut down. Can Fermilab help? Run ~1000 short jobs on PT4 if leak not fixed

Push hard to get new HLT reconstruction code written to solve remaining problems in time for September deadline

Page 5: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta5

Status of New Reconstruction CodeStatus of New Reconstruction CodeStefano Lacaprara has authored a new version of DT reconstruction code Corrects some naïve handling of the DT hits, incorrect pulls,

new code organization,… Turns out DT reconstruction must know drift velocity to ~1%

This code has been examined (some bugs fixed) and cleaned up by Bart Van de Vyver (and also by Norbert and Nicola)

Aiming to get new results in August, hopefully with new reconstruction code

Page 6: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta6

Muon Analysis at FermilabMuon Analysis at FermilabRequest made to get the Muon Federations copied from CERN Pt4 single muon sample at highest priority

Shafqat predicts copied by Monday Pt1, Pt10, W, Z, and t-tbar to follow Z (on-peak and above) already available

Root Trees will be copied as well, when availableUS users thus have at least one local choice for an analysis center, in addition to CERN

Mechanism to obtain FNAL visitor id and computer accounts remotely works well (Thanks Hans…)

When Pt4 sample ready, PRS/Muon group is interested in running large number of RootTreeMaker jobs at FNAL INFN still trying to copy PT4 tracker digis (0.7TB)

Page 7: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta7

Florida Prototype Tier-2 CenterFlorida Prototype Tier-2 CenterCurrently host the Z samples in Florida

but only for local accounts, I think, at the moment. Eventually should be accessible world-wide.

Limited by available disk space Several TB of RAID ordered

Problematic analysis environment Although the Production environment is working quite well

with DAR distributions, the analysis environment (where users can compile and run jobs) is a little unstable

Some difficulties building code in ORCA6, perhaps traced to using RH6.2 vs. RH6.1 (loader version)

Need a more standard way to set up analysis environment I think INFN also had some initial difficulties getting

ORCA_6_2_0 installed and working

Should be solved once key people come back from vacation

Page 8: Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment

US CMS S&C and Physics Meeting, July 26, 2002

Darin Acosta8

Side Note…Side Note…

For better or worse, we are working with a complex set of software for CMS Definitely not easy for newcomers to contribute to development or

to debugging (or to create a DB) Case in point: how can a summer student plug a new L2 module

into ORCA? Many layers to ORCA software, difficult to navigate, little

documentation of “common” classes Sometimes counterintuitive rules must be followed

Complexity probably partly intrinsic to ORCA/COBRA, and partly due to inexperienced physicists working in this environment

That being the case, we MUST have professional tools for development and debugging Must be able to debug and profile the code, check for memory

leaks, corruption, etc.This is standard for CDF, and reliability of production code has

increased dramatically Requires analysis workstations with enough memory to handle

these toolsShould start defining a set of validation plots to show problems early in production