information technology and computing infrastructure for u.s. lhc physics

19
Feb 21, 2003 Meeting with the NSF 1 Lothar A T Bauerdick Fermilab Information Technology and Information Technology and Computing Infrastructure for Computing Infrastructure for U.S. LHC Physics U.S. LHC Physics Lothar A.T. Bauerdick, Fermilab Lothar A.T. Bauerdick, Fermilab Project Manager U.S. CMS Software and Computing Project Manager U.S. CMS Software and Computing

Upload: jared-mckinney

Post on 01-Jan-2016

28 views

Category:

Documents


0 download

DESCRIPTION

Information Technology and Computing Infrastructure for U.S. LHC Physics. Lothar A.T. Bauerdick, Fermilab Project Manager U.S. CMS Software and Computing. LHC Discovery is Through Software and Computing. +. LHC Computing Unprecedented in Scale and Complexity. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 1Lothar A T Bauerdick Fermilab

Information Technology and Information Technology and Computing Infrastructure for Computing Infrastructure for

U.S. LHC PhysicsU.S. LHC Physics

Lothar A.T. Bauerdick, FermilabLothar A.T. Bauerdick, FermilabProject Manager U.S. CMS Software and ComputingProject Manager U.S. CMS Software and Computing

Page 2: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 2Lothar A T Bauerdick Fermilab

LHC Discovery is ThroughLHC Discovery is ThroughSoftware and ComputingSoftware and Computing

LHC Computing Unprecedented in Scale and ComplexityLHC Computing Unprecedented in Scale and Complexity

+

Page 3: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 3Lothar A T Bauerdick Fermilab

Physics DiscoveriesPhysics Discoveriesby Researchers at U.S. Universitiesby Researchers at U.S. Universities

U.S. LHC is Committed to Empower the Universities U.S. LHC is Committed to Empower the Universities

to do Research on LHC Physics Datato do Research on LHC Physics Data

This is why we are interested in the Grid This is why we are interested in the Grid as an Enabling Technologyas an Enabling Technology

Page 4: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 4Lothar A T Bauerdick Fermilab

Distributed Grid Computing and Data Model Distributed Grid Computing and Data Model is the LHC Baseline Modelis the LHC Baseline Model

R&D and Testbeds: Prototyping Grid Infrastructure, R&D and Testbeds: Prototyping Grid Infrastructure, Deploying Grid Tools, Developing Grid ApplicationsDeploying Grid Tools, Developing Grid Applications

US Atlas Grid Testbed

Page 5: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 5Lothar A T Bauerdick Fermilab

LHC Experiment

Online System

CERN Computer Center > 20 TIPS

USAKorea RussiaUK

Institute

100-200 MBytes/s

2.5 Gbits/s

1 Gbits/s

2.5 - 10 Gbits/s

~0.6 Gbits/s

Tier 0

Tier 1

Tier 3

Tier 2

Physics cachePCs, other portals

Institute

Institute

Institute

Tier2 Center

Tier2 Center

Tier2 Center

Tier2 Center

Tier 4

Tier-ed System of Regional CentersTier-ed System of Regional Centers

Developing the hierarchically organized fabric of “Grid Nodes” …Developing the hierarchically organized fabric of “Grid Nodes” …

Page 6: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 6Lothar A T Bauerdick Fermilab

Transition to Production-quality GridTransition to Production-quality Grid

Centers taking part in LHC Grid 2003Centers taking part in LHC Grid 2003Production Service Around the World Production Service Around the World Around the Clock! Around the Clock!

Page 7: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 7Lothar A T Bauerdick Fermilab

……towards Dynamic Workspaces for Scientiststowards Dynamic Workspaces for Scientists

To empower Communities of Scientists to analyze dataTo empower Communities of Scientists to analyze dataand to work locally within a global contextand to work locally within a global context

Resource Management, Knowledge Systems, Human-Grid Interfaces, Collab. ToolsResource Management, Knowledge Systems, Human-Grid Interfaces, Collab. Tools Infrastructure to support sharing and consistency of Infrastructure to support sharing and consistency of

Physics and Calibration Data + Schema, Data Provenance, Workflow, etc Physics and Calibration Data + Schema, Data Provenance, Workflow, etc

Page 8: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 8Lothar A T Bauerdick Fermilab

The Goal:The Goal:

Provide individual physicists and groups of scientists Provide individual physicists and groups of scientists capabilities from the desktop that allow themcapabilities from the desktop that allow them To participate as an equal in one or more “Analysis Communities”To participate as an equal in one or more “Analysis Communities” Full representation in the Global Experiment Enterprise Full representation in the Global Experiment Enterprise To on-demand receive whatever resources and information they need to To on-demand receive whatever resources and information they need to

explore their science interest while respecting the collaboration wide explore their science interest while respecting the collaboration wide priorities and needspriorities and needs

That is, provideThat is, providemassive computing, storage, networking resourcesmassive computing, storage, networking resources Including “opportunistic” use of resources that are not LHC owned!Including “opportunistic” use of resources that are not LHC owned!

Provide full access to dauntingly complex “meta-data”Provide full access to dauntingly complex “meta-data” That need to be kept consistent to make sense of the event dataThat need to be kept consistent to make sense of the event data

Page 9: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 9Lothar A T Bauerdick Fermilab

These Goals Require Substantial R&DThese Goals Require Substantial R&D Global Access and Global Management of Massive and Complex DataGlobal Access and Global Management of Massive and Complex Data Location Transparency of Location Transparency of

Complex Processing Environments and of Massive Data CollectionsComplex Processing Environments and of Massive Data Collections Monitoring, Simulation, Scheduling and Optimization on a Monitoring, Simulation, Scheduling and Optimization on a

Heterogeneous Grid of Computing Facilities and NetworksHeterogeneous Grid of Computing Facilities and Networks Virtual Data, Workflow, Knowledge Management TechnologiesVirtual Data, Workflow, Knowledge Management Technologies End-to-End Networking Performance, Application IntegrationEnd-to-End Networking Performance, Application Integration Management of Virtual Organizations across the Grid, Management of Virtual Organizations across the Grid,

Technologies and Services for Security, Privacy, AccountingTechnologies and Services for Security, Privacy, Accounting Scientific Collaboration over the distanceScientific Collaboration over the distance Etc …Etc …

Grids are the Enabling TechnologyGrids are the Enabling TechnologyLHC Needs are Pushing the LimitsLHC Needs are Pushing the Limits

Technology and Architecture still evolvingTechnology and Architecture still evolvingNew IT and CS Research is requiredNew IT and CS Research is required

Page 10: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 10Lothar A T Bauerdick Fermilab

Start of the NSF LHC Research ProgramStart of the NSF LHC Research Program

Exciting opportunities with the start of the Exciting opportunities with the start of the NSF Research Program funding this year!NSF Research Program funding this year!

US CMS Universities will profit in major waysUS CMS Universities will profit in major ways develop the strong U.S. LHC environment for Physics Analysisdevelop the strong U.S. LHC environment for Physics Analysis

Address the core issues in U.S. LHC S&C: Address the core issues in U.S. LHC S&C: developing and implementing the distributed computing model developing and implementing the distributed computing model

central for success of U.S. Universities participationcentral for success of U.S. Universities participation Focus on end-to-end services, Focus on end-to-end services, Focus on distributed data access and management Focus on distributed data access and management

Major injection of new R&D manpower + running Tier-2 centersMajor injection of new R&D manpower + running Tier-2 centerse.g. U.S. CMS:e.g. U.S. CMS: At U.S. Universities for Architecture, Middleware and Physics supportAt U.S. Universities for Architecture, Middleware and Physics support Start of a pilot Tier-2, possibly a Tier-2-based PAC Start of a pilot Tier-2, possibly a Tier-2-based PAC Start Grid Operations R&D and support (2-3 FTE)Start Grid Operations R&D and support (2-3 FTE)

Page 11: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 11Lothar A T Bauerdick Fermilab

U.S. LHC Grid Technology CyclesU.S. LHC Grid Technology Cycles

““Rolling Prototypes”: evolution of the facility and data systemsRolling Prototypes”: evolution of the facility and data systemsPrototyping, Prototyping,

Early roll out, Early roll out, Emphasis on Quality, Documentation, Emphasis on Quality, Documentation,

Dissemination,Dissemination,Tracking of external “practices”Tracking of external “practices”

Page 12: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 12Lothar A T Bauerdick Fermilab

MITRiceMinnesota IowaPrinceton

•Expression of interest:

Grid Testbeds And Production GridsGrid Testbeds And Production Grids

BrazilSouth Korea

Grid Testbeds: Development and Dissemination!Grid Testbeds: Development and Dissemination! LHC Grid Testbeds first real-life large Grid installations, becoming production qualityLHC Grid Testbeds first real-life large Grid installations, becoming production quality Strong Partnership between Universities, Labs, Strong Partnership between Universities, Labs,

with Grid (iVDGL, GriPhyN, PPDG) and Middleware Projects (Condor, Globus)with Grid (iVDGL, GriPhyN, PPDG) and Middleware Projects (Condor, Globus) Strong dissemination component, together with Grid ProjectsStrong dissemination component, together with Grid Projects

E.g. U.S. CMS Testbed: E.g. U.S. CMS Testbed: Caltech, UCSD, U.Florida, UW Madison, Fermilab, CERNCaltech, UCSD, U.Florida, UW Madison, Fermilab, CERN

Page 13: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 13Lothar A T Bauerdick Fermilab

Example: Monitoring and Example: Monitoring and Information ServicesInformation Services

MonALISA (Caltech)MonALISA (Caltech)Currently deployed in Testbed environmentCurrently deployed in Testbed environment

Dynamic information/resource Dynamic information/resource discovery mechanism using intelligent agentsdiscovery mechanism using intelligent agents

Java / Jini with interfaces to Java / Jini with interfaces to SNMP, MDS, Ganglia, and HawkeyeSNMP, MDS, Ganglia, and Hawkeye

WDSL / SOAP with UDDIWDSL / SOAP with UDDI Aim to incorporate into a Aim to incorporate into a

“Grid Control Room” Service“Grid Control Room” Service

Page 14: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 14Lothar A T Bauerdick Fermilab

Distributed Collaborative EngineeringDistributed Collaborative Engineering

Projectization essential for Software and Computing Effort of this complexityProjectization essential for Software and Computing Effort of this complexity Physics and Detector Groups at Universities are the first to profit from thisPhysics and Detector Groups at Universities are the first to profit from this

Example: Atlas Detector Geometry Description Databases Example: Atlas Detector Geometry Description Databases Idea and Concept Idea and Concept

Geometry Modeller based on CDF [U.Pittsburg]Geometry Modeller based on CDF [U.Pittsburg] Massive Development Effort Massive Development Effort

NOVA MySQL Database [BNL]NOVA MySQL Database [BNL]– Repository of persistent configuration informationRepository of persistent configuration information

NOVA Service [ANL]NOVA Service [ANL]– Retrieval of transient C++ objects from NOVA DatabaseRetrieval of transient C++ objects from NOVA Database

Conditions Database Service [ANL/Lisbon]Conditions Database Service [ANL/Lisbon]– Access to time-varying information based on type, time, version and keyAccess to time-varying information based on type, time, version and key– Used in conjunction with other persistency services (e.g. NOVA Service)Used in conjunction with other persistency services (e.g. NOVA Service)

Interval Of Validity Service [LBNL]Interval Of Validity Service [LBNL]– Registration of clients; retrieval of updated information when validity expires; Registration of clients; retrieval of updated information when validity expires;

caching policy managementcaching policy management Release as scheduled to Detector and Physics GroupsRelease as scheduled to Detector and Physics Groups

Prototype at Silicon alignment workshop in December 2002Prototype at Silicon alignment workshop in December 2002

Page 15: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 15Lothar A T Bauerdick Fermilab

Example: Detector DescriptionExample: Detector Description

Detail from TRT

Detail from Barrel Liquid Argon(parameterized - 40kB in memory)

Geometry Modeller, Database, Visualization, OptimizationGeometry Modeller, Database, Visualization, Optimization

Page 16: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 16Lothar A T Bauerdick Fermilab

Grid Projects Directly Related to LHCGrid Projects Directly Related to LHC

A Large International EffortA Large International Effort

Page 17: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 17Lothar A T Bauerdick Fermilab

National Science Foundation

Operations insupport of endusers

Developmentor acquisition

Coordination (synergy) Matrix

Research intechnologies,systems, andapplications

Applications of information technology to scienceand engineering research

Cyberinfrastructure in support of applications

Core technologies incorporated intocyberinfrastructure

Pieces For the LHC Computing InfostructurePieces For the LHC Computing Infostructure

CERN

LCGCERN

LCGUS LHC

S&CUS LHC

S&CITR

ProposalsITR

Proposals

Blue Ribbon Panel on Blue Ribbon Panel on CyberinfrastructureCyberinfrastructure

iVDGLiVDGL

GriPhy

NGriPhy

N

Page 18: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 18Lothar A T Bauerdick Fermilab

Universities have an Enormous Impact on R&D&D&DUniversities have an Enormous Impact on R&D&D&D

Inter-agency partnership between the Inter-agency partnership between the NSF-funded Tier-2 and the DOE-funded Tier-1 efforts NSF-funded Tier-2 and the DOE-funded Tier-1 efforts

addresses a major part of theaddresses a major part of the24x7 support and Grid services issues24x7 support and Grid services issues

Software and ComputingSoftware and Computingfor LHC Discovery requires for LHC Discovery requires

Research, Development, Deployment, DisseminationResearch, Development, Deployment, Dissemination

And also Running Facilities and ServicesAnd also Running Facilities and Services

+

Page 19: Information Technology and Computing Infrastructure for   U.S. LHC Physics

Feb 21, 2003Meeting with the NSF 19Lothar A T Bauerdick Fermilab

The U.S. LHC Mission is The U.S. LHC Mission is Physics Discovery at the Energy Frontier!Physics Discovery at the Energy Frontier!

This model takes advantage This model takes advantage of the significant strengths of of the significant strengths of

U.S. universities in the area of CS and ITU.S. universities in the area of CS and IT

Draw Strength and Exploit Synergy BetweenDraw Strength and Exploit Synergy BetweenU.S. Universities and National Labs,U.S. Universities and National Labs,

Software Professionals and Physicists,Software Professionals and Physicists,Computer Scientists and High Energy PhysicistsComputer Scientists and High Energy Physicists

LHC is amongst the first to put a truly distributed LHC is amongst the first to put a truly distributed “Cyberinfrastructure” in place, spearheading important “Cyberinfrastructure” in place, spearheading important

innovations in how we do scienceinnovations in how we do science