Open Science GridOpen Science GridFor CI-DaysFor CI-Days
Internet2: Fall Member Meeting, 2007
John McGee – [email protected] Engagement Manager
Renaissance Computing InstituteUniversity of North Carolina, Chapel Hill
Why should my University facilitateWhy should my University facilitate(or drive) resource sharing?(or drive) resource sharing?
• Because researchers with federally funded clusters can’t wait to meet you and your Grid middleware?
• Because it’s the right thing to do– Enables new modalities of collaboration– Enables new levels of scale– Democratizes large scale computing– Sharing locally leads to sharing globally– Better overall resource utilization– Funding agencies
At the heart of the cyberinfrastructure vision is the development of a cultural community that supports peer-to-peer collaboration and new modesof education based upon broad and open access to leadership computing; data and information resources; online instruments and observatories; and visualization and collaboration services.
- Arden Bement CI Vision for 21st Century introduction
Clemson Campus Condor PoolClemson Campus Condor Pool• Machines in 27 different
locations on Campus• ~1,700 job slots• >1.8M hours served in
6 months
• users from Industrial and Chemical engineering, and Economics
• Fast ramp up of usage
• Accessible to the OSG through a gateway
6,400 CPUs available
Campus Condor pool backfills idle nodes in PBS clusters - provided 5.5 million CPU-hours in 2006, all from idle nodes in clusters
Use on TeraGrid: 2.4 million hours in 2006 spent Building a database of hypothetical zeolite structures; 2007: 5.5 million hours allocated to TG
http://www.cs.wisc.edu/condor/PCW2007/presentations/cheeseman_Purdue_Condor_Week_2007.ppt
The Open Science GridThe Open Science Grid
• OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories and computing centers across the U.S., who together build and operate the OSG project. The project is funded by the NSF and DOE, and provides staff for managing various aspects of the OSG.
• Brings petascale computing and storage resources into a uniform grid computing environment
• Integrates computing and storage resources from over 50 sites in the U.S. and beyond
A framework for large scale distributed resource sharingaddressing the technology, policy, and social requirements of sharing
Principal Science DriversPrincipal Science Drivers
• High energy and nuclear physics– 100s of petabytes (LHC) 2007– Several petabytes 2005
• LIGO (gravity wave search)– 0.5 - several petabytes 2002
• Digital astronomy– 10s of petabytes 2009– 10s of terabytes 2001
• Other sciences emerging– Bioinformatics (10s of petabytes)– Nanoscience– Environmental– Chemistry– Applied mathematics– Materials Science
Virtual Organizations (VOs)Virtual Organizations (VOs)• The OSG Infrastructure trades in
Groups not Individuals
• VO Management services allow registration, administration and control of members of the group.
• Facilities trust and authorize VOs.
• Storage and Compute Services prioritize according to VO group.
Set of Available Resources
VO Management Service
OSG and WAN VO Management
& Applications
Campus Grid Campus Grid Campus Grid
Image courtesy: UNM Image courtesy: UNM
Date range: 2007-04-29 00:00:00 GMT - 2007-05-07 23:59:59 GMT
“What impressed me most was how quickly we were able to
access the grid and start using it. We learned about it at
RENCI, and we were running jobs about two weeks later,”
says Kuhlman.
For each protein we design, we consume about 3,000 CPU hours across 10,000 jobs,” says Kuhlman. “Adding in the structure and atom design process, we’ve consumed about 100,000 CPU hours in total so far.”
Designing proteins in the Kuhlman LabDesigning proteins in the Kuhlman Lab
Campus IT and Enterprise Systems
Department IT
Campus Research Computing
Department IT
CampusResearcher Student
… so, what can we do together …
… … to advance scientific research and education?to advance scientific research and education?
Lab IT
What can we do together?What can we do together?• OSG is looking for a few partners to help deploy
campus wide grid infrastructure that integrates with local enterprise infrastructure and the national CI
• RENCI’s OSG team is available to help scientists get their applications running on OSG– low impact starting point– Help your researchers gain significant compute cycles while
exploring OSG as a framework for your own campus CI
mailto: [email protected]
E N D