environmental hydrology team meeting
DESCRIPTION
Environmental Hydrology Team Meeting. Show how the Alliance has demonstratably changed the nation’s computational infrastructure Show explicitly how we have empowered communities to do things better Focus on deployment of infrastructure - PowerPoint PPT PresentationTRANSCRIPT
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Environmental Hydrology Team Meeting
• Show how the Alliance has demonstratably changed the nation’s computational infrastructure
• Show explicitly how we have empowered communities to do things better
• Focus on deployment of infrastructure
• Deploy real codes to real researchers coupled with real metrics of use
• “Harden” software to be accompanied by documentation, training, and dissemination
Educational focus secondary for this year in AT
Primary Alliance Objectives in Year 4
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Purpose of Workshop
1) Learn about the current status of EH projects 2) Develop overall team goals for this year in light of
Alliance goals3) Refine SOW’s in light of team goals4) Determine current and future AT and/or EH team
collaboration5) Define expected deliverables this year both
dependence and independent of EH development6) Identify what is holding us back from reaching our
individual and group goals7) Discuss communication and reporting 8) Plan group PR9) Other ?????
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Presentation Guidelines
1) Briefly summarize past accomplishments and deliverables with at most 3 to 4 bullets on 1 or 2 overheads
2) State your main goals and deliverables (and your groups specific contributions to these them) for this year. I would like at most 4 bullets here.
3) What communities will you actually impact this year and how will this be accomplished (include plans for disseminating software or interacting with communities who will benefit from your work)
4) Note who on your team will actually carry out the work and what percentage of time they will be contributing
5) Note your questions/concerns about realizing the goals and deliverables (e.g., looking for a student to work on this, work can't begin until January due to other commitments, when will a linux cluster be available for me to work on,........)
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Schedule8:30 Alliance Directions Dick Crutcher – Alliance Chief Scientist9:00 Purpose/Goals of the Workshop Bob Wilhelmson – EH Team Lead9:15 VisAD and Coupling Bill Hibbard – U. of Wisconsin9:45 SME Developments Tom Maxwell – U. of Maryland10:15 ARPS Development Dan Weber – U. of Oklahoma10:45 Infrastructure and Benchmarking Danesh Tafti - NCSA11:15 Hydrologic Developments Frank Weirich – U. of Iowa11:45 Surface Modeling Baxter Vieux – U. of Oklahoma12:15 Lunch and demo setup12:30 Digital River Basin Doug Johnston – NCSA1:00 Demoes2:00 Portal Development Jay Alameda – Alliance Chemistry Team2:30 Regional Ocean Modeling Dale Haidvogel – Rutgers U.3:00 Visualization of Fluids Polly Baker – NCSA, Dir. Data Mining/Vis.3:30 OPIE Doug Fine – NCSA4:00 Clusters at NCSA Rob Pennington – NCSA, Acting Dir. C&C4:30 HDF5 Developments Mike Folk – NCSA5:00 General Discussion Bob Wilhelmson7:00 Dinner at Silvercreek
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Alliance Technology Roadmap
• Capability computing– attack complex problems– move from rationing to
computing on demand• Building the Grid
– eliminate distance for virtual teams
– convert computing into a utility• Science portals
– bring commercial web technology to scientists
– build electronic research communities
• Clusters as the unifying mechanism– User wants and review
recommendations
Science Portals & Workbenches
Twenty-First Century Applications
Computational Services
Performance
Networking, Devices and Systems
Grid Services(resource independent)
Grid Fabric(resource dependent)
Access Services & Technology
Access Grid
Computational Grid
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Linux Terascale Cluster
64 hosts 64 hosts 64 hosts 64 hosts 64 hosts
Spine Switches
128-port Clos Switches
64 inter-switch links
100Mb/s Switched EthernetManagement Network
(c) I/O - Storage (d) Visualization
Clos mesh InterconnectEach line = 8 x 2Gb/s links
64 TBRAID
64 inter-switch links
= 4 links
64 inter-switch links
Local Display Networks for Remote Display
(e) Compute
(b) Example 320-node Clos Network(a) Terascale Architecture Overview
Add’l Clusters, External Networks
782 IA-6432 IA-3232 IA-32
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Linking People and Resources
Sensor Arrays
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Prototypical Grid Applications
• NSF Network for Earthquake Engineering Simulation (NEES) – Tom Prudhomme– integrated instrumentation, collaboration, simulation– planning study for $10M deployment
• Grid Physics Network (GriPhyN) – Ian Foster– largest NSF ITR award– ATLAS, CMS, LIGO, SDSS– distributed analysis of petascale data
• Environmental modeling– Mobile, disposable sensors and wireless networks– Integrated measurement, simulation, and adaptation– EH atmosphere, land, ocean, and ecosystem modeling
NSF NEES Earthquake GridGriPhyN Physics Grid Network
National Computational ScienceUniversity of Illinois at Urbana-Champaign
• Access Grid does for people what the Computational Grid does for machines– enables group interaction with the Grid– streaming multicast audio/video and shared presentations
Collaborative Technologies
PC Options: Alliance Access Grid Netmeeting New voice/video technology coupled with large screen TV or flat screen
National Computational ScienceUniversity of Illinois at Urbana-Champaign
“Standard” Portal Model
Users Browser
Other Desktop
Tools
Users Desktop machine
Portal Server
AuthenticationService
MyProxycertificate
serverJob mgmtService
InfoServices
fileServices
securityServices
The Grid - remote compute, data and application resources
COG/GPDK
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Grand Challenges in Environmental SciencesNew NRC Report requested by NSF
• Biogeochemical Cycles
• Biological Diversity and Ecosystem Functioning
• Climate Variability
• Hydrologic Forecasting
• Infectious Disease and the Environment
• Institutions and Resource Use
• Land-Use Dynamics
• Reinventing the Use of Materials
Recommended for immediate research investment
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Weather Research/Forecasting Model
• Adding HDF5 parallel I/O capabilities• Porting/optimizing for IA32 and IA64 clusters• Deployment/documentation of these added capabilities• Woodward collaboration to improve performance for very large problems on hundreds
of processors
Wilhelmson Objectives for 2001
Staff Wilhelmson 25% Shaw 30%
Usage and Dissemination• WRF beta release in November 2000• Updates during year
Issues Shaw on leave
Group Objectives• Couple with VisAD• Couple with surface model• Develop portal interface
in grid enabled environment
National Computational ScienceUniversity of Illinois at Urbana-Champaign
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Environmental Hydrology Visualization
Need:• Access to distributed
large datasets• Cross platform• Cluster algorithms• Lightweight component• Auto-data translators• Metadata support• Heterogeneous format
support• Component libraries• Real-time GIS/Model • Distance collaboration• Vector/raster• Nested grid
Options:• Vis5D• SGI Explorer• Fluid Tracers• VisAD• Cave5D• nViz• GeoVis…• VisBench• NCAR Graphics• IBM Data Explorer• IDL
National Computational ScienceUniversity of Illinois at Urbana-Champaign
VisBench and Geospatial Data
Early results for
combining Terrain
plus GIS info plus
simulation output
Visualization generator
Terrain/GISServer(s)
Client application
Rob Stein
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Multigrid VTK Visualization of Hurricane Opal
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Graphics: How Far We’ve Come
• Toy Story™– 2-12 million triangles/frame– in 2001 we will be close to Toy Story graphics
– in real time on PCs
• "Reality" – 80 million triangles/frame– within 5-10 years a PC game
– will be on par with "reality“
• Playstation2 story (stay tuned …)
National Computational ScienceUniversity of Illinois at Urbana-Champaign
Computing On Toys
• Sony PlayStation2 features– 6.2 GF peak– 70M polygons/second– 10.5M transistors– superscalar RISC core– plus vector units, each:
– 19 mul-adds & 1 divide– each 7 cycles
• $299 suggested retail– U.S. release October 2000– 980,000 units sold first week in
Japan• Terascale computing
– $60K/teraflop– scalable visualization
National Computational ScienceUniversity of Illinois at Urbana-Champaign
The New Quadrangle
• $110M IT infrastructure– the world’s best
– living laboratories
• North research park– three stage R&D pipeline
– basic research– prototyping– transfer and development
– industrial partners nearby
National Computational ScienceUniversity of Illinois at Urbana-Champaign