climate and weather research at nasa goddard

24
NASA Center for Computational Sciences Climate and Weather Research at NASA Goddard 9 September 2009 Phil Webster Goddard Space Flight Center [email protected]

Upload: reidar

Post on 15-Jan-2016

42 views

Category:

Documents


0 download

DESCRIPTION

Climate and Weather Research at NASA Goddard. 9 September 2009. Phil Webster Goddard Space Flight Center [email protected]. NASA Mission Structure. To implement NASA’s Mission, NASA Headquarters is organized into four Mission Directorates . - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Climate and Weather Researchat NASA Goddard

9 September 2009

Phil WebsterGoddard Space Flight [email protected]

Page 2: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

NASA Mission Structure

To implement NASA’s Mission, NASA Headquarters is organized into four Mission Directorates.

• Aeronautics: Pioneers and proves new flight technologies that improve our ability to explore and which have practical applications on Earth.

• Exploration Systems: Creates new capabilities and spacecraft for affordable, sustainable human and robotic exploration

• Science: Explores the Earth, moon, Mars, and beyond; charts the best route of discovery; and reaps the benefits of Earth and space exploration for society.

• Space Operations: Provides critical enabling technologies for much of the rest of NASA through the space shuttle, the International Space Station, and flight support.

2September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 3: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Science Mission Directorate

3September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 4: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Earth Science Division Overview

Overarching Goal: to advance Earth System science, including climate studies, through spaceborne data acquisition, research and analysis, and predictive modeling

Six major activities:• Building and operating Earth observing satellite missions, many with international

and interagency partners

• Making high-quality data products available to the broad science community

• Conducting and sponsoring cutting-edge research in 6 thematic focus areas– Field campaigns to complement satellite measurements

– Modeling

– Analyses of non-NASA mission data

• Applied Science

• Developing technologies to improve Earth observation capabilities

• Education and Public Outreach

4September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 5: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Earth Science Division Focus Areas

5September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 6: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Modeling, Analysis and Prediction (MAP) Program

• Seeks an understanding of the Earth as a complete, dynamic system

• Emphasis on climate and weather

• Key questions include: − How is the Earth system changing?

− What are the forcing mechanisms driving observed changes?

− How does the Earth system respond to natural and human-induced changes?

− What are the consequences of Earth system change to society?

− What further changes can be anticipated, and what can be done to improve our ability to predict such changes through improved remote sensing, data assimilation, and modeling?

The MAP program supports observation-driven modeling that integrates the research activities in NASA’s Earth Science Program

6September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 7: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

NASA’s Climate and Weather Modeling

• Spans timescales from weather to short-term climate prediction to long-term climate change

• Spans weather, climate, atmospheric composition, water & energy cycles, carbon cycle

• “Unique” in bringing models and observations together through assimilation and simulation

• Products to support NASA instrument teams and atmospheric chemistry community

• Contributes to international assessments: WMO/UNEP, IPCC - contributions to IPCC/AR5 – new paradigm of “data delivery” for NASA modeling community – in partnership with NCCS, PCMDI, and LLNL

• Contributes to WWRP and WCRP

7September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 8: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Tomorrow’s Science

• New missions – increased sensing of the earth’s climate system as recommended by decadal studies; more data and more types of data

• Advanced assimilation – use more data to produce better model initiation

• Higher resolution – better representation of atmospheric processes to improve prediction

• Greater complexity - understanding and predicting/projecting future climate

• Coupled ocean-atmosphere-land models - including full carbon cycle

• Increased collaboration – of models, model output, simulation & observational data sets

Different gradients inthe Atlantic Ocean

Different gradients inthe Pacific Ocean

Opposite sign gradient change in the Atlantic and Pacific

September 9, 2009 833rd HPC User Forum, Broomfield, CO

Page 9: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

High-Resolution Climate Simulations with GEOS-5 Cubed-Sphere Model

September 9, 2009 33rd HPC User Forum, Broomfield, CO 9

• GMAO, GISS, NCCS and SIVO staff is refining techniques for Intel Nehalem (e.g. concurrent serial I/O paths).

• SIVO’s Bill Putman’s 3.5-km (non-hydrostatic) simulations with GEOS-5 Cubed Sphere Finite-Volume Dynamical Core, on approximately 4,000 Nehalem cores of the NASA Center for Computational Sciences (NCCS) Discover supercomputer, yielded promising results including cloud features not seen with lower-resolution runs.

• Exploring techniques for more efficient memory and parallel I/O, e.g., evaluating effects of replacing of Intel MPI with MVAPICH.

Low Cloud features from 3.5-km GEOS-5 Cubed Sphere for 2 January 2009 (left), compared to GOES-14 first full-disk visible image on 27 July

2009 (center) and 27-km (roughly ¼ degree) GEOS-5 Cubed Sphere for 2 January 2009 (right).

NCCS Discover Scalable Unit 5’s Nehalem architecture and larger core count enables researchers to exploit methods for higher resolution models, advancing NASA Science Mission Directorate science objectives.

Bill Putman, Max Suarez, NASA Goddard Space Flight Center; Shian-Jiann Lin, NOAA Geophysical Fluid Dynamics Laboratory

Page 10: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

GEOS-5 – Impact of ResolutionKarman Vortex Streets

7 km

28 km

3.5 km

14 km

September 9, 2009 1033rd HPC User Forum, Broomfield, CO

Page 11: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

MERRA Project: Modern Era Retrospective-analysis for Research and Applications

• GMAO’s 30-year reanalysis of the satellite era (1979 to present) using GEOS-5

• Largest assimilation data set available today

• The focus of MERRA is the hydrological cycle and climate variability

• Today’s observing system - ~1.6M observations per 6-hour snapshot. Close to 90% are from satellites

• Public record supporting broad range of scientific research

• Climate Data Assimilation System efforts will continue

• Single largest compute project at the NCCS

Products are accessed online at the GES DISC http://disc.sgi.gsfc.nasa.gov/MDISC

11September 9, 2009 33rd HPC User Forum, Broomfield, CO

Michael Bosilovich, Global Modeling and Assimilation Office, NASA Goddard Space Flight Center

Page 12: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

High-Resolution Modeling of Aerosol Impacts on the Asian Monsoon Water Cycle

September 9, 2009 33rd HPC User Forum, Broomfield, CO 12

• Objectives include 1) clarifying the interactions between aerosols (dust and black carbon) and the Indian monsoon water cycle and how they they may modulate regional climatic impacts of global warming, and 2) testing feedback hypotheses using high-resolution models as well as satellite and in situ observations.

• The team runs the regional-scale, cloud-resolving Weather Research and Forecasting (WRF) Model at very high resolution—less than 10-km horizontal grid spacing with 31 vertical layers. To mitigate the large computational demands of over 200,000 grid cells, the team uses a triple-nest grid with resolutions of 27, 9, and 3 km.

• For the aerosol-monsoon studies, a radiation module within WRF links to the Goddard Chemistry Aerosol Radiation and Transport (GOCART) aerosol module.

• Using the Discover supercomputer at the NASA Center for Computational Sciences (NCCS), the team conducted a model integration for May 1 to July 1 in both 2005 and 2006.

• Among other results, studies documented the “elevated-heat-pump” hypothesis, highlighting the role of the Himalayas and Tibetan Plateau in trapping aerosols over the Indo-Gangetic Plain, and showed preliminary evidence of aerosol impacts on monsoon variability.

Rainfall distributions from Weather Research and Forecasting (WRF) Model simulations at 9-kilometer resolution (top row) and from Tropical Rainfall Measurement Mission (TRMM) satellite estimates (bottom row). Units are in millimeters per day. Both WRF and TRMM show heavy rain

(red) over the Bay of Bengal and the western coast.

By using 256 Intel Xeon processors on Discover, the WRF Model can finish a 1-day integration in less than 3 hours.

William Lau, Kyu-Myong Kim, Jainn J. Shi, Toshi Matsui, and Wei-Kuo Tao, NASA Goddard Space Flight Center

Page 13: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Observing System Experiments: Evaluating andEnhancing the Impact of Satellite Observations

September 9, 2009 33rd HPC User Forum, Broomfield, CO 13

• An Observing System Experiment (OSE) assesses the impact of an observational instrument by producing two or more data assimilation runs, one of which (the Control run) omits data from the instrument under study. From the resulting analyses, the team initializes corresponding forecasts and evaluates them against operational analyses.

• The team runs the NASA GEOS-5 data assimilation system at a resolution of 1/2 degree longitude and latitude, with 72 vertical levels, and the GEOS-5 forecasting system at a resolution of 1/2 or 1/4 degree.

• The team uses high-end computers at the NASA Center for Computational Sciences (NCCS) and the NASA Advanced Supercomputing (NAS) facility. The mass storage allows continual analysis of model results with diagnostic tools.

• This research has demonstrated the impact of quality-controlled Advanced Infrared Spectrometer (AIRS) observations under partly cloudy conditions. In modeling tropical cyclogenetic processes, the team found that using AIRS data leads to better-defined tropical storms and improved GEOS-5 track forecasts. Depicted in the figure is a set of experiments centering on April–May 2008, during which Tropical Cyclone Nargis hit Myanmar.

Impact of the Advanced Infrared Spectrometer (AIRS) on the 1/2-degree Goddard Earth Observing System Model, Version 5 (GEOS-5) forecast for Tropical Cyclone Nargis. Upper left: Differences (AIRS minus Control) in 6-hour forecasts of 200 hPa temperature (°C, shaded) and sea-level pressure (hPa, solid line). Lower left: The 6-hour sea-level pressure forecast from the

AIRS run shows a well-defined low close to the observed storm track (green solid line). Lower right: The corresponding 108-hour forecast for 2

May 2008 (landfall time) compares very well with the observed track. Upper right: The 6-hour sea-level pressure forecast from the Control run shows no

detectable cyclone.

NASA computational resources hosted 70 month-long assimilation experiments and corresponding 5-day forecasts.

Oreste Reale and William Lau, NASA Goddard Space Flight Center

Page 14: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

GEOS-5 Support of NASA Field Campaigns: TC4 • ARCTAS • TIGERZ

September 9, 2009 33rd HPC User Forum, Broomfield, CO 14

• A Goddard Space Flight Center team supports NASA field campaigns with real-time products and forecasts from the Global Modeling and Assimilation Office’s GEOS-5 model to aid in flight planning and post-mission data interpretation.

• Recent supported campaigns include the TC4 (Tropical Composition, Cloud and Climate Coupling), ARCTAS (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites), and TIGERZ missions.

• The most-often-used GEOS-5 configuration was a 2/3-degree longitude by 1/2-degree latitude grid with 72 vertical levels. Data assimilation analyses were conducted every 6 hours.

• The NASA Center for Computational Sciences (NCCS) hosted the GEOS-5 model runs on its high-end computers and provided a multi-faceted data delivery system through its Data Portal.

• The mission support was successful, with GEOS-5 products delivered on time for most of the mission duration due to the NCCS ensuring timely execution of job streams and supporting the Data Portal.

• One example of mission success was a June 29, 2008 DC-8 flight’s sampling of the Siberian fire plume transported to the region in the mid-troposphere, as predicted by GEOS-5.

This image shows 500-hectopascal (hPa) temperatures (shading) andheights (contours) during NASA’s ARCTAS (Arctic Research of the

Composition of the Troposphere from Aircraft and Satellites) mission. An analysis from the GEOS-5 model is shown with 24- and 48-hour

forecasts and validating analyses. These fields, with the accompanying atmospheric chemistry fields, were used to help plan a DC-8 flight on June

29, 2008.

The GEOS-5 systems were run on 128 processors of the NCCS Explore high-end computer, with a continuous job stream allowing timely delivery of products to inform flight planning.

Michele Rienecker, Peter Colarco, Arlindo da Silva, Max Suarez, Ricardo Todling, Larry Takacs, Gi-Kong Kim, and Eric Nielsen, NASA Goddard Space Flight Center

Page 15: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

NASA HPC

• NCCS at Goddard Space Flight Center– Focused on Climate and Weather Research in the Earth

Science Division of the Science Mission Directorate• Support code development

• Environment for running models in production mode

• Capacity computing for large, complex models

• Analysis & visualization environments

• NAS at Ames Research Center– Supports all Mission Directorates

• For Earth Science: Capability runs for test & validation of next generation models

15September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 16: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

• Large scale HPC computing

• Comprehensive toolsets for job scheduling and monitoring

• Large capacity storage

• Tools to manage and protect data

• Data migration support

•Help Desk

•Account/Allocation support

•Computational science support

•User teleconferences

•Training & tutorials

• Interactive analysis environment

• Software tools for image display

• Easy access to data archive

• Specialized visualization support

• Internal high speed interconnects for HPC components

• High-bandwidth to NCCS for GSFC users

• Multi-gigabit network supports on-demand data transfers

HPC Compute Data Archival and Stewardship

• Code repository for collaboration

• Environment for code development and test

• Code porting and optimization support

• Web based tools

Code Development*

Analysis & Visualization*User Services*

• Capability to share data & results

• Supports community-based development

• Facilitates data distribution and publishing

Data Sharing

Data Transfer

DATAStorage &

Management

Global file system enables data access for full range of modeling and analysis activities

* Joint effort with SIVO

NCCS Data Centric Climate Simulation Environment

September 9, 2009 1633rd HPC User Forum, Broomfield, CO

Page 17: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Data Centric ArchitectureHighlight of Current Activities

September 9, 2009 33rd HPC User Forum, Broomfield, CO 17

Data Storage and ManagementPetabyte online storage plus technology-independent

software interfaces to provide data access to all NCCS services

Data Archiving and StewardshipPetabyte mass storage facility to support project data storage, access, and distribution, access to

data sets in other locations

High Performance ComputingBuilding toward Petascale computational resources to support advanced modeling

applications

Analysis and VisualizationTerascale environment with tools to

support interactive analytical activities

Data Sharing and PublicationWeb-based environments to support collaboration,

public access, and visualization

Nehalem Cluster Upgrades

Dali – Interactive Data Analysis

Data Portal &Earth System Grid

Data Management System

Page 18: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Interactive Data Analysis & Visualization Platform - Dali

• Interactive Data Analysis Systems– Direct login for users– Fast access to all file systems– Supports custom and 3rd party applications– Visibility and easy access to post data to the data

portal– Interactive display of analysis results

• In-line and Interactive visualization– Synchronize analysis with model execution– Access to intermediate data as they are being

generated– Generate images for display back to the user’s

workstations– Capture and store images during execution for later

analysis• Develop Client/Server Capabilities

– Extend analytic functions to the user’s workstations

– Data reduction (subsetting, field/variable/temporal extractions, averaging, etc.) and manipulation (time series, display, etc.) functions

September 9, 2009 33rd HPC User Forum, Broomfield, CO 18

Analysis & Visualization

Direct GPFS I/O Connections~3 GB/sec per node

16 cores 256GB

Dali Analytics Platform1.2 TF Peak, 128 cores, 2 TB main memory

- 8 nodes 2.4 GHz Dunnington (Quad Core) - 16 cores/node with 256 GB memory/core- 3 GB/s I/O bandwidth to GPFS filesystem - Software: CDAT, ParaView, GrADS, Matlab, IDL, python, FORTRAN, C, Quads, LATS4D

Currently configured as (8) 16-core nodes with 256 GB RAM/node, with flexibility technology to support up to (2) 64-core nodes with 1 TB RAM/node.

Page 19: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Data Management System

• Improving access to shared observational and simulation data through the creation of a data grid

• Adopting an iRODS grid-centric paradigm

– iRODS intermediates between vastly different communities

• The world of operational data management• The world of operational scientific practice

• Challenges– Creating a catalog of NCCS policies to be

mapped into iRODS rules

– Creating an architecture for work flows to be mapped into iRODS microservices

– Defining metadata and the required mappings

– Capturing and publishing metadata

– Doing all of this without disrupting operations!

September 9, 2009 33rd HPC User Forum, Broomfield, CO 19

Page 20: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Data Portal and Earth Systems Grid

September 9, 2009 33rd HPC User Forum, Broomfield, CO 20

• Web-based environments to support collaboration, public access, and visualization

• Interfaces to the Earth Systems Grid (ESG) and PCMDI for sharing IPCC model data

• Connectivity to observational data, Goddard DISC, and other scientific data sets

• Direct connection back to NCCS data storage and archive for prompt publication; minimizes data movement and multiple copies of data

• Sufficient compute capability for data analysis

Data Portal

NASA ESG PCMDI

LocalDisk

NFS iRODSGPFS

MC

Other

HP c7000 BladeSystem (128 cores, 1.2TF, 120TB of disk)

Page 21: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Nehalem Cluster Upgrades

• Additional IBM iDataPlex scalable compute unit added into the Discover cluster in FY09

– Additional 512 nodes (+46 TFLOPS)

– 4K 2.8 GHz Nehalem quad cores

– 24 GB RAM per node (+12 TB RAM)

– Infiniband DDR interconnect

• An additional 4K core Nehalem scalable unit to be integrated later this calendar year

• Performance:– 2x speedup of some major NCCS applications

– 3x to 4x improvement in memory to processor bandwidth

– Dedicated I/O nodes to the GPFS file system provides much higher throughput

September 9, 2009 33rd HPC User Forum, Broomfield, CO 21

“Discover “ Cluster110 TF Peak, 10,752 cores, 22.8 TB main memory,

Infiniband interconnect

• Base Unit:- 128 nodes 3.2 GHz Xeon Dempsey (Dual Core)• SCU1 and SCU2:- 512 nodes 2.6 GHz Xeon Woodcrest (Dual Core)• SCU3 and SCU4:- 512 nodes 2.5 GHz Xeon Harpertown (Quad Core)• SCU5:- 512 nodes 2.8 GHz Xeon Nehalem (Quad Core)

Page 22: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Where we’re going…

• NASA is aggressively moving forward to deploy satellite missions supporting the Decadal Survey*.

• NCCS is moving forward to support the climate & weather research that will extract the scientific value from this exciting new data!

* (January 15, 2007, report entitled Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond).

22September 9, 2009 33rd HPC User Forum, Broomfield, CO

Page 23: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

Thank you

September 9, 2009 33rd HPC User Forum, Broomfield, CO 23

Page 24: Climate and Weather Research at NASA Goddard

NASA Center for Computational Sciences

NCCS Architecture

September 9, 2009 33rd HPC User Forum, Broomfield, CO 24

ManagementServers

LicenseServers

GPFSManagement

GPFS Disk Subsystems~ 1.3 PB

Increasing by ~1.8PB in FY10

OtherServices

Analysis FY09Upgrade

~45 TF

FY10Upgrade

~45TF

Login

ARCHIVE

DataGateways

Viz

DataPortal

GPFS I/O Nodes

GPFS I/O Nodes

Direct Connect GPFS Nodes

Disk~300 TB

Tape~8 PB

Planned for FY10

Internal Services

Existing

ExistingDiscover

65 TF

GPFS I/O Nodes

PBSServers

DataManagement

NCCS LAN (1 GbE and 10 GbE)