bill kuo dtc aop 2011 and challenges. outline review of dtc aop 2010 tasks and budgets highlights of...

87
Bill Kuo DTC AOP 2011 and Challenges

Upload: beverley-garrett

Post on 16-Dec-2015

250 views

Category:

Documents


7 download

TRANSCRIPT

Bill Kuo

DTC AOP 2011 and Challenges

OutlineReview of DTC AOP 2010 tasks and budgetsHighlights of DTC accomplishmentsRecommendations of SAB and DTC

responsesProposed DTC AOP 2011Future direction and challenges

DTC Organization & AOP 2010 Tasks

DTC Director’s Office

WRFWRF for

Hurricanes

GSI MET HMT HWT DET NEMS

DTC Visitor Program

WRF: WRF modeling systemWRF for Hurricanes: HWRF, HFIPGSI: Grid-point Statistical Interpolation data assimilation systemMET: Model Evaluation ToolsHMT: Hydrometeorology Testbed collaborationHWT: Hazardous Weather Testbed collaborationDET: DTC Ensemble TestbedNEMS: NOAA Environmental Modeling System

Two major functions of DTC:

A. Provide support for community systems

B. Conduct testing and evaluation of community systems for research and operations

DTC FY 2010 Budget Allocations (in $K)

Task NOAA Core

HFIP USWRP AFWA NCAR GSD NSF Total

DTC Dir. Office 373 150 100 623

Visitor Prog. 100 22 100 222

WRF support 593 55 50 698

HWRF 735 254 7 127 1,123

GSI 176 396 50 622

MET 92 374 21 487

HMT-DTC 300 23 323

HWT-DTC 98 50 148

DET 651 651

NEMS 173 173

Total 2,900 346 300 825 300 300 100 5,071

DTC Tasking and FY10 Funding

Task areas Funding sourcesDTC Director’s office:

$623KVisitor program: $222KWRF (Wolff): $698KHWRF (Bernardet): $1,123KGSI (Huang): $622KMET (Fowler): $487KHMT (Tollerud): $323KHWT (Jensen): $148KDET (Toth): $651KNEMS (Carson): $173K

NOAA Core: $2,900K

HFIP: $346KUSWRP: $300KAFWA: $825KNCAR: $300KGSD: $300KNSF: $100KTotal budget:

$5,071KCollaborations with other testbeds

Major DTC milestones:Feb 8-9, 2010: DRAFT AOP 2010 reviewed

by DTC MBFeb 28, 2010: Received DTC EC approval

for AOP 2010March 1, 2010: Begin AOP 2010 executionMay 25, 2010: DTC MB teleconference

meetingAug 26, 2010: DTC EC meeting in WDCSep 21-23, 2010: Joint DTC SAB/MB MeetingOct 18, 2010: Received SAB written reportJan 11-12, 2011: DTC MB to review DRAFT

AOP 2011

What we will discuss in this meeting:SAB recommendations and DTC responses:

Do we agree with all their recommendations?For those that we agree, how do we

incorporate their recommendations into AOP 2011 (with potentially reduced budget)?

Continuing resolution and its impact on DTC AOP 2011 & budget and DTC operation:We don’t have final numbers from all DTC

sponsorsMay not know the numbers until well into the

AOP 2011How should DTC operate under these budget

uncertainties?

Highlights of DTC ActivitiesHired DTC software engineer (Eugene

Mirvis) to work on NEMS task at EMC, in collaboration with EMC staff.

Hired DTC scientist (Mrinal Kanti Biswas) to work on hurricane task.

Established the Community GSI repository, and a process for community to contribute to GSI development.

Started the development of DET modules.Tested WRF community code for 2011

operational hurricane prediction at NCEP. Additional highlights will be presented by task leads.

GSI R2O Transition Procedure (draft)

Community research

Code development

candidate

Code development

candidate

1. GSI Review Committee - initial scientific review2. DTC - developer code merging and testing3. GSI Review Committee - code and

commitment review4. DTC->EMC GSI code commitment5. EMC->DTC GSI repository syncing

1 DTC branch

2

DTC branch

EMC branch

3

EMC trunkDTC trunk45

GSI Transition: O2R and R2OPast (two years ago?)

Distributed development

Different applications and operational requirement (GFS, RTMA, NAM, RR, AFWA…)

Manual version controlNo system

documentationlimited system support

(for GSI partners and community)

Non-portable system

CurrentGSI Review Committee (DTC,

NCEP/EMC, NASA/GMAO, NOAA/ESRL, NCAR/MMM-AFWA)

Multiple platform GSI system supported by DTC

SVN version control (Duel repositories with synced trunk and localized development branches)

Completed GSI User’s Guide and website

Annual community release and residential tutorial

Community user support by DTC ([email protected])

R2O infrastructure and application

Support by EMC (John Derber) was essential!!

WRF V3.1.1

WRF V3.2

WRF V205/2004

07/2009

03/2010

HWRF code management: atmospheric model

HWRF for operations (2011)

oper HWRF2009

oper HWRF2010 “R2”

11

2010 baseline

“R1”

Tutorial,

HWRF Beta

release

Contributions from EMC and NOAA Research to

HWRF(preliminarily tested 3rd nest, new nest motion)

WRF V3.2.1

(…)

WRF V3.2.1+

WRF V3.2.1+

08/2010

09/2010

02/2011

(…)

04/2011 WRF V3.3

2011 Baseline “R2 final”

HWRF releas

e

Functionally-equivalent T&E suite

The Developmental Testbed Center HWRF 2011 Baseline Test Plan

Point of Contact: Ligia BernardetDecember 15, 2010

 IntroductionThe DTC will be performing testing and evaluation for the Hurricane WRF system, known as HWRF (Gopalakrishnan et al. 2010). HWRF will be configured as close as possible to the operational HWRF model, employing the same domains, physics, coupling, and initialization procedures as the model used at the NOAA NCEP Central Operations and by the model developers at NCEP EMC. The configuration to be tested matches the 2011 HWRF Baseline, which is the configuration that served as control for all developments at EMC geared towards the 2011 operational implementation.

•Pre-processing (including ability to read binary GFS in spectral coordinates)•Cycled HWRF vortex initialization and relocation•GSI Data Assimilation•Coupled (POM + WRF) model•Post-processing•Tracking•NHC Verification & confidence intervals•Display•Archival

SAB Recommendations and DTC responses

General Conclusions:SAB believes DTC currently risks becoming spread over too many

projects and directions. While most activities are relevant, some tasks appear to be peripheral to DTC and critical U.S. NWP goals.We restructured the DTC tasks for AOP 2011 into five focused areas. We

examine all tasks to ensure their relevance to the core missions of DTC. New approach should improve the way we communicate the DTC activities to the outside community.

The SAB believes that these accomplishments, although substantial, have not been sufficient to realize DTC’s core objectives, as noted in the introduction. For example, a functionally equivalent operational environment available for extended period tests is not available, although progress has been made in that direction.AFWA considers DTC to have a functionally equivalent test environment for

AFWA related work.DTC now has a functionally equivalent test environment for HWRF (including full

cycling capability)Efforts on “functionally equivalent test environment” are needed on mesoscale

modeling (e.g., NAM, NEMS, and NMM-B).

General Conclusions: To date, a substantial amount of DTC effort has been

placed on the long-term development of infrastructure in a range of areas, such as MET software package. However, it is time to place increased emphasis on infrastructure utilization that directly addresses DTC’s central goals.SAB asked DTC to increase testing and evaluation efforts

that directly contribute to the core missions of the DTC.With flat budget, the efforts on community support have to

be decreased.

Specific Recommendations and Conclusions(1) The DTC should give priority to building a functionally equivalent

operational environment to test and evaluate new NWP methods for extended retrospective periods and significant events using advanced verification tools. Such an environment should include operations-like data assimilation/cycling. Sufficient resources should be provided to insure this is in place within 12 months. DTC will place greater efforts on this, particularly related to the mesoscale

modeling. Assembling a functionally equivalent environment will require enhanced collaboration among several task areas, including mesoscale modeling, hurricane, data assimilation and ensemble efforts.

(2) The SAB was in general agreement that a central role for DTC is to be the nation’s model “scorekeeper” that would track mesoscale model forecast improvement and serve as the key benchmark center for U.S. mesoscale NWP. DTC will develop a “test plan” for Reference Configuration (RC) testing that can

help track mesoscale model forecast improvement. RC testing will include EMC and AFWA operational configurations. Concept will provide a framework for tracking the improvements of WRF systems with releases of new versions. The test plan will be reviewed by SAB representatives, EMC, and MMM.

Increased efforts on T&E will require additional resources at the expense of community support services.

Specific Recommendations and Conclusions The establishment of a suite of model verification tools by DTC

(including, but not limited to the MET package) was appropriate, although several enhancements are still required. For example, plan-view spatial verification is a critical tool that is currently missing from MET. Furthermore, DTC verification should support the full range of mesoscale assets, such as ACARS, NEXRAD and profiler data. However, the current lack of comprehensive, actionable verification statistics for major contemporary U.S. modeling systems (see 2 above) is of some concern and higher priority must be given to making this information available more rapidly, even if that requires redirecting some of the resources currently being provided to construct and support more specialized MET capabilities. Some of these features are already available in MET. Further enhancement will

require additional resources. DTC will make a contribution to actionable verification statistics for major

contemporary U.S. modeling systems, through Reference Configuration testing.

Specific Recommendations and Conclusions (4) The board notes that DTC has supported workshops

on some modeling systems that are not widely used in the community (e.g., NMM model). Such activities may be justifiable, but it may be worthwhile to review whether their current frequency reflects optimal use of DTC resources. DTC does not hold NMM workshops. DTC and EMC need to decide on NMM-B (and NEMS) tutorials, as well

as Joint ARW-NMM tutorials with MMM for AOP 2012.

(5) The committee believes that although the DTC itself must maintain competency and knowledge in the new NWS model infrastructure (NEMS), for the immediate future there is little requirement for any significant NEMS outreach activities to the research community. DTC has no plan to start a tutorial on NEMS framework. DTC hired a

staff to work at EMC, to gain expertise in NEMS, and will have the ability to implement components of other modeling systems into the NEMS-based EMC operational system for testing and evaluation.

Louis Uccellini believes that the most effective R2O is for the research community to use operational systems for their research.

Specific Recommendations and Conclusions (6) The general sentiment of the SAB is that, although the

NOAA Hydrometeorological Testbed (HMT) and Hazard Weather Testbed (HWT) are important national endeavors, DTC participation in these activities may be a secondary priority with respect to DTC’s core mission. Testbed collaborations are important to NOAA HWT and HMT collaborations are no longer identified as DTC task areas.

Rather, they will be DTC special projects that are directly linked to a few DTC focused areas. Efforts are made to ensure that these projects contribute to DTC core missions.

(7) The committee believes that the DTC can potentially play a unique role in bringing the research and operational communities together to examine and address key national NWP problems. Dealing with existing deficiencies in NWP physics parameterizations is one such problem. For AOP 2011, the DTC will organize a Physics Workshop, in collaboration with

EMC and the academic community, to be held in WDC area in August 2011.

Specific Recommendations and Conclusions (8)… there remained concern on the board that, because DET is inevitably a

costly long-term project, it may limit DTC’s ability to demonstrate sufficient short-term value and relevancy to its sponsoring agencies and the NWP community at large. Thus, some difficult resource decisions may become necessary. DET is an important activity for NWS to realize the goals outlined in the “White

Paper”. DTC will continue to explore additional resources to augment and support such effort.

DET needs to quickly demonstrate short-term value and relevancy to the sponsoring agencies and NWP community at large.

(9)… Based on this experience and a review of previous cycles of the program, the board believes that the visitor program is an appropriate and important element of DTC activities. However, the board believes that the current program casts too wide a net in its request for proposals…Future calls should be crafted to clearly identify one or two mission-critical areas where collaborations with applicants having particularly relevant expertise can significantly augment DTC capabilities and hasten transition of effective NWP solutions to the broader community. DTC will improve the process for the visitor program for 2011. Input from MB will be an important part of this process!

Specific Recommendations and Conclusions (10) Regarding hurricane-related activities, the board

felt that there is a need to establish and maintain a reference configuration and operational testbed for the Hurricane-WRF (HWRF). Such a testbed must include a data assimilation component. Furthermore, there is a need for the development and support of relevant diagnostic tools for hurricanes. The HWRF system maintained by DTC includes a data assimilation

component. DTC will include HWRF as a Reference Configuration for testing, with full cycling data assimilation. DTC will also develop and support relevant diagnostic tools, in collaboration with HFIP partners, pending on HFIP support.

AOP 2011 TasksDTC

Director’s Office

Mesoscale

modeling

Hurricanes

Data Assimilat

ion

Ensemble

Verification

DTC Visitor Program

Testbed collaborations: HWT & HMT

1. DTC activities are distilled into five focused areas2. HWT & HMT are cross-DTC special projects, with

contributions to DTC focused areas identified3. NEMS is included as part of mesoscale modeling

Required budget for proposed AOP 2011 activities

Proposed Task Budget

Director’s Office + Visitor Program $873,436

Mesoscale modeling (including NEMS)

$1,032,363

Verification $464,989

Data assimilation $633,394

Hurricanes $1,262,758

Ensembles $723,442

HWT $206,636

HMT $321,798

Total $5,517,818

Total actual funding for AOP 2010 was $5,070K. Total for AOP 2011 is a 8.7% increase.

Future Directions and ChallengesMesoscale modeling, NMM-B and NEMS

NCEP is moving forward with NMM-B and NEMS, and expects DTC to entrain the community into these operational systems

SAB does not recommend DTC provide community support for NMM-B and NEMS

DTC needs to have a stronger linkage to mesoscale modeling at NCEP (that will contribute directly to NCEP operations). Physics workshop is a start.

Future NCEP short-range ensemble will include NMM-B and ARW in the NEMS framework.

In 2012, DTC needs to decide whether to continue joint ARW-NMM tutorials (and whether to do NEMS tutorial).

Future Directions and ChallengesData Assimilation:

NCEP is moving toward GSI-EnKF hybrid data assimilation for global modeling

Multiple EnKF systems are being developed under HFIP sponsorship

AFWA asked DTC to examine a few regional EnKF systems for possible operational use

DTC needs to work with EMC and AFWA to decide on a community hybrid GSI-EnKF system for both regional and global NWP applications. DTC should not support multiple EnKF systems to the community.

Future Directions and ChallengesHurricane:

DTC made significant progress in merging the operational HWRF code into WRF repository, and in testing the community HWRF code for operation

Needs to assess the impact of EnKF on hurricane prediction

Needs to facilitate improvements made by HFIP community into operations (i.e., Hurricane Test Plan)

Future of operational hurricane model at NCEP:Will HWRF migrate toward NMM-B on NEMS?Should we consider migrating toward AHW?

Future Directions and ChallengesEnsemble:

Very good start on the development of the DET infrastructure and first few modules during AOP 2010

Need to demonstrate sufficient short-term value to operations and NWP community:EMC: Next generation SREFHWT: Collaboration with CAPS and SPC cloud-scale

ensembleHMT: Ensemble verificationMaking the DET modules available to the community

DET test and evaluation activities will require significant compute resources

Additional resources required to accelerate development

THANK YOU!

Louisa Nance

DTC Director’s Office

Director’s Office ResponsibilitiesInternal Coordination

Manage and coordinate DTC tasksPlanning, budgeting, execution and reporting

External CommunicationConduct or assist with workshops and tutorialsInteract with DTC partners on collaborative

effortsCreate and maintain the DTC WebsiteProvide administrative support for DTC EC, MB

& SAB meetingsHost the DTC Visitor Program

Director’s Office StaffManagement

Bill Kuo – Director (0.5 FTE)Louisa Nance – Assistant Director (0.5 FTE)Barb Brown – JNT Director (0.10 FTE)

Bonny Strong - JNT DTC Manager (0.25 FTE)Steve Koch – GSD Director (0.10 FTE)

TBD - GSD DTC Manager (0.25 FTE)

Administrative SupportPam Johnson (0.50 FTE)

Total FTEs: 2.20 FTEs

Major accomplishments for AOP 2010Internal Coordination

DTC staff retreat (Apr 2010)Monthly staff / task lead meetings

External CommunicationDTC Management Meetings

1st face-to-face meeting of DTC Executive Committee (Aug 2010)

1st SAB/MB meeting (Sept 2010)DTC Visitor Program

Announcement of Opportunity (Jun 2010)Selected 6 proposals from 22 submissions for funding (Sept

2010)Selected hosts from DTC staff for each projectHosted initial visits for 3 projects

DTC Visitor ProgramPrincipal

Investigator

Home Institution

Project Type Project Title

Brian Ancell Texas Tech University

PI + student

Development of operational Weather Research and Forecasting model ensemble sensitivity/data assimilation tools

Michael E. Baldwin / Kimberly Elmore

Purdue University / University of

OklahomaPI Incorporating spatial analysis of forecast performance

into the MET package at the DTC

Vincent E. Larson

University of Wisconsin - Milwaukee

PI A generalized parameterization for clouds and turbulence in WRF

Sarah-Jane LockUniversity of

LeedsPI

Cut-cell representation of orography: Exploring an alternative method for dealing with flows over steep and complex terrain

Don MortonUniversity of

Alaska FairbanksPI

Alaska High Resolution Rapid Refresh (HRRR-AK) - Verification and study of model configuration for operational deployment

Di Wu & Xiquan Dong

University of North Dakota

PI + student

Evaluation of WRF microphysics schemes with observations in 3D environment

Withdrawn

Proposed activities for AOP 2011Internal Communication

Implement tools to assist with task and staff coordinationProvide a framework for cross-coordination between over-

lapping activities of major task areasCoordinate strategic planning activities

External CommunicationContinue making improvements to DTC websiteHost MB, EC & SAB meetings and keep open channels of

communicationDTC Visitor Program

Host visitors for 5 funded projectsPrepare a new AO & select next round of visitor projects

Provide support for upcoming workshops and tutorials

Challenges for Director’s OfficeMeeting the needs/interests of the NWP

community during tight funding periodCross-coordination between task areas

Verification, HMT & HWTEnKF for DA, Hurricanes & Ensembles

Determining appropriate focus and scope for next DTC Visitor Program Announcement of Opportunity

Jamie Wolff

Mesoscale Modeling

Collaborators:NOAA/NCEP/EMCNOAA/ESRL/GSDNCAR/NESL/MMM

Mesoscale Modeling GoalsCommunity Support

Maintain, support and facilitate contributions to the community-released code repositories, currently including:WRF * *Post Processing Software *

Support community outreach events * * -in collaboration with MMM* and

EMC*

Testing and EvaluationExtensively test and evaluate a variety of model

configurations that will provide benefits to both the operational and research communitiesReference Configurations (RCs)

Provide a functionally equivalent operational environment for testing and evaluating new techniques or capabilities

Major Accomplishments for AOP 2010 Community outreach and support * *

WRF v3.2 release ,WRF Workshop, WRF Tutorials, wrfhelp email Significant progress on transition from WPP to UPP * * *

Established community-UPP code repository -in collaboration with MMM*, EMC*, GSD* and

AFWA*

Published/updated online docs for past & current activities on the DTC T&E webpage http://www.dtcenter.org/eval

Designated several DTC RCs and retested two with WRF v3.2.1 http://www.dtcenter.org/config

Performed evaluation of GFS/NAM Precipitation Forecast Comparison Presented results at relevant conferences

Began to develop DTC expertise in NEMS software Software Engineer hire focused on NEMS development (at EMC) Held a technical information exchange meeting (October 2010)

EMC, GSD, MMM, DTC representatives

Highlight:GFS/NAM Precip Forecast Comparison

FSS: NAM15 is consistently higher than the GFS60 across multiple thresholds (12-h lead time shown)

MODE: Counts (left) and size distribution (right) for all objects defined within the NAM4 forecast are more consistent with the observation field than the GFS4 forecast

Community Support - AOP 2011Proposed Activities

Maintain/Enhance WRF regression testing *Continue on-going efforts in community support * *Finish preparations of community-UPP software package for

distribution *Extensively test, write documentationProvide community support upon release

Anticipated major accomplishmentsWRF release (v3.3 April 2011) * *Community-UPP version 1.0 release (April 2011) *Bi-annual WRF Tutorials (July 2011, January 2012) * *Annual WRF Users Workshop (June 2011) *

-in collaboration with MMM* and EMC*

Testing and Evaluation - AOP 2011Proposed Activities

Continue expansion of RC testing and evaluationStrengthen the foundation of DTC expertise with the NEMS software

Continue to contribute to development process in select areas (e.g. portability, inter-operability, I/O layer capabilities)

Co-host physics workshop for mesoscale modelingWork on publications for DTC methodology and results from select

test activitiesAnticipated major accomplishments

New RC designations or retests, as appropriateFunctionally equivalent operational environment established at the

DTC to assess new NWP methodsWhite paper outlining short-term and longer-term approaches for

making significant progress toward improving physics parameterizations

Manuscript(s) submitted to appropriate peer-reviewed journals

Resource requirements for AOP 2011Staff Category FTE

Scientists 3.83 (0.43)

Software Engineers

2.21 (0.05)

Students 0.68 (0.18)

Scientists Software Engineers

Jimy Dudhia Laurie Carson

Tressa Fowler Dave Gill

Eric Gilleland John Halley Gotway

Michelle Harrold Chris Harrop

Tara Jensen Eugene Mirvis

Louisa Nance Paul Oldenburg

Ed Tollerud Tricia Slovacek

Wei Wang Brent Ertz

Jamie Wolff Lara Ziady

Chunhua Zhou Students:

ASII Stanislav Stoytchev

Zach Trabold

Non-salary Cost

Travel $15K

Workshops $25K

Publications $2K

*Contributions from testbed collaborations – proposed work presented separately

Ligia Bernardet

Hurricanes

External collaborators:NOAA Environmental Modeling Center

NOAA Geophysical Fluid Dynamics LaboratoryNOAA Atlantic Oceanographic and Meteorological Laboratory

NCAR Mesoscale and Microscale Meteorology DivisionUniversity of Rhode Island

Hurricanes GoalsFacilitate transfer of research to operations by

creating a framework for NCEP and the research community to collaborate

Support the community in using operational hurricane models

Develop and maintain a hurricane testing and evaluation infrastructure at DTC

Perform tests to assure integrity of community code and evaluate new developments for potential operational implementation

Major Accomplishments for AOP 2010Release of HWRF to the community: code

management, documentation, support to 150 registered users.

Transition of Community Code to EMC to serve as baseline for 2011 operational implementation.

Creation of a functionally-equivalent infrastructure to run HWRF on jet (including data assimilation and cycling)

Testing and evaluation:Routine extended regression tests to evaluate integrity

of codeRan 400 cases from the 2008 and 2009 hurricane

seasons in preparation to designate a DTC RC.

Functionally-equivalent T&E suite

The Developmental Testbed Center HWRF 2011 Baseline Test Plan

Point of Contact: Ligia BernardetDecember 15, 2010

 IntroductionThe DTC will be performing testing and evaluation for the Hurricane WRF system, known as HWRF (Gopalakrishnan et al. 2010). HWRF will be configured as close as possible to the operational HWRF model, employing the same domains, physics, coupling, and initialization procedures as the model used at the NOAA NCEP Central Operations and by the model developers at NCEP EMC. The configuration to be tested matches the 2011 HWRF Baseline, which is the configuration that served as control for all developments at EMC geared towards the 2011 operational implementation.

•Pre-processing (including ability to read binary GFS in spectral coordinates)•Cycled HWRF vortex initialization and relocation•GSI Data Assimilation•Coupled (POM + WRF) model•Post-processing•Tracking•NHC Verification & confidence intervals•Display•Archival

Proposed activities for AOP 2011Continue HWRF community support.Continue HWRF code management, keeping the evolving

community and EMC versions of HWRF connected.Upgrade DTC testing suite by adding HYCOM, UPP and

ability to run high-resolution. Perform extensive testing of HWRF. Actual tests are TBD,

but could include changes in resolution, alternate physics and initialization (EnKF).

Diagnostic activities: Evaluate HWRF to understand weaknesses and sources of

error. Begin assembling a diagnostic toolbox for DTC and the

community*

*Pending HFIP funding.

Anticipated major accomplishments for AOP 2011Hurricane Prediction Test Plan, in collaboration with

EMC and HFIP, describing plans for tests to be conducted by DTC in 2011 and a protocol for how future tests will be determined.

Hurricane Tutorial in April 2011.Publication of a HWRF Reference Configuration.Expanded WRF Community Code with addition of new

developments for HWRF.Expanded HWRF Testing infrastructure on jet with the

addition of new components (HYCOM and high-resolution).

Input to NCEP pre-implementation decisions through HWRF testing and evaluation at DTC.

Resource requirements for AOP 2011

Staff Category FTE

Scientists 3.7*

Software Engineers 2.3*

Students -

Scientists Software Engineers

Shaowu Bao Timothy Brown

Ligia Bernardet Laurie Carson

Mrinal Biswas Christopher Harrop

Louisa Nance Donald Stark

Jamie Wolff Tricia Slovacek*

New AS II Bonny Strong*

New AS III*

Non-salary Cost

Travel 8 K

Workshops 5 K

Publications 1.5 K

*Includes funding for hurricane diagnostic toolbox

Xiang-Yu (Hans) Huang*presented by Ming Hu

Data Assimilation

Collaborators: AFWA, NCEP/EMC, NASA/GMAO, NOAA/GSD, NCAR/MMM

Data Assimilation GoalsCommunity code:

Provide current operational GSI capability to research community (O2R) and a pathway for research community to contribute to operational GSI (R2O);

Provide a framework to enhance the collaboration from distributed developers.

T&EProvide rational basis to operational centers and research community for selecting a GSI configuration for their NWP system

Explore alternative data assimilation method -EnKF

Major Accomplishments for AOP 2010

GSI Community CodeEstablished GSI Review CommitteeCommunity support (v2 release, tutorial, & helpdesk)

Developer support Maintained Community GSI repository and conducted appropriate testing

R2O applicationsGSI T&E

Configuration testingMonth-long data impact testsInvestigated several special issues

GSI R2O applicationsGSI R2O Transition Procedure1. GSI Review Committee - initial scientific review2. DTC

• Merge code with the latest GSI trunk following the GSI coding standard

• Perform the DTC regression test• Perform impact study of the code changes

3. GSI Review Committee - code and commitment review4. DTC->EMC GSI code commitment5. EMC->DTC GSI repository syncing Applications1. GSD cloud analysis package for Rapid Refresh

operation2. Assimilation of surface PM2.5 observations in GSI

for CMAQ regional air quality model3. Portability issues from repository testing

Proposed activities for AOP 2011Community Code

Continue provide support for Community GSI packageContinue to maintain Community GSI repository

Coordinate GSI Review Committee meetings and activitiesTest (and adjust if necessary) procedure for managing R2O

transition with input from GSI Review Committee Explore alternative DA methods/systems including

collaboration and community supportTesting and Evaluation

Conduct extensive tests of end-to-end systemGSI baseline tests (including comparison with WRF-Var)Regional EnKF system

Reassess the long term strategy for DA task

Anticipated major accomplishments for AOP 2011

GSI Community CodeAnnual Community GSI Tutorial/WorkshopGSI v3.0 releaseGSI community contribution (R2O) procedure and implementation

Testing and EvaluationFinal report summarizing results of GSI and WRF-Var comparison experiments.

Final report summarizing the DART-EnKF test results and recommendation for regional EnKF testbed

Resource requirements for AOP 2011

Staff Category FTE

Scientists 3.2

Software Engineers 0.5

Scientists Software Engineers

Kathryn Crosby Don Stark

Hans Huang

Ming Hu

Hui Shao

Chunhua Zhou

Non-salary Cost

Travel $18 K

Tressa L. Fowler

Verification

In collaboration with AFWA, NASA, HWT, HMT

Verification GoalsMET Development

Provide complete, quality verification tools to NWP community.

MET SupportProvide instruction for and demonstration of

those tools.

Major Accomplishments for AOP 2010MET Support

New expanded tutorialsMET related talks at AMSHWT and HMT collaborations

MET DevelopmentWorkshopRelease of MET v3.0Adaptation and demonstration of METViewer for

NCEP database.

Proposed activities for AOP 2011MET Support

Semi-annual tutorials, expanded and improved.MET Development

Annual WorkshopImproved methods for verifying clouds.Research on methods for verifying through

time.Improvements in ensemble methods in

collaboration with DET.Expanded capabilities of METViewer.Initial capabilities to verify hurricanes.

Anticipated major accomplishments for AOP 2011MET Support

Improved TutorialsMET Development

Informative workshopMET software release

Improved ensemble, cloud, and time verification. METViewer database and display release

Resource requirements for AOP 2011Staff Category FTE

Scientists 1.40 (0.65)

Software Engineers

2.50 (0.73)

Students 0.40 (0.40)

Scientists Software Engineers

Tressa Fowler* Randy Bullock*

Tara Jensen* John Halley Gotway*

Eric Gilleland Paul Oldenburg*

Michelle Harrold

Anne Holmes

Ed Tollerud Bonny Strong*

Students:

Lisa Coco

Non-salary Cost

Travel 11 K (5 K)

Workshops 26 K (1 K)

*Contributions from testbed collaborations – proposed work presented separately

Zoltan Toth

Ensembles

DET website: http://www.dtcenter.org/det

External collaborators:NOAA Environmental Modeling Center

Hazardous Weather TestbedHydrometeorological Testbed

Hurricane Forecast Improvement Project

Ensembles GoalsDevelop & maintain DET infrastructure

Six modulesUser interface – DET Portal

Establish benchmarkFunctionally reproduce NCEP operational systemFirst benchmark is NCEP’s upcoming implementation(s)

Test & evaluate community developed methodsCollaborative work with the research community

Link with other testbeds and programs/projectsSupport ensemble activities

HMT, HWT, HFIP, etcLink with user community

Major Accomplishments for AOP 2010Plans developed for

Overall architecture of DET infrastructureEach of 6 modulesTest and Evaluation

Established and tested basic capabilities for 2 modulesConfigurationInitial perturbations

Collaboration with other DTC tasks/projectsHMT – Joint plans for testing DET & HMT ensemble generationHWT – Joint plans for evaluation of 2011 HWT ensemble

productsOutreach

Organized DET Workshop and engaged with WRF Ensemble WGActivities coordinated with NCEP/EMC via regular meetings

Module 1: Configuration

Module 2: Initial

Perturbations

Module 4: Statistical

Post-Processing

External Input(HMT, HWT, HFIP,

etc)

Module 3: Model

Perturbations

Module 5: Product

Generation

Module 6: VerificationD

ET M

OD

ULES

Ongoing work on Modules 1-2Establishing basic capabilities

Closely coordinated with EMC6-member ensemble test

ARW with various physicsNA SREF domain

22 km grid-spacingTo be expanded S & E for HFIP

GFS initial conditionsDec 2010

Initial perturbations from GEFS are cycled (dynamically downscaled)Tested against interpolated GEFS initial perturbations

GEFS lateral boundary conditions

Proposed activities for AOP 2011Establish benchmark for initial perturbation module

Testing & evaluation to contribute to next NCEP SREF implementation

Establish basic capability for model perturbation moduleCapability of using different versions of NMM and possibly

ARW under NEMSInterface with other testbeds & projects

Evaluation of HMT ensembleProducts & evaluation for HWT ensembleJoint planning for HFIP ensemble development & testing

Continued engagement with communityCo-organize 5th Ensemble User Workshop with NCEP &

possibly NUOPC

Anticipated major accomplishments for AOP 2011

Test results and software for cycling initial perturbations

ARW & NMM incorporated into DET under NEMS

Improved user interface – Basic capability for DET Portal

Report on HMT & HWT ensemble product evaluationHydromet & hazardous weather forecast

productsVerification metric packages identified for

hydromet & hazardous weatherIn collaboration with HMT, OHD, NCEP, etc

5th Ensemble User Workshop

Resource requirements for AOP 2011Staff Category FTE

Scientists 1.90 (0.40)

Software Engineers

1.40 (0.25)

Students 0.10 (0.10)

Scientists Software Engineers

Barbara Brown Brent Ertz

Michelle Harold Ann Holmes

Isidora Jankov Eugene Mirvis

Tara Jensen* Paula McCaslin

Ed Tollerud Paul Oldenburg

Zoltan Toth Linda Wharton

New DET Task Lead

SEII

New Hire Students

Lisa Coco

Non-salary Cost

Travel 15 K (6 K)

Workshops 12 K

Publications 3 K

*Contributions from testbed collaborations – proposed work presented separately

IssuesComputational resources for DET

Collaboration with HFIP established – Access to Jet in BoulderRun HFIP regional ensemble embedded into DET NA ensemble

Teragrid start-up allocation securedFor testing portabilityFull allocation will be requested in spring

NOAA Site B research computerHow to request allocation for DET for FY12 & beyond?

New NCAR facilityHow to request allocation for DET?

Real time testing of DET ensembleNA domain, with embedded HMT, HWT, HFIP, etc (movable)

nests?Subject to availability of additional resources

Accelerate development of benchmarks for stat. post & productsSubject to availability of additional resources

Ed Tollerud

HMT/DTC Collaboration

External Collaborators:NOAA Earth System Research LaboratoryNOAA Environmental Modeling Center

California Nevada River Forecast Center California NWS forecast offices

HMT/DTC Collaboration Goals Work toward the DTC Mission Statement: Improvement through

verification/evaluation of EMC operational models Improve forecasting methods for extreme precipitation events:

model techniques, data impacts, and physical parameterizations Demonstrate usefulness of prototype ensemble forecast systems;

thereby provide long-range guidance for operational ensemble forecast

systems in development at EMC Collaborate and advance the missions of the cross-cutting tasks in

the DTC: Model development, verification tools, ensemble methods

HMT-West ensemble QPF evaluation is current focus to attain these goals; HMT-West is an effective test-bed for real-life forecasting method evaluation

Major Accomplishments for AOP 2010HMT-West winter exercise real-time demonstration

website for QPF verification Mesoscale modeling: operational EMC model QPF

verification Ensemble modeling (DET): verification for WRF

regional ensemble modelData impacts analyses: verification uncertainty due

to observational data-stream choices and data quality

Verification methods: MODE and the assessment of Atmospheric River forecasts

Proposed activities for AOP 2011Evaluate impact of microphysical schemes on

operational and research model performance using HMT-West observations base

Perform QPF verification for heavy rain events for EMC operational and research models, HMT-West WRF ensemble, and others

Expand ensemble/probabilistic content of HMT-West verification demonstration

Investigate use of moisture flux for MODE object identification and value to model AR forecast assessment

Anticipated major accomplishments for AOP 2011

Written evaluation of QPF for several EMC operational models and regional ensemble systems using 1-2 year statistics from HMT winter exercises

Development of new techniques to verify microphysical forecasts in time and space domains

An expanded and interactive verification website with new aggregation, regionalization, and probabilistic scoring options

Identification of effective MODE-based methods to evaluate leading edge, moisture flux, and other AR attributes

Resource requirements for AOP 2011

Staff Category FTE

Scientists 0.83

Software Engineers 0.53

Students 0.48

Scientists Software Engineers

Ed Tollerud John Halley Gotway

Tara Jensen Paul Oldenberg

Tressa Fowler Randy Bullock

Brent Ertz

Students:

Lisa Coco Stanislav Stoytchev

Non-salary Cost

Travel 8 K

Publications 2 K

Compute Resources

NOAA jet system

Tara Jensen

HWT-DTC Collaboration

Collaborating with:NOAA: SPC, NSSL, EMC, HPC, AWC, ESRL Universities: OU/CAPSNCAR: MMMUCAR: COMET

HWT-DTC Collaboration Goals• Support DTC mission by evaluating

convection allowing models/ensembles on cutting edge of NWP

• Gather robust datasets for ensemble and deterministic studies• Datasets and algorithms can be leveraged by

DET and Mesoscale Focus• Demonstrate utility of MET-based objective

evaluation in a forecast environment• Provide near real-time evaluation during Spring

Experiment• Application of new MET tools feeds back to MET

development

Mesoscale DataAssimilation Hurricanes Ensembles Verification

Major Accomplishments for AOP 2010 Enormous expansion of near real-time evaluation

capability Evaluation of 30 models during Spring Experiment

(CAPS ensemble+3 operational baselines) 10 Deterministic and 4 Ensemble products evaluated using

traditional and spatial verification methods. Three additional research models available for retrospective

studies

DTC staff participation in each week of Spring Experiment 2010

Papers and PresentationsKain et. al, October 2010: Assessing Advances in the Assimilation of Radar Data and Other Mesoscale Observations within a Collaborative Forecasting–Research Environment, MWR

Presentations at 11th WRF Users’ Workshop, 25th AMS Conf. on Severe Local Storms, FAA Interagency Meeting, 24th AMS Conf. on Weather and Forecasting, & 25th AMS Conf. on Hydrology

Example of Real-time Evaluation:Radar Echo Tops Ensemble Products are not always useful

RETOP

ObservedCAPS PM MeanThompson WSM6 WDM6 Morrison

CAPS Ensemble

Mean

Quick Glance at HWT-DTC Evaluation Results

Frequency Bias indicates CAPS ensemble mean field exhibits a large over-forecast of areal coverage of cloud complexes

Ratio of MODE forecast objects to observed objects implies 2-4x over-prediction of CAPS ensemble mean convective cells whereas HRRR and CAPS 1-km ratio is near 1 for first 15 hours

Frequency Bias – RETOP > 35kFt15

10

5

0

CAPS 1km

HRRR

Proposed activities for AOP 2011Ensembles Focus Area: Help build foundation for

Product Generation module by:Identifying promising ensemble product algorithms

through evaluation of NCEP and CAPS SSEF ensemble products.NCEP ensembles: SREF and HREFVariables: Reflectivity, Accum precip, and Synthetic

satellite (if available)Working with EMC and CAPS to incorporate

promising algorithms into module.Verification Focus Area: Demonstrate MODE-Time

Dimension (MODE-TD) for Convective Initiation forecast problem.

Mesoscale DataAssimilation Hurricanes Ensembles Verification

Anticipated major accomplishments for AOP 2011

Report describing the evaluation of real-time CAPS SSEF and NCEP products.

Begin incorporation of select NCEP and CAPS ensemble product algorithms into DET Product Generation module.

Demonstration of MODE-TD during Spring Experiment

DTC staff participation during each week of Spring Experiment.

Mesoscale DataAssimilation Hurricanes Ensembles Verification

Resource requirements for AOP 2011

Staff Category FTE

Scientists 0.70

Software Engineers 0.45

Students 0.20

Scientists Software Engineers

Tara Jensen Paul Oldenburg

Michelle Harrold

John Halley Gotway

Tressa Fowler Brent Ernst

Student Randy Bullock

Lisa Coco Bonnie StrongNon-salary Cost

Travel 11 K

Fall Meeting 1 K

Compute Resources 8-12 dedicated processors from 1 May to 30 June