forecast verification research barbara brown, ncar with thanks to beth ebert and laurie wilson s2s...

23
Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Upload: lillian-bruce

Post on 27-Mar-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Forecast Verification Research

Barbara Brown, NCAR

With thanks to Beth Ebert and Laurie Wilson

S2S Workshop, 5-7 Feb 2013, Met Office

Page 2: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

2

Verification working group members

Beth Ebert (BOM, Australia)Laurie Wilson (CMC, Canada)• Barb Brown (NCAR, USA)• Barbara Casati (Ouranos, Canada)• Caio Coelho (CPTEC, Brazil)• Anna Ghelli (ECMWF, UK)• Martin Göber (DWD, Germany)• Simon Mason (IRI, USA)• Marion Mittermaier (Met Office, UK)• Pertti Nurmi (FMI, Finland)• Joel Stein (Météo-France)• Yuejian Zhu (NCEP, USA)

Page 3: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

3

Aims

Verification component of WWRP, in collaboration with WGNE, WCRP, CBS

(“Joint” between WWRP and WGNE)

• Develop and promote new verification methods

• Training on verification methodologies

• Ensure forecast verification is relevant to users

• Encourage sharing of observational data

• Promote importance of verification as a vital part of experiments

• Promote collaboration among verification scientists, model developers and forecast providers

Page 4: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Relationships / collaboration

4

WGNE

SDS-WAS

YOTC

HyMeX

Subseasonal to Seasonal Prediction

Polar Prediction

WGCM

WGSIP

TIGGE

SWFDP

CG-FV

SRNWP COST-731

Page 5: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

5

FDPs and RDPs

Sydney 2000 FDP

Beijing 2008 FDP/RDP

SNOW-V10 RDP

FROST-14 FDP/RDP

MAP D-PHASE

Severe Weather FDP

Typhoon Landfall FDP

Page 6: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

6

SNOW-V10

• Nowcast and regional model verification at obs sites

• User-oriented verification

– Tuned to decision thresholds of VANOC, whole Olympic period

• Model-oriented verification

– Model forecasts verified in parallel, January to August 2010

• Status

– Significant effort to process and quality-control observations

– Multiple observations at some sites observation error

Page 7: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

7

Forecast < 30 30 ≤ x < 50 50 ≤ x < 200 200 ≤ x < 300 300 ≤ x < 500 > 500 Total< 30 0 0 0 0 0 0 0

30 ≤ x < 50 0 0 0 0 0 0 050 ≤ x < 200 0 0 52 20 22 43 137

200 ≤ x < 300 0 0 76 18 19 103 216300 ≤ x < 500 0 1 26 15 12 60 114

> 500 0 9 831 246 170 3743 4999Total 0 10 985 299 223 3949 5466

lam1k Min. Visibility (m) at VOL HSS=0.095Observed

Wind speed verification(model-oriented)

Visibility verification(user-oriented)

Page 8: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

8

FROST-14

User-focused verification

• Threshold-based as in SNOW-V10

• Timing of events – onset, duration, cessation

• Real-time verification

• Road weather forecasts?

Model-focused verification

• Neighborhood verification of high-resolution NWP

• Spatial verification of ensembles

Account for observation

uncertainty

Page 9: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

9

Promotion of best practice

Recommended methods for evaluating cloud and related parameters

Page 10: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

10

Promotion of best practice

Verification of tropical cyclone forecasts

1. Introduction

2. Observations and analyses

3. Forecasts

4. Current practice in TC verification – deterministic forecasts

5. Current verification practice – Probabilistic forecasts and ensembles

6. Verification of monthly and seasonal tropical cyclone forecasts

7. Experimental verification methods

8. Comparing forecasts

9. Presentation of verification results

Page 11: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Verification of deterministic TC forecasts

11

Page 12: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Beyond track and intensity…

12

Model 2Model 1

Precipitation (MODE spatial method)

Track error distribution

TCgenesis

Wind speed

Page 13: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

13

Promotion of best practice

Verification of forecasts from mesoscale models (early DRAFT)

1.Purposes of verification

2.Choices to be made

a. Surface and/or upper-air verification?

b. Point-wise and/or spatial verification?

3.Proposal for 2nd Spatial Verification Intercomparison Project in collaboration with Short-Range NWP (SRNWP)

Page 14: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

14

Spatial Verification Method Intercomparison Project

• International comparison of many new spatial verification methods

• Phase 1 (precipitation) completed– Methods applied by researchers to same

datasets (precipitation; perturbed cases; idealized cases)

– Subjective forecast evaluations

– Weather and Forecasting special collection 2009-2010

• Phase 2 in planning stage

– Complex terrain

– MAP D-PHASE / COPS dataset

– Wind and precipitation, timing errors

Page 15: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

15

Outreach and training

• Verification workshops

and tutorials– On-site, travelling

– SWFDP (e.g., east Africa)

• EUMETCAL training

modules

• Verification web page

• Sharing of tools

http://www.cawcr.gov.au/projects/verification/

Page 16: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

16

5th International Verification Methods Workshop Melbourne 2011

Tutorial

• 32 students from 23 countries

• Lectures and exercises (took tools home)

• Group projects - presented at workshop

Workshop

• ~120 participants

• Topics: – Ensembles and probabilistic forecasts

– Seasonal and climate

– Aviation verification

– User-oriented verification

– Diagnostic methods and tools

– Tropical cyclones and high impact weather

– Weather warning verification

– Uncertainty

• Special issue of Meteorol. Applications in early 2013

Page 17: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Seamless verification

17

Seamless forecasts - consistent across space/time scalessingle modelling system or blendedlikely to be probabilistic / ensemble

climatechange

local

point

regional

global

Sp

atia

l sca

le

forecast aggregation timeminutes hours days weeks months years decades

NWP

nowcasts

decadalprediction

seasonalprediction

sub-seasonalprediction

veryshortrange

Page 18: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

"Seamless verification" – consistent across space/time scales

• Modelling perspective – is my model doing the right thing?

– Process approaches

• LES-style verification of NWP runs (first few hours)

• T-AMIP style verification of coupled / climate runs (first few days)

• Single column model

– Statistical approaches

• Spatial and temporal spectra

• Spread-skill

• Marginal distributions (histograms, etc.)

Perkins et al., J.Clim. 2007

Page 19: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

"Seamless verification" – consistent across space/time scales

• User perspective – can I use this forecast to help me make a better decision?

– Neighborhood approaches - spatial and temporal scales with useful skill

– Generalized discrimination score (Mason & Weigel, MWR 2009)

– consistent treatment of binary, multi-category, continuous, probabilistic forecasts

– Calibration - accounting for space-time dependence of bias and accuracy?

– Conditional verification based on larger scale regime

– Extreme Forecast Index (EFI) approach for extremes

• JWGFVR activity

– Proposal for research in verifying forecasts in weather-climate interface

– Assessment component of UK INTEGRATE project

Page 20: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Questions

• What should be the role of JWGFVR in S2S? – Defining protocols? Metrics?– Guidance on methods?– Participation in activities?– Linking forecasting and applications?

• What should be the interaction with other WMO verification activities?E.g., Standardized Verification System for Long-range Forecasts

(SVS-LRF); WGNE/WGCM Climate Metrics Panel

• How do metrics need to change for S2S?• How do we cope with small sample sizes• Is a common set of metrics required for S2S?

20

Page 21: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Database comments

• Database should be designed to allow easy access for– Applications– Verification

• Will need observations for evaluations and applications– Will these (or links to these) be included in the database?– Lack of obs can be a big challenge / detriment to use of the

database

• Access to data– For applications and verification often will not want a whole field

or set of fields – Also may want to be able to examine time series of forecasts at

points– Data formats and access can limit uses

21

Page 22: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

Opportunities!

• New challenges– Methods for evaluating extremes

• Sorting out some of the thorny problems (small sample sizes, limited observations, etc.)

• Defining meaningful metrics associated with research questions

• Making a useful connection between forecast performance and forecast usefulness/value– Application areas (e.g., precipitation onset in Africa)– A new research area

• Using spatial methods for evaluation of S2S forecast patterns

22

Page 23: Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office

23

Thank you