downscaling global climate models: what is it …...2016/03/03  · downscaling global climate...

41
Downscaling Global Climate Models: What is it good for? Adam Terando Department of Biology, NCSU

Upload: others

Post on 09-Jun-2020

5 views

Category:

Documents


1 download

TRANSCRIPT

Downscaling Global Climate Models: What is it good for?

Adam Terando

Department of Biology, NCSU

How to generate robust projections?

Socio-Economic Projections

Population, Technology, Economy

Energy Modeling

Emissions of GHGs & other substances

Global Modeling Climate, Chemistry

Downscaling

Dynamical, statistical

Impact Modeling

Hydrology,

Ecosystems, etc.

Source: K. Hayhoe

GLOBAL CLIMATE MODELS

3

Simulate the climate (or weather) using the physical laws that govern the motion of a fluid as well as the laws of thermodynamics.

Climate models are driven by

fundamental physics.

1. Conservation of momentum (F=ma for pressure differences and the Coriolis force)

2. Hydrostatic equation (how pressure varies with height - gravitational force balanced by pressure gradient force)

3. Conservation of energy (change in energy is equal to net transfer across boundaries by advection, evaporation, condensation)

4. Continuity equation (conservation of mass – mass is neither created nor destroyed)

5. Equation of state (ideal gas law relates pressure, density and temperature)

6. Water vapor equation (accounts for changes in water vapour amounts due to advection, condensation, evaporation)

THE PRIMITIVE EQUATIONS THE ROSETTA STONE OF CLIMATE SCIENCE

Equations of motion

Equation of mass conservation

Equation of state for an ideal gas

Equation of energy conservation

GCMs • Dynamic physically based

numerical models of atmosphere, ocean and land surface

• “Coupled” models……Modeled ocean processes affect modeled atmospheric and terrestrial processes and vice versa

• GCMs are NOT statistical or empirical models of climate based on a sample of observations

IPCC 2007 6

IPCC AR4 2007 7

GCMs • Increasing complexity of climate

models as computational power increases

• First models very crude, basically energy balance models

• As computing power increases, more processes involved, greater resolution, less ‘tuning’, more direct coupling, etc.

• Increasing sophistication, yet consistent predictions of warming as CO2 increases

• Major problems still exist, especially in areas like cloud formation and ice dynamics

IPCC AR4 2007 8

Why are high resolution projections needed?

Climate change is a global issue; but local information is needed to determine how it will affect human and natural systems around the world. Global temperature change: 2-

6oC by 2100 Implications for Chicago: 1995-like heat

waves up to 3x/year by 2100

Source: IPCC Third Assessment Report; Hayhoe et al. 2009. JGLR

Average winter snowfall

Why are high resolution projections needed?

Source: K. Hayhoe

Why are high resolution projections needed?

Buffalo, NY

Toronto, ON

53”

92”

YYZ BUF

150”+

KB

Kissing Bridge, NY Source: K. Hayhoe

How are future projections generated?

Socio-Economic Projections

Population, Technology, Economy

Energy Modeling

Emissions of GHGs & other substances

Global Modeling Climate, Chemistry

Downscaling

Dynamical, statistical

Impact Modeling

Hydrology,

Ecosystems, etc.

Source: K. Hayhoe

DOWNSCALING 13

Hayhoe 2009

Hayhoe 2009 14

How are high resolution projections generated?

DOWNSCALING: the simulation of sub-gridscale variables from coarser-resolution fields

WHERE DID DOWNSCALING COME FROM?

Downscaling developed from weather forecasting as a way to correct the biases and other systematic errors in large-scale models that occurred the local scale, where the information was being used.

PHYSICAL BASIS: the assumption that variables at finer resolution than the spatial or temporal scale of the input are a reproducible function of large-scale features resolvable by the input and available high-resolution information

Hayhoe 2009 16

Each method has its strengths

Statistical Dynamic

- can generate large number of

realizations in order to assess

uncertainty

- delivers meteorologically-

consistent downscaled variable

response to forcing

- flexible, easy to use for a variety

of applications, gridded or point

output

-explicitly simulates both large

scale and sub-grid-scale processes

-Can resolve many local-scale

feedbacks, including biophysical

feedbacks - doesn’t require a lot of CPU

-can relate GCM-derived data

directly to impact-relevant variables

not simulated by climate models

- no need to assume current

relationship between large & local

scale climate variables remains

valid in the future Source: K. Hayhoe

EMPIRICAL STATISTICAL DOWNSCALING

Empirical-Statistical Approaches

• Stochastic Weather Generators • Weather Typing • Regression Methods • Delta • BCSD • Neural Networks • Bayesian analyses • Clustering Methods • Combined statistical-dynamical approaches • And so forth and so on…….

4 traditional downscaling approaches

Regression Geostatistics Weather generator

Analogues Weather states

Linear Non-linear Krigeage Stochast. methods

Clustering

Traditional approaches to statistical

downscaling

Source: M. Vrac & K. Hayhoe

Statistical Downscaling Assumptions

1. GCM (large-scale) predictors are relevant to local climate

2. GCM predictions are realistic at the large scale

3. Transfer functions are valid under altered forcing conditions

4. Predictors fully represent the climate change signal

Limits to Empirical Downscaling

• Assumes the observational data is a perfectly accurate representation of actual conditions.

• Numerous factors from observer error to long-term creep in measurement equipment can bias observations relative to reality.

• Unresolved “processes” that determine the relationship between large-scale features and local climate may include observational error, a bias which then continues to be included in future projections

The Statistical Downscaling Process

SELECT: Global climate model

Downscaling technique

Predictor variables

CALIBRATE & VALIDATE:

Observational data Independant data

DOWNSCALE: Force downscaling model or technique with GCM predictor variables

Source: K. Hayhoe

1. Delta 2. Bias-correction

3. Quantile mapping 4. Asynchronous regression

Comparison for Southeast station: PDF

Source: K. Hayhoe. In preparation.

Future projections: 2080-2099 Daily Tx

Southeast

Source: K. Hayhoe. In preparation.

Statistical Downscaling: What is it good for?

Statistical methods should be used if….

• The only variables required are monthly or daily temperature and/or precipitation

• If it is important to have continuous simulations for decades to centuries

• If the researcher needs the climate model output to match the observational record

Dynamical Downscaling

Maraun et al. 2010

Dynamic Downscaling

Use high-resolution regional climate models to simulate local climate, updating the regional model’s boundary conditions every few hours with output fields from a large-scale global climate model.

Dynamic Downscaling

Laprise et al. 2008

Dynamic Downscaling

Laprise et al. 2008

Limitations to Dynamic Downscaling

• No matter how high the resolution, could be physical processes we are not aware of, or processes operating at smaller temporal or spatial scale than can be simulated

• Cannot produce climate projections at a scale finer than the resolution of its grid cells.

• Resolution of 25 km2 may be insufficient for regions with quickly changing topography, or urban areas with heat island effects

• Boundary effects where model output must snap back to courser GCM output

Hostetler et al. 2011

Source: S. Hostetler

Surface Fields 3h Atmospheric Fields 6h

Dynamic Downscaling: What is it good for?

Dynamically downscaled models should be used if… • You need other variables besides temperature and

precipitation--such as surface wind, humidity, pressure, or upper-air fields (although some of these could also be produced with statistical downscaling)

• You need 3h or 6h (sub-daily) outputs • A few decades’ worth of simulations are enough • You are not concerned about resolving the range of

likely model uncertainty (although you probably should be!)

• It’s important to model dynamic feedback processes at the local to at the local to regional scale

Pitfalls and Traps

• Higher resolution is not always better or even necessary!

What resolution is good enough?

• Downscaled models should be as high resolution as possible but not higher?

• Key is to match the scale of (physically and computationally feasible) simulated climate output to the scale of the local climatic processes that affect ecosystem processes (importance of covariance)

• Interfaces between systems that jointly impact both climate and ecosystems present a special challenge (i.e. the coastline)

Limitations to ALL Approaches:

It’s REALLY Hard to Downscale Precipitation

• Non-Gaussian • Discontinuous process • On-Off (wet-dry) runs • Poorly understood or crudely parameterized

processes Mauran et al. 2010

Global Model

DS RCM

OBS

Stefanova et al. 2011

Kang and Ramirez 2010

BOTTOM LINE • Best approach may be combining statistical

methods with regional climate model simulations.

• Explicitly solve for the process-based physical dynamics of the regional climate system and incorporate essential historical observations into future projections (Gridded Average to a Single Location)

• Key challenge remains translating climate projections generated by the downscaling into information directly relevant to an impact community which may have previously based its planning on historical observations.

BOTTOM LINE • Any downscaling approach is nearly always better than

none at all

• Quantile-based approaches tend to be better than approaches that correct for mean only.

• Some future projections (esp extremes) are sensitive to downscaling approach used.

• Understanding limitations & biases in methods can help select the appropriate method… or, if it’s too late, interpret results.

• But there is no one ideal method best suited for any analysis

• The perfect method for each analysis depends on your resources, time frame, familiarity with the data and methods used, and the specific focus of the study.