victoria manfredi, sridhar mahadevan, jim kurose secon’05 september 28, 2005
DESCRIPTION
Switching Kalman Filters for Prediction and Tracking in an Adaptive Meteorological Sensing Network. Victoria Manfredi, Sridhar Mahadevan, Jim Kurose SECON’05 September 28, 2005. Introduction. CASA C ollaborative A daptive S ensing of the A tmosphere - PowerPoint PPT PresentationTRANSCRIPT
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science
November 12, 2000 Memory Management for High-Performance Applications - Ph.D. defense - Emery Berger
Switching Kalman Filters for Prediction and Tracking in an Adaptive Meteorological
Sensing Network
Victoria Manfredi, Sridhar Mahadevan, Jim Kurose
SECON’05September 28, 2005
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 2
Introduction
• CASA– Collaborative Adaptive Sensing of the Atmosphere– Distributed, collaborative, adaptive radar network
Where/what, when, and how to sense?
Configure radars based on predicted locations of meteorological phenomena
Our focus? Storm cells
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 3
Problem
• Track storm cells over time• Use predicted storm locations to identify future radar
configurations
• Constraints/Assumptions– Existing meteorological algorithms that identify storms
from raw radar data– Tracking only a single storm cell– Less than 30 seconds for prediction
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 4
Outline
• Meteorological vs. Statistical Approaches• Kalman Filter Approaches• Experiments• Conclusions • Future work
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 5
Storm Tracking
• Extrapolation– SCIT: linear least-squares over last five points
[JMWMSET98]
– Titan: extrapolation plus cross-correlation [DW93]
– K-means to identify storm clusters, smooth storm movements with Kalman filter [LRD03]
• Knowledge-intensive– Gandolf: model meteorological evolution of each
storm [PHCH00]
– Growth and Decay Storm Tracker: track encompassing storm instead of storm cell [WFHM98]
– Ensemble Kalman Filter: project a set of points forward in time using a meteorological model [E03]
ComputationallyExpensive
Simpler
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 6
• Goal:– Good predictions – Satisfy real-time constraints
• Meteorological Approaches– Extrapolation– Knowledge-intensive
Meteorological vs. Statistical
• Other Statistical Approaches– Kalman filter: linear, Gaussian, state– Switching Kalman filter: non-linear, Gaussian, state
SCIT: Linear least-squares regression [JMWMSET98]
Linear, Gaussian, no state(Developed at NSSL, Kurt Hondl)
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 7
Kalman Filter (KF)
t=1 t=2 t=3
State
Observation
State transitions : xt+1 = Axt + N[0,Q]Observations : yt+1 = Bxt+1 + N[0,R]
•Model (linear) dynamics of an object•States, Obs: Linear function plus Gaussian noise
X = [lat, long, vlat, vlong
]
Y = [lat, long]
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 8
Switching Kalman Filter (SKF)
State transitions : xt+1 = Ai xt + N[0,Qi ]Observations : yt+1 = Bi xt+1 + N[0,Ri ]
•Model object dynamics with set of Kalman filters•Piecewise linear approximation of nonlinear path
t=1 t=2 t=3
State
Observation
X = [lat, long, vlat, vlong
]
Y = [lat, long]
Switch
S = which Kalman filter
Switch
A1 , Q1 , B1 , R1 , 1 , 1
A2 , Q2 , B2 , R2 , 2 , 2
A3 , Q3 , B3 , R3 , 3 , 3
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 9
t=4t=1 t=2 t=3
State
Observation
Kalman Filter
X = [ lat, long, vlat, vlong ]
Y = [ lat, long]
Observe Infer Predict
Inference + Prediction
Least-Squares
• Use five most recent observations only
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 10
t=4
Inference + Prediction
t=1 t=2 t=3
State
Observation
Switch
Switching Kalman Filter
X = [ lat, long, vlat, vlong ]
Y = [ lat, long]
S = which Kalman filter
Switch values unknown inference in SKF is hardt=1: K possible states with K Kalman filters t=2: K2 possible states …t=n: Kn possible states
Solution? Approximate inference: Generalized pseudo-Bayesian– Order 2: Collapse over state, switches two time steps ago
Prediction– Compute most likely sequence of switches– Use corresponding KFs to infer hidden state and predict next
state
Observe Infer Collapse Most Likely Predict
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 11
Experiments
• Compare Kalman filter, switching Kalman filter and linear least-squares regression (SCIT [JMWMSET98] ) on tracking and predicting storm locations
• Data– 35 storm tracks courtesy of Kurt Hondl at NSSL– Each track is a sequence of latitude and longitude
coordinates– Range in length from ten to 30 data points– Identified using SCIT [JMWMSET98]
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 12
Compare hand-coded parameters with learned parameters
• Kalman filter, switching Kalman filter parameters– What are dynamics of storm cells?– How to obtain model of dynamics?
Parameter Learning
• Expectation-maximization to learn parameters– E-step: Assume parameters are known, compute
expected values of hidden variables (state, switch)– M-step: Assume values of hidden variables are known,
compute maximum likelihood parameters
SKF KF
hand-coded
KF-EM SKF-EM
learned
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 13
Results
(Not suprisingly) On nonlinear track, switching Kalman filter performs better
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 14
Results
On linear tracks, both methods perform similarly
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 15
Results
Method Average 1-Step RMSELatitude Longitude
Average 2-Step RMSE Latitude
Longitude KF 0.2248 0.1923 0.3359 0.2871
KF-EM 0.1967 0.1680 0.2680 0.2389
SKF 0.1914 0.1702 0.2642 0.2421
SKF-EM 0.2577 0.2070 0.3948 0.3110
Least- Squares
0.2114 0.2107 0.3030 0.3081
0.1° lat = 6.9 miles0.1° long 6.9 miles
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 16
# of KFs Time Required for 1-Step Prediction (seconds)
avg max min
1 0.000155 0.000864 0.000103
4 0.001689 0.006479 0.001138
8 0.006655 0.028375 0.004553
Timing
Within timing constraints
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 17
Conclusions and Future Work
• Although tracks identified with least-squares method (SCIT), KF-EM and SKF have lower prediction error
• Can learn storm dynamics to improve prediction model
• Future work– Obtain more data to improve learned model
• Especially SKF– Incorporate meteorological information– Track multiple targets, other meteorological
phenomena– Combine decision-making with prediction– Add higher layers to the SKF
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 18
Thank You.Questions?
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 19
References
[JMWMSET98] J. Johnson, P. MacKeen, A. Witt, E. Mitchell, G. Stumpf, M. Eilts, and K.. Thomas. The storm cell identification and tracking algorithm: An enhanced WSR-88D algorithm. Weather and Forecasting, 13:263-276, 1998.
[DW93] M. Dixon and G. Weiner. TITAN: Thunderstorm identification, tracking analysis and nowcasting a radar based methodology. J. Atmos. Ocean. Tech., 10:785-797, 1993.
[LRD03] V. Lakshamanan, R. Rabin, and V. DeBrunner. Multiscale strom identification and forecast. Journal of Atmospheric Research, 367-380, 2003.
[PHCH00] C.Pierce, P. Hardaker, C. Collier, and C. Haggett. GANDOLF: A system for generating automated nowcasts of covective precipitation. Meteorol. Appl., 7:341-360, 2000.
[WFHM98] M. Wolfson, B. Forman, R. Hallowell, and M. Moore. The growth and decay storm tracker. American Meteorological Society 79th Annual Conference, 1999.
[E03] G. Evensen. The ensemble Kalman filter: Theoretical formulation and practical implementatioon. Ocean Dynamics, 53:343-367, 2003.
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 20
Generalized Pseudo-Bayesian
• Values of switch variables are unknown inference in SKF is hard– Time step 1: K possible states with K Kalman filters – Time step 2: K2 possible states– …– Time step n: Kn possible states
• Solution? Approximate inference– Generalized pseudo-Bayesian– Variational– Sampling– Viterbi
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 21
Generalized Pseudo-Bayesian
• Order two generalized pseudo-Bayesian algorithm– Collapse over everything two time steps ago– x = mean, V = covariance, W = switch probability
(xj, Vj) = Collapse(xij, Vij, Wi)xj = ∑i Wi xij
Vj = ∑i Wi Vij + ∑i Wi (xij -xj)(xij -xj)T
• Covariance depends on observations through x
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 22
Linear Least-Squares Regression
• Given a set of points, find best fit line• Assumes constant covariance• Solve Ax=b for coefficient vector x• If too many equations, problem is over-constrained• Error: difference between what model says response
value should be and actual value– Ax - b
• Minimize squared vertical distance to best fit line– ||Ax -b||2
• So instead solve ATAx=ATb for coefficient vector x
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 23
Kalman Filter (KF)
• Assume A = Identity and Q = zero matrix– Then for all t, xt+1 = xt
• This can be used to derive the recursive least-squares update equations
• Implies least-squares assumes constant covariance while KF does not
State transitions : xt+1 = Axt + N[0,Q]Observations : yt+1 = Bxt+1 + N[0,R]
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 24
Results
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 25
t=4 t=5
Inference + Prediction
t=1 t=2 t=3
State
Observation
Kalman Filter
ObserveInferPredict
X = [ lat, long, vlat, vlong ]
X = [ lat, long]
UUNIVERSITY OF NIVERSITY OF MMASSACHUSETTSASSACHUSETTS, A, AMHERST • MHERST • Department of Computer Science Department of Computer Science 26
t=4 t=5
Inference + Prediction
t=1 t=2 t=3
State
Observation
Switch
Switching Kalman Filter
Least-Squares
• Use five most recent observations only
ObserveInferPredictCollapse
X = [ lat, long, vlat, vlong ]