probability, gaussians and estimation

35
Probability, Gaussians and Estimation David Johnson

Upload: tierra

Post on 09-Feb-2016

36 views

Category:

Documents


5 download

DESCRIPTION

Probability, Gaussians and Estimation. David Johnson. Basic Problem. Approaches so far Robot state is a point in state space q = ( x, y, vx, vy, heading ) This state based on measurements External (GPS, beacons, vision) Internal (odometry, gyros) Measurements not exact - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Probability, Gaussians and Estimation

Probability, Gaussians andEstimation

David Johnson

Page 2: Probability, Gaussians and Estimation

Basic Problem• Approaches so far

– Robot state is a point in state space

• q = ( x, y, vx, vy, heading )• This state based on

measurements– External (GPS, beacons, vision)– Internal (odometry, gyros)

• Measurements not exact• Errors can accumulate• How to “clean” measurements

– filtering• How to combine

measurements?– estimation

Page 3: Probability, Gaussians and Estimation

Problems to solve

• Localization– Given a map where am I?

• Mapping– Given my position, build a map

• SLAM – Simultaneous Localization and Mapping– Drop down a robot, build a map and location

within in at the same time

Page 4: Probability, Gaussians and Estimation

Approach

• Treat state variables as probabilities• Combine measurements weighted by

reliability• Use filtering to improve estimate of state

Page 5: Probability, Gaussians and Estimation

Example - Triangulation• Time of flight from

beacons gives distance– Constrains to a circle– 2 beacons to point

solutions

Page 6: Probability, Gaussians and Estimation

Noise in Measurements• Uncertainty in

measurement– Can be reported as

plus/minus some value• Creates solution

regions• Even this is a

simplification– Measurements follow

a distribution

Page 7: Probability, Gaussians and Estimation

Coin Flip

• F=(head, tails)– Discrete distribution

• Probability of a coin flip being head or tails is 0.5 + 0.5 = 1

• But what about continuous distributions?• Probability someone in the room is exactly

2 meters tall is infinitesimal• Talk about probability of intervals instead

Page 8: Probability, Gaussians and Estimation

Continuous distributions• Probability density

function f(x)• Find probability of a

measurement being within an interval

• What is f(x) for a uniform distribution over range [u,v]?

x

f(x)

Page 9: Probability, Gaussians and Estimation

Gaussian distributions

• Bell-shaped f(x)• Can assume most measurements with

noise follow a Gaussian distribution• Why

– Central Limit Theorem– Applet

Page 10: Probability, Gaussians and Estimation

Gaussian Definition

-s s

m

Univariate

m

Multivariate

2

2)(21

2

21)(

:),(~)(

sm

sp

sm

--

=x

exp

Nxp

)()(21

2/12/

1

)2(1

)(

:)(~)(

μxΣμx

Σx

Σμx

--- -

=t

ep

,Νp

dp

Page 11: Probability, Gaussians and Estimation

The Mean of a Continuous Distribution

Page 12: Probability, Gaussians and Estimation

Discrete Variance vs Continuous

• Discrete

• Continuous

Page 13: Probability, Gaussians and Estimation

Gaussians

• Gaussians completely described by mean and variance

• Non-zero mean implies a bias in measurement

• Zero mean can be removed by filtering

Page 14: Probability, Gaussians and Estimation

),(~),(~ 22

2

smsm

abaNYbaXY

NX

=

Properties of Gaussians

-- 22

21

222

21

21

122

21

22

212222

2111 1,~)()(

),(~),(~

ssm

sssm

sss

smsm NXpXp

NXNX

Page 15: Probability, Gaussians and Estimation

Filtering

• Gaussian noise = N(0, )• Make repeated measurements• Histogram the samples• Find the peak – that is the mean

– Easy!

• What is the size of the whiteboard in meters (1 decimal place precision)?

2s

Page 16: Probability, Gaussians and Estimation

Non-static situation

• What happens when state evolves?– Can’t repeat measurements

• Moving average filter

• Introduces lag into system!

Page 17: Probability, Gaussians and Estimation

Use a state model

• Estimate position from measurements• Measure velocity as well• Evolve position from velocity

– Incorporate evolved state into position measurements

– Need to combine multiple, uncertain measurements

Page 18: Probability, Gaussians and Estimation

Back to the non-evolving case

• Two different processes measure the same thing

• Want to combine into one better measurement

• Estimation

Page 19: Probability, Gaussians and Estimation

Estimation

What is meant by estimation?

Data + noise

Data + noise

Data + noise

Estimator Estimation

Hz ŷStochastic process estimate

Page 20: Probability, Gaussians and Estimation

A Least-Squares Approach

• We want to fuse these measurements to obtain a new estimate for the range

• Using a weighted least-squares approach, the resulting sum of squares error will be

• Minimizing this error with respect to yields

222

111

),0(),0(

vrRNrzvrRNrz

====

=

-=n

iii zrwe

1

2)ˆ(

0)ˆ(2)ˆ(ˆˆ 11

2 =-=-

=

==

n

iii

n

iii zrwzrw

rre

Page 21: Probability, Gaussians and Estimation

A Least-Squares Approach• Rearranging we have

• If we choose the weight to be

we obtain

0ˆ11

=- ==

n

iii

n

ii zwrw

=

== n

ii

n

iii

w

zwr

1

iii Rw 11

2 ==s

221

11

21

2

21

2

2

1

1

11ˆ z

RRRz

RRR

RR

Rz

Rz

r

=

=

Page 22: Probability, Gaussians and Estimation

• For merging Gaussian distributions, the update rule is

A Least-Squares Approach

22

21

22

212

322

21

22

21

22

21

23

111ss

sssssss

sss =

==

Show for N(0,a) N(0,b)

Page 23: Probability, Gaussians and Estimation

• This can be rewritten as

or if we think of this as adding a new measurement to our current estimate of the state we would get

• For merging Gaussian distributions, the update rule is

which if we write in our measurement update equation form we get

A Least-Squares Approach

)(ˆ 1221

11 zz

RRRzr -

=

22

21

22

212

322

21

22

21

22

21

23

111ss

sssssss

sss =

==

-

-

-

-

-

= 11111

111 kkk

kk

kkk PKP

RPRPP

)ˆ(ˆˆ 111

111

--

--

-

= kkk

kkk rz

RPPrr )ˆ(ˆˆ 11111

-

- -= kkkkk rzKrr

KalmanGainshow

Page 24: Probability, Gaussians and Estimation

What happens when you move?

),(~),(~ 22

2

smsm

abaNYbaXY

NX

=

derive

Page 25: Probability, Gaussians and Estimation

Moving

• As you move– Uncertainty grows– Need to make new measurements– Combine measurements using Kalman gain

Page 26: Probability, Gaussians and Estimation

The Kalman Filter“an optimal recursive data processing algorithm”

OPTIMAL:- Linear dynamics- Measurements linear w/r to state- Errors in sensors and dynamics

must be zero-mean (un-bias) white Gaussian

RECURSIVE:- Does not require all previous data- Incoming measurements ‘modify’

current estimate

DATA PROCESSING ALGORITHM:The Kalman filter is essentially a technique of estimation given a system model and concurrent measurements

(not a function of frequency)

Page 27: Probability, Gaussians and Estimation

The Discrete Kalman Filter

Estimate the state of a discrete-time controlled process that is governed by the linear stochastic difference equation:

with a measurement:

The random variables wk and vk represent the process and measurement noise (respectively). They are assumed to be independent (of each other), white, and with normal probability distributions

In practice, the process noise covariance and measurement noise covariancematrices might change with each time step or measurement.

(PDFs)

Page 28: Probability, Gaussians and Estimation

The Discrete Kalman Filter

First part – model forecast: prediction

“prior” estimate

Process noisecovariance

Statetransition

Stateprediction

Error covarianceprediction

Controlsignal

Prediction is based only the model of the system dynamics.

Page 29: Probability, Gaussians and Estimation

The Discrete Kalman Filter

Second part – measurement update: correction

“posterior” estimate

statecorrection

“prior” stateprediction

Kalmangain

actualmeasurement

predictedmeasurement

update error covariance matrix (posterior)

Page 30: Probability, Gaussians and Estimation

The Discrete Kalman Filter

The Kalman gain, K: “Do I trust my model or measurements?”

RHPHHP

K Tk

Tk

k

= -

- variance of the predicted states= ------------------------------------------------------------

variance of the predicted + measured states

measurement sensitivity matrix

measurementnoise covariance

As measurement error covariance, R, approaches zero, the actual measurement, zk is “trusted” more and more. is trusted less and less

But, as the “prior” (predicted) estimate error covariance, P, approaches zero, the actual measurement is trusted less, and predicted measurement, is trusted more and more

Page 31: Probability, Gaussians and Estimation

Estimate a constant voltage

• Measurements have noise• Update step is

• Measurement step is

Page 32: Probability, Gaussians and Estimation

Results

Page 33: Probability, Gaussians and Estimation

Variance

Page 34: Probability, Gaussians and Estimation

Parameter tuning

Page 35: Probability, Gaussians and Estimation

More tuning