markov processes mbap 6100 & emen 5600 survey of operations research professor stephen lawrence...

25
Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder, CO 80309-0419

Upload: donald-albert-king

Post on 18-Dec-2015

231 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov Processes

MBAP 6100 & EMEN 5600

Survey of Operations Research Professor Stephen Lawrence

Leeds School of Business

University of Colorado

Boulder, CO 80309-0419

Page 2: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

OR Course Outline

• Intro to OR• Linear Programming• Solving LP’s• LP Sensitivity/Duality• Transport Problems• Network Analysis• Integer Programming

• Nonlinear Programming• Dynamic Programming

• Game Theory• Queueing Theory

• Markov Processes• Decisions Analysis• Simulation

Page 3: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Whirlwind Tour of OR

Markov Analysis

Andrey A. Markov (born 1856). Early work in probability theory, proved central limit theorem

Page 4: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Agenda for This Week

• Markov Applications– More Markov examples

– Markov decision processes

• Markov Processes– Stochastic processes

– Markov chains

– Future probabilities

– Steady state probabilities

– Markov chain concepts

Page 5: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Stochastic Processes

• Series of random variables {Xt}

• Series indexed over time interval T

• Examples: X1, X2, … , Xt, … , XT represent

– monthly inventory levels– daily closing price for a stock or index– availability of a new technology– market demand for a product

Page 6: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov Chains

• Present state Xt is independent of history

– previous states or events have no current or future influence on the current state

• Process will move to other states with known transition probabilities

• Transition probabilities are stationary– probabilities do not change over time

• There exist a finite number of possible states

Page 7: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

An Example of a Markov Chain

A small community has two service stations: Petroco and Gasco. The marketing department of Petroco has found that customers switch between stations according to the following transition matrix:

ThisMonth Petroco GascoPetroco 0.60 0.40Gasco 0.20 0.80

Next Month

Note: Rows sum to 1.0 !

=1.0=1.0

Page 8: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Future State Probabilities

Probability that a customer buying from Petroco this month will buy from Petroco next month:

p111 0 6( ) .

p112 0 6 0 6 0 4 0 2 0 44( ) ( . . ) ( . . ) .

In two months:

From Gasco in two months:

p122 0 4 0 8 0 6 0 4 0 56( ) ( . . ) ( . . ) .

Page 9: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Graphical Interpretation

Petroco

Petroco

Gasco

Petroco

Petroco

Gasco

Gasco

0.4

0.40.6

0.6

0.8

0.2

0.6

0.4

0.36

0.24

0.08

0.32

1.00

First Period Second Period

Page 10: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Chapman-Kolmogorov Equations

P(2) = P·P

Let P be the transition matrix for a Markov process. Then the n-step transition probability matrices can be found from:

P(3) = P·P·P

Page 11: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

CK Equations for Example

80.020.0

40.060.0

72.028.0

56.044.0

80.020.0

40.060.0

80.020.0

40.060.0

P(1)

P(2)

Page 12: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Starting StatesIn current month, if 70% of customers shop at Petroco and 30% at Gasco, what will be the mix in 2 months?

80.020.0

40.060.0P

s = [0.70 0.30]

sn = s0 P(n)

2

80.020.0

40.060.0

s2 = [0.7 0.3]

= [0.7 0.3]

72.028.0

56.044.0

= [0.39 0.61]

Page 13: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

CK Equations in Steady State

80.020.0

40.060.0

72.028.0

56.044.0

80.020.0

40.060.0

80.020.0

40.060.0

67.033.0

67.033.0

80.020.0

40.060.09

P(1)

P(2)

P(9)

Page 14: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Convergence to Steady-State

Prob

1.0

1 105 Period

0.33

If a customer is buys at Petroco this month, what is the long-run probability that the customer will buy at Petroco during any month in the future?

Page 15: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Calculation of Steady State

• Want outcome probabilities equal to incoming probabilities

• Let s = [s1, s2, …, sn] be the vector of steady-state probabilities

• Then we wants = s P

• That is, the output state probabilities do not change from transition to transition (e.g., steady-state!)

Page 16: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Steady-State for Example

80.020.0

40.060.0P

s = [p g]

s = s P

80.020.0

40.060.0[p g] = [p g]

p = 0.6p + 0.2gg = 0.4p + 0.8g

p + g = 1

p = 0.333g = 0.667

Page 17: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov Chain Concepts

• Steady-State Probabilities– long-run probability that a process starting in

state i will be found in state j

• First-Passage Time– length of time (steps) in going from state i to j

• Recurrence Time– length of time (steps) to return to state i when

starting in state i

Page 18: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov Chain Concepts (cont.)

• Accessible States– State j can be reached from i (pij

(n) > 0)

• Communicating States– State i and j are accessible from one another

• Irreducible Markov chains– All states communicate with one another

Page 19: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov Chain Concepts (cont.)

• Recurrent State– A state that will certainly return to itself (fii = 1)

• Transient State– A state that may return to itself (fii < 1)

• Absorbing State– A state the never moves to another state (pii=1)

– A “black hole”

Page 20: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Markov ExamplesMarkov Decision Processes

Page 21: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Matrix Multiplication

qp

nm

dc

ba

dqcndpcm

bqanbpam

Matrix multiplication in Excel…

Page 22: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Machine Breakdown Example

A critical machine in a manufacturing operation breaks down with some frequency. The hourly up-down transition matrix for the machine is shown below. What percentage of the time is the machine operating (up)?

3.07.0

1.09.0Up

Up

Down

Down

Page 23: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Credit History Example

The Rifle, CO Mercantile Department Store wants to analyze the payment behavior of customers who have outstanding accounts. The store’s credit department has determined the following bill payment pattern from historical records:

2.08.0

1.09.0Pay

No PayPay

No Pay

Page 24: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

Credit History ContinuedFurther analysis reveals the following credit transition matrix at the Rifle Mercantile:

Pay1 2 Bad

Pay

1

2

Bad

10000

01000

4.06.0000

08.02.000

08.002.000

0

Page 25: Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,

University Graduation ExampleFort Lewis College in Durango has determined that students progress through the college according to the following transition matrix:

100000

010000

8.005.015.0000

01.075.015.000

01.008.01.00

02.0007.01.0

F So J Sr D GF

SoJSrDG