cs4495/6495 introduction to computer vision8d-l3 hidden markov models cs4495/6495 introduction to...
TRANSCRIPT
![Page 1: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/1.jpg)
8D-L3 Hidden Markov Models
CS4495/6495 Introduction to Computer Vision
![Page 2: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/2.jpg)
Outline
• Time Series
•Markov Models
•Hidden Markov Models
•3 computational problems of HMMs
•Applying HMMs in vision – Gesture Recognition
![Page 3: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/3.jpg)
Audio Spectrum Audio Spectrum
of the Song of the Prothonotary Warbler
![Page 4: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/4.jpg)
Chestnut-sided Warbler Prothonotary Warbler
![Page 5: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/5.jpg)
Questions One Could Ask
• What bird is this?
• How will the song continue?
• Is this bird sick?
• What phases does this song have?
Time series classification
Time series prediction
Outlier detection
Time series segmentation
![Page 6: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/6.jpg)
Intel
Cisco General Electric
Microsoft
![Page 7: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/7.jpg)
Questions One Could Ask
•Will the stock go up or down?
•What type stock is this (eg, risky)?
• Is the behavior abnormal?
Time series prediction
Time series classification
Outlier detection
![Page 8: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/8.jpg)
Music Analysis
![Page 9: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/9.jpg)
Questions One Could Ask
• Is this Beethoven or Bach?
•Can we compose more of that?
•Can we segment the piece into themes?
Time series classification
Time series prediction/generation
Time series segmentation
![Page 10: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/10.jpg)
For vision: Waving, pointing, controlling?
![Page 11: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/11.jpg)
The Real Question
• How do we model these problems?
• How do we formulate these questions as a inference/learning problems?
![Page 12: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/12.jpg)
Outline For Today
• Time Series
•Markov Models
•Hidden Markov Models
•3 computational problems of HMMs
![Page 13: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/13.jpg)
Weather: A Markov Model (maybe?)
Sunny
Rainy
Snowy
80%
15%
5%
60%
2%
38%
20%
75% 5%
Probability of moving to a given
state depends only on the current
state: 1st Order
Markovian
![Page 14: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/14.jpg)
Ingredients of a Markov Model
• States:
• State transition probabilities:
• Initial state distribution:
1[ ]i iP q S
1 2{ , ,..., }NS S S
1( | )ij t i t ja P q S q S
Sunny
Rainy
Snowy
80%
15%
5%
60%
2%
38%
20%
75% 5%
![Page 15: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/15.jpg)
Ingredients of a Markov Model • States: • State transition • probabilities:
• Initial state distribution:
{ , , }sunny rainy snowyS S S
.8 .15 .05
.38 .6 .02
.75 .05 .2
A
(.7 .25 .05)
Sunny
Rainy
Snowy
80%
15%
5%
60%
2%
38%
20%
75% 5%
![Page 16: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/16.jpg)
Probability of a Time Series
•Given:
•What is the probability of this series?
(.7 .25 .05) .8 .15 .05
.38 .6 .02
.75 .05 .2A
0.7 0.15 0.6 0.6 0.02 0.2 0.0001512
( ) ( | ) ( | ) ( | )
( | ) ( | )
sunny rainy sunny rainy rainy rainy rainy
snowy rainy snowy snowy
P S P S S P S S P S S
P S S P S S
![Page 17: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/17.jpg)
Outline For Today
• Time Series
•Markov Models
•Hidden Markov Models
•3 computational problems of HMMs
![Page 18: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/18.jpg)
Hidden Markov Models: Intuition
• Suppose you can’t observe the state
• You can only observe some evidence…
![Page 19: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/19.jpg)
Hidden Markov Models: Weather Example
Observables:
Emission probabilities: ( ) ( | )j t t ib k P o k q S
![Page 20: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/20.jpg)
Hidden Markov Models: Weather Example
Sunny
Rainy
Snowy
80%
15%
5%
60%
2%
38%
20%
75% 5%
Sunny
Rainy
Snowy
80%
15%
5%
60%
2%
38%
20%
75% 5%
60%
10%
30%
65%
5%
30%
50% 0% 50%
NOT OBSERVABLE
OBSERVABLE
Emission Probabilities
![Page 21: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/21.jpg)
Probability of a Time Series
•Given:
•What is the probability of this series?
.6 .3 .1.05 .3 .650 .5 .5
B
(.7 .25 .05) .8 .15 .05
.38 .6 .02
.75 .05 .2A
![Page 22: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/22.jpg)
Probability of a Time Series
•Given:
.6 .3 .1.05 .3 .650 .5 .5
B
1 7
1 7 1 7all ,...,
( | ) ( ) ( | ,..., ) ( ,..., )Q q q
P O Q P Q P O q q P q q
( ) ( , , ,..., )coat coat umbrella umbrella
P O P O O O O
62 4(0.3 0.1 0.6) (0.7 0 ) ....8
(.7 .25 .05) .8 .15 .05
.38 .6 .02
.75 .05 .2A
(All sun!)
![Page 23: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/23.jpg)
Specification of an HMM
N - number of states
• S = {𝑆1, , 𝑆2, … 𝑆𝑁} – set of states
• 𝑄 = {𝑞1; 𝑞2; … ; 𝑞𝑇} – sequence of states
![Page 24: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/24.jpg)
Specification of an HMM: 𝜆= (A,B,π)
A - the state transition probability matrix 𝑎𝑖𝑗 = 𝑃(𝑞𝑡+1 = 𝑗|𝑞𝑡 = 𝑖)
B- observation probability distribution
Discrete: 𝑏𝑗 𝑘 = 𝑃 𝑜𝑡 = 𝑘 𝑞𝑡 = 𝑗 1 ≤ 𝑘 ≤ 𝑀
Continuous: 𝑏𝑗(𝑥) = 𝑝(𝑜𝑡 = 𝑥
| 𝑞𝑡 = 𝑗)
π - the initial state distribution
𝜋 (𝑗) = 𝑃(𝑞1 = 𝑗)
S3 S2 S1
![Page 25: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/25.jpg)
Specification of an HMM
Some form of output symbols
• Discrete – finite vocabulary of symbols of size M. One symbol is “emitted” each time a state is visited
• Continuous – an output density in some feature space associated with each state where a output is emitted with each visit
![Page 26: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/26.jpg)
Specification of an HMM
Considering a given observation sequence O
• 𝑂 = {𝑜1; 𝑜2; … ; 𝑜𝑇} – oi observed symbol or feature at time i
(sometimes a set of them)
![Page 27: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/27.jpg)
Specification of an HMM: 𝜆= (𝐴, 𝐵, 𝜋)
𝐴 - the state transition probability matrix 𝑎𝑖𝑗 = 𝑃(𝑞𝑡+1 = 𝑗|𝑞𝑡 = 𝑖)
S3 S2 S1
![Page 28: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/28.jpg)
Specification of an HMM: 𝜆= (𝐴, 𝐵, 𝜋)
𝐵- observation probability distribution
Discrete: 𝑏𝑗 𝑘 = 𝑃 𝑜𝑡 = 𝑘 𝑞𝑡 = 𝑗 1 ≤ 𝑘 ≤ 𝑀
Continuous: 𝑏𝑗(𝑥) = 𝑝(𝑜𝑡 = 𝑥
| 𝑞𝑡 = 𝑗)
S3 S2 S1
![Page 29: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/29.jpg)
Specification of an HMM: 𝜆= (𝐴, 𝐵, 𝜋)
𝜋 - the initial state distribution
𝜋 (𝑗) = 𝑃(𝑞1 = 𝑗)
S3 S2 S1
![Page 30: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/30.jpg)
What does this have to do with Vision? • Given some sequence of observations, what “model”
generated those?
• Using the previous example: given some observation sequence of clothing:
• Is this Philadelphia, Boston or Newark?
• Notice that if Boston vs. Arizona would not need the sequence!
![Page 31: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/31.jpg)
Outline For Today
• Time Series
•Markov Models
•Hidden Markov Models
•3 computational problems of HMMs
![Page 32: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/32.jpg)
The 3 great problems in HMM modelling
1. Evaluating 𝑃(𝑂|𝜆): Given the model 𝜆 = (𝐴, 𝐵, 𝜋) what is the probability of occurrence of a particular observation sequence
𝑂 = 𝑜1, … , 𝑜𝑇
• Classification/recognition problem: I have a trained model for each of a set of classes, which one would most likely generate what I saw.
![Page 33: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/33.jpg)
The 3 great problems in HMM modelling
2. Decoding: Optimal state sequence to produce an observation sequence
𝑂 = {𝑜1, … , 𝑜𝑇}
• Useful in recognition problems – helps give meaning to states.
![Page 34: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/34.jpg)
The 3 great problems in HMM modelling
3. Learning: Determine model λ, given a training set of observations
• Find λ, such that 𝑃(𝑂|𝜆) is maximal
![Page 35: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/35.jpg)
Problem 1 𝑃(𝑂|𝜆) : Naïve solution
Assume
• We know state sequence 𝑄 = 𝑞1, … 𝑞T
• Independent observations:
then
1 1 2 2
1
( | , ( | , ) ( ) ( )... ( )T
t t q q qT Ti
P O q P o q b o b o b o
![Page 36: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/36.jpg)
Problem 1 𝑃(𝑂|𝜆) : Naïve solution
•But we know the probability of any given sequence of states:
1 1 2 2 3 ( 1)( | ) ...q q q q q q T qTP q a a a
![Page 37: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/37.jpg)
Problem 1 𝑃(𝑂|𝜆) : Naïve solution
•Given
•We get:
But this is summed over all paths. There are 𝑁𝑇 states paths, each ‘costing’ 𝑂(𝑇) calculations, leading to 𝑂(𝑇𝑁𝑇)
time complexity.
1 1 2 21
( | , ( | , ) ( ) ( )... ( )T
t t q q qT Ti
P O q P o q b o b o b o
( | ) ( | , ) ( | )q
P O P O q P q
1 1 2 2 3 ( 1)( | ) ...q q q q q q T qTP q a a a
![Page 38: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/38.jpg)
Problem 1 𝑃(𝑂|𝜆) : Efficient solution
Define auxiliary forward variable α:
𝛼𝑡(𝑖) is the probability of observing a partial
sequence of observables 𝑜1, … 𝑜𝑡 AND at time t,
state 𝑞𝑡 = 𝑖
1( ) ( ,..., , | )
t t ti P o o q i
![Page 39: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/39.jpg)
Problem 1 𝑃(𝑂|𝜆) : Efficient solution
Forward Recursive algorithm:
• Initialise:
• Each time step:
• Conclude:
• Complexity: O(𝑁2𝑇)
1 1( ) ( )i ii b o
1 1
1
( ) ( ) ( )N
t t ij j t
i
j i a b o
1
( | ) ( )N
Ti
P O i
Can reach 𝑗 from any preceding
state
Probability of the entire observation sequence is just
sum of observations and ending up in state i, for all i.
![Page 40: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/40.jpg)
Rest of HMMs (in brief)
• The forward recursive algorithm could compute the likelihood of being in a state i at time t and having observed the sequence from the start until t, given HMM λ
![Page 41: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/41.jpg)
Rest of HMMs (in brief)
•A backward recursive algorithm could compute the likelihood of being in a state i at time t and observing the remainder of the observed sequence, given HMM λ
![Page 42: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/42.jpg)
So… or hmmmm…
1. If we know HMM 𝜆 then with the forward and backward algorithm we can get an Estimate of the distribution over which state the system is in at time t.
2. With those distributions and having actually observed output data, I can determine the emission probabilities 𝑏𝑗 𝑘 that would Maximize the probability of the sequence.
![Page 43: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/43.jpg)
So… or hmmmm…
3. Given distribution about state can also determine the transition probabilities 𝑎𝑖𝑗 to
Maximize probability.
4. With the new 𝑎𝑖𝑗 and 𝑏𝑗 𝑘 I can get a new
estimate of the state distributions at all time. (Go to 1)
![Page 44: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/44.jpg)
HMMs: General
• HMMs: Generative probabilistic models of time series (with hidden state)
• Forward-Backward: Algorithm for computing probabilities over hidden states • Given the forward-backward algorithms you can also
train the models.
• Best known methods in speech, computer vision, robotics, though for really big data CRFs winning.
![Page 45: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/45.jpg)
Some thoughts about gestures
• There is a conference on Face and Gesture Recognition so obviously Gesture recognition is an important problem…
• Prototype scenario: • Subject does several examples of "each gesture" • System "learns" (or is trained) to have some sort of
model for each • At run time compare input to known models and pick
one
![Page 46: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/46.jpg)
New found life for Gesture Recognition:
![Page 47: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/47.jpg)
Generic Gesture Recognition using HMMs
Nam, Y., & Wohn, K. (1996, July). Recognition of space-time hand-gestures using hidden Markov model. In ACM symposium on Virtual reality software and technology (pp. 51-58).
![Page 48: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/48.jpg)
Generic Gesture Recognition using HMMs (1)
Data glove
![Page 49: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/49.jpg)
Generic Gesture Recognition using HMMs (2)
![Page 50: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/50.jpg)
Generic Gesture Recognition using HMMs (3)
![Page 51: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/51.jpg)
Generic Gesture Recognition using HMMs (4)
![Page 52: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/52.jpg)
Generic Gesture Recognition using HMMs (5)
![Page 53: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/53.jpg)
Pluses and minuses of HMMs in Gesture
Good points about HMMs:
• A learning paradigm that acquires spatial and temporal models and does some amount of feature selection.
• Recognition is fast; training is not so fast but not too bad.
![Page 54: CS4495/6495 Introduction to Computer Vision8D-L3 Hidden Markov Models CS4495/6495 Introduction to Computer Vision Outline •Time Series •Markov Models •Hidden Markov Models •3](https://reader036.vdocuments.net/reader036/viewer/2022081507/5fc393fe1a70d477f653697d/html5/thumbnails/54.jpg)
Pluses and minuses of HMMs in Gesture
Not so good points:
• Not great for on the fly labeling – e.g. segmentation of input streams. Requires lots of data to train for that – much like language.
• Works well when problem is easy. Less clear other times.