nlp. introduction to nlp sequence of random variables that aren’t independent examples –weather...

36
NLP

Upload: darleen-cook

Post on 05-Jan-2016

228 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

NLP

Page 2: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Introduction to NLP

Hidden Markov Models

Page 3: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Markov Models

• Sequence of random variables that aren’t independent

• Examples – weather reports– text

Page 4: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Properties

• Limited horizon:P(Xt+1 = sk|X1,…,Xt) = P(Xt+1 = sk|Xt)

• Time invariant (stationary)= P(X2=sk|X1)

• Definition: in terms of a transition matrix A and initial state probabilities .

Page 5: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Example

f a b

e d c1.0

1.0

0.2

0.2

0.8

1.0

0.3

0.7

0.1

0.7

start

Page 6: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Visible MMP(X1,…XT) = P(X1) P(X2|X1) P(X3|X1,X2) … P(XT|X1,…,XT-1)

= P(X1) P(X2|X1) P(X3|X2) … P(XT|XT-1)

P(d, a, b) = P(X1=d) P(X2=a|X1=d) P(X3=b|X2=a)

= 1.0 x 0.7 x 0.8 = 0.56

Page 7: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Hidden MM• Motivation

– Observing a sequence of symbols– The sequence of states that led to the generation of the symbols is

hidden

• Definition– Q = sequence of states– O = sequence of observations, drawn from a vocabulary– q0,qf = special (start, final) states

– A = state transition probabilities– B = symbol emission probabilities– = initial state probabilities– µ = (A,B,) = complete probabilistic model

Page 8: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Hidden MM

• Uses– part of speech tagging– speech recognition– gene sequencing

Page 9: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Hidden Markov Model (HMM)

• Can be used to model state sequences and observation sequences

• Example:– P(s,w) = i P(si|si-1)P(wi|si)

S1 S2 S3 SnS0 …

W1 W2 W3 Wn

Page 10: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Generative Algorithm

• Pick start state from • For t = 1..T

– Move to another state based on A– Emit an observation based on B

Page 11: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

State Transition Probabilities

G H

0.2

0.6

0.40.8

start

Page 12: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Emission Probabilities

• P(Ot=k|Xt=si,Xt+1=sj) = bijk

x y zG 0.7 0.2 0.1H 0.3 0.5 0.2

Page 13: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

All Parameters of the Model

• Initial– P(G|start) = 1.0, P(H|start) = 0.0

• Transition– P(G|G) = 0.8, P(G|H) = 0.6, P(H|G) = 0.2, P(H|H) = 0.4

• Emission– P(x|G) = 0.7, P(y|G) = 0.2, P(z|G) = 0.1– P(x|H) = 0.3, P(y|H) = 0.5, P(z|H) = 0.2

Page 14: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Observation sequence “yz”

• Starting in state G (or H), P(yz) = ?• Possible sequences of states:

– GG– GH– HG– HH

• P(yz) = P(yz|GG) + P(yz|GH) + P(yz|HG) + P(yz|HH) == .8 x .2 x .8 x .1 + .8 x .2 x .2 x .2+ .2 x .5 x .4 x .2+ .2 x .5 x .6 x .1= .0128+.0064+.0080+.0060 =.0332

Page 15: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

States and Transitions

• The states encode the most recent history• The transitions encode likely sequences of

states– e.g., Adj-Noun or Noun-Verb – or perhaps Art-Adj-Noun

• Use MLE to estimate the transition probabilities

Page 16: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Emissions

• Estimating the emission probabilities– Harder than transition probabilities– There may be novel uses of word/POS combinations

• Suggestions– It is possible to use standard smoothing– As well as heuristics (e.g., based on the spelling of the

words)

Page 17: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Sequence of Observations

• The observer can only see the emitted symbols• Observation likelihood

– Given the observation sequence S and the model = (A,B,), what is the probability P(S|) that the sequence was generated by that model.

• Being able to compute the probability of the observations sequence turns the HMM into a language model

Page 18: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Tasks with HMM• Tasks

– Given = (A,B,), find P(O|)– Given O, , what is (X1,…XT+1)

– Given O and a space of all possible , find model that best describes the observations

• Decoding – most likely sequence– tag each token with a label

• Observation likelihood– classify sequences

• Learning– train models to fit empirical data

Page 19: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Inference

• Find the most likely sequence of tags, given the sequence of words– t* = argmaxt P(t|w)

• Given the model µ, it is possible to compute P (t|w) for all values of t

• In practice, there are way too many combinations• Possible solution:

– Use beam search (partial hypotheses)– At each state, only keep the k best hypotheses so far– May not work

Page 20: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Viterbi Algorithm

• Find the best path up to observation i and state s

• Characteristics– Uses dynamic programming– Memoization– Backpointers

Page 21: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

P(H|G)

P(B|B)

P(y|G)

start

end

H H H H

G G G G

start start start

end end end

HMM Trellis

Page 22: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

start start start

end end end

HMM Trellis

y z .

P(G,t=1)

P(G,t=1) = P(start) x P(G|start) x P(y|G)

Page 23: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(H,t=1)

P(H,t=1) = P(start) x P(H|start) x P(y|H)

Page 24: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(H,t=2) = max (P(G,t=1) x P(H|G) x P(z|H),

P(H,t=1) x P(H|H) x P(z|H))

P(H,t=2)

Page 25: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(H,t=2)

Page 26: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

Page 27: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(end,t=3)

Page 28: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(end,t=3)

P(end,t=3) = max (P(G,t=2) x P(end|G),

P(H,t=2) x P(end|H))

Page 29: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

start

end

H H H H

G G G G

y z

start start start

end end end

HMM Trellis

.

P(end,t=3)

P(end,t=3) = best score for the sequence

Use the backpointers to find the sequence of states.

P(end,t=3) = max (P(G,t=2) x P(end|G),

P(H,t=2) x P(end|H))

Page 30: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Observation Likelihood

• Given multiple HMMs – e.g., for different languages– Which one is the most likely to have generated the

observation sequence

• Naïve solution– try all possible state sequences

Page 31: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Forward Algorithm

• Dynamic programming method– Computing a forward trellis that encodes all possible

state paths.– Based on the Markov assumption that the probability of

being in any state at a given time point only depends on the probabilities of being in all states at the previous time point

Page 32: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

HMM Learning

• Supervised– Training sequences are labeled

• Unsupervised– Training sequences are unlabeled– Known number of states

• Semi-supervised– Some training sequences are labeled

Page 33: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Supervised HMM Learning

• Estimate the static transition probabilities using MLE

• Estimate the observation probabilities using MLE

• Use smoothing

)(

)q ,( 1t

it

jitij sqCount

ssqCounta

)(

),()(

ji

kijij sqCount

vosqCountkb

Page 34: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Unsupervised HMM Training

• Given: – observation sequences

• Goal: – build the HMM

• Use EM (Expectation Maximization) methods – forward-backward (Baum-Welch) algorithm– Baum-Welch finds an approximate solution for P(O|µ)

Page 35: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

Outline of Baum-Welch

• Algorithm– Randomly set the parameters of the HMM– Until the parameters converge repeat:

• E step – determine the probability of the various state sequences for generating the observations

• M step – reestimate the parameters based on these probabilities

• Notes– the algorithm guarantees that at each iteration the likelihood of the data

P(O|µ) increases– it can be stopped at any point and give a partial solution– it converges to a local maximum

Page 36: NLP. Introduction to NLP Sequence of random variables that aren’t independent Examples –weather reports –text

NLP