probabilistic context free grammars grant schindler 8803-mdm april 27, 2006

14
Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Upload: clement-jefferson

Post on 19-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Probabilistic Context Free Grammars

Grant Schindler

8803-MDM

April 27, 2006

Page 2: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Problem

PCFGs can model a more powerful class of languages than HMMs. Can we take advantage of this property?

Regular Language

Context Free Language Probabilistic Context Free Grammar (PCFG)

Hidden Markov Model (HMM)

Context Sensitive Language

Unrestricted Language

Page 3: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

PCFG Background

S N V (1.0)

N Bob (0.3) Jane (0.7)

<Left-Hand Side> <Right-Hand Side> (Probability)

V V N (0.4) loves (0.6)

Example Grammar:

Production Rule:

Jane loves Bob.

S

VN

V

N

Example Parse:

Page 4: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

PCFG Applications

•Natural Language Processing: parsing written sentences

•BioInformatics: RNA sequences

•Stock Markets: model rise/fall of the Dow Jones (?)

•Computer Vision: parsing architectural scenes

Page 5: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

PCFG Application: Architectural Facade Parsing

Page 6: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Goal: Inferring 3D Semantic Structure

Page 7: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Discrete vs. Continuous Observations

•Natural Language Processing: parsing written sentences

•BioInformatics: RNA sequences

•Stock Markets: model rise/fall of the Dow Jones (?)

Discrete Values

Continuous Values

How do we estimate the parameters of PCFGs with continuous observation densities (terminal nodes in the parse tree)?

Page 8: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

PCFG Parameter Estimation

In the discrete case, there exists an Expectation Maximization (EM) algorithm:

E-Step: Compute expected number of times each rule (A-> BC) is used in generating a given set of observation sequences (based on previous parameter estimates).

M-Step: Update parameters as normalized counts computed in E-Step.

Essentially: P*(N Bob) = #Bobs / #Nouns

Page 9: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Gaussian Parameter Update Equations

NEW!

Probability that rule A was applied to generate the observed value at location i, computed from Inside-Outside Algorithm via CYK Algorithm

Page 10: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Significance

We can now begin applying probabilistic context-free grammars to problems with continuous data (e.g. stock market) rather than restricting ourselves to discrete outputs (e.g. natural language, RNA).

We hope to find problems for which PCFGs offer a better model than HMMs.

Page 11: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Questions

Page 12: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006
Page 13: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

Open Problems

How do we estimate the parameters of PCFGs with:

A. continuous observation densities (terminal nodes in the parse tree)?

B. continuous values for both non-terminal and terminal nodes?

Page 14: Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

CYK Algorithm

Inside-Outside Probabilities