issues in computational linguistics: ambiguity management dick crouch and tracy king

50
Issues in Computational Issues in Computational Linguistics: Linguistics: Ambiguity Management Ambiguity Management Dick Crouch and Tracy Dick Crouch and Tracy King King

Upload: timothy-washington

Post on 13-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Issues in Computational Linguistics:Issues in Computational Linguistics:Ambiguity ManagementAmbiguity Management

Dick Crouch and Tracy KingDick Crouch and Tracy King

Page 2: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

The next two daysThe next two days

Today– Mostly computational– Generalizing charts for packing ambiguity

Tomorrow– Mostly linguistic– An introduction to glue semantics

Page 3: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Several Types of AmbiguitySeveral Types of Ambiguity Phonetic:

– “I scream” or “ice cream” Tokenization:

– “I like Jan.” --- |Jan|. Or |Jan.|. (abbrev January)

Morphological: – “walks” --- noun or verb– “untieable knot” --- un(tieable) or (untie)able

Lexical:– “bank” --- river bank or financial institution

Syntactic:– “The turkeys are ready to eat” --- fattened or hungry

Semantic:– “Two boys ate fifteen pizzas” --- 15 each or 15 total

Pragmatic:– “Sue won. Ed gave her a good luck charm” --- cause or result

Page 4: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Some more ambiguitiesSome more ambiguities

Women like chocolate more than men Visiting relatives can be boring

– Visiting relatives is boring– Visiting relatives are boring

Pets must be carried on escalator Clothes must be worn in public Warning: keep all body parts out of window

Page 5: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

PP AttachmentPP AttachmentA classic example of syntactic ambiguityA classic example of syntactic ambiguity

PP adjuncts can attach to VPs and NPs Strings of PPs in the VP are ambiguous

– I see the girl with the telescope.

I see [the girl with the telescope]. I see [the girl] [with the telescope].

Ambiguities proliferate exponentially– I see the girl with the telescope in the park

I see [the girl with (the telescope in the park)]I see [the (girl with the telescope) in the park]I see the girl [with the (telescope in the park)]I see the girl [with the telescope] [in the park]I see [the girl with the telescope] [in the park]

– The syntax has no way to determine the attachment, even if humans can.

Page 6: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Coverage entails AmbiguityCoverage entails Ambiguity

I fell in the park.

+I know the girl in the park.

I see the girl in the park.

Page 7: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Ambiguity: bug or feature?Ambiguity: bug or feature?

Bug in computer programming languages Feature in natural language

– People good at resolving ambiguity in context– Ambiguity consequently often unperceived

» “Cable breakage damages the printer”» “Readjust paper holding clip”

Even though thousand-fold ambiguities are common– Ambiguity promotes conciseness

Computers can’t resolve ambiguity like humans

Page 8: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Ambiguity can be explosiveAmbiguity can be explosiveIf alternatives multiply within or across components…

Tokenize

Morphology

Syntax

Sem

antics

Discourse

Page 9: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Computational consequences of ambiguityComputational consequences of ambiguity

Serious problem for computational systems– Broad coverage, hand written grammars frequently produce

thousands of analyses, sometimes millions– Machine learned grammars easily produce hundreds of

thousands of analyses if allowed to parse to completion

Three approaches to ambiguity management:– Pruning: block unlikely analysis paths early– Procrastination: do not expand analysis paths that will lead

to ambiguity explosion until something else requires them» Also known as underspecification

– Packing: compact representation and computation of all possible analyses

Page 10: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

The Problem with Pruning:The Problem with Pruning: premature disambiguationpremature disambiguation

The conventional approach: Use heuristics to prune as soon as possible

Tokenize

Morphology

Syntax

Sem

antics

Discourse

XX

X

Statistics

X

Strong constraints may reject the so-far-best (= only) option

Fast computation, wrong result

X

Page 11: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

The problem with procrastinationThe problem with procrastination passing the buckpassing the buck

Chunk parsing as an example:– Collect noun groups, verb groups, PP groups– Leave it to later processing to figure out the

correct way of putting these together– Not all combinations are grammatically acceptable

Later processing must either– Call parser to check grammatical constraints– Have its own model of grammatical constraints– In the best case, solve a set of constraints the

partial parser includes with its output

Page 12: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

The Problem with PackingThe Problem with Packing

There may just be too many analyses to pack efficiently

A major problem for relatively unconstrained machine induced grammars– Grammars overgenerate massively– Statistics used to prune out unlikely sub-analyses

Less of a problem for carefully hand-coded broad coverage grammars

Page 13: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Outline of rest of talkOutline of rest of talk

Packing– Charts for context free grammars– Generalizing free choice packing

Statistical pruning– PCFGs– Viterbi and inside-outside algorithms– Lexicalized PCFGs

Packing and Pruning– Maximum entropy disambiguation and avoiding

premature choice

Page 14: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

PackingPacking

Explosion of ambiguity results from a small number of sub-analyses combining in different ways to produce a large number of total analyses (e.g. PP attachment)

Compute and represent each sub-analysis just once Compute a factored representation of how these sub-

analyses combine

Charts for context free grammars are an example of (free choice) packing– Compute and represent 2N analyses in time and space N3

Page 15: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

ChartsCharts

P:3NP:2V:1 NP:4 P:5 NP:6

PP:7 <P:3, NP:4> PP:8 <P:5, NP:6>

NP:9 <NP:4, PP:8>

PP:10 <P:3, NP:9>

NP:11 <NP:2, PP:7>

NP:12 <NP:11, PP:8> or <NP:2, PP:10>

VP:13 <V1:3, NP:2>

VP:14 <V:1, NP:11> or <VP:13, PP:7>

VP:15 <VP:14, PP:8> or <VP:13, PP:10> or <V:1:, NP:12>

saw the girl in the park with the telescope

Page 16: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Context Free Charts: Context Free Charts: Free choice structuresFree choice structures

5 analyses in 15 components– Each contains ~11 constituents– Can count analyses without

unpacking chart Exploits context-freeness to

factor combinations: – V:1 combines with NP:12

regardless of internal structure– Chart packs all ambiguities of

NP:12 together with no sacrifice of correctness

– VP:15 can freely chose any analysis of NP:12

But what happens if things are not context free?– Can we preserve free choice?

P:3NP:2V:1 NP:4 P:5 NP:6

PP:7 <P:3, NP:4> PP:8 <P:5, NP:6>

NP:9 <NP:4, PP:8>

PP:10 <P:3, NP:9>

NP:11 <NP:2, PP:7>

NP:12 <NP:11, PP:8> or <NP:2, PP:10>

VP:13 <V1:3, NP:2>

VP:14 <V:1, NP:11> or <VP:13, PP:7>

VP:15 <VP:14, PP:8> or <VP:13, PP:10> or <V:1:, NP:12>

Page 17: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Generalizing Free Choice PackingGeneralizing Free Choice Packing

The sheep saw the fish.How many sheep?

How many fish?

The sheep-sg saw the fish-sg.The sheep-pl saw the fish-sg.The sheep-sg saw the fish-pl.The sheep-pl saw the fish-pl.

Options multiplied out

In principle, a verb might require agreement of Subject and Object: Have to check it out.

The sheep saw the fishsgpl

sgpl

Options packed

But English doesn’t do that:Any combination of choices is OK

Page 18: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Dependent choicesDependent choices

Das Mädchen-nom sah die Katze-nomDas Mädchen-nom sah die Katze-accDas Mädchen-acc sah die Katze-nomDas Mädchen-acc sah die Katze-acc

badThe girl saw the catThe cat saw the girl bad

Das Mädchen sah die Katzenomacc

nomacc

The girl saw the cat

Again, packing avoids duplication … but it’s wrongIt doesn’t encode all dependencies, choices are not free.

Page 19: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Solution: Label dependent choicesSolution: Label dependent choices

Das Mädchen-nom sah die Katze-nomDas Mädchen-nom sah die Katze-accDas Mädchen-acc sah die Katze-nomDas Mädchen-acc sah die Katze-acc

badThe girl saw the catThe cat saw the girl bad

• Label each choice with distinct Boolean variables p, q, etc.• Record acceptable combinations as a Boolean expression • Each analysis corresponds to a satisfying truth-value assignment

(a line from ’s truth table that assigns it “true”)

Das Mädchen sah die Katze p:nom

p:acc

q:nom

q:acc

(pq)

(pq) =

Page 20: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Simplifying Truth TablesSimplifying Truth Tables

Das Mädchen sah die Katze p:nom

p:acc

q:nom

q:acc

(pq)

(pq) =

p q 1 1 01 0 10 1 10 0 0

(q = p)

Das Mädchen sah die Katze p:nom

p:acc p:nom

p:acc (p p) =

p 1 10 1

Freely choose any linefrom the truth table

Page 21: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

(Almost) Free choice for PP attachment(Almost) Free choice for PP attachment

VP

V NP

PP

P NP

PP

P NP

PP

P NP

A1

A2

A3

A4

B1

B2

B3

C1

C2

Let PPs freely attach to all NPs and VPsand then rule out crossing attachments

A1 xor A2 xor A3 xor A4 1

B1 xor B2 xor B3 1

C1 xor C2 1

Independent Choices NoGoods (crossing attachments)(A2 & B1), (A2 & C1)(A3 & B1), (A3 & B2)(B2 & C1)

xor : exclusive or

Page 22: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Eliminating NoGoods / Simplifying Truth TablesEliminating NoGoods / Simplifying Truth Tables

Starting from:– A set of N independent m-way choices between boolean variables– And a set of nogoods constraining combinations of these variables

This defines:– A truth table for a nogood function on the boolean variables– Where disallowed combinations of variables correspond to lines in

the truth table assigning false to the noogood function

We can:– Produce a corresponding free choice truth table– Where no combinations of variables are disallowed

Page 23: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Example of NoGood EliminationExample of NoGood Elimination

VP

NP

PP

P NP

PP

P NP

B1

B2

C1

C2

B1 xor B2 1C1 xor C2 1

(B1 & C1)

B1 B2 C1 C2 (B1&C1) 1 0 1 0 0 1 0 0 1 1 0 1 1 0 1 0 1 0 1 1

Simplifies to

VP

NP

PP

P NP

PP

P NP

b1

b2

c1

c2 v b1

b1 xor b2 1c1 xor c2 b2

b1 b2 c1 c2 nogood 1 0 0 0 1 0 1 1 0 1 0 1 0 1 1

Page 24: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

The Free Choice GambleThe Free Choice Gamble

Eliminating nogoods increases number of boolean choice variables.– The more interactions between choices, the greater the

additional number of variables Worst case, where everything interacts:

– As many choice variables as there are readings– Packing blows up, and becomes exponential

Best case, no interactions– N completely independent choices represent 2N readings

Language interactions mostly limited & local– Tends towards the best case– Free choice packing pays off for linguistic analysis

Page 25: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Advantages of Free Choice PackingAdvantages of Free Choice Packing

Avoids procrastination– Nogoods are constraints that parser sends to other components

to check when using packed structures– Eliminating nogoods: other components don’t do parser’s work

Independence between choices:Allows processing relying on independence assumptions– Counting number of readings

» Apparently trivial but of crucial importance, since statistical modelling requires the ability to count

– Hence, statistical processing

A general mechanism extending beyond parsing

Page 26: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Parsing in the XLEParsing in the XLE Extending free choice packingExtending free choice packing

C-structure parsing produces a context free chart– Already a free choice packed structure

F-structure constraints applied to c-str chart– Rules out certain c-structures through unification failures

(i.e. contradictions in a logic of equality)– Need to impose additional no-goods on chart

How do we get the f-structure logic of equality to– Operate within the free choice structure of the chart– Impose additional no-goods on the structure?

That is, we know how to pack context free grammars, but how do we pack a logic of equality?

Page 27: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Let , be formulas from the logic you want to pack

1. New choice: iff p p (p a new Boolean variable) Proof:(If) If is true, let p be true, in which case p p is true.

(Only if) If p is true, then is true, in which case is true.

How to Pack a LogicHow to Pack a Logic

2. Ambiguity-enabled inference (by trivial logic): If is a rule of inference, then (C1 ) (C2 ) (C1 C2 ) is also valid

E.g. Substition of equals for equals: x=y x/y is a rule of inference

Therefore: (C1 x=y) (C2 ) (C1 C2 x/y)

Goal: Ambiguity-enable componentsto defer choosing, unpacking

3. Nogoods: if and C1 then nogood(C1).

Page 28: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Packing Equalities in F-structurePacking Equalities in F-structure

NPSUBJ =

NPNUM=sg

NPV

NPSUBJ=

Adj NP=

S

VP

V

visiting relatives

visiting relatives NUM=pl

isSUBJ NUM=sg

Adj

boring

A1 A2

1 SUBJ NUM=sg

A1 SUBJ NUM=sg

A2 SUBJ NUM=pl

1 & A1 sg=sg

1 & A2 sg=pl nogood(A2)

Page 29: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Ambiguity Enabled Language ProcessingAmbiguity Enabled Language Processing

Extend free choice packing to other components in language processing, e.g– Structure transfer for machine translation– Generation– Semantic interpretation– Reasoning (?)

A single, uniform mechanism for managing ambiguity within and across module boundaries

Allows deferment of disambiguation choices across module boundaries

Efficient, provided choices are relatively independent

Page 30: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Pruning: Statistical ParsingPruning: Statistical Parsing

Probabilistic Context Free Grammars (PCFGs) Viterbi parsing of PCFGs Inside-Outside training of PCFGs Lexicalized PCFGs

Disambiguation of packed structures in XLE– Maximum entropy models

Page 31: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

PCFGsPCFGs

Context free grammar with probabilities assigned to rules– Probabilities of all rules rewriting a non-terminal N sum to 1

Probability of a tree– Product of probabilities of all the rules used to derive the tree

Probability of a sentence– Sum probabilities of all trees assigned to sentence by grammar

Independence assumption:All rules are statistically independent– Choice of rule used to analyze a subconstituent does not

influence choice rule used to embed it in a larger constituent– Empirically implausible assumption (hence lexicalized PCFGs)

Page 32: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Viterbi Parsing of PCFGsViterbi Parsing of PCFGs

How do you find the most probable parse of a sentence without inspecting all of its parses?

Observation: – most probable parse of a constituent is the most

probable combination of » a rule » the most probable subconstituents matching the rule

Viterbi parsing (dynamic programming)– For each word span i-j and non-terminal N, record

only the most probable parse of i-j as N

Page 33: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Example of Viterbi ParsingExample of Viterbi Parsing (charts with probabilities)(charts with probabilities)

N Adj N 0.1 N relatives 0.1 …VP V NP 1 Adj old 0.1NP N 0.9 Adj visiting 0.01 … NP VP 0.1 V visiting 0.1 …

visiting old relatives

V 0.1 Adj 0.01 1-1

NP 0.09 N 0.1 3-3Adj 0.1 2-2

NP 0.0009 N 0.001 2-3

NP 9x10-6 NPVP NP 9x10-7 NPN

N 1x10-6

VP 9x10-5 1-3

Page 34: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Training: Expectation MaximizationTraining: Expectation Maximization

Start with set of rules & initial estimate of rule probabilities – e.g. rule counts from annotated corpus, or random assignment– For PCFGs, results extremely sensitive to initial estimates

Adjust rule probabilities to maximize probability of sentences in (unannotated) corpus– Parse corpus with existing rules and calculate sentence

probabilities– Reassign rule probabilities based on rule counts weighted by the

probabilities of the sentences in which those rules occurred– Iterate until probabilities settle/converge

Requires efficient way of calculating rule probabilities weighted by sentence probability– Inside-Outside algorithm

Page 35: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Inside probability: INN(i,j) = P(wi,j | NiRHSj)Probability that a string of words expands N

Outside probability: OUTN(i,l) = P(wi,j-1, NjRHSk,,wk+1,l)Probability of having an expansion of N between surrounding sequences of words

Inside and Outside ProbabilitiesInside and Outside Probabilities

1 j k l

N

INOUT OUT

Page 36: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Calculating Inside & Outside ProbabilitiesCalculating Inside & Outside Probabilities Compute inside probabilities bottom up

– Sum inside probabilities of all chart edges dominated by N-rule

Compute outside probabilities top down– From outside probabilities of dominating

constituents in chart

Can show that weighted rule count

C(NAB) =

OUTN(k,l). P(NAB). INA(k,m). INB(m+1,n)1P(w1..n)

klm

Page 37: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Lexicalized PCFGsLexicalized PCFGs

Probabilities of rule conditioned by lexical heads of its daughters/RHS

Conceptually, each non-terminal N becomes a complex <N, LexicalHead> category– NP(man) Det(the) N(man) – done more efficiently in practice

Major problems with data sparseness– Unlikely to see NP(zygote) Det(the) N(zygote) in training– Various smoothing techniques to assign reasonable non-

zero probabilities to examples unseen in training

Page 38: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Packing & Pruning in XLEPacking & Pruning in XLE

XLE produces (too) many candidates– All valid (with respect to grammar and OT marks)– Not all equally likely– Some applications require a single best guess

Grammar writer can’t specify correct choices– Many implicit properties of words and structures with unclear

significance

Appeal to probability model to choose best parse– Assume: previous experience is a good guide for future decisions– Collect corpus of training sentences, build probability model that

optimizes for previous good results– Apply model to choose best analysis of new sentences

Page 39: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

IssuesIssues

What kind of probability model? What kind of training data? Efficiency of training, efficiency of

disambiguation? Benefit vs. random choice of parse

Page 40: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Probability modelProbability model Conventional models: stochastic branching process

– Probabilistic Context-Free grammars Sequence of decisions, each independent of previous

decisions, each choice having a certain probability– PCFG: Choose from alternative expansions of a given category

Probability of an analysis = product of choice probabilities Efficient algorithms

– Training: inside/outside– Disambiguation: Viterbi

Abney 1997 and others: Not appropriate for LFG, HPSG…– Choices are not independent: Information from different CFG

branches interacts through f-structure– Probability models are biased (don’t make right choices on

training set)

Page 41: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Exponential models are appropriateExponential models are appropriate(aka Maximum Entropy or Log-linear models)(aka Maximum Entropy or Log-linear models)

Assign probabilities to representations, not to choices in a derivation

No independence assumption Arithmetic combined with human insight

– Human:» Define properties of representations that may be relevant» Based on any computable configuration of features,

trees

– Arithmetic:» Train to figure out the weight of each property

Page 42: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King
Page 43: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King
Page 44: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

What does this mean??What does this mean??

Define yes-no / 1-0 features, f, that seem important Training determines weights on these features, λ, to

reflect their actual importance Select parse x: count occurrences of features (0,1)

and multiply by corresponding weights, λ.f(x) Convert weighted feature counts to probabilities

eλ.f(x)

eλ.f(X)

Un-normalized probabilityNormalizing factor

Page 45: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

Entropy and ExponentialsEntropy and Exponentials Sparse and interdependent training data

– Not enough parse/feature combinations to uniquely determine feature weights consistent with training data

– Features are not independent

Entropy: reciprocal of average number of bits required to communicate a result, under an optimal coding scheme– Optimal coding: two most probable results encoded in 1 bit, next 4

in 2 bits, etc… – Dependencies between features: if f1 makes f2 more likely, can

reduce size of encoding of f2 when code for f1 is present

Maximize entropy of training data to smooth dependent features weights

Entropy defined as (hence the exponential)H(p) = -p(x). log p(x)

Page 46: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King
Page 47: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

And what does that mean??And what does that mean??

Variant of an Expectation-Maximization (EM) algorithm (conjugate-gradient minimization) used to compute weight distribution with maximum entropy

Inside-outside algorithm used to calculate all inside and outside probabilities as determined by feature weights

Page 48: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

EfficiencyEfficiency Properties counts

– Associated with Boolean tree of XLE contexts (a1, b2)– Shared among many parses

Training– Inside/outside algorithm of PCFG, but applied to Boolean

tree, not parse tree– Fast algorithm for choosing best properties– Can train on sentences with relatively low-ambiguity– 5 hours to train over WSJ (given file of parses)

Disambiguation– Viterbi algorithm applied to Boolean tree– 5% of parse time to disambiguate– 30% gain in F-score

Page 49: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King

ConclusionsConclusions

Free choice packing– Extending charts beyond parsing context free grammars

Efficient if limited interaction between alternatives – Generally true for natural language processing– Though sometimes hits bad cases

Uniform ambiguity management architecture across all language processing modules– Not forced to make premature disambiguation choices at

module boundaries

Statistical disambiguation of packed representations

Page 50: Issues in Computational Linguistics: Ambiguity Management Dick Crouch and Tracy King