15-505: lecture 11 generative models for text classification and information extraction kamal nigam...

62
15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Post on 22-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

15-505: Lecture 11

Generative Models for Text Classification and Information Extraction

Kamal Nigam

Some slides from William Cohen, Andrew McCallum

Page 2: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Text Classification by Example

Page 3: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Text Classification by Example

Page 4: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Text Classification by Example

Page 5: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Text Classification by Example

Page 6: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Text Classification by Example

Page 7: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

How could you build a text classifier?

• Take some ideas from machine learning– Supervised learning setting– Examples of each class (a few or thousands)

• Take some ideas from machine translation– Generative models– Language models

• Simplify each and stir thoroughly

Page 8: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Basic Approach of Generative Modeling

1. Pick representation for data

2. Write down probabilistic generative model

3. Estimate model parameters with training data

4. Turn model around to calculate unknown values for new data

Page 9: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Naïve Bayes: Bag of Words Representation

Corn prices rose today while corn futures dropped in surprising trading activity. Corn ...

activity 1 cable 0 corn 3 damp 0

drawer 0 dropped 1 elbow 0

earning 0 . . . . . .

All words in dictionary

Occurrence counts

Page 10: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Naïve Bayes: Mixture of Multinomials Model

1. Pick the class: P(class)

2. For every word, pick from the class urn: P(word|class)

while

polo

socceractivity

droppedsoccer

the

ball

COMPUTERS SPORTS

thein

web

windows

the

thein

java

windows

again

modem

Word independence assumption!

Page 11: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Naïve Bayes: Estimating Parameters

• Just like estimating biased coin flip probabilities

• Estimate MAP word probabilities:

• Estimate MAP class priors:

classdoc

classdoc

docNVocab

docwordclassword

)(||

),N(1)|P(

)N()N(

),N(1)P(

docclass

classdocclass

Page 12: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Naïve Bayes: Performing Classification

• Word independence assumption

• Take the class with the highest probability

docword

classwordclass )|P()P(

)(P

)(P)|(P)|(P

doc

classclassdocdocclass

Page 13: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Classification Tricks of the Trade

• Stemming– run, runs, running, ran run– table, tables, tabled table– computer, compute, computing compute

• Stopwords– Very frequent function words generally uninformative– if, in, the, like, …

• Information gain feature selection– Keep just most indicative words in the vocabulary

Page 14: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Naïve Bayes Rules of Thumb

• Need hundreds of labeled examples per class for good performance (~85% accuracy)

• Stemming and stopwords may or may not help• Feature selection may or may not help• Predicted probabilities will be very extreme• Use sum of logs instead of multiplying

probabilities for underflow prevention• Coding this up is trivial, either as a mapreduce

or not

Page 15: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction with Generative Models

Page 16: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Example: A Problem

Genomics job

Mt. Baker, the school district

Baker Hostetler, the company

Baker, a job opening

Page 17: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Example: A Solution

Page 18: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Job Openings:Category = Food ServicesKeyword = Baker Location = Continental U.S.

Page 19: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Extracting Job Openings from the Web

Title: Ice Cream Guru

Description: If you dream of cold creamy…

Contact: [email protected]

Category: Travel/Hospitality

Function: Food Services

Page 20: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Potential Enabler of Faceted Search

Page 21: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Lots of Structured Information in Text

Page 22: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE from Research Papers

Page 23: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

What is Information Extraction?

• Recovering structured data from formatted text

Page 24: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

What is Information Extraction?

• Recovering structured data from formatted text– Identifying fields (e.g. named entity recognition)

Page 25: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

What is Information Extraction?

• Recovering structured data from formatted text– Identifying fields (e.g. named entity recognition)– Understanding relations between fields (e.g. record

association)

Page 26: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

What is Information Extraction?

• Recovering structured data from formatted text– Identifying fields (e.g. named entity recognition)– Understanding relations between fields (e.g. record

association)– Normalization and deduplication

Page 27: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

What is Information Extraction?

• Recovering structured data from formatted text– Identifying fields (e.g. named entity recognition)– Understanding relations between fields (e.g. record

association)– Normalization and deduplication

• Today, focus on field identification

Page 28: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE Posed as a Machine Learning Task

• Training data: documents marked up with ground truth

• In contrast to text classification, local features crucial. Features of:– Contents– Text just before item– Text just after item– Begin/end boundaries

00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrun

prefix contents suffix

… …

Page 29: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Good Features for Information Extraction

Example word features:– identity of word– is in all caps– ends in “-ski”– is part of a noun phrase– is in a list of city names– is under node X in WordNet or

Cyc– is in bold font– is in hyperlink anchor– features of past & future– last person name was female– next two words are “and

Associates”

begins-with-number

begins-with-ordinal

begins-with-punctuation

begins-with-question-word

begins-with-subject

blank

contains-alphanum

contains-bracketed-number

contains-http

contains-non-space

contains-number

contains-pipe

contains-question-mark

contains-question-word

ends-with-question-mark

first-alpha-is-capitalized

indented

indented-1-to-4

indented-5-to-10

more-than-one-third-space

only-punctuation

prev-is-blank

prev-begins-with-ordinal

shorter-than-30

Creativity and Domain Knowledge Required!

Page 30: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Is Capitalized

Is Mixed Caps

Is All Caps

Initial Cap

Contains Digit

All lowercase

Is Initial

Punctuation

Period

Comma

Apostrophe

Dash

Preceded by HTML tag

Character n-gram classifier says string is a person name (80% accurate)

In stopword list(the, of, their, etc)

In honorific list(Mr, Mrs, Dr, Sen, etc)

In person suffix list(Jr, Sr, PhD, etc)

In name particle list (de, la, van, der, etc)

In Census lastname list;segmented by P(name)

In Census firstname list;segmented by P(name)

In locations lists(states, cities, countries)

In company name list(“J. C. Penny”)

In list of company suffixes(Inc, & Associates, Foundation)

Word Features– lists of job titles, – Lists of prefixes– Lists of suffixes– 350 informative phrases

HTML/Formatting Features– {begin, end, in} x

{<b>, <i>, <a>, <hN>} x{lengths 1, 2, 3, 4, or longer}

– {begin, end} of line

Creativity and Domain Knowledge Required!Good Features for Information Extraction

Page 31: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Landscape of ML Techniques for IE:

Any of these models can be used to capture words, formatting or both.

Classify Candidates

Abraham Lincoln was born in Kentucky.

Classifier

which class?

Sliding Window

Abraham Lincoln was born in Kentucky.

Classifier

which class?

Try alternatewindow sizes:

Boundary Models

Abraham Lincoln was born in Kentucky.

Classifier

which class?

BEGIN END BEGIN END

BEGIN

Finite State Machines

Abraham Lincoln was born in Kentucky.

Most likely state sequence?

Wrapper Induction

<b><i>Abraham Lincoln</i></b> was born in Kentucky.

Learn and apply pattern for a website

<b>

<i>

PersonName

Page 32: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Sliding Windows & Boundary Detection

Page 33: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction by Sliding Windows

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 34: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction by Sliding Windows

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 35: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction by Sliding Window

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 36: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction by Sliding Window

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 37: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction by Sliding Window

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 38: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Information Extraction with Sliding Windows[Freitag 97, 98; Soderland 97; Califf 98]

00 : pm Place : Wean Hall Rm 5409 Speaker : Sebastian Thrunw t-m w t-1 w t w t+n w t+n+1 w t+n+m

prefix contents suffix

… …

• Standard supervised learning setting– Positive instances: Windows with real label– Negative instances: All other windows– Features based on candidate, prefix and suffix

• Special-purpose rule learning systems work wellcourseNumber(X) :-

tokenLength(X,=,2), every(X, inTitle, false), some(X, A, <previousToken>, inTitle, true),some(X, B, <>. tripleton, true)

Page 39: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE by Boundary Detection

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 40: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE by Boundary Detection

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 41: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE by Boundary Detection

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 42: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE by Boundary Detection

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 43: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

IE by Boundary Detection

GRAND CHALLENGES FOR MACHINE LEARNING

Jaime Carbonell School of Computer Science Carnegie Mellon University

3:30 pm 7500 Wean Hall

Machine learning has evolved from obscurity in the 1970s into a vibrant and popular discipline in artificial intelligence during the 1980s and 1990s. As a result of its success and growth, machine learning is evolving into a collection of related disciplines: inductive concept acquisition, analytic learning in problem solving (e.g. analogy, explanation-based learning), learning theory (e.g. PAC learning), genetic algorithms, connectionist learning, hybrid systems, and so on.

CMU UseNet Seminar Announcement

E.g.Looking forseminarlocation

Page 44: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

BWI: Learning to detect boundaries

• Another formulation: learn three probabilistic classifiers:– START(i) = Prob( position i starts a field)– END(j) = Prob( position j ends a field)– LEN(k) = Prob( an extracted field has length k)

• Then score a possible extraction (i,j) bySTART(i) * END(j) * LEN(j-i)

• LEN(k) is estimated from a histogram

• START(i) and END(j) learned by boosting over simple boundary patterns and features

[Freitag & Kushmerick, AAAI 2000]

Page 45: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Problems with Sliding Windows and Boundary Finders

• Decisions in neighboring parts of the input are made independently from each other.

– Sliding Window may predict a “seminar end time” before the “seminar start time”.

– It is possible for two overlapping windows to both be above threshold.

– In a Boundary-Finding system, left boundaries are laid down independently from right boundaries, and their pairing happens as a separate step.

Page 46: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Hidden Markov Models

Page 47: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Citation Parsing

• Fahlman, Scott & Lebiere, Christian (1989). The cascade-correlation learning architecture. Advances in Neural Information Processing Systems, pp. 524-532.

• Fahlman, S.E. and Lebiere, C., “The Cascade Correlation Learning Architecture,” Neural Information Processing Systems, pp. 524-532, 1990.

• Fahlman, S. E. (1991) The recurrent cascade-correlation learning architecture. NIPS 3, 190-205.

Page 48: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Can we do this with probabilistic generative models?

• Could have classes for {author, title, journal, year, pages}

• Could classify every word or sequence?– Which sequences?

• Something interesting in the sequence of fields that we’d like to capture– Authors come first– Title comes before journal– Page numbers come near the end

Page 49: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Hidden Markov Models: The Representation

• A document is a sequence of words• Each word is tagged by its class

• fahlman s e and lebiere c the cascade correlation learning architecture neural information processing systems pp 524 532 1990

Page 50: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMM: Generative Model (1)

Author Title Journal

Year Pages

Page 51: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMM: Generative Model (2)

Author Title

Year Pages

Page 52: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMM: Generative Model (3)

• States: xi

• State transitions: P(xi|xj) = a[xi|xj] • Output probabilities: P(oi|xj) = b[oi|xj]

• Markov independence assumption

Page 53: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMMs: Estimating Parameters

• With fully-labeled data, just like naïve Bayes

• Estimate MAP output probabilities:

• Estimate MAP state transitions:

j

j

xdataword

xdatawordi

ji Vocab

wordo

xo

@

@

1||

),N(1

]|[b

datax

dataxij

ji

j

j

x

xx

xxa1||

),N(1

]|[1

Page 54: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMMs: Performing Extraction

• Given output words:– fahlman s e 1991 the recurrent cascade correlation learning

architecture nips 3 190 205

• Find state sequence that maximizes:

• Lots of possible state sequences to test (514)

Hmm…

i

iiii xobxxa ]|[]|[ 1

Page 55: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Representation for Paths: Trellis

Page 56: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Representation for Paths: Trellis

Page 57: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum
Page 58: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum
Page 59: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum
Page 60: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMM Example: Nymble

Task: Named Entity Extraction

Train on 450k words of news wire text.

Case Language F1 .Mixed English 93%Upper English 91%Mixed Spanish 90%

[Bikel, et al 97]

Person

Org

Other

(Five other name classes)

start-of-sentence

end-of-sentence

Results:

• Bigram within classes• Backoff to unigram• Special capitalization

and number features…

Page 61: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

Nymble word features

Page 62: 15-505: Lecture 11 Generative Models for Text Classification and Information Extraction Kamal Nigam Some slides from William Cohen, Andrew McCallum

HMMs: A Plethora of Applications

• Information extraction• Part of speech tagging• Word segmentation

• Gene finding• Protein structure prediction

• Speech recognition

• Economics, Climatology, Robotics, …