probabilistic learning tutorial: p. smyth, uc irvine, august 2005 principles and applications of...

244
Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer Science University of California, Irvine www.ics.uci.edu/~smyth

Post on 20-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Principles and Applications ofProbabilistic Learning

Padhraic SmythDepartment of Computer Science

University of California, Irvinewww.ics.uci.edu/~smyth

Page 2: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

New Slides

• Original slides created in mid-July for ACM

– Some new slides have been added• “new” logo in upper left

NEW

Page 3: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

New Slides

• Original slides created in mid-July for ACM

– Some new slides have been added• “new” logo in upper left

– A few slides have been updated• “updated” logo in upper left

• Current slides (including new and updated) at: www.ics.uci.edu/~smyth/talks

UPDATED

Page 4: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

From the tutorial Web page:

“The intent of this tutorial is to provide a starting point for students and researchers……”

NEW

Page 5: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Probabilistic Modeling vs. Function Approximation

• Two major themes in machine learning:

1. Function approximation/”black box” methods• e.g., for classification and regression• Learn a flexible function y = f(x)• e.g., SVMs, decision trees, boosting, etc

2. Probabilistic learning• e.g., for regression, model p(y|x) or p(y,x)• e.g, graphical models, mixture models, hidden Markov

models, etc

• Both approaches are useful in general– In this tutorial we will focus only on the 2nd approach,

probabilistic modeling

Page 6: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Motivations for Probabilistic Modeling

• leverage prior knowledge

• generalize beyond data analysis in vector-spaces

• handle missing data

• combine multiple types of information into an analysis

• generate calibrated probability outputs

• quantify uncertainty about parameters, models, and predictions in a statistical manner

Page 7: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning object models in visionWeber, Welling, Perona, 2000

NEW

Page 8: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning object models in visionWeber, Welling, Perona, 2000

NEW

Page 9: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning to Extract Information from Documents

e.g., Seymore, McCallum, Rosenfeld, 1999

NEW

Page 10: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

NEW

Page 11: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

NEW Segal, Friedman, Koller, et al,Nature Genetics, 2005

Page 12: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ProbabilisticModel

Real WorldData

P(Data | Parameters)

Page 13: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ProbabilisticModel

Real WorldData

P(Data | Parameters)

P(Parameters | Data)

Page 14: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ProbabilisticModel

Real WorldData

P(Data | Parameters)

P(Parameters | Data)

(Generative Model)

(Inference)

Page 15: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Outline

1. Review of probability

2. Graphical models

3. Connecting probability models to data

4. Models with hidden variables

5. Case studies(i) Simulating and forecasting rainfall data

(ii) Curve clustering with cyclone trajectories

(iii) Topic modeling from text documents

Page 16: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Part 1: Review of Probability

Page 17: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Notation and Definitions

• X is a random variable– Lower-case x is some possible value for X– “X = x” is a logical proposition: that X takes value x– There is uncertainty about the value of X

• e.g., X is the Dow Jones index at 5pm tomorrow

• p(X = x) is the probability that proposition X=x is true– often shortened to p(x)

• If the set of possible x’s is finite, we have a probability distribution and p(x) = 1

• If the set of possible x’s is infinite, p(x) is a density function, and p(x) integrates to 1 over the range of X

Page 18: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

• Let X be the Dow Jones Index (DJI) at 5pm Monday August 22nd (tomorrow)

• X can take real values from 0 to some large number

• p(x) is a density representing our uncertainty about X– This density could be constructed from historical data, e.g.,

– After 5pm p(x) becomes infinitely narrow around the true known x (no uncertainty)

Page 19: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Probability as Degree of Belief

• Different agents can have different p(x)’s– Your p(x) and the p(x) of a Wall Street expert might be

quite different– OR: if we were on vacation we might not have access to

stock market information• we would still be uncertain about p(x) after 5pm

• So we should really think of p(x) as p(x | BI)

– Where BI is background information available to agent I

– (will drop explicit conditioning on BI in notation)

• Thus, p(x) represents the degree of belief that agent I has in proposition x, conditioned on available background information

Page 20: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Comments on Degree of Belief

• Different agents can have different probability models– There is no necessarily “correct” p(x)– Why? Because p(x) is a model built on whatever assumptions or

background information we use– Naturally leads to the notion of updating

• p(x | BI) -> p(x | BI, CI)

• This is the subjective Bayesian interpretation of probability– Generalizes other interpretations (such as frequentist)– Can be used in cases where frequentist reasoning is not applicable– We will use “degree of belief” as our interpretation of p(x) in this

tutorial

• Note!– Degree of belief is just our semantic interpretation of p(x)– The mathematics of probability (e.g., Bayes rule) remain the same

regardless of our semantic interpretation

Page 21: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Multiple Variables

• p(x, y, z)– Probability that X=x AND Y=y AND Z =z– Possible values: cross-product of X Y Z

– e.g., X, Y, Z each take 10 possible values• x,y,z can take 103 possible values• p(x,y,z) is a 3-dimensional array/table

– Defines 103 probabilities• Note the exponential increase as we add more

variables

– e.g., X, Y, Z are all real-valued• x,y,z live in a 3-dimensional vector space• p(x,y,z) is a positive function defined over this space,

integrates to 1

Page 22: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Conditional Probability

• p(x | y, z)– Probability of x given that Y=y and Z = z– Could be

• hypothetical, e.g., “if Y=y and if Z = z”• observational, e.g., we observed values y and z

– can also have p(x, y | z), etc– “all probabilities are conditional probabilities”

• Computing conditional probabilities is the basis of many prediction and learning problems, e.g.,– p(DJI tomorrow | DJI index last week)– expected value of [DJI tomorrow | DJI index next week)– most likely value of parameter given observed data

Page 23: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Computing Conditional Probabilities

• Variables A, B, C, D– All distributions of interest related to A,B,C,D can be computed

from the full joint distribution p(a,b,c,d)

• Examples, using the Law of Total Probability

– p(a) = {b,c,d} p(a, b, c, d)

– p(c,d) = {a,b} p(a, b, c, d)

– p(a,c | d) = {b} p(a, b, c | d)

where p(a, b, c | d) = p(a,b,c,d)/p(d)

• These are standard probability manipulations: however, we will see how to use these to make inferences about parameters and unobserved variables, given data

Page 24: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Conditional Independence

• A is conditionally independent of B given C iff p(a | b, c) = p(a | c)

(also implies that B is conditionally independent of A given C)

• In words, B provides no information about A, if value of C is known

• Example:– a = “patient has upset stomach”– b = “patient has headache”– c = “patient has flu”

• Note that conditional independence does not imply marginal independence

Page 25: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Two Practical Problems

(Assume for simplicity each variable takes K values)

• Problem 1: Computational Complexity– Conditional probability computations scale as O(KN)

• where N is the number of variables being summed over

• Problem 2: Model Specification– To specify a joint distribution we need a table of O(KN) numbers

– Where do these numbers come from?

Page 26: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Two Key Ideas

• Problem 1: Computational Complexity– Idea: Graphical models

• Structured probability models lead to tractable inference

• Problem 2: Model Specification– Idea: Probabilistic learning

• General principles for learning from data

Page 27: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Part 2: Graphical Models

Page 28: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

“…probability theory is more fundamentally concerned with the structure of reasoning and causation than with numbers.”

Glenn Shafer and Judea PearlIntroduction to Readings in Uncertain Reasoning,Morgan Kaufmann, 1990

Page 29: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Models

• Represent dependency structure with a directed graph– Node <-> random variable– Edges encode dependencies

• Absence of edge -> conditional independence– Directed and undirected versions

• Why is this useful?– A language for communication– A language for computation

• Origins: – Wright 1920’s– Independently developed by Spiegelhalter and Lauritzen in

statistics and Pearl in computer science in the late 1980’s

Page 30: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of 3-way Graphical Models

A CB Marginal Independence:p(A,B,C) = p(A) p(B) p(C)

Page 31: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of 3-way Graphical Models

A

CB

Conditionally independent effects:p(A,B,C) = p(B|A)p(C|A)p(A)

B and C are conditionally independentGiven A

e.g., A is a disease, and we model B and C as conditionally independentsymptoms given A

Page 32: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of 3-way Graphical Models

A B

C

Independent Causes:p(A,B,C) = p(C|A,B)p(A)p(B)

Page 33: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of 3-way Graphical Models

A CB Markov dependence:p(A,B,C) = p(C|B) p(B|A)p(A)

Page 34: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Real-World Example

Monitoring Intensive-Care Patients• 37 variables• 509 parameters …instead of 237

(figure courtesy of KevinMurphy/Nir Friedman)

PCWP CO

HRBP

HREKG HRSAT

ERRCAUTERHRHISTORY

CATECHOL

SAO2 EXPCO2

ARTCO2

VENTALV

VENTLUNG VENITUBE

DISCONNECT

MINVOLSET

VENTMACHKINKEDTUBEINTUBATIONPULMEMBOLUS

PAP SHUNT

ANAPHYLAXIS

MINOVL

PVSAT

FIO2

PRESS

INSUFFANESTHTPR

LVFAILURE

ERRBLOWOUTPUTSTROEVOLUMELVEDVOLUME

HYPOVOLEMIA

CVP

BP

Page 35: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Directed Graphical Models

A B

C

p(A,B,C) = p(C|A,B)p(A)p(B)

Page 36: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Directed Graphical Models

A B

C

In general, p(X1, X2,....XN) = p(Xi | parents(Xi ) )

p(A,B,C) = p(C|A,B)p(A)p(B)

Page 37: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Directed Graphical Models

A B

C

• Probability model has simple factored form

• Directed edges => direct dependence

• Absence of an edge => conditional independence

• Also known as belief networks, Bayesian networks, causal networks

In general, p(X1, X2,....XN) = p(Xi | parents(Xi ) )

p(A,B,C) = p(C|A,B)p(A)p(B)

Page 38: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

C F

E

G

Page 39: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

C F

E

G

p(A, B, C, D, E, F, G) = p( variable | parents ) = p(A|B)p(C|B)p(B|D)p(F|E)p(G|E)p(E|D) p(D)

Page 40: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Say we want to compute p(a | c, g)

Page 41: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Direct calculation: p(a|c,g) = bdef p(a,b,d,e,f | c,g)

Complexity of the sum is O(K4)

Page 42: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Reordering (using factorization):

b p(a|b) d p(b|d,c) e p(d|e) f p(e,f |g)

Page 43: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Reordering:

bp(a|b) d p(b|d,c) e p(d|e) f p(e,f |g)

p(e|g)

Page 44: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Reordering:

bp(a|b) d p(b|d,c) e p(d|e) p(e|g)

p(d|g)

Page 45: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Reordering:

bp(a|b) d p(b|d,c) p(d|g)

p(b|c,g)

Page 46: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

D

A

B

c F

E

g

Reordering:

bp(a|b) p(b|c,g)

p(a|c,g) Complexity is O(K), compared to O(K4)

Page 47: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

A More General Algorithm

• Message Passing (MP) Algorithm– Pearl, 1988; Lauritzen and Spiegelhalter, 1988

– Declare 1 node (any node) to be a root

– Schedule two phases of message-passing

• nodes pass messages up to the root

• messages are distributed back to the leaves

– In time O(N), we can compute P(….)

Page 48: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Sketch of the MP algorithm in action

Page 49: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Sketch of the MP algorithm in action

1

Page 50: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Sketch of the MP algorithm in action

1 2

Page 51: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Sketch of the MP algorithm in action

1 2

3

Page 52: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Sketch of the MP algorithm in action

1 2

3 4

Page 53: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Complexity of the MP Algorithm

• Efficient– Complexity scales as O(N K m)

• N = number of variables• K = arity of variables• m = maximum number of parents for any node

– Compare to O(KN) for brute-force method

Page 54: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphs with “loops”

D

A

B

C F

E

G

Message passing algorithm does not work whenthere are multiple paths between 2 nodes

Page 55: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphs with “loops”

D

A

B

C F

E

G

General approach: “cluster” variablestogether to convert graph to a tree

Page 56: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Reduce to a Tree

D

A

B, E

C F G

Page 57: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Reduce to a Tree

D

A

B, E

C F G

Good news: can perform MP algorithm on this tree

Bad news: complexity is now O(K2)

Page 58: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Probability Calculations on Graphs

• Structure of the graph reveals– Computational strategy– Dependency relations

• Complexity is typically O(K max(number of parents) )– If single parents (e.g., tree), -> O(K)– The sparser the graph the lower the complexity

• Technique can be “automated”– i.e., a fully general algorithm for arbitrary graphs– For continuous variables:

• replace sum with integral– For identification of most likely values

• Replace sum with max operator

Page 59: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Hidden Markov Model (HMM)

Y1

S1

Y2

S2

Y3

S3

Yn

Sn

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Observed

Hidden

Two key assumptions:1. hidden state sequence is Markov

2. observation Yt is CI of all other variables given St

Widely used in speech recognition, protein sequence models

Motivation: switching dynamics, low-d representation of Y’s, etc

Page 60: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HMMs as graphical models…

• Computations of interest

• p( Y ) = p(Y , S = s) -> “forward-backward” algorithm

• arg maxs p(S = s | Y) -> Viterbi algorithm

• Both algorithms….– computation time linear in T– special cases of MP algorithm

• Many generalizations and extensions….– Make state S continuous -> Kalman filters– Add inputs -> convolutional decoding– Add additional dependencies in the model

• Generalized HMMs

Page 61: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Part 3: Connecting Probability Models to Data

Recommended References for this Section:

• All of Statistics, L. Wasserman, Chapman and Hall, 2004 (Chapters 6,9,11)

• Pattern Classification and Scene Analysis, 1st ed, R. Duda and P. Hart, Wiley, 1973, Chapter 3.

Page 62: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ProbabilisticModel

Real WorldData

P(Data | Parameters)

P(Parameters | Data)

(Generative Model)

(Inference)

Page 63: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Conditionally Independent Observations

y1

Data

Model parameters

y2yn-1 yn

NEW

Page 64: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

“Plate” Notation

yi

i=1:n

Data = {y1,…yn}

Model parameters

Plate = rectangle in graphical model

variables within a plate are replicated in a conditionally independent manner

Page 65: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: Gaussian Model

yi

i=1:n

Generative model: p(y1,…yn | ) = p(yi | ) = p(data | parameters)

= p(D | ) where = { }

Page 66: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The Likelihood Function

• Likelihood = p(data | parameters)

= p( D | )

= L ()

• Likelihood tells us how likely the observed data are conditioned on a particular setting of the parameters

• Details– Constants that do not involve can be dropped in defining

L ()

– Often easier to work with log L ()

Page 67: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Comments on the Likelihood Function

• Constructing a likelihood function L () is the first step in probabilistic modeling

• The likelihood function implicitly assumes an underlying probabilistic model M with parameters

• L () connects the model to the observed data

• Graphical models provide a useful language for constructing likelihoods

Page 68: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Binomial Likelihood• Binomial model

– N memoryless trials

– probability of success at each trial

• Observed data– r successes in n trials – Defines a likelihood:

L() = p(D | )

= p(succeses) p(non-successes)

= r (1-) n-r

NEW

yi

i=1:n

Page 69: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Binomial Likelihood Examples

NEW

Page 70: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Gaussian Model and Likelihood

Model assumptions: 1. y’s are conditionally independent given model 2. each y comes from a Gaussian (Normal) density

Page 71: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 72: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Conditional Independence (CI)

• CI in a likelihood model means that we are assuming data points provide no information about each other, if the model parameters are assumed known.

p( D | ) = p(y1,… yN | ) = p(yi | )

• Works well for (e.g.)– Patients randomly arriving at a clinic– Web surfers randomly arriving at a Web site

• Does not work well for– Time-dependent data (e.g., stock market)– Spatial data (e.g., pixel correlations)

CI assumption

Page 73: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: Markov Likelihood

• Motivation: wish to model data in a sequence where there is sequential dependence,– e.g., a first-order Markov chain for a DNA sequence

– Markov modeling assumption: p(yt | yt-1, yt-2, …yt) = p(yt | yt-1)

– = matrix of K x K transition matrix probabilities

L( ) = p( D | ) = p(y1,… yN | ) = p(yt | yt-1 , )

Page 74: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Maximum Likelihood (ML) Principle (R. Fisher ~ 1922)

yi

i=1:n

L () = p(Data | ) = p(yi | )

Maximum Likelihood: ML = arg max{ Likelihood() }

Select the parameters that make the observed data most likely

Data = {y1,…yn}

Model parameters

Page 75: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: ML for Gaussian Model

Maximum Likelhood EstimateML

Page 76: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Maximizing the Likelihood

• More generally, we analytically solve for the value that maximizes the function L () – With p parameters, L () is a scalar function defined over a

p-dimensional space

– 2 situations:• We can analytically solve for the maxima of L ()

– This is rare

• We have to resort to iterative techniques to find ML – More common

• General approach– Define a generative probabilistic model– Define an associated likelihood (connect model to data)– Solve an optimization problem to find ML

Page 77: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Analytical Solution for Gaussian Likelihood

Page 78: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Model for Regression

yi

i=1:n

xi

Page 79: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

x

y

Page 80: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

f(x ; ) this is unknown

x

y

Page 81: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: ML for Linear Regression

• Generative model: y = ax + b + Gaussian noise p(y) = N(ax + b, )

• Conditional Likelihood L() = p(y1,… yN | x1,… xN, )

= p(yi | xi , ) , {a, b}

• Can show (homework problem!) that

log L() = - [yi - (a xi – b) ]2

i.e., finding a,b to maximize log- likelihood is the

same as finding a,b that minimizes least squares

Page 82: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ML and Regression

• Multivariate case– multiple x’s, multiple regression coefficients – with Gaussian noise, the ML solution is again equivalent to least-

squares (solutions to a set of linear equations)

• Non-linear multivariate model – With Gaussian noise we get

log L() = - [yi - f (xi ; ) ]2

– Conditions for the q that maximizes L() leads to a set of p non-linear equations in p variables

– e.g., f (xi ; ) = a multilayer neural network with 1000 weights• Optimization = finding the maximum of a non-convex function in

1000 dimensional space!• Typically use iterative local search based on gradient (many

possible variations)

Page 83: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Probabilistic Learning and Classification

• 2 main approaches:

1. p(c | x) = p(x|c) p(c) / p(x) ~ p(x|c) p(c) -> learn a model for p(x|c) for each class, use Bayes rule to classify - example: naïve Bayes - advantage: theoretically optimal if p(x|c) is “correct” - disadvantage: not directly optimizing predictive accuracy

2. Learn p(c|x) directly, e.g.,– logistic regression (see tutorial notes from D. Lewis)– other regression methods such as neural networks, etc.– Often quite effective in practice: very useful for ranking, scoring,

etc

– Contrast with purely discriminative methods such as SVMs, trees

NEW

Page 84: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The Bayesian Approach to Learning

yi

i=1:n

Maximum A Posteriori: MAP = arg max{ Likelihood() x Prior() }

Fully Bayesian: p( | Data) = p(Data | ) p() / p(Data)

Prior() = p( )

Page 85: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The Bayesian Approach

yi

i=1:n

Fully Bayesian:p( | Data) = p(Data | ) p() / p(Data) = Likelihood x Prior / Normalization term

Estimating p( | Data) can be viewed as inference in a graphical model

ML is a special case = MAP with a “flat” prior

Page 86: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

More Comments on Bayesian Learning

• “fully” Bayesian: report full posterior density p( |D)– For simple models, we can calculate p( |D) analytically– Otherwise we empirically estimate p( |D)

• Monte Carlo sampling methods are very useful

• Bayesian prediction (e.g., for regression):

p(y | x, D ) = integral p(y, | x, D) d

= integral p(y | , x) p( |D) d

-> prediction at each is weighted by p(|D)

[theoretically preferable to picking a single (as in ML)]

Page 87: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

More Comments on Bayesian Learning

• In practice…– Fully Bayesian is theoretically optimal but not always the

most practical approach• E.g., computational limitations with large numbers of

parameters• assessing priors can be tricky

• Bayesian approach particularly useful for small data sets

• For large data sets, Bayesian, MAP, ML tend to agree– ML/MAP are much simpler => often used in practice

Page 88: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example of Bayesian Estimation

• Definition of Beta prior

• Definition of Binomial likelihood

• Form of Beta posterior

• Examples of plots with prior+likelihood -> posterior

Page 89: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Beta Density as a Prior

• Let be a proportion, – e.g., fraction of customers that respond to an email ad

– p() is a prior for

– e.g. p() = Beta density with parameters and

p() ~ -1 (1-) -1

/( + ) influences the location + controls the width

NEW

Page 90: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of Beta Density Priors

NEW

Page 91: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Binomial Likelihood• Binomial model

– N memoryless trials

– probability of success at each trial

• Observed data– r successes in n trials – Defines a likelihood:

p(D | ) = p(succeses) p(non-successes)

= r (1-) n-r

NEW

Page 92: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Beta + Binomial -> Beta p( | D) = Posterior ~ Likelihood x Prior

= Binomial x Beta

~ r (1-) n-r x -1 (1-) -1

= Beta( + r, + n – r)

Prior is “updated” using data:

Parameters: -> +r, -> + n – r

Sample size: + -> + + n

Mean: /( + ) -> ( + r)/( + + n)

NEW

Page 93: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

NEW

Page 94: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

NEW

Page 95: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

NEW

Page 96: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Extensions

• K categories with K probabilities that sum to 1– Dirichlet prior + Multinomial likelihood -> Dirichlet posterior– Used in text modeling, protein alignment algorithms, etc

• E.g. Biological Sequence Analysis, R. Durbin et al., Cambridge University Press, 1998.

• Hierarchical modeling– Multiple trials for different individuals– Each individual has their own – The ’s ~ common population distribution

– For applications in marketing see• Market Segmentation: Conceptual and Methodological

Foundations, M. Wedel and W. A. Kamakura, Kluwer, 1998

NEW

Page 97: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: Bayesian Gaussian Model

yi

i=1:n

Note: priors and parameters are assumed independent here

Page 98: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example: Bayesian Regression

yi

i=1:n

Model: yi = f [xi;] + e, e ~ N(0, )

p(yi | xi) ~ N ( f[xi;] , )

xi

Page 99: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Other Examples

• Bayesian examples– Bayesian neural networks

• Richer probabilistic models– Random effects models– E.g., Learning to align curves

• Learning model structure– Chow-Liu trees– General graphical model structures

• e.g. gene regulation networks

Comprehensive reference:Bayesian Data Analysis, A. Gelman, J. B. Carlin. H. S. Stern, and D. B.

Rubin, Chapman and Hall, 2nd edition, 2003.

UPDATED

Page 100: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning Shapes and Shifts

Original data

Data after Learning

Data = smoothed growth acceleration data from teenagers

EM used to learn a spline model + time-shift for each curve

Page 101: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning to Track PeopleSidenbladh, Black , Fleet, 2000

Page 102: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Model Uncertainty

• How do we know what model M to select for our likelihood function?– In general, we don’t!

– However, we can use the data to help us infer which model from a set of possible models is best

Page 103: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Method 1: Bayesian Approach

• Can evaluate the evidence for each model, p(M |D) = p(D|M) p(M)/ p(D)

– Can get p(D|M) by integrating p(D, | M) over parameter space (this is the “marginal likelihood”)

– in theory p(M |D) is how much evidence exists in the data for model M

• More complex models are automatically penalized because of the integration over higher-dimensional parameter spaces

– in practice p(M|D) can rarely be computed directly• Monte Carlo schemes are popular• Also: approximations such as BIC, Laplace, etc

Page 104: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Comments on Bayesian Approach

• Bayesian Model Averaging (BMA):– Instead of selecting the single best model, for prediction

average over all available models (theoretically the correct thing to do)

– Weights used for averaging are p(M|D)

• Empirical alternatives– e.g., Stacking, Bagging– Idea is to learn a set of unconstrained combining weights

from the data, weights that optimize predictive accuracy• “emulate” BMA approach• may be more effective in practice

Page 105: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Method 2: Predictive Validation

• Instead of the Bayesian approach, we could use the probability of new unseen test data as our metric for selecting models

• E.g., 2 models– If p(D | M1) > p(D | M2) then M1 is assigning higher

probability to new data than M2

– This will (with enough data) select the model that predicts the best, in a probabilistic sense

– Useful for problems where we have very large amounts of data and it is easy to create a large validation data set D

Page 106: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The Prediction Game

NEW

0 10Observed Data

What is a good guess at p(x)?

x

0 10Model A for p(x)

0 10Model B for p(x)

Page 107: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Which of Model A or B is better?

NEW

Test data generated from the true underlying q(x)

Model A

Model B

We can score each model in terms of p(new data | model)

Asymptotically, this is a fair unbiased score (irrespective of the complexities of the models)

Note: empirical average of log p(data) scores ~ negative entropy

Page 108: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

20 40 60 80 100 120 140 160 180 2002

2.2

2.4

2.6

2.8

3

3.2

3.4

3.6

3.8

4N

eg

ativ

e lo

g-l

ike

liho

od

[bits

/toke

n]

Number of mixture components [K]

Predictive Entropy Out-of-Sample

Mixtures of Multinomials

Mixtures of SFSMs

NEW

Model-based clustering and visualization of navigation patterns on a Web site Cadez et al, Journal of Data Mining and Knowledge Discovery, 2003

Page 109: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Simple Model Class

Page 110: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Data-generatingprocess (“truth”)

Simple Model Class

Page 111: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Data-generatingprocess (“truth”)

Best model is relatively far from Truth=> High Bias

Simple Model Class

“Closest” model in terms of KL distance

Page 112: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Data-generatingprocess (“truth”)

Simple Model Class

Complex Model Class

Page 113: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Data-generatingprocess (“truth”)

Simple Model Class

Complex Model ClassBest model is closer to Truth=> Low Bias

Page 114: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Data-generatingprocess (“truth”)

Simple Model Class

Complex Model Class

However,…. this could be the model that best fits the observed data=> High Variance

Page 115: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Part 4: Models with Hidden Variables

Page 116: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Hidden or Latent Variables

• In many applications there are 2 sets of variables:– Variables whose values we can directly measure– Variables that are “hidden”, cannot be measured

• Examples:– Speech recognition:

• Observed: acoustic voice signal• Hidden: label of the word spoken

– Face tracking in images• Observed: pixel intensities• Hidden: position of the face in the image

– Text modeling• Observed: counts of words in a document• Hidden: topics that the document is about

Page 117: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixture Models

S

Y

p(Y) = k p(Y | S=k) p(S=k)

Hidden discrete variable

Observed variable(s)

Motivation:1. models a true process (e.g., fish example)

2. approximation for a complex process

Pearson, 1894, Phil. Trans. Roy. Soc. A.

Page 118: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

-5 0 5 100

0.1

0.2

0.3

0.4

0.5

Component 1 Component 2p(

x)

-5 0 5 100

0.1

0.2

0.3

0.4

0.5

Mixture Model

x

p(x)

Page 119: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

-5 0 5 100

0.1

0.2

0.3

0.4

0.5

Component 1 Component 2p(

x)

-5 0 5 100

0.1

0.2

0.3

0.4

0.5

Mixture Model

x

p(x)

Page 120: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

-5 0 5 100

0.5

1

1.5

2

Component Modelsp(

x)

-5 0 5 100

0.1

0.2

0.3

0.4

0.5

Mixture Model

x

p(x)

Page 121: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

A Graphical Model for Clustering

S

Yj

Hidden discrete (cluster) variable

Observed variable(s)(assumed conditionally independent given S)

YdY1

Clusters = p(Y1,…Yd | S = s)

Probabilistic Clustering = learning these probability distributions from data

Page 122: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Hidden Markov Model (HMM)

Y1

S1

Y2

S2

Y3

S3

Yn

Sn

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Observed

Hidden

Two key assumptions:1. hidden state sequence is Markov

2. observation Yt is CI of all other variables given St

Widely used in speech recognition, protein sequence models

Motivation?- S can provide non-linear switching

- S can encode low-dim time-dependence for high-dim Y

Page 123: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Generalizing HMMs

Y1

S1

Y2

S2

Y3

S3

Yn

Sn

T1 T2T3 Tn

Two independent state variables, e.g., two processes evolving at different time-scales

Page 124: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Generalizing HMMs

Y1

S1

Y2

S2

Y3

S3

Yn

Sn

I1 I2I3 In

Inputs I provide context to influence switching, e.g., external forcing variables

Model is still a tree -> inference is still linear

Page 125: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Generalizing HMMs

Y1

S1

Y2

S2

Y3

S3

Yn

Sn

I1 I2I3 In

Add direct dependence between Y’s to better model persistence

Can merge each St and Yt to construct a tree-structured model

Page 126: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixture Model

Si

yi

i=1:n

Likelihood() = p(Data | )

= i p(yi | )

= i [ k p(yi |si = k , ) p(si = k) ]

Page 127: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning with Missing Data

• Guess at some initial parameters

• E-step (Inference)– For each case, and each unknown variable compute

p(S | known data, )

• M-step (Optimization)– Maximize L() using p(S | …..)– This yields new parameter estimates

• This is the EM algorithm:– Guaranteed to converge to a (local) maximum of L()– Dempster, Laird, Rubin, 1977

Page 128: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

E-Step

Si

yi

i=1:n

Page 129: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

M-Step

Si

yi

i=1:n

Page 130: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

E-Step

Si

yi

i=1:n

Page 131: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The E (Expectation) Step

Current K componentsand parameters

n objects

E step: Compute p(object i is in group k)

Page 132: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The M (Maximization) Step

New parameters forthe K components

n objects

M step: Compute , given n objects and memberships

Page 133: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Complexity of EM for mixtures

K modelsn objects

Complexity per iteration scales as O( n K f(d) )

Page 134: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4ANEMIA PATIENTS AND CONTROLS

Red Blood Cell Volume

Red

Blo

od C

ell H

emog

lobi

n C

once

ntra

tion

Data from Prof.Christine McLaren,Dept of Epidemiology,UC Irvine

Page 135: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 1

Page 136: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 3

Page 137: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 5

Page 138: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 10

Page 139: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 15

Page 140: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

EM ITERATION 25

Page 141: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

3.3 3.4 3.5 3.6 3.7 3.8 3.9 43.7

3.8

3.9

4

4.1

4.2

4.3

4.4

Red Blood Cell Volume

Re

d B

loo

d C

ell

He

mo

glo

bin

Co

nce

ntr

atio

n

ANEMIA DATA WITH LABELS

Anemia Group

Control Group

Page 142: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

0 5 10 15 20 25400

410

420

430

440

450

460

470

480

490LOG-LIKELIHOOD AS A FUNCTION OF EM ITERATIONS

EM Iteration

Lo

g-L

ike

liho

od

Page 143: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example of a Log-Likelihood Surface

10 20 30 40 50 60 70 80 90 100

50

100

150

200

250

300

350

400

Log Scale for Sigma 2

Mean 2

Page 144: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

-50 -40 -30 -20 -10 0 10 20-80

-75

-70

-65

-60

-55

-50

-45Log-Likelihood Cross-Section

Log(sigma)

Lo

g-l

ike

liho

od

Page 145: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

HMMs

Page 146: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

Page 147: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

Page 148: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

E-Step(linear inference)

Page 149: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

M-Step(closed form)

Page 150: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Alternatives to EM

• Method of Moments– EM is more efficient

• Direct optimization– e.g., gradient descent, Newton methods– EM is usually simpler to implement

• Sampling (e.g., MCMC)

• Minimum distance, e.g.,

2)()|()( xqxpEIMSE

Page 151: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixtures as “Data Simulators”

For i = 1 to N

classk ~ p(class1, class2, …., class K)

xi ~ p(x | classk)

end

Page 152: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixtures with Markov Dependence

For i = 1 to N

classk ~ p(class1, class2, …., class K | class[xi-1] )

xi ~ p(x | classk)

end Current class depends onprevious class (Markov dependence)

This is a hidden Markov model

Page 153: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixtures of Sequences

For i = 1 to N

classk ~ p(class1, class2, …., class K)

while non-end state

xij ~ p(xj | xj-1, classk)

endend Markov sequence model

Produces a variablelength sequence

Page 154: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixtures of Curves

For i = 1 to N

classk ~ p(class1, class2, …., class K)

Li ~ p(Li | classk)

for i = 1 to Li

yij ~ f(y | xj, classk) + ek

endend

Class-dependent curve model

Length of curve

Independent variable x

Page 155: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Mixtures of Image Models

For i = 1 to Nclassk ~ p(class1, class2, …., class K)

sizei ~ p(size|classk)

for i = 1 to Vi-1

intensityi ~ p(intensity | classk)

endend

Pixel generation model

Number of vertices

Global scale

Page 156: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

More generally…..

k

K

k

kii cDpDp

1

)|()(

Generative Model

- select a component ck for individual i

- generate data according to p(Di | ck)

- p(Di | ck) can be very general

- e.g., sets of sequences, spatial patterns, etc

[Note: given p(Di | ck), we can define an EM algorithm]

Page 157: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

References

• The EM Algorithm and Mixture Models– The EM Algorithm and Extensions

G. McLachlan and T. Krishnan. John Wiley and Sons, New York, 1997.

• Mixture models– Statistical analysis of finite mixture distributions.

D. M. Titterington, A. F. M. Smith & U. E. Makov. Wiley & Sons, Inc., New York, 1985.

– Finite Mixture Models G.J. McLachlan and D. Peel, New York: Wiley (2000)

– Model-based clustering, discriminant analysis, and density estimation, C. Fraley and A. E. Raftery, Journal of the American Statistical Association 97:611-631 (2002).

NEW

Page 158: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

References

• Hidden Markov Models– A tutorial on hidden Markov models and selected

applications in speech recognition, L. R. Rabiner, Proceedings of the IEEE, vol. 77, no.2, 257-287, 1989.

– Probabilistic independence networks for hidden Markov modelsP. Smyth, D. Heckerman, and M. Jordan, Neural Computation , vol.9, no. 2, 227-269, 1997.

– Hidden Markov models, A. Moore, online tutorial slides, http://www.autonlab.org/tutorials/hmm12.pdf

NEW

Page 159: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Part 5: Case Studies

(i) Simulating and forecasting rainfall data

(ii) Curve clustering with cyclones

(iii) Topic modeling from text documents

and if time permits…..

(iv) Sequence clustering for Web data

(v) Analysis of time-course gene expression data

Page 160: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Case Study 1:

Simulating and Predicting Rainfall Patterns

Joint work with:

Andy Robertson, International Research Institute for Climate Prediction

Sergey Kirshner, Department of Computer Science, UC Irvine

Page 161: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Spatio-Temporal Rainfall Data

Northeast Brazil 1975-2002

90-day time series24 years 10 stations

Page 162: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

10 20 30 40 50 60 70 80 90

5

10

15

20

25

30

35

DATA FOR ONE RAIN-STATION

DAY

YEAR

Page 163: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Modeling Goals

• “Downscaling” – Modeling interannual variability– coupling rainfall to large-scale effects like El Nino

• Prediction– e.g., “hindcasting” of missing data

• Seasonal Forecasts– E.g. on Dec 1 produce simulations of likely 90-day winters

Page 164: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Y1

S1

Y2

S2

Y3

S3

YN

SN

I1 I2I3 IN

S = unobserved weather state Y = spatial rainfall pattern (“outputs”) I = atmospheric variables (“inputs”)

HMMs for Rainfall Modeling

Page 165: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learned Weather States

States provide an interpretable “view” of spatio-temporal relationships in the data

Page 166: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 167: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

WeatherStates

for Kenya

Page 168: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 169: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 170: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Spatial Chow-Liu Trees

- Spatial distribution given a state is a tree structure

(a graphical model)

- Useful intermediate between full pair-wise model and conditional independence

- Optimal topology learned from data using minimum spanningtree algorithm

- Can use priors based on distance, topography

- Tree-structure over time also

Page 171: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Missing Data

Page 172: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Error rate v. fraction of missing data

Page 173: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

References

• Trees and Hidden Markov Models– Conditional Chow-Liu tree structures for modeling discrete-

valued vector time seriesS. Kirshner, P. Smyth, and A. Robertsonin Proceedings of the 20th International Conference on Uncertainty in AI , 2004.

• Applications to rainfall modeling– Hidden Markov models for modeling daily rainfall

occurrence over BrazilA. Robertson, S. Kirshner, and P. Smyth Journal of Climate, November 2005.

NEW

Page 174: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Summary

• Simple “empirical” probabilistic models can be very helpful in interpreting large scientific data sets– e.g., HMM states provide scientists with a basic but useful

classification of historical spatial rainfall patterns

• Graphical models provide “glue” to link together different information– Spatial– Temporal– Hidden states, etc

• “Generative” aspect of probabilistic models can be quite useful, e.g., for simulation

• Missing data is handled naturally in a probabilistic framework

Page 175: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Case Study 2:

Clustering Cyclone Trajectories

Joint work with:

Suzana Camargo, Andy Robertson, International Research Institute for Climate Prediction

Scott Gaffney, Department of Computer Science, UC Irvine

Page 176: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Storm Trajectories

Page 177: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Microarray Gene Expression Data

0 2 4 6 8 10 12 14 16 18-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

Time (7-minute increments)

No

rma

lize

d lo

g-r

atio

of

inte

ns

ity

TIME-COURSE GENE EXPRESSION DATA

Yeast Cell-Cycle DataSpellman et al (1998)

Page 178: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Clustering “non-vector” data

• Challenges with the data….– May be of different “lengths”, “sizes”, etc– Not easily representable in vector spaces– Distance is not naturally defined a priori

• Possible approaches– “convert” into a fixed-dimensional vector space

• Apply standard vector clustering – but loses information– use hierarchical clustering

• But O(N2) and requires a distance measure– probabilistic clustering with mixtures

• Define a generative mixture model for the data• Learn distance and clustering simultaneously

Page 179: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Models for Curves

y

T

t

y = f(t ; )

e.g., y = at2 + bt + c, = {a, b, c}

Data = { (y1,t1),……. yT, tT) }

Page 180: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Models for Curves

y

T

t

y ~ Gaussian density with mean = f(t ; ), variance = 2

Page 181: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

t

y

Page 182: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Example

f(t ; ) <- this is hidden

t

y

Page 183: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Models for Sets of Curves

y

T

t

Each curve: P(yi | ti, ) = product of Gaussians

N curves

Page 184: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Curve-Specific Transformations

y

T

t

N curves

e.g., yi = at2 + bt + c + i, = {a, b, c, 1,….N}

Note: we can learn function parameters and shifts simultaneously with EM

Page 185: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning Shapes and Shifts

Original data

Data after Learning

Data = smoothed growth acceleration data from teenagers

EM used to learn a spline model + time-shift for each curve

Page 186: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Clustering: Mixtures of Curves

y

T

t

N curves

c

Each set of trajectory points comes from 1 of K models

Model for group k is a Gaussian curve model

Marginal probability for a trajectory = mixture model

Page 187: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

The Learning Problem

• K cluster models– Each cluster is a shape model E[Y] = f(X;) with its own

parameters

• N observed curves: for each curve we learn– P(cluster k | curve data)– distribution on alignments, shifts, scaling, etc, given data

• Requires simultaneous learning of– Cluster models– Curve transformation parameters

• Results in an EM algorithm where E and M step are tractable

Page 188: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

2 4 6 8 10 12 14 16 18 20-2

-1

0

1

2

3

4

5Simulated Curves (K=2 Clusters)

Time

Page 189: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

0 5 10 15 20 25-3

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5Simulated Data after Alignment

Time

Page 190: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Results on Simulated Data

0.1290.424-0.79K-means

0.0480.0191.340.99EM with Alignment

0.05002.011True Model

Within-Cluster

Error in Mean

LogPClassification Accuracy

Method

*Averaged over 50 train/test sets

StandardEM

0.89 -7.87 0.171 0.105

Page 191: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Clusters of Trajectories

Page 192: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Cluster Shapes for Pacific Cyclones

Page 193: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

TROPICAL CYCLONES Western North Pacific 1983-2002

Page 194: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 195: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

References on Curve Clustering

• Functional Data Analysis J. O. Ramsay and B. W. Silverman, Springer, 1997.

• Probabilistic curve-aligned clustering and prediction with regression mixture models S. J. Gaffney, Phd Thesis, Department of Computer Science, University of California, Irvine, March 2004.

• Joint probabilistic curve clustering and alignment S. Gaffney and P. Smyth Advances in Neural Information Processing 17 , in press, 2005.

• Probabilistic clustering of extratropical cyclones using regression mixture modelsS. Gaffney, A. Robertson, P. Smyth, S. Camargo, M. Ghil preprint, online at www.datalab.uci.edu.

NEW

Page 196: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Summary

• Graphical models provide a flexible representational language for modeling complex scientific data– can build complex models from simpler building blocks

• Systematic variability in the data can be handled in a principled way– Variable length time-series– Misalignments in trajectories

• Generative probabilistic models are interpretable and understandable by scientists

Page 197: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Case Study 3:

Topic Modeling from Text Documents

Joint work with:

Mark Steyvers, Dave Newman, Chaitanya Chemudugunta, UC Irvine

Michal Rosen-Zvi, Hebrew University, Jerusalem

Tom Griffiths, Brown University

Page 198: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Enron email data

250,000 emails

5000 authors

1999-2002

Page 199: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Questions of Interest

– What topics do these documents “span”?

– Which documents are about a particular topic?

– How have topics changed over time?

– What does author X write about?

– Who is likely to write about topic Y?

– Who wrote this specific document?

– and so on…..

Page 200: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Model for Clustering

z

w

Cluster fordocument

Word

Cluster-Worddistributions

D

n

Page 201: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Graphical Model for Topics

z

w

Topic

Word

Document-Topicdistributions

Topic-Worddistributions

D

n

Page 202: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Topic = probability distribution over words

)|( zwPWORD PROB.

PROBABILISTIC 0.0778

BAYESIAN 0.0671

PROBABILITY 0.0532

CARLO 0.0309

MONTE 0.0308

DISTRIBUTION 0.0257

INFERENCE 0.0253

PROBABILITIES 0.0253

CONDITIONAL 0.0229

PRIOR 0.0219

.... ...

TOPIC 209

WORD PROB.

RETRIEVAL 0.1179

TEXT 0.0853

DOCUMENTS 0.0527

INFORMATION 0.0504

DOCUMENT 0.0441

CONTENT 0.0242

INDEXING 0.0205

RELEVANCE 0.0159

COLLECTION 0.0146

RELEVANT 0.0136

... ...

TOPIC 289

Page 203: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Key Features of Topic Models

• Generative model for documents in form of bags of words

• Allows a document to be composed of multiple topics– Much more powerful than 1 doc -> 1 cluster

• Completely unsupervised– Topics learned directly from data– Leverages strong dependencies at word level AND large data sets

• Learning algorithm– Gibbs sampling is the method of choice

• Scalable– Linear in number of word tokens– Can be run on millions of documents

NEW

Page 204: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Topics vs. Other Approaches

• Clustering documents– Computationally simpler…– But a less accurate and less flexible model

• LSI/LSA– Projects words into a K-dimensional hidden space– Less interpretable– Not generalizable

• E.g., authors or other side-information– Not as accurate

• E.g., precision-recall: Hoffman, Blei et al, Buntine, etc

• Topic Models (aka LDA model)– “next-generation” text modeling, after LSI– More flexible and more accurate (in prediction)– Linear time complexity in fitting the model

Page 205: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of Topics learned from Proceedings of the National Academy of Sciences

Griffiths and Steyvers, 2004

NEW

FORCESURFACE

MOLECULESSOLUTIONSURFACES

MICROSCOPYWATERFORCES

PARTICLESSTRENGTHPOLYMER

IONICATOMIC

AQUEOUSMOLECULARPROPERTIES

LIQUIDSOLUTIONS

BEADSMECHANICAL

HIVVIRUS

INFECTEDIMMUNODEFICIENCY

CD4INFECTION

HUMANVIRAL

TATGP120

REPLICATIONTYPE

ENVELOPEAIDSREV

BLOODCCR5

INDIVIDUALSENV

PERIPHERAL

MUSCLECARDIAC

HEARTSKELETALMYOCYTES

VENTRICULARMUSCLESSMOOTH

HYPERTROPHYDYSTROPHIN

HEARTSCONTRACTION

FIBERSFUNCTION

TISSUERAT

MYOCARDIALISOLATED

MYODFAILURE

STRUCTUREANGSTROM

CRYSTALRESIDUES

STRUCTURESSTRUCTURALRESOLUTION

HELIXTHREE

HELICESDETERMINED

RAYCONFORMATION

HELICALHYDROPHOBIC

SIDEDIMENSIONALINTERACTIONS

MOLECULESURFACE

NEURONSBRAIN

CORTEXCORTICAL

OLFACTORYNUCLEUS

NEURONALLAYER

RATNUCLEI

CEREBELLUMCEREBELLAR

LATERALCEREBRAL

LAYERSGRANULELABELED

HIPPOCAMPUSAREAS

THALAMIC

TUMORCANCERTUMORSHUMANCELLS

BREASTMELANOMA

GROWTHCARCINOMA

PROSTATENORMAL

CELLMETASTATICMALIGNANT

LUNGCANCERS

MICENUDE

PRIMARYOVARIAN

Page 206: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

What can Topic Models be used for?

– Queries• Who writes on this topic?

– e.g., finding experts or reviewers in a particular area• What topics does this person do research on?

– Comparing groups of authors or documents

– Discovering trends over time

– Detecting unusual papers and authors

– Interactive browsing of a digital library via topics

– Parsing documents (and parts of documents) by topic

– and more…..

Page 207: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

What is this paper about?Empirical Bayes screening for multi-item associations

Bill DuMouchel and Daryl Pregibon, ACM SIGKDD 2001

Most likely topics according to the model are…1. data, mining, discovery, association, attribute..2. set, subset, maximal, minimal, complete,…3. measurements, correlation, statistical, variation,4. Bayesian, model, prior, data, mixture,…..

NEW

Page 208: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

1990 1992 1994 1996 1998 2000 20020

0.002

0.004

0.006

0.008

0.01

0.012

Year

To

pic

Pro

ba

bili

tyCHANGING TRENDS IN COMPUTER SCIENCE

OPERATINGSYSTEMS

INFORMATIONRETRIEVAL

WWW

PROGRAMMINGLANGUAGES

Page 209: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Pennsylvania Gazette

1728-18001728-1800

80,000 80,000 articlesarticles

(courtesy of David Newman & Sharon Block, UC Irvine)(courtesy of David Newman & Sharon Block, UC Irvine)

NEW

Page 210: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Historical Trends in Pennsylvania Gazette

YEAR

1730 1740 1750 1760 1770 1780 1790 1800

Top

ic P

ropo

rtio

n (%

)

0

2

4

6

8

10STATE

GOVERNMENTCONSTITUTION

LAWUNITEDPOWERCITIZENPEOPLEPUBLIC

CONGRESS

SILKCOTTONDITTOWHITEBLACKLINENCLOTHWOMEN

BLUEWORSTED

(courtesy of David Newman & Sharon Block, UC Irvine)(courtesy of David Newman & Sharon Block, UC Irvine)

NEW

Page 211: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Enron email data

250,000 emails250,000 emails

5000 authors5000 authors

1999-20021999-2002

Page 212: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Enron email topics

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

FEEDBACK 0.0781 PROJECT 0.0514 FERC 0.0554 ENVIRONMENTAL 0.0291

PERFORMANCE 0.0462 PLANT 0.028 MARKET 0.0328 AIR 0.0232

PROCESS 0.0455 COST 0.0182 ISO 0.0226 MTBE 0.019

PEP 0.0446 CONSTRUCTION 0.0169 COMMISSION 0.0215 EMISSIONS 0.017

MANAGEMENT 0.03 UNIT 0.0166 ORDER 0.0212 CLEAN 0.0143

COMPLETE 0.0205 FACILITY 0.0165 FILING 0.0149 EPA 0.0133

QUESTIONS 0.0203 SITE 0.0136 COMMENTS 0.0116 PENDING 0.0129

SELECTED 0.0187 PROJECTS 0.0117 PRICE 0.0116 SAFETY 0.0104

COMPLETED 0.0146 CONTRACT 0.011 CALIFORNIA 0.0110 WATER 0.0092

SYSTEM 0.0146 UNITS 0.0106 FILED 0.0110 GASOLINE 0.0086

SENDER PROB. SENDER PROB. SENDER PROB. SENDER PROB.

perfmgmt 0.2195 *** 0.0288 *** 0.0532 *** 0.1339

perf eval process 0.0784 *** 0.022 *** 0.0454 *** 0.0275

enron announcements 0.0489 *** 0.0123 *** 0.0384 *** 0.0205

*** 0.0089 *** 0.0111 *** 0.0334 *** 0.0166

*** 0.0048 *** 0.0108 *** 0.0317 *** 0.0129

TOPIC 23TOPIC 36 TOPIC 72 TOPIC 54

Page 213: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Non-work Topics…

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

HOLIDAY 0.0857 TEXANS 0.0145 GOD 0.0357 AMAZON 0.0312

PARTY 0.0368 WIN 0.0143 LIFE 0.0272 GIFT 0.0226

YEAR 0.0316 FOOTBALL 0.0137 MAN 0.0116 CLICK 0.0193

SEASON 0.0305 FANTASY 0.0129 PEOPLE 0.0103 SAVE 0.0147

COMPANY 0.0255 SPORTSLINE 0.0129 CHRIST 0.0092 SHOPPING 0.0140

CELEBRATION 0.0199 PLAY 0.0123 FAITH 0.0083 OFFER 0.0124

ENRON 0.0198 TEAM 0.0114 LORD 0.0079 HOLIDAY 0.0122

TIME 0.0194 GAME 0.0112 JESUS 0.0075 RECEIVE 0.0102

RECOGNIZE 0.019 SPORTS 0.011 SPIRITUAL 0.0066 SHIPPING 0.0100

MONTH 0.018 GAMES 0.0109 VISIT 0.0065 FLOWERS 0.0099

SENDER PROB. SENDER PROB. SENDER PROB. SENDER PROB.

chairman & ceo 0.131 cbs sportsline com 0.0866 crosswalk com 0.2358 amazon com 0.1344

*** 0.0102 houston texans 0.0267 wordsmith 0.0208 jos a bank 0.0266

*** 0.0046 houstontexans 0.0203 *** 0.0107 sharperimageoffers 0.0136

*** 0.0022 sportsline rewards 0.0175 doctor dictionary 0.0101 travelocity com 0.0094

general announcement 0.0017 pro football 0.0136 *** 0.0061 barnes & noble com 0.0089

TOPIC 109TOPIC 66 TOPIC 182 TOPIC 113

Page 214: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Topical Topics

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

POWER 0.0915 STATE 0.0253 COMMITTEE 0.0197 LAW 0.0380

CALIFORNIA 0.0756 PLAN 0.0245 BILL 0.0189 TESTIMONY 0.0201

ELECTRICITY 0.0331 CALIFORNIA 0.0137 HOUSE 0.0169 ATTORNEY 0.0164

UTILITIES 0.0253 POLITICIAN Y 0.0137 WASHINGTON 0.0140 SETTLEMENT 0.0131

PRICES 0.0249 RATE 0.0131 SENATE 0.0135 LEGAL 0.0100

MARKET 0.0244 BANKRUPTCY 0.0126 POLITICIAN X 0.0114 EXHIBIT 0.0098

PRICE 0.0207 SOCAL 0.0119 CONGRESS 0.0112 CLE 0.0093

UTILITY 0.0140 POWER 0.0114 PRESIDENT 0.0105 SOCALGAS 0.0093

CUSTOMERS 0.0134 BONDS 0.0109 LEGISLATION 0.0099 METALS 0.0091

ELECTRIC 0.0120 MOU 0.0107 DC 0.0093 PERSON Z 0.0083

SENDER PROB. SENDER PROB. SENDER PROB. SENDER PROB.

*** 0.1160 *** 0.0395 *** 0.0696 *** 0.0696

*** 0.0518 *** 0.0337 *** 0.0453 *** 0.0453

*** 0.0284 *** 0.0295 *** 0.0255 *** 0.0255

*** 0.0272 *** 0.0251 *** 0.0173 *** 0.0173

*** 0.0266 *** 0.0202 *** 0.0317 *** 0.0317

TOPIC 194TOPIC 18 TOPIC 22 TOPIC 114

Page 215: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Using Topic Models for Information Retrieval

UPDATED

Page 216: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Author-Topic Models

• The author-topic model– a probabilistic model linking authors and topics

• authors -> topics -> words

– Topic = distribution over words– Author = distribution over topics– Document = generated from a mixture of author

distributions

– Learns about entities based on associated text

• Can be generalized– Replace author with any categorical doc information– e.g., publication type, source, year, country of origin, etc

Page 217: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Author-Topic Graphical Model

x

z

w

a

Author

Topic

Word

Author-Topicdistributions

Topic-Worddistributions

D

n

Page 218: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Learning Author-Topic Models from Text

• Full probabilistic model– Power of statistical learning can be leveraged– Learning algorithm is linear in number of word occurrences

• Scalable to very large data sets• Completely automated (no tweaking required)

– completely unsupervised, no labels

• Query answering– A wide variety of queries can be answered:

• Which authors write on topic X?• What are the spatial patterns in usage of topic Y?• How have authors A, B and C changed over time?

– Queries answered using probabilistic inference• Query time is real-time (learning is offline)

Page 219: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Author-Topic Models for CiteSeer

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

DATA 0.1563 PROBABILISTIC 0.0778 RETRIEVAL 0.1179 QUERY 0.1848

MINING 0.0674 BAYESIAN 0.0671 TEXT 0.0853 QUERIES 0.1367

ATTRIBUTES 0.0462 PROBABILITY 0.0532 DOCUMENTS 0.0527 INDEX 0.0488

DISCOVERY 0.0401 CARLO 0.0309 INFORMATION 0.0504 DATA 0.0368

ASSOCIATION 0.0335 MONTE 0.0308 DOCUMENT 0.0441 JOIN 0.0260

LARGE 0.0280 DISTRIBUTION 0.0257 CONTENT 0.0242 INDEXING 0.0180

KNOWLEDGE 0.0260 INFERENCE 0.0253 INDEXING 0.0205 PROCESSING 0.0113

DATABASES 0.0210 PROBABILITIES 0.0253 RELEVANCE 0.0159 AGGREGATE 0.0110

ATTRIBUTE 0.0188 CONDITIONAL 0.0229 COLLECTION 0.0146 ACCESS 0.0102

DATASETS 0.0165 PRIOR 0.0219 RELEVANT 0.0136 PRESENT 0.0095

AUTHOR PROB. AUTHOR PROB. AUTHOR PROB. AUTHOR PROB.

Han_J 0.0196 Friedman_N 0.0094 Oard_D 0.0110 Suciu_D 0.0102

Rastogi_R 0.0094 Heckerman_D 0.0067 Croft_W 0.0056 Naughton_J 0.0095

Zaki_M 0.0084 Ghahramani_Z 0.0062 Jones_K 0.0053 Levy_A 0.0071

Shim_K 0.0077 Koller_D 0.0062 Schauble_P 0.0051 DeWitt_D 0.0068

Ng_R 0.0060 Jordan_M 0.0059 Voorhees_E 0.0050 Wong_L 0.0067

Liu_B 0.0058 Neal_R 0.0055 Singhal_A 0.0048 Chakrabarti_K 0.0064

Mannila_H 0.0056 Raftery_A 0.0054 Hawking_D 0.0048 Ross_K 0.0061

Brin_S 0.0054 Lukasiewicz_T 0.0053 Merkl_D 0.0042 Hellerstein_J 0.0059

Liu_H 0.0047 Halpern_J 0.0052 Allan_J 0.0040 Lenzerini_M 0.0054

Holder_L 0.0044 Muller_P 0.0048 Doermann_D 0.0039 Moerkotte_G 0.0053

TOPIC 205 TOPIC 209 TOPIC 289 TOPIC 10

Page 220: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Author-Profiles

• Author = Andrew McCallum, U Mass:– Topic 1: classification, training, generalization, decision, data,…– Topic 2: learning, machine, examples, reinforcement, inductive,…..– Topic 3: retrieval, text, document, information, content,…

• Author = Hector Garcia-Molina, Stanford:- Topic 1: query, index, data, join, processing, aggregate….- Topic 2: transaction, concurrency, copy, permission, distributed….- Topic 3: source, separation, paper, heterogeneous, merging…..

• Author = Jerry Friedman, Stanford:– Topic 1: regression, estimate, variance, data, series,…– Topic 2: classification, training, accuracy, decision, data,…– Topic 3: distance, metric, similarity, measure, nearest,…

Page 221: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Page 222: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

PubMed-Query Topics

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

BIOLOGICAL 0.1002 PLAGUE 0.0296 BOTULISM 0.1014 HIV 0.0916

AGENTS 0.0889 MEDICAL 0.0287 BOTULINUM 0.0888 PROTEASE 0.0563

THREAT 0.0396 CENTURY 0.0280 TOXIN 0.0877 AMPRENAVIR 0.0527

BIOTERRORISM 0.0348 MEDICINE 0.0266 TYPE 0.0669 INHIBITORS 0.0366

WEAPONS 0.0328 HISTORY 0.0203 CLOSTRIDIUM 0.0340 INHIBITOR 0.0220

POTENTIAL 0.0305 EPIDEMIC 0.0106 INFANT 0.0245 PLASMA 0.0204

ATTACK 0.0290 GREAT 0.0091 NEUROTOXIN 0.0184 APV 0.0169

CHEMICAL 0.0288 EPIDEMICS 0.0090 BONT 0.0167 DRUG 0.0169

WARFARE 0.0219 CHINESE 0.0083 FOOD 0.0134 RITONAVIR 0.0164

ANTHRAX 0.0146 FRENCH 0.0082 PARALYSIS 0.0124 IMMUNODEFICIENCY0.0150

AUTHOR PROB. AUTHOR PROB. AUTHOR PROB. AUTHOR PROB.

Atlas_RM 0.0044 Károly_L 0.0089 Hatheway_CL 0.0254 Sadler_BM 0.0129

Tegnell_A 0.0036 Jian-ping_Z 0.0085 Schiavo_G 0.0141 Tisdale_M 0.0118

Aas_P 0.0036 Sabbatani_S 0.0080 Sugiyama_H 0.0111 Lou_Y 0.0069

Greenfield_RA 0.0032 Theodorides_J 0.0045 Arnon_SS 0.0108 Stein_DS 0.0069

Bricaire_F 0.0032 Bowers_JZ 0.0045 Simpson_LL 0.0093 Haubrich_R 0.0061

TOPIC 32TOPIC 188 TOPIC 63 TOPIC 85

Page 223: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

PubMed-Query Topics

WORD PROB. WORD PROB. WORD PROB. WORD PROB.

ANTHRACIS 0.1627 CHEMICAL 0.0578 HD 0.0657 ENZYME 0.0938

ANTHRAX 0.1402 SARIN 0.0454 MUSTARD 0.0639 ACTIVE 0.0429

BACILLUS 0.1219 AGENT 0.0332 EXPOSURE 0.0444 SUBSTRATE 0.0399

SPORES 0.0614 GAS 0.0312 SM 0.0353 SITE 0.0361

CEREUS 0.0382 AGENTS 0.0268 SULFUR 0.0343 ENZYMES 0.0308

SPORE 0.0274 VX 0.0264 SKIN 0.0208 REACTION 0.0225

THURINGIENSIS 0.0177 NERVE 0.0232 EXPOSED 0.0185 SUBSTRATES 0.0201

SUBTILIS 0.0152 ACID 0.0220 AGENT 0.0140 FOLD 0.0176

STERNE 0.0124 TOXIC 0.0197 EPIDERMAL 0.0129 CATALYTIC 0.0154

INHALATIONAL 0.0104 PRODUCTS 0.0170 DAMAGE 0.0116 RATE 0.0148

AUTHOR PROB. AUTHOR PROB. AUTHOR PROB. AUTHOR PROB.

Mock_M 0.0203 Minami_M 0.0093 Monteiro-Riviere_NA 0.0284 Masson_P 0.0166

Phillips_AP 0.0125 Hoskin_FC 0.0092 Smith_WJ 0.0219 Kovach_IM 0.0137

Welkos_SL 0.0083 Benschop_HP 0.0090 Lindsay_CD 0.0214 Schramm_VL 0.0094

Turnbull_PC 0.0071 Raushel_FM 0.0084 Sawyer_TW 0.0146 Barak_D 0.0076

Fouet_A 0.0067 Wild_JR 0.0075 Meier_HL 0.0139 Broomfield_CA 0.0072

TOPIC 178TOPIC 40 TOPIC 89 TOPIC 104

Page 224: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

PubMed: Topics by Country

ISRAEL, n=196 authors TOPIC 188 TOPIC 6 TOPIC 133 TOPIC 104 TOPIC 159

p=0.049 p=0.045 p=0.043 p=0.027 p=0.025 BIOLOGICAL INJURY HEALTH HD EMERGENCY

AGENTS INJURIES PUBLIC MUSTARD RESPONSE THREAT WAR CARE EXPOSURE MEDICAL

BIOTERRORISM TERRORIST SERVICES SM PREPAREDNESS

WEAPONS MILITARY EDUCATION SULFUR DISASTER POTENTIAL MEDICAL NATIONAL SKIN MANAGEMENT

ATTACK VICTIMS COMMUNITY EXPOSED TRAINING CHEMICAL TRAUMA INFORMATION AGENT EVENTS

WARFARE BLAST PREVENTION EPIDERMAL BIOTERRORISM ANTHRAX VETERANS LOCAL DAMAGE LOCAL

CHINA, n=1775 authors

TOPIC 177 TOPIC 7 TOPIC 79 TOPIC 49 TOPIC 197 p=0.045 p=0.026 p=0.024 p=0.024 p=0.023 SARS RENAL FINDINGS METHODS PATIENTS

RESPIRATORY HFRS CHEST RESULTS HOSPITAL SEVERE VIRUS CT CONCLUSION PATIENT

COV SYNDROME LUNG OBJECTIVE ADMITTED SYNDROME FEVER CLINICAL CONCLUSIONS TWENTY

ACUTE HEMORRHAGIC PULMONARY BACKGROUND HOSPITALIZED CORONAVIRUS HANTAVIRUS ABNORMAL STUDY CONSECUTIVE

CHINA HANTAAN INVOLVEMENT OBJECTIVES PROSPECTIVELY

KONG PUUMALA COMMON INVESTIGATE DIAGNOSED PROBABLE HANTAVIRUSES RADIOGRAPHIC DESIGN PROGNOSIS

Page 225: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ISRAEL, n=196 authors TOPIC 188 TOPIC 6 TOPIC 133 TOPIC 104 TOPIC 159

p=0.049 p=0.045 p=0.043 p=0.027 p=0.025 BIOLOGICAL INJURY HEALTH HD EMERGENCY

AGENTS INJURIES PUBLIC MUSTARD RESPONSE THREAT WAR CARE EXPOSURE MEDICAL

BIOTERRORISM TERRORIST SERVICES SM PREPAREDNESS

WEAPONS MILITARY EDUCATION SULFUR DISASTER POTENTIAL MEDICAL NATIONAL SKIN MANAGEMENT

ATTACK VICTIMS COMMUNITY EXPOSED TRAINING CHEMICAL TRAUMA INFORMATION AGENT EVENTS

WARFARE BLAST PREVENTION EPIDERMAL BIOTERRORISM ANTHRAX VETERANS LOCAL DAMAGE LOCAL

CHINA, n=1775 authors

TOPIC 177 TOPIC 7 TOPIC 79 TOPIC 49 TOPIC 197 p=0.045 p=0.026 p=0.024 p=0.024 p=0.023 SARS RENAL FINDINGS METHODS PATIENTS

RESPIRATORY HFRS CHEST RESULTS HOSPITAL SEVERE VIRUS CT CONCLUSION PATIENT

COV SYNDROME LUNG OBJECTIVE ADMITTED SYNDROME FEVER CLINICAL CONCLUSIONS TWENTY

ACUTE HEMORRHAGIC PULMONARY BACKGROUND HOSPITALIZED CORONAVIRUS HANTAVIRUS ABNORMAL STUDY CONSECUTIVE

CHINA HANTAAN INVOLVEMENT OBJECTIVES PROSPECTIVELY

KONG PUUMALA COMMON INVESTIGATE DIAGNOSED PROBABLE HANTAVIRUSES RADIOGRAPHIC DESIGN PROGNOSIS

PubMed-Query: Topics by Country

Page 226: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Extended Models

• Conditioning on non-authors– “side-information” other than authors– e.g., date, publication venue, country, etc– can use citations as authors

• Fictitious authors and common author– Allow 1 unique fictitious author per document

• Captures document specific effects– Assign 1 common fictitious author to each document

• Captures broad topics that are used in many documents

• Semantics and syntax model– Semantic topics = topics that are specific to certain documents– Syntactic topics = broad, across many documents– Probabilistic model that learns each type automatically

Page 227: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Scientific syntax and semantics(Griffiths et al., NIPS 2004 – slides courtesy of Mark Steyvers and Tom Griffiths,

PNAS Symposium presentation, 2003)

z

w

zz

w w

xxx

semantics: probabilistic topics

syntax: probabilistic regular grammar

Factorization of language based onstatistical dependency patterns:

long-range, document specificdependencies

short-range dependencies constantacross all documents

Page 228: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HEART 0.2 LOVE 0.2SOUL 0.2TEARS 0.2JOY 0.2

z = 1 0.4

SCIENTIFIC 0.2 KNOWLEDGE 0.2WORK 0.2RESEARCH 0.2MATHEMATICS 0.2

z = 2 0.6

x = 1

THE 0.6 A 0.3MANY 0.1

x = 3

OF 0.6 FOR 0.3BETWEEN 0.1

x = 2

0.9

0.1

0.2

0.8

0.7

0.3

Page 229: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HEART 0.2 LOVE 0.2SOUL 0.2TEARS 0.2JOY 0.2

SCIENTIFIC 0.2 KNOWLEDGE 0.2WORK 0.2RESEARCH 0.2MATHEMATICS 0.2

THE 0.6 A 0.3MANY 0.1

OF 0.6 FOR 0.3BETWEEN 0.1

0.9

0.1

0.2

0.8

0.7

0.3

THE ………………………………

z = 1 0.4 z = 2 0.6

x = 1

x = 3

x = 2

Page 230: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HEART 0.2 LOVE 0.2SOUL 0.2TEARS 0.2JOY 0.2

SCIENTIFIC 0.2 KNOWLEDGE 0.2WORK 0.2RESEARCH 0.2MATHEMATICS 0.2

THE 0.6 A 0.3MANY 0.1

OF 0.6 FOR 0.3BETWEEN 0.1

0.9

0.1

0.2

0.8

0.7

0.3

THE LOVE……………………

z = 1 0.4 z = 2 0.6

x = 1

x = 3

x = 2

Page 231: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HEART 0.2 LOVE 0.2SOUL 0.2TEARS 0.2JOY 0.2

SCIENTIFIC 0.2 KNOWLEDGE 0.2WORK 0.2RESEARCH 0.2MATHEMATICS 0.2

THE 0.6 A 0.3MANY 0.1

OF 0.6 FOR 0.3BETWEEN 0.1

0.9

0.1

0.2

0.8

0.7

0.3

THE LOVE OF………………

z = 1 0.4 z = 2 0.6

x = 1

x = 3

x = 2

Page 232: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

HEART 0.2 LOVE 0.2SOUL 0.2TEARS 0.2JOY 0.2

SCIENTIFIC 0.2 KNOWLEDGE 0.2WORK 0.2RESEARCH 0.2MATHEMATICS 0.2

THE 0.6 A 0.3MANY 0.1

OF 0.6 FOR 0.3BETWEEN 0.1

0.9

0.1

0.2

0.8

0.7

0.3

THE LOVE OF RESEARCH ……

z = 1 0.4 z = 2 0.6

x = 1

x = 3

x = 2

Page 233: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Semantic topics

29 46 51 71 115 125AGE SELECTION LOCI TUMOR MALE MEMORYLIFE POPULATION LOCUS CANCER FEMALE LEARNING

AGING SPECIES ALLELES TUMORS MALES BRAINOLD POPULATIONS ALLELE BREAST FEMALES TASK

YOUNG GENETIC GENETIC HUMAN SPERM CORTEXCRE EVOLUTION LINKAGE CARCINOMA SEX SUBJECTS

AGED SIZE POLYMORPHISM PROSTATE SEXUAL LEFTSENESCENCE NATURAL CHROMOSOME MELANOMA MATING RIGHTMORTALITY VARIATION MARKERS CANCERS REPRODUCTIVE SONG

AGES FITNESS SUSCEPTIBILITY NORMAL OFFSPRING TASKSCR MUTATION ALLELIC COLON PHEROMONE HIPPOCAMPAL

INFANTS PER POLYMORPHIC LUNG SOCIAL PERFORMANCESPAN NUCLEOTIDE POLYMORPHISMS APC EGG SPATIALMEN RATES RESTRICTION MAMMARY BEHAVIOR PREFRONTAL

WOMEN RATE FRAGMENT CARCINOMAS EGGS COGNITIVESENESCENT HYBRID HAPLOTYPE MALIGNANT FERTILIZATION TRAINING

LOXP DIVERSITY GENE CELL MATERNAL TOMOGRAPHYINDIVIDUALS SUBSTITUTION LENGTH GROWTH PATERNAL FRONTAL

CHILDREN SPECIATION DISEASE METASTATIC FERTILITY MOTORNORMAL EVOLUTIONARY MICROSATELLITE EPITHELIAL GERM EMISSION

Page 234: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Syntactic classes

REMAINED

5 8 14 25 26 30 33IN ARE THE SUGGEST LEVELS RESULTS BEEN

FOR WERE THIS INDICATE NUMBER ANALYSIS MAYON WAS ITS SUGGESTING LEVEL DATA CAN

BETWEEN IS THEIR SUGGESTS RATE STUDIES COULDDURING WHEN AN SHOWED TIME STUDY WELLAMONG REMAIN EACH REVEALED CONCENTRATIONS FINDINGS DIDFROM REMAINS ONE SHOW VARIETY EXPERIMENTS DOES

UNDER REMAINED ANY DEMONSTRATE RANGE OBSERVATIONS DOWITHIN PREVIOUSLY INCREASED INDICATING CONCENTRATION HYPOTHESIS MIGHT

THROUGHOUT BECOME EXOGENOUS PROVIDE DOSE ANALYSES SHOULDTHROUGH BECAME OUR SUPPORT FAMILY ASSAYS WILLTOWARD BEING RECOMBINANT INDICATES SET POSSIBILITY WOULD

INTO BUT ENDOGENOUS PROVIDES FREQUENCY MICROSCOPY MUSTAT GIVE TOTAL INDICATED SERIES PAPER CANNOT

INVOLVING MERE PURIFIED DEMONSTRATED AMOUNTS WORK

THEYAFTER APPEARED TILE SHOWS RATES EVIDENCE ALSO

ACROSS APPEAR FULL SO CLASS FINDINGAGAINST ALLOWED CHRONIC REVEAL VALUES MUTAGENESIS BECOME

WHEN NORMALLY ANOTHER DEMONSTRATES AMOUNT OBSERVATION MAGALONG EACH EXCESS SUGGESTED SITES MEASUREMENTS LIKELY

Page 235: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

(PNAS, 1991, vol. 88, 4874-4876)

A23 generalized49 fundamental11 theorem20 of4 natural46 selection46 is32 derived17 for5 populations46 incorporating22 both39 genetic46 and37 cultural46 transmission46. The14 phenotype15 is32 determined17 by42 an23 arbitrary49 number26 of4 multiallelic52 loci40 with22 two39-factor148 epistasis46 and37 an23 arbitrary49 linkage11 map20, as43 well33 as43 by42 cultural46 transmission46 from22 the14 parents46. Generations46 are8 discrete49 but37 partially19 overlapping24, and37 mating46 may33 be44 nonrandom17 at9 either39 the14 genotypic46 or37 the14 phenotypic46 level46 (or37 both39). I12 show34 that47 cultural46 transmission46 has18 several39 important49 implications6 for5 the14 evolution46 of4 population46 fitness46, most36 notably4 that47 there41 is32 a23 time26 lag7 in22 the14 response28 to31 selection46 such9 that47 the14 future137 evolution46 depends29 on21 the14 past24 selection46 history46 of4 the14 population46.

(graylevel = “semanticity”, the probability of using LDA over HMM)

Page 236: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

(PNAS, 1996, vol. 93, 14628-14631)

The14 ''shape7'' of4 a23 female115 mating115 preference125 is32 the14 relationship7 between4 a23 male115 trait15 and37 the14 probability7 of4 acceptance21 as43 a23 mating115 partner20, The14 shape7 of4 preferences115 is32 important49 in5 many39 models6 of4 sexual115 selection46, mate115 recognition125, communication9, and37 speciation46, yet50 it41 has18 rarely19 been33 measured17 precisely19, Here12 I9 examine34 preference7 shape7 for5 male115 calling115 song125 in22 a23 bushcricket*13 (katydid*48). Preferences115 change46 dramatically19 between22 races46 of4 a23 species15, from22 strongly19 directional11 to31 broadly19 stabilizing45 (but50 with21 a23 net49 directional46 effect46), Preference115 shape46 generally19 matches10 the14 distribution16 of4 the14 male115 trait15, This41 is32 compatible29 with21 a23 coevolutionary46 model20 of4 signal9-preference115 evolution46, although50 it41 does33 nor37 rule20 out17 an23 alternative11 model20, sensory125 exploitation150. Preference46 shapes40 are8 shown35 to31 be44 genetic11 in5 origin7.

Page 237: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

(PNAS, 1996, vol. 93, 14628-14631)

The14 ''shape7'' of4 a23 female115 mating115 preference125 is32 the14 relationship7 between4 a23 male115 trait15 and37 the14 probability7 of4 acceptance21 as43 a23 mating115 partner20, The14 shape7 of4 preferences115 is32 important49 in5 many39 models6 of4 sexual115 selection46, mate115 recognition125, communication9, and37 speciation46, yet50 it41 has18 rarely19 been33 measured17 precisely19, Here12 I9 examine34 preference7 shape7 for5 male115 calling115 song125 in22 a23 bushcricket*13 (katydid*48). Preferences115 change46 dramatically19 between22 races46 of4 a23 species15, from22 strongly19 directional11 to31 broadly19 stabilizing45 (but50 with21 a23 net49 directional46 effect46), Preference115 shape46 generally19 matches10 the14 distribution16 of4 the14 male115 trait15. This41 is32 compatible29 with21 a23 coevolutionary46 model20 of4 signal9-preference115 evolution46, although50 it41 does33 nor37 rule20 out17 an23 alternative11 model20, sensory125 exploitation150. Preference46 shapes40 are8 shown35 to31 be44 genetic11 in5 origin7.

Page 238: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

References on Topic Models

• Latent Dirichlet allocation David Blei, Andrew Y. Ng and Michael Jordan. Journal of Machine

Learning Research, 3:993-1022, 2003.

• Finding scientific topics Griffiths, T., & Steyvers, M. (2004). Proceedings of the National

Academy of Sciences, 101 (suppl. 1), 5228-5235

• Probabilistic author-topic models for information discovery M. Steyvers, P. Smyth, M. Rosen-Zvi, and T. Griffiths, in Proceedings of the ACM SIGKDD Conference on Data Mining and Knowledge Discovery, August 2004.

• Integrating topics and syntax. Griffiths, T.L., & Steyvers, M.,  Blei, D.M., & Tenenbaum, J.B. (in press,

2005). In: Advances in Neural Information Processing Systems, 17.

NEW

Page 239: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Summary

• State-of-the-art probabilistic text models can be constructed from large text data sets– Can yield better performance than other approaches like

clustering, LSI, etc– Advantage of probabilistic approach is that a wide range of

queries can be supported by a single model– See also recent work by Buntine and colleagues

• Learning algorithms are slow but scalable– Linear in the number of word tokens– Applying this type of Monte Carlo statistical learning to

millions of words was unheard of a few years ago

Page 240: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Conclusion

Page 241: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

ProbabilisticModel

Real WorldData

Modeling

Learning

NEW

“All models are wrong, but some are useful” (G.E.P. Box)

Page 242: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Concluding Comments

• The probabilistic approach is worthy of inclusion in a data miner’s toolbox– Systematic handling of missing information and uncertainty– Ability to incorporate prior knowledge– Integration of different sources of information– However, not always best choice for “black-box” predictive modeling

• Graphical models in particular provide:– A flexible and modular representational language for modeling– efficient and general computational inference and learning algorithms

• Many recent advances in theory, algorithms, and applications– Likely to continue to see advances in new powerful models, more

efficient scalable learning algorithms, etc

Page 243: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

Examples of New Research Directions

• Modeling and Learning – Probabilistic Relational Models

• Work by Koller et al, Russell et al, etc.– Conditional Markov Random Fields

• information extraction (McCallum et al)– Dirichlet processes

• Flexible non-parametric models (Jordan et al)– Combining discriminative and generative models

• e.g., Haussler and Jaakkola

• Applications– Computer vision: particle filters– Robotics: map learning– Statistical machine translation– Biology: learning gene regulation networks– and many more….

Page 244: Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005 Principles and Applications of Probabilistic Learning Padhraic Smyth Department of Computer

Probabilistic Learning Tutorial: P. Smyth, UC Irvine, August 2005

General References

• All of Statistics: A Concise Course in Statistical Inference L. Wasserman, Chapman and Hall, 2004

• Bayesian Data AnalysisA. Gelman, J. B. Carlin. H. S. Stern, and D. B. Rubin, Chapman and Hall, 2nd edition, 2003.

• Learning in Graphical ModelsM. I. Jordan (ed), MIT Press, 1998

• Graphical models M. I. Jordan. Statistical Science (Special Issue on Bayesian Statistics), 19, 140-

155, 2004.

• The Elements of Statistical Learning : Data Mining, Inference, and  Prediction T. Hastie, R. Tibshirani, J. H. Friedman, Springer, 2001

• Recent Research: – Proceedings of NIPS and UAI conferences, Journal of Machine Learning

Research

UPDATED