full bayesian inference (learning)...learning paradigms learning as inference bayesian learning,...

21
Full Bayesian inference (Learning) Peter Antal [email protected] April 8, 2019 1 A.I.

Upload: others

Post on 13-May-2021

23 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Full Bayesian inference(Learning)

Peter [email protected]

April 8, 2019 1A.I.

Page 2: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Learning paradigms◦ Learning as inference

◦ Bayesian learning, full Bayesian inference, Bayesian model averaging

◦ Model identification, maximum likelihood learning

Probably Approximately Correct learning

April 8, 2019A.I. 2

Page 3: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Epicurus' (342? B.C. - 270 B.C.) principle of multiple explanations which states that one should keep all hypotheses that are consistent with the data.

The principle of Occam's razor (1285 - 1349, sometimes spelt Ockham). Occam's razor states that when inferring causes entities should not be multiplied beyond necessity. This is widely understood to mean: Among all hypotheses consistent with the observations, choose the simplest. In terms of a prior distribution over hypotheses, this is the same as giving simpler hypotheses higher a priori probability, and more complex ones lower probability.

Page 4: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

April 8, 2019A.I. 4

Page 5: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

April 8, 2019A.I. 5

Page 6: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood
Page 7: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Russel&Norvig: Artificial intelligence, ch.20

Page 8: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Russel&Norvig: Artificial intelligence

Page 9: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Russel&Norvig: Artificial intelligence

Page 10: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Russel&Norvig: Artificial intelligence

Page 11: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood
Page 12: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood
Page 13: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

April 8, 2019A.I. 13

0

0.2

0.4

0.6

0.8

1

1 2 3 4 5 6 7 8 9 10 11 12

sequential likelihood of a given data

h1 h2 h3 h4 h5

Page 14: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Simplest form: learn a function from examples

f is the target function

An example is a pair (x, f(x))

Problem: find a hypothesis hsuch that h ≈ fgiven a training set of examples

(This is a highly simplified model of real learning:◦ Ignores prior knowledge◦ Assumes examples are given)◦

Page 15: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Example from concept learning

X: i.i.d. samples.

n: sample size

H: hypotheses

bad

The Probably Approximately Correct PAC-learning

A single estimate of the expected error for a given hypothesis is convergent,

but can we estimate the errors for all hypotheses uniformly well??

Page 16: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Assume that the true hypothesis f is element of the hypothesis space H.

Define the error of a hypothesis h as its misclassification rate:

Hypothesis h is approximately correct if

(ε is the “accuracy”)

For h∈Hbad

𝑒𝑟𝑟𝑜𝑟 ℎ = 𝑝(ℎ(𝑥) ≠ 𝑓(𝑥))

𝑒𝑟𝑟𝑜𝑟 ℎ < 𝜀

𝑒𝑟𝑟𝑜𝑟 ℎ > 𝜀

Page 17: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

H can be separated to H<ε and Hbad as Hε<

By definition for any h ∈ Hbad, the probability of error is larger than 𝜀thus the probability of no error is less than )1(

bad

Page 18: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Thus for m samples for a hb ∈ 𝐻𝑏𝑎𝑑:

For any hb ∈ 𝐻𝑏𝑎𝑑, this can be bounded as

𝑝 𝐷𝑛:ℎ𝑏 𝑥 = 𝑓 𝑥 ≤ (1 − 𝜀)𝑛

𝑝 𝐷𝑛:∃ℎ𝑏∈ 𝐻, ℎ𝑏 𝑥 = 𝑓 𝑥 ≤

≤ 𝐻𝑏𝑎𝑑 1 − 𝜀 𝑛

≤ |𝐻| (1 − 𝜀)𝑛

Page 19: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

To have at least δ “probability” of approximate correctness:

By expressing the sample size as function of ε accuracy and δ confidence we get a bound for sample complexity

|𝐻| (1 − 𝜀)𝑛≤ δ

1/𝜀(ln 𝐻 + ln1

δ) ≤ n

Page 20: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

How many distinct concepts/decision trees with n Boolean attributes?

= number of Boolean functions

= number of distinct truth tables with 2n rows = 22n

E.g., with 6 Boolean attributes, there are 18,446,744,073,709,551,616 trees

Page 21: Full Bayesian inference (Learning)...Learning paradigms Learning as inference Bayesian learning, full Bayesian inference, Bayesian model averaging Model identification, maximum likelihood

Normative predictive probabilistic inference◦ performs Bayesian model averaging

◦ implements learning through model posteriors

◦ avoids model identification

Model identification is hard◦ Probably Approximately Correct learning

April 8, 2019A.I. 21