uncertain reasoning cpsc 315 – programming studio spring 2009 project 2, lecture 6

20

Click here to load reader

Upload: winfred-edwards

Post on 18-Dec-2015

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Uncertain Reasoning

CPSC 315 – Programming Studio

Spring 2009

Project 2, Lecture 6

Page 2: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Reasoning in Complex Domains or Situations

Reasoning often involves moving from evidence about the world to decisions

Systems almost never have access to the whole truth about their environment

Reasons for lack of knowledge Cost/benefit trade-off in knowledge engineering

Less likely, less influential factors often not included in model

No complete theory of domain Complete theories are few and far between

Incomplete knowledge of situation Acquiring all knowledge of situation is impractical

Page 3: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Forms of Uncertain Reasoning

Partially-believed domain features E.g. chance of rain = 80% Probability (focus of today’s lecture) Other (we will return to this)

Partially-true domain features E.g. cloudy = .8 Fuzzy logic (outside scope of this class)

Page 4: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Making Decisions to Meet Goals

Decision theory = Probability theory +Utility theory

Decisions – the outcome of system’s reasoning, actions to take or avoid

Probability – how system reasons Utility – system’s goals / preferences

Page 5: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Quick Question

You go to the doctor and are tested for a disease. The test is 98% accurate if you have the disease. 3.6% of the population has the disease while 4% of the population tests positive.

How likely is it you have the disease?

Page 6: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Quick Question 2

You go to the doctor and are tested for a disease. The test is 98% accurate if you have the disease. 3.6% of the population has the disease while 7% of the population tests positive.

How likely is it you have the disease?

Page 7: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Basics of Probability

Unconditional or prior probability Degree of belief of something being true in

absence of any information P (cavity = true) = 0.1 or P (cavity) = 0.1 Implies P (not cavity) = 0.9

Page 8: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Basics of Probability

Unconditional or prior probability Can be for a set of values

P (Weather = sunny) = 0.7 P (Weather = rain) = 0.2 P (Weather = cloudy) = .08 P (Weather = snow) = .02 Note: Weather can have only a single value – system

must know that rain and snow implies clouds

Page 9: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Basics of Probability

Conditional or posterior probability Degree of belief of something being true given

knowledge about situation P (cavity | toothache) = 0.8

Mathematically, we knowP (a | b) = P (a ^ b) / P (b)

Requires system to know unconditional probability of combinations of features This knowledge becomes exponential relative to the

size of the feature set

Page 10: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Bayes’ Rule

Remember: P (a | b) = P (a ^ b) / P (b) Can be rewritten

P (a ^ b) = P (a | b) * P (b) Swapping a and b features yields

P (a ^ b) = P (b | a) * P (a) Thus

P (b | a) * P (a) = P (a | b) * P (b)

Rewriting we get Bayes’ Rule P (b | a) = P (a | b) * P (b) / P (a)

Page 11: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Reasoning with Bayes’ Rule

Bayes’ Rule P (b | a) = P (a | b) * P (b) / P (a)

Example Let’s take

P (disease) = 0.036 P (test) = 0.04 P (test | disease) = 0.98 P (disease | test) = ?

Page 12: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Reasoning with Bayes’ Rule Bayes’ Rule

P (b | a) = P (a | b) * P (b) / P (a) Example

P (disease) = 0.036 P (test) = 0.04 P (test | disease) = 0.98 P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.98 * 0.036 / 0.04 = 88.2 %

Page 13: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Reasoning with Bayes’ Rule What if test has more false positives

Still 98% accurate for those with disease Example

P (disease) = 0.036 P (test) = 0.07 P (test | disease) = 0.98 P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.98 * 0.036 / 0.07 = 50.4 %

Page 14: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Reasoning with Bayes’ Rule What if test has more false negatives

Now 90% accurate for those with disease Example

P (disease) = 0.036 P (test) = 0.04 P (test | disease) = 0.90 P (disease | test) = ? = P (test | disease) * P (disease) / P (test) = 0.90 * 0.036 / 0.04 = 81 %

Page 15: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Combining Evidence

What happens when we have more than one piece of evidence Example: toothache and tool catches on tooth P (cavity | toothache ^ catch) = ? Problem: toothache and catch are not

independent If someone has a toothache there is a greater

chance they will have a catch and vice-versa

Page 16: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Independence of Events

Independence of features / events Features / events cannot be used to predict each

other Example: values rolled on two separate die Example: hair color and food preference

Probabilistic reasoning works because systems divide domain into independent sub-domains Do not need the exponentially increasing data to

understand interactions Unfortunately, non-independent sub-domains can

still be huge (have many interacting features)

Page 17: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Conditional Independence

What happens when we have more than one piece of evidence Example: toothache and tool catches on tooth P (cavity | toothache ^ catch) = ?

Conditional independence Assume indirect relationship Example: toothache and catch are both caused by cavity

but not any other feature

Then P (toothache ^ catch | cavity) =

P (toothache | cavity) * P (catch | cavity)

Page 18: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Conditional Independence

This let’s us sayP (toothache ^ catch | cavity)

= P (toothache | cavity) * P (catch | cavity)

P (cavity | toothache ^ catch) = ?= P (toothache ^ catch | cavity) * P (cavity)

= P (toothache | cavity) * P (catch | cavity) * P (cavity)

Avoids requiring system to have data on all permutations

Difficulty: How true? What about a chipped or cracked tooth?

Page 19: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Human Reasoning

Studies show people, without training and prompting, do not reason probabilistically People make incorrect inferences when

confronted with probabilities like those of the last few slides

If asked for all prior and posterior probabilities then they will posit systems with rather large inconsistencies

Page 20: Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6

Human Reasoning

Studies show people, without training, do not reason probabilistically

Some systems have used non-probabilistic forms of uncertain reasoning Qualitative categories rather than numbers

Must be true, highly likely, likely, some chance, unlikely, virtually impossible, impossible

Rules for how these combine based on human reasoning Value depends on where belief values come from

If belief values from external evidence about world then use probability

If belief values provided by user then non-probabilistic approach may do better