proof methods for propositional logic of inference rules • each application yields the legitimate...

43
Proof Methods for Propositional Logic

Upload: trankhanh

Post on 16-Mar-2019

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

Proof Methods for Propositional Logic

Page 2: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391 - Intro to AI 2

Outline

Automated Propositional Proof Methods

1. Resolution

2. A Practical Method: Walksat

Page 3: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI3

Proof methods

I. Application of Inference Rules• Each application yields the legitimate (sound) generation of a

new sentence from old

• Proof = a sequence of sound inference rule applications

• Proofs can be found using search

— Inference Rules as operators for a standard search algorithm

• Typically require transformation of sentences into a normal form

• Example: Resolution

II. Model Checking Methods

• Examples:

— Truth Table Enumeration (tests satisfiability, validity)

— WalkSat (tests satisfiability)

Page 4: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

Applies to a DB of Sentences in Conjunctive Normal Form (CNF)

conjunction of clauses of disjunctions of literals and negated literals

E.g., (A B) (B C D)

clause

Resolution inference rule (for CNF):

l1 … li-1 li li+1 … lk, m1 … mj-1 mj mj+1 … mn

l1 … li-1 li+1 … lk m1 … mj-1 mj+1 ... mn

where li and mj are complementary literals, i.e. li = mj

e.g. P1,3 P2,2, P2,2

P1,3

Resolution is sound and complete for propositional logicCIS 391-Intro to AI

4

Resolution

Page 5: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI5

Soundness of resolution inference rule

if li = mj

(l1 … li-1 li+1 … lk) li

mj (m1 … mj-1 mj+1 ... mn)

(l1 … li-1 li+1 … lk) (m1 … mj-1 mj+1 ... mn)

Given that ( ) ( )

Page 6: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391 - Intro to AI 6

Review: Validity and satisfiabilityA sentence is valid if it is true in all models,

e.g. True, A A, A A, (A (A B)) B

Validity is connected to inference via the Deduction Theorem:

KB ╞ α if and only if (KB α) is valid

A sentence is satisfiable if it is true in some model

e.g. A B, C

A sentence is unsatisfiable if it is false in all models

e.g. AA

Satisfiability is connected to inference via the following:

KB ╞ α if and only if (KB α) is unsatisfiable

(there is no model for which KB=true and is false)

Page 7: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI7

Proof by Resolution: Proof by contradiction

I.E.: prove α by showing KBα unsatisfiable

Example: KB = (B1,1 (P1,2 P2,1)) B1,1

• Prove : P1,2

KB in Conjunctive Normal Form:(B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1) B1,1

Negate : P1,2

CIS 391-Intro to AI7

Page 8: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI8

Conversion to CNF: General Procedure

Example: B1,1 (P1,2 P2,1)

1. Eliminate , replacing α β with (α β)(β α).(B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)

2. Eliminate , replacing α β with α β.(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

3. Move inwards using de Morgan's rules and (often, but not here) double-negation:(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

4. Flatten by applying distributivity law ( over ):(B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1)

Page 9: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391 - Intro to AI 9

For convenience: Logical equivalence

To manipulate logical sentences we need some rewrite rules.

Two sentences are logically equivalent iff they are true in same models: α

≡ ß iff α╞ β and β╞ α

I told you you

needed to

know these !

Page 10: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI10

Resolution algorithm

Iteratively apply resolution to all pairs of clauses

will have

just been resolved

A A

Page 11: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI11

Resolution example

KB = (B1,1 (P1,2 P2,1)) B1,1 α = P1,2

Page 12: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI12

The WalkSAT algorithm

A practical, simple algorithm to determine

satisfiability for propositional logic

Sound

Incomplete

A hill-climbing search algorithm

Balance between greediness and randomness

• Evaluation function: The min-conflict heuristic of minimizing

the number of unsatisfied clauses

• Uses random jumps to escape local minima

Page 13: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI13

The WalkSAT algorithm

Page 14: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI14

Hard satisfiability problems

Consider random 3-CNF sentences. e.g.,

(D B C) (B A C) (C B E)

(E D B) (B E C)

m = number of clauses

n = number of symbols

• Hard problems seem to cluster near m/n = 4.3 (critical point)

• Here:

m=4, n=|{A,B,C,D,E}| =5

m/n= 4/5 = .8

Page 15: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI15

Hard satisfiability problems

Page 16: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI16

Hard satisfiability problems

Median runtime for 100 satisfiable random 3-CNF sentences, n = 50

Page 17: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI17

Encoding Wumpus in propositional logic

4x4 Wumpus World• The “physics” of the game

• At least one wumpus on board—

• A most one wumpus on board (for any two squares, one is free)

— n2 rules like:

• No instant death:

—P1,1

—W1,1

1,1 1,2 1,3 4,4( ... )W W W W

Page 18: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI18

Page 19: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CIS 391-Intro to AI19

KB contains "physics" sentences for every single

square

Rapid proliferation of clauses

Expressiveness limitation of propositional logic

Page 20: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI20

Forward and backward chaining

Horn Clause (restricted)• Horn clause:

—proposition symbol

— (conjunction of symbols) symbol

• E.g.: A B B A C D B

KB = conjunction of Horn clausesE.g., C ( B A) (C D B)

Modus Ponens (for Horn Form): complete for Horn KBsα1, … ,αn, α1 … αn β

β

Used with forward chaining or backward chaining.

These algorithms are very natural and run in linear time

Page 21: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI21

Forward chaining

Idea: Apply modus ponens to any Horn Clause whose

premises are satisfied in the KB

• Add its conclusion to the KB, until query is found

• Easy to visualize informally in graphical form:

Page 22: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI22

Forward chaining algorithm

Forward chaining is sound and complete for Horn KB

Page 23: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI23

Forward chaining example

Page 24: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI24

Forward chaining example

Page 25: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI25

Forward chaining example

Page 26: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI26

Forward chaining example

Page 27: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI27

Forward chaining example

Page 28: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI28

Forward chaining example

Page 29: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI29

Forward chaining example

Page 30: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI30

Forward chaining example

Page 31: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI31

Proof of completeness

FC derives every atomic sentence that is entailed by KB

1. FC reaches a fixed point where no new atomic sentences are derived

2. Consider the final state as a model m, assigning true/false to symbols

3. Every clause in the original KB is true in ma1 … ak b

4. Hence m is a model of KB

5. If KB╞ q, q is true in every model of KB, including m

Page 32: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI32

Backward chaining

Idea: work backwards from the query q:to prove q by BC,

check if q is known already, or

prove by BC all premises of some rule concluding q

Avoid loops: check if new subgoal is already on the goal stack

Avoid repeated work: check if new subgoal

1. has already been proved true, or

2. has already failed

Page 33: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI33

Backward chaining example

Page 34: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI34

Backward chaining example

Page 35: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI35

Backward chaining example

Page 36: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI36

Backward chaining example

Page 37: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI37

Backward chaining example

Page 38: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI38

Backward chaining example

Page 39: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI39

Backward chaining example

Page 40: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI40

Backward chaining example

Page 41: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI41

Backward chaining example

Page 42: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI42

Backward chaining example

Page 43: Proof Methods for Propositional Logic of Inference Rules • Each application yields the legitimate (sound) generation of a new sentence from old • Proof = a sequence of sound inference

CSE 391-Intro to AI43

Forward vs. backward chaining

FC is data-driven, automatic, unconscious processing,• e.g., object recognition, routine decisions

May do lots of work that is irrelevant to the goal

BC is goal-driven, appropriate for problem-solving,• e.g., Where are my keys? How do I get into a PhD program?

Complexity of BC can be much less than linear in size of KB