knowledge representation. key issue in ai 1. mapping between objects and relations in a problem...

41
Knowledge Knowledge Representation Representation

Upload: philip-black

Post on 03-Jan-2016

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Knowledge Knowledge RepresentationRepresentation

Page 2: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Key Issue in AIKey Issue in AI

1.1. Mapping between objects and Mapping between objects and relations in a problem domain and relations in a problem domain and computational objects and relations computational objects and relations in a program.in a program.

2.2. Results of inferences on the Results of inferences on the knowledge base (KB) should knowledge base (KB) should correspond to the results of actions correspond to the results of actions or observations in the world.or observations in the world.

Page 3: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Have Already Examined Two Have Already Examined Two (related) KR Schemes(related) KR Schemes

First Order Predicate LogicFirst Order Predicate Logic Production SystemsProduction Systems

Page 4: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

We’ll look at four othersWe’ll look at four others

1.1. Semantic NetsSemantic Nets

2.2. Conceptual Dependency SchemesConceptual Dependency Schemes

3.3. FramesFrames

4.4. ScriptsScripts

1 & 2 are called network schemes1 & 2 are called network schemes

3 & 4 are called structured schemes 3 & 4 are called structured schemes (alternatively slot and filler schemes)(alternatively slot and filler schemes)

Page 5: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Problems with FOPLProblems with FOPL

1.1. Emphasis is on truth-preserving Emphasis is on truth-preserving relationsrelations

2.2. Sometimes at odds with the way Sometimes at odds with the way that humans acquire and use that humans acquire and use knowledgeknowledge

3.3. Leads to problems in mapping Leads to problems in mapping human language to FOPLhuman language to FOPL

Page 6: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

For ExampleFor Example

If … thenIf … then

In English suggests causalityIn English suggests causality

ButBut

In FOPL specifies a relationship between In FOPL specifies a relationship between

truth values of antecedent and consequenttruth values of antecedent and consequent

(2+2 = 5) (2+2 = 5) color(elephants, green) color(elephants, green)

This is true, but without common sense This is true, but without common sense meaning.meaning.

Page 7: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Categorical InterludeCategorical Interlude

CategoryCategory A group of objects that seem to go A group of objects that seem to go

togethertogether Because they have significant Because they have significant

attributes in commonattributes in common Example: DOGExample: DOG

Page 8: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Allows us to use our finite mental Allows us to use our finite mental resources efficientlyresources efficiently– When identifying an objects, we can abstract When identifying an objects, we can abstract

key attributes from all sensory information key attributes from all sensory information presented to us.presented to us.

– I am trying to determine whether that flying I am trying to determine whether that flying object is a bird or a wasp.object is a bird or a wasp.

– I don’t care that robins have orange breasts I don’t care that robins have orange breasts and sparrows have grey.and sparrows have grey.

– What matters are those attributes of category What matters are those attributes of category bird that exclude instances of category waspbird that exclude instances of category wasp

Page 9: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Categories license inductive Categories license inductive inferencesinferences– Most birds pose no threat to humansMost birds pose no threat to humans– Common wasps doCommon wasps do– Inference from category wasp tells us to Inference from category wasp tells us to

avoid its membersavoid its members

Page 10: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Gelman & Markman’s Experiment Gelman & Markman’s Experiment

Children wereChildren were Shown a picture of a fishShown a picture of a fish Told that it breathes under waterTold that it breathes under water Shown a picture of a dolphinShown a picture of a dolphin Told that it breathes by jumping out of the waterTold that it breathes by jumping out of the water Shown a picture of a sharkShown a picture of a shark Told that it is a fish (though it looks like a dolphin)Told that it is a fish (though it looks like a dolphin) Were asked how it breathesWere asked how it breathes Answered “Under water.”Answered “Under water.”

Page 11: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Semantic NetsSemantic Nets

Proposed by Quillian in the late 1960’sProposed by Quillian in the late 1960’s Tries to provide a formalism that captures Tries to provide a formalism that captures

taxonomic hierarchiestaxonomic hierarchies A graph whereA graph where

– Nodes are categoriesNodes are categories– Arcs are of three typesArcs are of three types

Isa links, indicating a subset relationship (a dog isa Isa links, indicating a subset relationship (a dog isa mammal)mammal)

Inst links, indicating an element-set relationship Inst links, indicating an element-set relationship (mazel is a dog)(mazel is a dog)

Attribute links, indicating a property held by a Attribute links, indicating a property held by a category (simcha is grey)category (simcha is grey)

Page 12: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

ExampleExamplething

Animate thing

Table_1

legs

Ponderosa pine

animal

plant

green

Inanimate

Thing

Block_1

Furniture

Table

cubic

Block

color

isa

isa

isa

isa

isa

isa

isa

Instance_of

Instance_of

Supported_by

Supported_by

Supported_by

shape

Page 13: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

In (what else?) PrologIn (what else?) Prolog

Page 14: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Strengths of Semantic NetsStrengths of Semantic Nets

1.1. Provides for inheritanceProvides for inheritance

2.2. Organizes knowledge using Organizes knowledge using interconnected conceptsinterconnected concepts

3.3. Let’s us discover relationships Let’s us discover relationships between pairs of concepts (block_1 between pairs of concepts (block_1 and table_1 are both inanimate and table_1 are both inanimate things and are supported by legs)things and are supported by legs)

Page 15: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Weaknesses of Semantic NetsWeaknesses of Semantic Nets

1.1. Generality of the attribute linksGenerality of the attribute links

2.2. As task grows in complexity, so As task grows in complexity, so does the representationdoes the representation

3.3. No systematic basis for structuring No systematic basis for structuring semantic relationshipssemantic relationships

4.4. Puts the burden of constructing Puts the burden of constructing facts & links on programmerfacts & links on programmer

Page 16: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Key IssueKey Issue

Isolation of primitives for semantic Isolation of primitives for semantic network languagesnetwork languages

Primitives are those things that the Primitives are those things that the interpreter is programmed in interpreter is programmed in advance to understand.advance to understand.

We need a more systematic basis for We need a more systematic basis for structuring semantic relationshipsstructuring semantic relationships

Page 17: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Case Structure GrammarsCase Structure Grammars C.J. Fillmore, 1968C.J. Fillmore, 1968 Verb oriented (as opposed to concept-oriented)Verb oriented (as opposed to concept-oriented) Sentences are represented as verb nodes with Sentences are represented as verb nodes with

links to specific roles played by nouns and noun links to specific roles played by nouns and noun phrasesphrases

Important linksImportant links– AgentAgent– ObjectObject– InstrumentInstrument– LocationLocation– TimeTime

Page 18: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

““Mary caught the ball with her glove.”Mary caught the ball with her glove.”

Mary

agentcatch ball

gloveinstrument

past

object

time

Page 19: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

AdvantagesAdvantages

Representational language captures Representational language captures some of the deep structure of natural some of the deep structure of natural languages (i.e., the relationship languages (i.e., the relationship between any verb and its subject is between any verb and its subject is the agent relationship)the agent relationship)

This deep structure is independent of This deep structure is independent of any sentence or even of any distinct any sentence or even of any distinct languagelanguage

Page 20: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Leading ToLeading To Conceptual Dependency TheoryConceptual Dependency Theory

– Associated with Robert Schank (then of Yale, most Associated with Robert Schank (then of Yale, most recently of Northwestern)recently of Northwestern)

– Attempts to model the semantic structure of natural Attempts to model the semantic structure of natural languagelanguage

– Attempts to provide a canonical form for the meaning of Attempts to provide a canonical form for the meaning of sentencessentences

– That is, all sentences that mean the same thing That is, all sentences that mean the same thing (whatever that means) will be represented internally by (whatever that means) will be represented internally by identical graphsidentical graphs

– Idea is to parse two sentences that use different words Idea is to parse two sentences that use different words but mean the same thing into identical internal but mean the same thing into identical internal representationsrepresentations

– Example: John gave the book to mary/Mary was given Example: John gave the book to mary/Mary was given the book by John.the book by John.

Page 21: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Primitives in CD TheoryPrimitives in CD Theory

ACTs – actionsACTs – actions PPs – picture producersPPs – picture producers AAs – modifiers of actions (action AAs – modifiers of actions (action

aiders)aiders) PAs – Modifiers of objects (picture PAs – Modifiers of objects (picture

aiders)aiders)

Page 22: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Further BreakdownFurther Breakdown

Each of these classes has a well-defined Each of these classes has a well-defined number of primitives (luger, pp. 236-37)number of primitives (luger, pp. 236-37)

All ACTs (actions) can be reduced to:All ACTs (actions) can be reduced to:1.1. ATRANS – transfer a relationship ATRANS – transfer a relationship

(give)(give)2.2. PTRANS – transfer a physical location PTRANS – transfer a physical location

(go)(go)3.3. PROPEL – apply physical force (push)PROPEL – apply physical force (push)4.4. MOVE – move body part by owner (kick)MOVE – move body part by owner (kick)……12. ATTEND – focus sense organ (listen) 12. ATTEND – focus sense organ (listen)

Page 23: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Yet MoreYet More

Indicates direction of dependencyIndicates direction of dependency

P indicates pastP indicates past F indicates futureF indicates future Indicates agent-verb relationshipIndicates agent-verb relationship

Indicates the object of an actionIndicates the object of an action

ACT PPACT PP Agent instrument is an arrow pointing leftAgent instrument is an arrow pointing left

o

Page 24: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

pppp

pppp

ACTACT

Recipient of an Recipient of an actionaction

Page 25: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

““John gave the book to mary.”John gave the book to mary.”

JohnJohn

pp

ATRANSATRANS

bookbook

RRmarymary

johnjohn

Page 26: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Basic IdeaBasic Idea

1.1. Parse the sentenceParse the sentence

2.2. Fit it into canonical formFit it into canonical form

3.3. Group sentences with similar Group sentences with similar meaningsmeanings

Page 27: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

StrengthsStrengths

1.1. Provides a formal theory of Provides a formal theory of language semanticslanguage semantics

2.2. Reduces the problem of ambiguityReduces the problem of ambiguity

3.3. Attempts to reduce the complexity Attempts to reduce the complexity of natural language by grouping of natural language by grouping sentences of similar meaning sentences of similar meaning together.together.

Page 28: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

WeaknessesWeaknesses

1.1. Reduction is not computable in polynomial timeReduction is not computable in polynomial time2.2. No evidence that humans store knowledge in No evidence that humans store knowledge in

canonical formscanonical forms3.3. Does not address the difficult issue of meaning Does not address the difficult issue of meaning

in discoursein discourseExample:Example:Bill and John always walk home together. One Bill and John always walk home together. One

afternoon, Bill said to John, “Let’s leave early.” afternoon, Bill said to John, “Let’s leave early.” In effect, In effect, hehe asked asked himhim to go along with to go along with hishis plan plan of playing hooky.of playing hooky.

What are the referents of these three pronouns?What are the referents of these three pronouns?

Page 29: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Canonical Sentences leads to Canonical Sentences leads to Canonical EventsCanonical Events

NLP programs must use a large NLP programs must use a large amount of background knowledgeamount of background knowledge

Evidence that we organize this Evidence that we organize this information into structures information into structures corresponding to typical situationscorresponding to typical situations

Example: if we read a story about Example: if we read a story about baseball, we resolve any ambiguities baseball, we resolve any ambiguities in the text in a way consistent with in the text in a way consistent with baseballbaseball

Page 30: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

ExampleExample

1.1. City Council refused to give the City Council refused to give the demonstrators a permit because demonstrators a permit because theythey feared violence.feared violence.

2.2. City Council refused to give the City Council refused to give the demonstrators a permit because demonstrators a permit because theythey advocated revolution.advocated revolution.

Background knowledge lets us determine Background knowledge lets us determine the correct referent to the correct referent to theythey in each case. in each case.

Page 31: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

ScriptScript

Structural representation that Structural representation that describes a stereotypical sequence describes a stereotypical sequence of events in a particular context.of events in a particular context.

May be viewed as a causal chainMay be viewed as a causal chain

Page 32: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

ComponentsComponents

1.1. Entry conditions: must be satisfied before Entry conditions: must be satisfied before the script is activatedthe script is activated

2.2. Result: things that will be true after script Result: things that will be true after script completescompletes

3.3. Props: slots representing objects that are Props: slots representing objects that are involved in the events of the script. involved in the events of the script.

4.4. Roles: slots representing people involved Roles: slots representing people involved in the events of the script.in the events of the script.

5.5. Track: Specific variation on a general Track: Specific variation on a general patternpattern

6.6. Scene: The actual sequence of eventsScene: The actual sequence of events

Page 33: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

NoticeNotice

Entry Conditions/Result are pre/post Entry Conditions/Result are pre/post conditionsconditions

Props and Roles are Data StructuresProps and Roles are Data Structures Track is overloadingTrack is overloading Scene is an algorithmScene is an algorithm

Page 34: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

ExampleExample

John went to a restaurant last night. John went to a restaurant last night. He ordered penne arrabiata. When He ordered penne arrabiata. When he paid, he noticed he was running he paid, he noticed he was running out of money. He hurried home, out of money. He hurried home, since it had started to rain.since it had started to rain.

Question: Did he eat?Question: Did he eat?

Page 35: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

In ActionIn Action Activate Script: RestaurantActivate Script: Restaurant Roles:Roles:

– S = CustomerS = Customer– W= WaiterW= Waiter

Props:Props:– F = FoodF = Food

SceneScene– Entering:Entering:

S ptrans s into restaurantS ptrans s into restaurant– Ordering: …Ordering: …– EatingEating

S ingest FS ingest F– ExitingExiting

S atrans money to WS atrans money to W Result: Answer to question is yesResult: Answer to question is yes

Page 36: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

FramesFrames Associated with Marvin MinskyAssociated with Marvin Minsky Semantic nets informally represent Semantic nets informally represent

– inheritance through isa linksinheritance through isa links– Relationships among entitiesRelationships among entities

FramesFrames– More structured semantic netMore structured semantic net– Assign structure to nodes as well as linksAssign structure to nodes as well as links– DefinitionDefinition

A frame is a collection of attributes (slots) and associated values A frame is a collection of attributes (slots) and associated values (along with constraints) that describe something in the world(along with constraints) that describe something in the world

Each frameEach frame– Represents a set of items (isa) with given properties that are Represents a set of items (isa) with given properties that are

inherited by its membersinherited by its members– Represents an instance (inst) of a class of items with given Represents an instance (inst) of a class of items with given

properties, some of which are inheritedproperties, some of which are inherited

Page 37: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Example Semantic Net Example Semantic Net personperson

MaleMale

ML baseball ML baseball playerplayer

pitcherpitcher outfielderoutfielder

KoufaxKoufax MaysMaysDodgersDodgers

.106.106

GiantsGiants

.262.262

.253.253

6-16-1

5-105-10

rightright

isisaa

instinst

htht

Page 38: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Transformed to a FrameTransformed to a Frame

IssueIssueSome attributes are to be inheritedSome attributes are to be inherited

Some refer only to the frame itselfSome refer only to the frame itself

Person has both cardinality Person has both cardinality (8,000,000,000) and locomotion (8,000,000,000) and locomotion (biped)(biped)

Only locomotion is to be inherited—Only locomotion is to be inherited—indicate with an *indicate with an *

Page 39: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

PersonPerson isa: Mammalisa: Mammal Card: 8,000,000,000Card: 8,000,000,000 *handed: right*handed: right

We have a frame with three slotsWe have a frame with three slots

Male Male Isa: PersonIsa: Person Card: 4,000,000,000Card: 4,000,000,000 *height: 5-10*height: 5-10

Page 40: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

Baseball PlayerBaseball Player Isa: maleIsa: male Card: 624Card: 624 *Height: 6-1*Height: 6-1 *avg: .252*avg: .252 *team:*team: *uniform color*uniform color

Page 41: Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program

SlotsSlots

Have inherited default valuesHave inherited default values Can be structured objectsCan be structured objects

– Frames to which it can be attached (*avg Frames to which it can be attached (*avg makes sense for baseball player but not for makes sense for baseball player but not for water fowl)water fowl)

– Constraints on values (0 <= avg <= 1)Constraints on values (0 <= avg <= 1)– Default valueDefault value– Rules for computing a value separate from Rules for computing a value separate from

inheritanceinheritance– Whether a slot is single or multi-valuedWhether a slot is single or multi-valued