10 knowledge representation (compatible) -part 3
TRANSCRIPT
CP468, Dr. Reem K. Al-Halimi 1
Study Material: •Part III up to end of Chapter 7 in Luger, Artificial Intelligence: Structures and Strategies.
Other References:•James Odell, “Objects and Agents Compared” in Journal of Object Technology, Vol 1, Issue 1, May-June 2002. http://www.jot.fm/issues/issue_2002_05/column4/column4.pdf
Knowledge RepresentationProblem: represent human knowledge into
computationally acceptable languageDesired Features
Exhaustiveness All needed information is in KB.
modifiability new information can be added without sacrificing consistency.
homomorphic mapping of objects information organized in a natural and intuitive fashion
Computational Efficiency
2CP468, Dr. Reem K. Al-Halimi
Approaches to Problem Solving in AIDifferent views
1. Weak Problem Solvers: To create intelligent systems, we simply need to transform the syntactic form of the start state to match that of the desired goal state. Example: General Problem Solver by Newell and Simon
2. Strong Problem Solvers: To create a system that acts intelligently we must represent world knowledge in a form accessible to the system. Example: expert systems such as MYCIN.
3. Subsumption Architecture: “The world is its own model”.
4. Genetic.
3CP468, Dr. Reem K. Al-Halimi
Explicit Representation of World KnowledgeLogic as a knowledge representation
languagePropositional LogicPredicate Logic (FOL)
Semantic networksFramesConceptual Dependency
4CP468, Dr. Reem K. Al-Halimi
Knowledge Representation Hypothesis1. Knowledge is represented propositionally
(i.e. in a form that explicitly represents the knowledge in question).
2. The behaviour of a system is seen as formally caused by the represented knowledge.
5CP468, Dr. Reem K. Al-Halimi
Semantic NetworksDefine objects in terms of their association with
other objectse.g. snow, white, snowman, ice, slippery.
Represent knowledge as a graph:
Concepts at lower levels inherit characteristics from their parent concepts.
Concepts
Relations
6CP468, Dr. Reem K. Al-Halimi
Semantic NetworksWell designed semantic networks are a form
of logic.
memberOf(femalePersons, mary)
female Person
s
memberOf
mary
7CP468, Dr. Reem K. Al-Halimi
Fig 7.2 Network representation of properties of snow and ice (From: Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009)
8
Semantic NetworksExample
CP468, Dr. Reem K. Al-Halimi
Semantic NetworksExample
female Person
s
memberOf
mary
male Person
s
memberOf
john
mammals
subsetOf
Persons
subsetOfsubsetOf
sisterOf
legs 2hasMoth
er
legs 1
9CP468, Dr. Reem K. Al-Halimi
Semantic NetworksInference MechanismInheritance
e.g. Persons by default have 2 legs. How many legs does Mary have? John?
Use of Inverse Links (through reification)e.g. hasSister(p, s) and sisterOf(s, p)
hasSister
inverseOf
sisterOf
10CP468, Dr. Reem K. Al-Halimi
Semantic NetworksExample
female Person
s
memberOf
mary
male Person
s
memberOf
john
mammals
subsetOf
Persons
subsetOfsubsetOf
sisterOf
legs 2hasMoth
er
legs 1
hasSister
11CP468, Dr. Reem K. Al-Halimi
Semantic NetworksAdvantagesSimple and transparent inference processes.Ability to assign default values for categories.Ability to include procedural attachment.
12CP468, Dr. Reem K. Al-Halimi
Semantic NetworksDisadvantagesSimple query language may be too limiting to
express complex queries.Does not represent full FOL since it does not
provide means to use negation, disjunction, and existential quantification.
n-ary functions must be mapped onto binary functions.
13CP468, Dr. Reem K. Al-Halimi
Semantic NetworksMuch of this work has been done in the arena
of natural language.First implementation in machine translation in
the early 60s.Quillian’s dictionary (late 1960s):
Planes contain single word definitions.Words are defined in terms of other words in a
semantic network format.Program used definitions to find relationships
between pairs of words. (e.g. comfort and cry produce sad)
14CP468, Dr. Reem K. Al-Halimi
Fig 7.3 three planes representing three definitions of the word “plant” (Quillian, 1967). (Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009) 15
Semantic Networks
CP468, Dr. Reem K. Al-Halimi
Semantic NetworksWhat Relationships do We Need?Conceptual Dependency theory:
primitives of meaning1. Actions2. Objects3. modifiers of actions4. modifiers of objects
16CP468, Dr. Reem K. Al-Halimi
Semantic NetworksWhat Relationships do We Need?Conceptual Dependency Theory:
1. Actions1. transfer a relationship (give)2. transfer physical location of an object (go)3. apply physical force to an object (push)4. move body part by owner (kick)5. grab an object by an actor (grasp)6. ingest an object by an animal (eat)7. expel from an animal’s body (tell)8. transfer mental information (decide)9. conceptualize or think about an idea (think)10. produce sound (say)11. focus sense organ (listen)
17CP468, Dr. Reem K. Al-Halimi
Semantic NetworksWhat Relationships do We Need?Conceptual Dependency theory:
primitives of meaning1. Actions2. Objects3. modifiers of actions4. modifiers of objects
conceptual syntax rules built using these primitives constitute a grammar of meaningful semantic
relationships.conceptual dependency relationships
are defined using the conceptual syntax rules can be used to construct an internal representation
of an English sentence.18CP468, Dr. Reem K. Al-Halimi
Semantic NetworksWhat Relationships do We Need?Conceptual Dependency theory:
conceptual dependency relationships are defined using the conceptual syntax rules can be used to construct an internal
representation of an English sentence. Tense and mode are added.
Example: past future transition etc.
19CP468, Dr. Reem K. Al-Halimi
Fig 7.6 Conceptual dependencies (Schank and Rieger, 1974). (From: Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009)
20
Semantic Networks
CP468, Dr. Reem K. Al-Halimi
Semantic NetworksConceptual Dependency Example
Example:“John throws the ball”
“John threw the ball”
John *PROPEL*
BallO
BallOJohn *PROPEL*
P
21CP468, Dr. Reem K. Al-Halimi
Semantic NetworksConceptual Dependency theory
Advantages Provides a formal theory of natural language semantics reduces problems of ambiguity. representation directly captures much of the natural
language semantics sentences with similar meaning will have similar
representations (canonical form).Disadvantages:
No program exists that can reliably reduce sentences to canonical form.
Primitives not sufficient to represent more subtle concepts.
22CP468, Dr. Reem K. Al-Halimi
Framessupport the organization of knowledge into
more complex units reflecting the organization of objects in the domain.
Can be viewed as a static data structure with values attached.
23CP468, Dr. Reem K. Al-Halimi
Fig 7.12 Part of a frame description of a hotel room. “Specialization” indicates a pointer to a superclass.(Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009)
24
Frames Example
CP468, Dr. Reem K. Al-Halimi
Frames AdvantagesFrames add power and clarity to semantic nets by
allowing complex objects to be represented as a single frame.
Frames provide an easier framework to organize information hierarchically than semantic nets.
Frames allow for procedural attachment which runs a demon (piece of code) as a result of another action in the KB (this has also been done to some semantic nets).
Both frames and semantic nets support class inheritance.
25CP468, Dr. Reem K. Al-Halimi
Assumptions in Knowledge RepresentationKnowledge must be represented internally.Knowledge representation should be done in
a central location (Knowledge Base).Human need to select precisely what
knowledge is to be represented.
26CP468, Dr. Reem K. Al-Halimi
Subsumption ArchitectureKnowledge must be represented
internally.Knowledge representation should be done in
a central location (Knowledge Base).Human need to select precisely what
knowledge is to be represented.
need not
27CP468, Dr. Reem K. Al-Halimi
Subsumption Architecture
“The world is its own Model”Rodney Brooks
28CP468, Dr. Reem K. Al-Halimi
Brooks on the Subsumption ArchitectureScene 13 from E. Morris’ “Fast Cheap and
Out of Control”
CP468, Dr. Reem K. Al-Halimi 29
Subsumption Architecture“The world is its own Model”Intelligence is the product of the interaction
between an appropriately designed system and its environment.Examples of environments:
The real world for a robot, the internet for a web agent, the set of documents for a text understanding system, a game for a game playing system.
Intelligent behaviour emerges from the interactions of architectures that have organized simpler behaviour
30CP468, Dr. Reem K. Al-Halimi
Brooks on the Importance of Intelligent Systems’ Interaction with their environmentScene 15 from E. Morris’ “Fast Cheap and
Out of Control”
CP468, Dr. Reem K. Al-Halimi 31
Subsumption Architecture
Each task handler is a finite state machine.A task handler uses a set of condition action
production rulestask handlers are data driven.
perception-based input Action
task handler
32CP468, Dr. Reem K. Al-Halimi
Subsumption Architecture
The architecture is a layered collection of task handlers.
Each layer subsumes lower ones.
33CP468, Dr. Reem K. Al-Halimi
From fig 7.26 The functions of the three-layered subsumption architecture from Brooks (1991a). The layers are described by the AVOID, WANDER, and EXPLORE behaviours. Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009
34
Subsumption ArchitectureFirst Layer: Avoid
CP468, Dr. Reem K. Al-Halimi
Subsumption ArchitectureFirst Layer: AvoidWatch the avoid behaviour at
http://www.youtube.com/watch?v=ohykDN6-aY4
35CP468, Dr. Reem K. Al-Halimi
From fig 7.26 The functions of the three-layered subsumption architecture from Brooks (1991a). The layers are described by the AVOID, WANDER, and EXPLORE behaviours. Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009
36
Subsumption ArchitectureSecond Layer: Wander
CP468, Dr. Reem K. Al-Halimi
Fig 7.26 The functions of the three-layered subsumption architecture from Brooks (1991a). The layers are described by the AVOID, WANDER, and EXPLORE behaviours. Luger: Artificial Intelligence, 6th edition. © Pearson Education Limited, 2009
37
Subsumption ArchitectureThird Layer: Explore
CP468, Dr. Reem K. Al-Halimi
Distributed Problem SolvingMain idea:
No need for one central store of knowledge and general-purpose inferencing scheme
Divide a problem into several smaller problems.These smaller problems interact to solve the
bigger problems. Earlier history example: blackboard systems.
Intelligent behaviour is the result of the interaction between the appropriately designed problem-solving agent and the environment.
38CP468, Dr. Reem K. Al-Halimi
Agent-Oriented Problem SolvingAn agent is a problem solver that is:
Situated (interacts with its environment)Autonomous (makes its own decisions without
external intervention)Flexible (responds to stimuli from the
environment, and initiates actions based on situation).
Social (can interact appropriately with other agents or with humans).
39CP468, Dr. Reem K. Al-Halimi
Multi-Agent Problem SolversAgents interact to
cooperate towards achieving a common goal.coordinate in organizing the problem-solving
activity.negotiate sub-problem constraints to improve
performance.
40CP468, Dr. Reem K. Al-Halimi
Multi-Agent Problem SolversMulti-agent systems form a “loosely coupled
network of agents that work together” to achieve solutions to problems beyond the capabilities of any individual agent.
41CP468, Dr. Reem K. Al-Halimi
Agent-Oriented Problem SolvingExample: ROBOCUP“An international research and education
initiative. “Provides “a standard problem where wide
range of technologies can be integrated and examined.” (robocup.org)
Main domain: Soccer.Format:
Two teams of robots.Robots compete in a soccer match on a
standard platform.
42CP468, Dr. Reem K. Al-Halimi
Agent-Oriented Problem SolvingExample: ROBOCUPTeam members must be:
situatedautonomousflexiblesocial
43CP468, Dr. Reem K. Al-Halimi
Agent-Oriented Problem SolvingExample: ROBOCUPWatch this soccer game from the Humanoid
2008 World Cup at http://www.youtube.com/watch?v=iMM_XQXJUUc
An example of cooperation between robots during a soccer match
http://www.youtube.com/user/DarmstadtDribblers
44CP468, Dr. Reem K. Al-Halimi
Are agents Simply Objects with Fancy Stuff?Agents and objects (an instantiation of a class
in OOP) share some similarities but are quite different.
CP468, Dr. Reem K. Al-Halimi 45
Objects in OOP vs. Agents Similarities: Objects (like agents) have
1. systems with encapsulated states.2. Certain methods are associated with the
object’s state.3. Methods support interaction with the
environment.4. Different objects communicate by message
passing.
46CP468, Dr. Reem K. Al-Halimi
Objects in OOP vs. Agents Differences:
1. Objects do not usually control their own behaviour.2. Agents can initiate their own actions. Object
generally do not.3. Objects do not have a social behaviour.4. Agents do not invoke methods in one another.5. Interacting agents usually have their own individual
thread of control.6. Agents can use more than just simple messages to
communicate. 7. Objects are associated with their class. Agents can
have multiple associations which may also change at any time.
8. Emergence can occur from groups of agents but not from objects.
47CP468, Dr. Reem K. Al-Halimi
Are agents Simply Objects with Fancy Stuff?Agents and objects (an instantiation of a class
in OOP) share some similarities but are quite different.
However, we can use objects to create agents.
CP468, Dr. Reem K. Al-Halimi 48
SummaryDiscussed two models of AI problem solving
central store of knowledge: Weak problem solvers Strong problem solvers
No central KB Subsumption architecture . Multi-agent systems.
A combination of those, and other, models exist.
The applicability of one model versus others is affected by the problem at hand, resources, and problem constraints.
CP468, Dr. Reem K. Al-Halimi 49