fluid concept architectures an experimentation platform

21
Fluid Concept Architectures An experimentation platform

Upload: leon-reed

Post on 17-Jan-2018

224 views

Category:

Documents


0 download

DESCRIPTION

3/21 Common problems in AI Bird’s nest problem (Minsky) Complex construction Parts are well designed (to the robot) and available Simple construction Debris on floor not designed to build nests

TRANSCRIPT

Page 1: Fluid Concept Architectures An experimentation platform

Fluid Concept Architectures

An experimentation platform

Page 2: Fluid Concept Architectures An experimentation platform

2/21

Overview Rationale

Common problems in AI Running example Strategies

Fluid Concepts High-level Perception Parallel Terraced Scan

Architecture Conclusion

Benefits & shortcomings An experimentation platform

Page 3: Fluid Concept Architectures An experimentation platform

3/21

Common problems in AI Bird’s nest problem (Minsky)

•Complex construction•Parts are well designed (to the robot) and available

• Simple construction• Debris on floor not designed to build nests

Page 4: Fluid Concept Architectures An experimentation platform

4/21

Common problems in AI Bird’s nest problem (Minsky)

Likewise: 9x9 chess?

“inability to handle the variation in real life”

Can we represent the fluidity of concepts?• overlapping and associative nature• blurry and shifting boundaries• adaptability to context

Page 5: Fluid Concept Architectures An experimentation platform

5/21

Common problems in AI Frame problem

How to identify effectively which data are relevant in solving a problem (without first solving the problem)?

Is relevant in the solution? find a solution with

No time to try all data!

• Make educated guesses (e.g. heuristics)• Abstract data (how?)

Page 6: Fluid Concept Architectures An experimentation platform

6/21

Common problems in AI Frame problem

Can we let relevance emerge through interplay between problem concepts and specific data?• relevant concepts shapes the abstraction of data • specific data adapts relevance of concepts

French flagblue, white, red

circle, trapezoid

rectangle

Page 7: Fluid Concept Architectures An experimentation platform

7/21

Common problems in AI Combinatorial explosion

Can we use relevance to focus search?• avoid brute-force search

Page 8: Fluid Concept Architectures An experimentation platform

8/21

Overview Rationale

Common problems in AI Running example Strategies

Fluid Concepts High-level Perception Parallel Terraced Scan

Architecture Conclusion

Benefits & shortcomings An experimentation platform

Page 9: Fluid Concept Architectures An experimentation platform

9/21

Running example Letter recognition

Page 10: Fluid Concept Architectures An experimentation platform

10/21

Overview Rationale

Common problems in AI Running example Strategies

Fluid Concepts High-level Perception Parallel Terraced Scan

Architecture Conclusion

Benefits & shortcomings An experimentation platform

Page 11: Fluid Concept Architectures An experimentation platform

11/21

Strategies Fluid Concepts (Slipnet)

Concepts Associations Conceptual distance (resistance) Relations (labels) Relevance (activation) Activation decay Conceptual depth Sparking Activation spreading Label nodes Conceptual shifting

part

of

right

to

right to

below left

.45

.3

.65

.6

deep concepts decay slower

.4

.9

Page 12: Fluid Concept Architectures An experimentation platform

12/21

Strategies High-level Perception (Workspace)

Percepts Mapping Abstracting Sparking Focus on:

“salient” percepts Relevant mappings Active mappings Low happiness

height: short .8width: wide

curvature:slight-left

shape: cupped

height: tall

tip1:NW

tip1: east

shape: cupped

.75

.5

.95

.3.6.75

.9

.8curvature: right

Contextually relevant concepts are activated

Percepts bound torelevance of these concepts

Relevance?

.8 .6.4 .7.9 1 .8 1

.9 .9.5 .4

.8 .4.6.9

Page 13: Fluid Concept Architectures An experimentation platform

13/21

Strategies Parallel Terraced Scan

“A parallel investigation of many possibilities to different levels of depth, quickly throwing out bad ones and homing in accurately and rapidly on good ones.” (D.R.Hofstadter)

Build percepts in phases: Measure promise with a quick test If okay, examine closer If okay, build it

Work in (simulated) parallel How?

Page 14: Fluid Concept Architectures An experimentation platform

14/21

Strategies Parallel Terraced Scan

Codelets (~ ant systems) Each performs tiny part of algorithm:

Scouts, examiners, builders Called with specific urgency:

Scouts are continuously added (low urgency) Follow-up codelets (urgency = promise of percept) Active concepts (high urgency)

Scheduler picks next codelet (stochasticly)

Strongest pressures commingle

Page 15: Fluid Concept Architectures An experimentation platform

15/21

Overview Rationale

Common problems in AI Running example Strategies

Fluid Concepts High-level Perception Parallel Terraced Scan

Architecture Conclusion

Benefits & shortcomings An experimentation platform

Page 16: Fluid Concept Architectures An experimentation platform

16/21

ArchitectureSlipnet

Node activation spreads through conceptual links

Activation is sparked with every mapping

Workspace

Coderack

Highly activated nodes spawn top-down codelets

Codelets call in follow-up codelets

Bottom-up codelets continuously added

Codelets enter workspace

Page 17: Fluid Concept Architectures An experimentation platform

17/21

Overview Rationale

Common problems in AI Running example Strategies

Fluid Concepts High-level Perception Parallel Terraced Scan

Architecture Conclusion

Benefits & shortcomings An experimentation platform

Page 18: Fluid Concept Architectures An experimentation platform

18/21

Benefits Handles concepts fluently

Flexible representations, flexible actions “Searches” through interpretation space

Not trying all possible combinations Does its own representation building

Sensitive to pressures from actual situation Much more independent Can generate original viewpoints

Remains symbolical Representations easily referenced and manipulated

Page 19: Fluid Concept Architectures An experimentation platform

19/21

Shortcomings Hard to set up good domain

Requires thorough study Doesn’t learn (yet) Behavior depends on many parameters

Hard to see how change affects behavior Hard to experiment with

No flexible code base available Start from scratch

Page 20: Fluid Concept Architectures An experimentation platform

20/21

An experimentation platform Component based approach

E.g. replace semantic network User writes objects, dynamically loaded

Codelets, percepts are very dynamic entities Uniform communication between components

Page 21: Fluid Concept Architectures An experimentation platform

21/21

Generalizations Allow multiple workspaces (and schedulers)

Different “parts” in problem Delegate different levels of perception (≠ codelets,…)

Allow different approaches Information in codelets vs. network vs. percepts