[slides3]

126
1 uzzy Set: 1965 … Fuzzy Logic: 1973 … Soft Decision: 1981 … BISC: 1990 … Human-Machine Perception: 2000 - Theory and the Applications of Natural Language Theory and the Applications of Natural Language Computing: Computing: Computation and Reasoning with Information Presented in Computation and Reasoning with Information Presented in Natural Languages Natural Languages Masoud Nikravesh BISC Program, EECS-UCB & Informatics and Imaging- Life Sciences Lawrence Berkeley National Laboratory (LBNL) http://www-bisc.cs.berkeley.edu/ Email: [email protected] Tel: (510) 643-4522; Fax: (510) 642-5775 Acknowledgements: James S. Albus Senior NIST Fellow Intelligent Systems Division Manufacturing Engineering Laboratory National Institute of Standards and Technol Acknowledgements: Prof. Lotfi A. Zadeh BISC Program EECS-UCB ICMLA'05 The Fourth International Conference on Machine Learning and Applications 15-17 December 2005, Sheraton Gateway Hotel, Los Angeles, CA, USA

Upload: rockys11

Post on 11-May-2015

720 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: [Slides3]

11

Fuzzy Set: 1965 … Fuzzy Logic: 1973 … Soft Decision: 1981 … BISC: 1990 … Human-Machine Perception: 2000 - …

Theory and the Applications of Natural Language Computing:Theory and the Applications of Natural Language Computing:Computation and Reasoning with Information Presented in Natural LanguagesComputation and Reasoning with Information Presented in Natural Languages

Masoud NikraveshBISC Program, EECS-UCB

&Informatics and Imaging- Life Sciences

Lawrence Berkeley National Laboratory (LBNL)

http://www-bisc.cs.berkeley.edu/Email: [email protected]

Tel: (510) 643-4522; Fax: (510) 642-5775

Acknowledgements: James S. Albus

Senior NIST FellowIntelligent Systems Division

Manufacturing Engineering LaboratoryNational Institute of Standards and Technology

Acknowledgements: Prof. Lotfi A. Zadeh

BISC ProgramEECS-UCB

ICMLA'05 The Fourth International Conference on Machine Learning and Applications

15-17 December 2005, Sheraton Gateway Hotel, Los Angeles, CA, USA

Page 2: [Slides3]

22

Outline

• BISC Program

• Introduction

• Natural Language Computing• CONCEPTS• ORIGIN• APPLICATIONS

• Neu-Search*• BISC-DSS

Page 3: [Slides3]

33

Page 4: [Slides3]

44

TURING’s TESTTuring: A computer can be said to be intelligent if its Turing: A computer can be said to be intelligent if its

answers are indistinguishable from the answers of a answers are indistinguishable from the answers of a human beinghuman being

??

Computer

Page 5: [Slides3]

55

Artificial Neural Network vs. Human Brain

Largest neural computer: 20,000 neurons

Worm’s brain: 1,000 neurons

But the worm’s brain outperforms neural computers

It’s the connections, not the neurons!Human brain: 100,000,000,000 neurons200,000,000,000,000 connections

Page 6: [Slides3]

66

• Processing Speed: Milliseconds VS Nanoseconds.

• Processing Order: Massively parallel.VS serially.

• Abundance and Complexity: 1011 and 1014 of neurons operate in parallel in the brain at any given moment, each with between 103 and 104 abutting connections per neuron.

• Knowledge Storage: Adaptable VS New information destroys old information.

• Fault Tolerance: Knowledge is retained through the redundant, distributed encoding information VS the corruption of a conventional computer's memory is irretrievevable and leads to failure as well.

Brain vs. Computer Processing

Cesare Pianese

Page 7: [Slides3]

77

Machine Intelligence – Human Intelligence Year is 2020

Computing Power == > Quadrillion/sec/$100 5-15 Quadrillion/sec (IBM’s Fastest computer =

100 Trillion) High Resolution Imaging (Brian and Neuroscience)

Human Brain, Reverse Engineering Dynamic Neuron Level Imaging/Scanning and Visualization

Searching, Logical Analysis Reasoning Searching for Next Google; Internet Protocol TV (IPTV)

Every viewer could potentially receive different advertisement based on its profile, search, and shows the viewer has been watched Family will not skip the ads, because it is targeted advertising

Technology goes Nano and Molecular Level Nanotechnology Nano Wireless Devices and OS

Tiny- blood-cell-size robots Virtual Reality through controlling Brain Cell Signals

Who should work and who should get paid? Human or Robots/Machines?

Page 8: [Slides3]

88

Q

Q2

Q1

Q3Q3

a1

a1

a2

a2a2

a12 a2

2

a31

a21a1

1

a13

a23 a1

3 a23

aa22

aa22

aa22

aa11

aa11

aa11

FF11

aa1133

aa2233

**

aa1122

aa1122

aa1122

aa3311

aa1111

aa2211

aa1133

aa2233

**

aa1122

**

aa2222

aa1111

aa3311

**

AA33AA22AA11

aa22

aa22

aa22

aa11

aa11

aa11

FF11

aa1133

aa2233

**

aa1122

aa1122

aa1122

aa3311

aa1111

aa2211

aa1133

aa2233

**

aa1122

**

aa2222

aa1111

aa3311

**

AA33AA22AA11

...

is F THEN is Aand is A

is F THEN is Aand is Aand is A

is F THEN is Aand is Aand is A

is F THEN is A

is F THEN is Aand is A

is F THEN is A is Aand is A

121

1321

1321

12

131

1321

221

12

232

21

11

231

21

13

122

132

13

131

21

11

aaaIF

aaaaIF

aaaaIF

aaIF

aaaIF

aaaaIF

Reasoning ?

Deduction ?

Page 9: [Slides3]

99

Reasoning ?

Deduction ?

Page 10: [Slides3]

1010

Human-Machine-Perception-Based Reasoning

Machine Agent Intelligent Agent Phantoms Humanoid Human

Computer Mind

Machine based on Human Brain and Neuroscience

Page 11: [Slides3]

1111

Page 12: [Slides3]

1212

The Human Mind

Arguably the most important for humankind.

The next unexplored frontier of science

Mind is what distinguishes humans from the rest of creation

Your mind is who you are

Page 13: [Slides3]

1313

Recent Breakthroughs

Neurosciences – Focused on understanding the brain- chemistry, synaptic transmission, axonal connectivity, functional MRI

Cognitive Modeling – Focused on representation and useof knowledge in performing cognitive tasks

- mathematics, logic, language

Intelligent Control – Focused on making machines behave appropriately (i.e., achieve goals) in an uncertain environment

- manufacturing, autonomous vehicles, agriculture, mining

Depth Imaging – Enables geometrical modeling of 3-D world. Facilitates grouping and segmentation.Provides solution to symbol-grounding problem.

Computational Power – Enables processes that rival the brainin operations per second. At 1010 ops, heading for 1015 ops.

Page 14: [Slides3]

1414

Page 15: [Slides3]

1515

EVOLUTION OF COMPUTATION

naturallanguage

arithmetic algebra

algebra

differentialequations

calculusdifferentialequations

numericalanalysis

symboliccomputation

computing with wordsprecisiated natural language

symboliccomputation

+ +

+ +

+ +

+

Page 16: [Slides3]

1616

Page 17: [Slides3]

1717

Classical Logic is inadequate for ordinary life

Intuitionism

Non- Monotonic Logic

Second thoughts

Plausible reasoning

Quick, efficient response to problems when an exact solution is not necessary

COMMON SENSE

The World Of ObjectsThe Measure SpaceQualitative Reasoning

HeuristicsRules of thumbsGeorge Polya: “Heuretics"

Page 18: [Slides3]

1818

Page 19: [Slides3]

1919

EVOLUTION OF LOGIC

two-valued (Aristotelian): nothing is a matter of degree

multi-valued: truth is a matter of degree

fuzzy: everything is a matter of degree

Page 20: [Slides3]

2020

In bivalent logic, BL, truth is bivalent, implying that every proposition, p, is either true or false, with no degrees of truth allowed

In multivalent logic, ML, truth is a matter of degree

In fuzzy logic, FL: everything is, or is allowed to be, to be partial, i.e., a

matter of degree everything is, or is allowed to be, imprecise

(approximate) everything is, or is allowed to be, granular (linguistic) everything is, or is allowed to be, perception based

Page 21: [Slides3]

2121

EVOLUTION OF FUZZY LOGICA PERSONAL PERSPECTIVE (L.A. Zadeh)

generality

time1965 1973 1999

1965: crisp sets fuzzy sets1973: fuzzy sets granulated fuzzy sets (linguistic variable)1999: measurements perceptions

nl-generalization

f.g-generalization

f-generalization

classical bivalent

computing with words and perceptions (CWP)

Page 22: [Slides3]

2222

Natural Language Computing

Page 23: [Slides3]

2323

•it is 35 C°

•Eva is 28

•probability is 0.8

•It is very warm

•Eva is young

•probability is high

•it is cloudy

•traffic is heavy

•it is hard to find parking near the campus

INFORMATION

measurement-based numerical

perception-based linguistic

MEASUREMENT-BASED VS. PERCEPTION-BASED INFORMATION

• measurement-based information may be viewed as special case of perception-based information

Page 24: [Slides3]

2424

MEASUREMENT-BASED

a box contains 20 black a box contains 20 black and white ballsand white balls

over seventy percent over seventy percent are blackare black

there are three times as there are three times as many black balls as many black balls as white ballswhite balls

what is the number of what is the number of white balls?white balls?

what is the probability what is the probability that a ball picked at that a ball picked at random is white?random is white?

a box contains about 20 a box contains about 20 black and white balls black and white balls

most are blackmost are black there are several times there are several times

as many black balls as as many black balls as white ballswhite balls

what is the number of what is the number of white ballswhite balls

what is the probability what is the probability that a ball drawn at that a ball drawn at random is white?random is white?

PERCEPTION-BASED

version 2version 1

Page 25: [Slides3]

2525

COMPUTATION (version 2)

measurement-basedmeasurement-based

X = number of black X = number of black ballsballs

YY22 number of white number of white

ballsballs

X X 0.7 0.7 • 20 = 14• 20 = 14

X + Y = 20X + Y = 20

X = 3YX = 3Y

X = 15X = 15; Y = 5; Y = 5

p =5/20 = .25p =5/20 = .25

perception-basedperception-based

X = number of black X = number of black ballsballs

Y = number of white Y = number of white ballsballs

X = most X = most × 20*× 20*

X = several *YX = several *Y

X + Y = 20*X + Y = 20*

P = Y/NP = Y/N

Page 26: [Slides3]

2626

BASIC POINT

Conventional methods of systems analysis are Conventional methods of systems analysis are oriented toward numerical attributes and oriented toward numerical attributes and measurement-based information. They lack the measurement-based information. They lack the capability to deal with linguistic attributes and capability to deal with linguistic attributes and perception-based informationperception-based information

Computing with words is aimed at adding to Computing with words is aimed at adding to methods of systems analysis and decision analysis methods of systems analysis and decision analysis an important high-level capability—the capability to an important high-level capability—the capability to deal computationally and logically with linguistic deal computationally and logically with linguistic attributes and perception-based informationattributes and perception-based information

Page 27: [Slides3]

2727

Natural Language Computing

Page 28: [Slides3]

Computation?

•Traditional Sense: Manipulation of Numbers

• Human: Uses Word for Computation and Reasoning

Words are less precise than numbers!

Computing Word <== Natural Language

Page 29: [Slides3]

2929

Natural Language Computing?

Human: Uses Natural LanguagesHuman: Uses Word for ComputationHuman: Uses Reasoning

LogicWords are less precise than numbers!Reality vs. Being Certain

Fuzzy Set and Fuzzy Logic as basis for Natural Language Computing

Page 30: [Slides3]

3030

Inspired by human’s remarkable capability to perform a wide variety of physical and mental tasks without any measurements and computations and dissatisfied with classical logic as a tool for modeling human reasoning in an imprecise environment, Lotfi A. Zadeh developed the theory and foundation of fuzzy logic with his 1965 paper “Fuzzy Sets” [1] and extended his work with his 2005 paper “Toward a Generalized Theory of Uncertainty (GTU)—An Outline”

Theory of Natural Language Computing ORIGIN, CONCEPTS, AND TRENDS

Fuzzy Set and Fuzzy Logic as basis for Natural Language Computing

Page 31: [Slides3]

3131

WHAT IS FUZZY LOGIC?

fuzzy logic (FL) is aimed at a formalization of modes of reasoning which are approximate rather than exact

examples:

exact all men are mortal

Socrates is a man

Socrates is mortal

approximate most Swedes are tall

Magnus is a Swede

it is likely that Magnus is tall

Page 32: [Slides3]

3232

““Fuzzy logic” is not fuzzy logicFuzzy logic” is not fuzzy logic Fuzzy logic is a Fuzzy logic is a preciseprecise logic of logic of

approximate reasoning and approximate approximate reasoning and approximate computationcomputation

The principal distinguishing features of fuzzy The principal distinguishing features of fuzzy logic are:logic are:

a)a) In fuzzy logic everything is, or is allowed to In fuzzy logic everything is, or is allowed to be graduated, that is, be a matter of degreebe graduated, that is, be a matter of degree

b)b) In fuzzy logic everything is allowed to be In fuzzy logic everything is allowed to be granulatedgranulated

FUZZY LOGIC—KEY POINTS

Page 33: [Slides3]

3333

When do we use Fuzzy Logic?

To exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality

Crisp, fine grained information is not available Economic systems, everyday decision-making

Precise information is costly Diagnosis systems, quality control, decision analysis

Fine-grained information is not necessary Cooking, balancing, parking a car

Coarse-grained information reduces cost Camera, consumer products

Page 34: [Slides3]

3434

Natural Language Computing?

Fuzzy Set and Fuzzy Logic as basis for

Natural Language Computing

+

NEW TOOLSComputing with words and perceptions (CWP) & Precisiated Natural Language (PNL)

Page 35: [Slides3]

3535

Fuzzy Sets (Zadeh 1965)

(Info. And Control, 8, 338-353 (1965) A fuzzy set is a class of objects with a

continuum of grades of membership

Each Set is characterized by membership function which assigns to each object a grade of membership

The notion of a fuzzy set is completely non statistical

Page 36: [Slides3]

3636

PRECISIATION OF “approximately a,” *a

x

x

x

a

a

a0

1

0

0

1

p

fuzzy graph

probability distribution

interval

x 0

a

possibility distribution

x a0

1

s-precisiation singleton

g-precisiation

cg-precisiation

Page 37: [Slides3]

3737

CONTINUED

x

p

0

bimodal distribution

GCL-based (maximal generality)GCL-based (maximal generality)

g-precisiation X isr R

GC-form

*a

g-precisiation

Page 38: [Slides3]

3838

Fuzzy Concept(Zadeh, 1971)

If x is a term, then its meaning, M(x), is a concept

Level 1 concept: K : a set of objects Concepts (labels for concepts); “white”, “yellow”, “green”, … “redder than”, “darker than” are level 1 concept, since

If y1 and y2 are objects (y1,y2), we can calculate µ(y1,y2) such as “darker than”

Level 2 Concept Example : “color”

This concept is a collection of the concepts M(white), M(green), …, M(black), …

Level 3 Concept Example : “visual attribute”

This concept is a collection of the concepts such as “color”, “shape”, “size”, … Concept at higher level than 1 is much harder to define by exemplification than concepts

at level 1.

Page 39: [Slides3]

3939

Fuzzy Concept

Natural language for teaching a kid or machine

Exemplification a set of primitive concepts at level 1 (vocabulary)

Build up on this vocabulary by defining other level 1 concepts in term of the already defined

Build up on this vocabulary by defining other higher level concepts in term of the already defined level 1 (or lower level concepts)

Page 40: [Slides3]

4040

Language as fuzzy relation iscloser to the Linguistic than formal languages

Each Word x in a natural language L may be viewed as a summarized description of a fuzzy subset M(x) of universe of discourse U, with M(x) representing the meaning of x

Language is a fuzzy correspondence between the element of T and U, where T is a set of terms

Let x be a term in T. Then the meaning of x, denoted by M(x) is a fuzzy subset of U characterized by a membership function µ (x l y).

T: white, gray, green, blue, yellow, red, black, …T: young, old, middle-aged, not old, not young, ….

Computing with Words and Perception

Page 41: [Slides3]

4141

Computation of Meaning by the use ofQuantitative Semantics

Meaning:Simple terms (young, old, very, not, and, or)Composite terms (not very young and not very old)

Quantitative Semantics is a procedure for computing the meaning, M(x), of a composite x in T from the knowledge of the meanings of the simple terms x1 x2 … xn

Computing with Words and PerceptionPrecisation of Variables

Page 42: [Slides3]

4242

Fuzzy Grammar for Computation of Meaning for composite terms

not very young not very young and not very old young and not old old or not old old or not very very young young and (old or not young)

S A C O S S or A C YA B O very OA A and B Y very YB not C O oldC (S) Y youngB C

Page 43: [Slides3]

4343

Fuzzy Grammar for Computation of Meaning for composite terms

)y(old,)y(young,)y(x,

oldvery not and very young not :

)(young)(Y youngY

)(old)(O oldO

)(S)(C (S)C

)(Y)(C Y C

)(O)(C OC

)(Y)(Y Y Y

)(O)(O O O

)(C)(B C notB

)(B)(A)(A B andA A

)(A)(S)(S A or SS

)(A)(S CB

)(B)(A BA

)(A)(S A S

L

L

RL

RL

RL

RL

RL

RL

RRL

RRL

RL

RL

RL

42

2

2

11

1

LLL μμμ

termcomposite

μμ

μμ

μμ

μμ

μμ

μμvery

μμvery

μμ

μμμ

μμμ

μμ

μμ

μμ

Page 44: [Slides3]

4444

Fuzzy Grammar for Computation of Meaning for composite terms

)y(old,1)y(young,1)y(x,

oldnot very and youngnot very : 42LLL

termcomposite

S1

A2

A3 B8

B4C9

C5 O10

O11

O12

Y6

Y7

oldyoung

very

very

very

not

not

and

Page 45: [Slides3]

4545

Fuzzy Grammar for Computation of Meaning for composite terms

)y(old,)y(young,)y(x,

oldvery not and very young not :

)(A)(S)y(x,

)(B)(A)(A

)(C)(B

)(O)(C

)(O)(O

) y(old,)(O

)(B)(A

)(C)(B

)(Y)(C

)(Y)(Y

) y(young,)(Y

21

832

9

9

10

12

3

54

5

76

7

42

10

112

4

6

2

11

18

1

LLL

L

L

L

μμμ

termcomposite

μμμ

μμμ

μμ

μμ

μμ

μμ

μμ

μμ

μμ

μμ

μμ

Page 46: [Slides3]

4646

Fuzzy Logic(Outline of New Approach to the Analysis of Complex Systems and Decision Process, IEEE

Trans. On system, man and cybernetics, Vol. SMC-3, No. 1, Jan 1973, 28-44)

The use of Linguistic variables Simple relations between variables by fuzzy

conditional statementComplex relations by fuzzy Algorithms

IF x is small and x is not large THEN y is very largeIF x is not very small THEN y is very largeIF x is not small and not large THEN y is not very large

Page 47: [Slides3]

4747

The use of Linguistic variablesMeaning

Information Summarization

Humans: Ability to summarize information finds its most pronounce manifestation in the use of Natural Languages

Linguistic Variables: Each word x in a natural language L may be viewed as a summarized description of a fuzzy subset M(x) of universe of discourse U, with M(x) representing the meaning of x.

Page 48: [Slides3]

4848

The use of Linguistic variables

Object: red Meaning: M(Red)Object: flower Meaning: M(flower)Object: red flower Meaning: M (red) ∩ M (flower) ∩ : Intersection or Min. Operator

Object: Variable: Color of the object Values: red, blue, yellow, green Values for the Object: Labels of fuzzy sets

Attribute: Color Fuzzy VariableValues: red, blue, …, Labels of the fuzzy sets Attributes: HeightValues: tall, not tall, somewhat tall, very tall, …

Sentences: Label (tall, red,….), Negation (not), Connective (and, but, or), and hedges (very, somewhat, quite, more or less, …)

Page 49: [Slides3]

4949

Simple relations between variables by fuzzy conditional statement

IF x is small THEN y is very largeIF x is not very small THEN y is very largeIF x is not small and not large THEN y is not

very large

Page 50: [Slides3]

5050

Complex relations by fuzzy Algorithms

Fuzzy Algorithm is an ordered sequences of instructions

Reduce x slightly if y is largeIncrease x very slightly if y is not very large

and not very smallIf x is small then stop; otherwise increase x

by 2.

Page 51: [Slides3]

5151

VARIABLES AND LINGUISTIC VARIABLES

one of the most basic concepts in science is that of a variable

variable -numerical (X=5; X=(3, 2); …)

-linguistic (X is small; (X, Y) is much larger)

a linguistic variable is a variable whose values are words or sentences in a natural or synthetic language (Zadeh 1973)

the concept of a linguistic variable plays a central role in fuzzy logic and underlies most of its applications

Page 52: [Slides3]

5252

Fuzzy sets, logics and reasoningExamples

(Zadeh 1973)Set: U

U = 1 + 2 + 3 + 4 + 5

small = 1/1 + .8/2 + .6/3 + .4/4 + .2/5

(degree/set)

very x := x2

very small = 1/1 + 0.64/2 + 0.36/3 + 0.16/4 + 0.04 /5

very very small = very (very x) = very (x2 )= x4

very very small = 1/1 + 0.4/2 + 0.1/3

Page 53: [Slides3]

5353

Computation of the Meaning of Values of Linguistic Variables

not very small = not (very small)= ¬ (very small) = ¬ (small2)

very small = 1/1 + 0.64/2 + 0.36/3 + 0.16/4 + 0.04 /5

not very small = 0/1 + 0.36/2 + 0.64/3 + 0.84/4 + 0.96/5

Page 54: [Slides3]

5454

Computation of the Meaning of Values of Linguistic Variables

very very large = very (very large)= very (large2) large4

not very very large = ¬ large4

X= not very small and not very very large:

¬ (small2) ∩ ¬ (large4 ) = Min. (¬ (small2), ¬ (large4 )

Page 55: [Slides3]

5555

Fuzzy Conditional Statement and Compositional Rule of Inference

In general : IF A THEN B ; A x B ; A B

Min (A,B) ; x : Intersection or ∩

A = 1/1 + 0.8/2

B = 0.6/1 + 0.9/2 + 1/3

808060

19060

2

1

...

..

3 2 1

BA

IF (x is) large THEN (y is) smallIF (the road) slippery THEN (driving is) dangerous

R : Fuzzy Relation

Page 56: [Slides3]

5656

Fuzzy Conditional Statement and Compositional Rule of Inference

IF A1 THEN B1 ELSE IF A2 THEN B2 ELSE IF An THEN Bn ELSE

= A1 x B1 + A1 x B1 + . . .+ An x Bn

IF A THEN (IF B THEN C ELSE D) ELSE E

= A x B x C + A x ¬B x D + ¬A x E

IF A THEN B ELSE C : A x B + (¬A x C) + : union or U

Page 57: [Slides3]

5757

Fuzzy Conditional Statement and Compositional Rule of Inference

IF (x is) very small THEN (y is) large ELSE (y is) not very large A x B + (¬A x C)A: small B: large C: not very largex : Min. or ∩ or intersection + : Max. or U or union Operation to calculate: Min Max operator

small = 1/1 + 0.8/2 + 0.6/3 + 0.4/4 + 0.2/5large = 0.2/1 + 0.4/2 + 0.3/3 + 0.8/4 + 1/5

R : Fuzzy Relation = A x B + (¬A x C)

Page 58: [Slides3]

5858

Fuzzy Conditional Statement and Compositional Rule of Inference

203606408080

4040606060

6060604040

8080604020

180604020

.....

.....

.....

.....

....

)( CABAR

R : Fuzzy Relation

IF (x is) very small THEN (y is) large ELSE (y is) not very large

x = very small (x2)x o R = y =[0.36 0.4 0.6 0.8 1]

Page 59: [Slides3]

5959

Fuzzy Conditional Statement and Compositional Rule of Inference

203606408080

4040606060

6060604040

8080604020

180604020

.....

.....

.....

.....

....

)( CABAR

R : Fuzzy Relation

IF (x is) very small THEN (y is) large ELSE (y is) not very large

x : very small = (small2) = [ 1 0.64 0.36 0.16 0.04] x o R = y =[0.36 0.4 0.6 0.8 1] ≈ not very small

Page 60: [Slides3]

6060

Fuzzy Conditional Statement and Compositional Rule of Inference

Given Inferred

A B x y

small large not small not very large

medium medium very large very very large

large very small very very small very very large

not very large small or medium

Page 61: [Slides3]

6161

Joint Probability

P: if X is small then p is smallIf X is medium then p is largeIf X is large then p is small

Q: if Y is small then q is largeIf Y is medium then q is smallIf Y is large then q is large

Page 62: [Slides3]

6262

Joint Probability

P

X psmall small

medium large

large small

Q

Y qsmall large

medium small

large small

P: small x small + medium x large + large x small

Q: small x large + medium x small + large x small

(P,Q)= small x small (small* large) +

+ small x medium x (small *small) + …

Large x large x (small * large)

* : the arithmetic product in fuzzy arithmetic

Page 63: [Slides3]

6363

Possibilistic Relational Universal Fuzzy

Page 64: [Slides3]

6464

PRUF – A meaning Representation Language for Natural Languages (Zadeh, 1977)

Possibilistic Relational Universal Fuzzy

Assumption: imprecision is possibilistic rather than probabilistic in nature

The Logic: Fuzzy logic, rather than two-valued or multivalued- logic

The quantifiers in PRUF are allowed to be linguistic, “most”, “many”, “some”, “few”

Page 65: [Slides3]

6565

PRUF – A meaning Representation Language for Natural Languages (Zadeh, 1977)

The concept of Semantic Equivalence and Semantic Entailment in PRUF provide a basis for Question-Answering (Q&A) and Inference from fuzzy premises

Foundation for Approximate Reasoning

Language for representation of imprecise knowledge and as a means of precisiation of fuzzy propositions expressed in a natural language.

Precisiated Natural LanguagePrecisation of Meaning

Page 66: [Slides3]

6666

PRUF – A meaning Representation Language for Natural Languages (Zadeh, 1977)

Translation rules in PRUF:

Type I: pertaining to modification Type II: pertaining to composition Type III: pertaining to quantification Type IV: pertaining to qualification

Page 67: [Slides3]

6767

PRUF Type I: pertaining to modification

X is very small

X is much larger than Y

Eleanor was very upset

The Man with the blond hair is very tall

PRUF Type II: pertaining to composition

X is small and Y is large (conjunctive composition)

X is small or Y is large (disjunctive composition)

If X is small then Y is large (conditional and conjunctive composition)

If X is small the Y is large else Y is very large (conditional and conjunctiv composition)

Page 68: [Slides3]

6868

PRUF – Type III: pertaining to quantification

Most Swedes are tall

Many men are much taller than most men

Most tall men are very intelligent

PRUF – Type IV: pertaining to qualification

Abe is young is not very true (truth qualification) Abe is young is quite probable (probability

qualification Abe is young is almost impossible (possibility

qualification)

Page 69: [Slides3]

6969

Proposition p

p : N is F

Modified proposition p+

p+ = N is mF

m: not, very, more or less, quite, extremely, etc.

PRUF Type I: pertaining to modificationRules of Type I: Basis is the Modifier

);()(

friends are Pat and Vera:p* p eApproximat

friends close are Pat and Vera:

),,(

very young is Lisa

),,(

2

YOUNG

Age(Lisa)

YOUNG

2

PatNameVeraNameFRINDSμFRIENDSπ

p

YOUNG

21

4535251

4535251

2

2

2

Page 70: [Slides3]

7070

PRUF Type II: pertaining to composition Rules of Type II: operation of composition

difference arithmetic -

sum, arithmetic ,min:

)()(),(

)()(),(

F

V F G F G is N then F is M If

or

GF G is N then F is M if

GF : G is N or F is M

G F GF : G is N and F is M

G is N : r

F is M : q

r * q p

GF

''

'

'

'

vμuμvuμ

vμuμvuμ

GUG

VF

GF

GFGF

11

Page 71: [Slides3]

7171

PRUF Type II: pertaining to composition Rules of Type II: operation of composition

101010

606010

16010

31260110

31026011

321

...

...

..

:large is Y and small is

.//./.::

/././::

:,:

,

:

Y)(X,

X

LARGEG

SMALLF

YNXM

VU

Example

909090

606040

16010

111

1150

16010

16010

16060

111

...

...

..

:large is Y then small is

.

..

:large is Y then small is

..

..

:large is Y or small is

Y)(X,

Y)(X,

Y)(X,

Xif

Xif

X

Page 72: [Slides3]

7272

PRUF Type II: pertaining to composition Rules of Type II: operation of composition

H is N then F not is M If

and G is N then F is M If

H is N else G is N then F is M If

'

H is N else G is N then F is M If

),...,,,...,(

or

HFGFnm YYXX

1

Page 73: [Slides3]

7373

PRUF Type II: pertaining to composition Rules of Type II: operation of composition

mnm11n11

mn2m22m11

2n2222211

1n2122111

FFFF R

F is X ... and F is X and F is X

OR F is X ... and F is X and F is X

OR F is X ... and F is X and F is X R

R X1 X2 … X1n

F11 F12 … F1n

… … … …

Fm1 … … Fmn

Page 74: [Slides3]

7474

Compactification Algorithm InterpretationA Simple Algorithm for Qualitative Analysis

Rule Extraction and Building Decision Tree

Dr. Nikravesh and Prof. Zadeh (2005)(Zadeh, 1976)

Page 75: [Slides3]

7575

Compactification Algorithm Interpretation

AA11 AA22 oo AAnn FF11

aa1111

aa2211

oo

aamm11

aa1122

aa2222

oo

aamm22

OO

OO

OO

aa11nn

aa22nn

oo

aammnn

aa11

aa22

oo

aamm

aa11 aa22oo aann ?b?b

Test Attribute Set

Page 76: [Slides3]

7676

Table 1 (intermediate results)

Group 1(initial)

Pass (1)

Pass (2)

Pass (3)

AA11 AA22 AA33 FF11

aa1111

aa1111

aa2211

aa3311

aa3311

aa1111

aa2211

aa3311

aa1122

aa2222

aa2222

aa2222

aa1122

aa2222

aa2222

aa2222

aa1133

aa1133

aa1133

aa1133

aa2233

aa2233

aa2233

aa2233

aa11

aa11

aa11

aa11

aa11

aa11

aa11

aa11

**

**

aa2222

aa2222

aa1133

aa2233

aa11

aa11

aa3311 ** aa22

33 aa11

aa1111

aa2211

aa3311

**

aa2222

aa2222

aa2222

aa2222

**

**

**

**

aa11

aa11

aa11

aa11

Page 77: [Slides3]

7777

MAXIMALLY COMPACT REPRESENTATIONQ

Q2

Q1

Q3Q3

a1

a1

a2

a2a2

a12 a2

2

a31

a21a1

1

a13

a23 a1

3 a23

aa22

aa22

aa22

aa11

aa11

aa11

FF11

aa1133

aa2233

**

aa1122

aa1122

aa1122

aa3311

aa1111

aa2211

aa1133

aa2233

**

aa1122

**

aa2222

aa1111

aa3311

**

AA33AA22AA11

aa22

aa22

aa22

aa11

aa11

aa11

FF11

aa1133

aa2233

**

aa1122

aa1122

aa1122

aa3311

aa1111

aa2211

aa1133

aa2233

**

aa1122

**

aa2222

aa1111

aa3311

**

AA33AA22AA11

...

is F THEN is Aand is A

is F THEN is Aand is Aand is A

is F THEN is Aand is Aand is A

is F THEN is A

is F THEN is Aand is A

is F THEN is A is Aand is A

121

1321

1321

12

131

1321

221

12

232

21

11

231

21

13

122

132

13

131

21

11

aaaIF

aaaaIF

aaaaIF

aaIF

aaaIF

aaaaIF

Page 78: [Slides3]

7878

40

60

20

30

50

750

80

20

250

1

2

3

332

331

332

231

112

222

331

111

221

33

11

22

.)(

.)(

.)(

.)(

.)(

.)(

.)(

.)(

.)(

:)(

:)(

:)(

Qa

Qa

Qa

Qa

Qa

Qa

Qa

Qa

Qa

CQnode

CQnode

CQnode Q

Q2

Q1

Q3Q3

a1

a1

a2

a2a2

a12 a2

2

a31

a21a1

1

a13

a23 a1

3 a23

6253

21

13

32321

21

13

31321

21

11

32321

21

11

31321

21

1221

222

1

1

1

.

)()(

)()(

)()(

)()(

)()(

ave

ave

JJjλ

jλave

c

xaxaaxCCC

xaxaaxCCC

xaxaaxCCC

xaxaaxCCC

xaaxCCxaCC

CCppCK

k

k

Page 79: [Slides3]

7979

PRUF Type II: pertaining to composition Rules of Type II: operation of composition

X Ysmall

very small

not small

Large

not very large

very small

small SMALLlarge LARGE very small SMALL2

not small SMALL’not very large (LARGE2)’

R SMALL x LARGE + (SMALL2) x (LARGE2)’ + SMALL’ x SMALL2

Page 80: [Slides3]

8080

Type III: pertaining to quantificationRules of Type III: p: Q N are F

less or more : m if )(

very : m if )(

not : m if )'(

F

F is N

tall are Swedes Most

almost some, few, many, most, :quantifier

quantifierfuzzy :Q

F are N Q p

.

)(

50

2

p

p

p

Fcount

X

mPq

qmp

QareQN

F

Page 81: [Slides3]

8181

Type III: pertaining to quantificationRules of Type III: p: Q N are F

F are N Q) (not F) are N (Q not

F are N Q) (m F) are N (Q m

G less or more is N

and F less or more is M ) G is N and F is M ( less of more

Gvery is N

and Fvery is M ) G is N and F is M (very

G not is N

or F not is M

G)' (F is Y)(X, ) G is N and F is M ( not

G) (F m is Y)(X, ) G is N and F is M ( m

F less of more is N ) F is N ( less of more

Fvery is N ) F is N (very

F not is N ) F is N ( not

mF is N )F is N (

m

Page 82: [Slides3]

8282

PRUF – Type IV: pertaining to qualificationRules of Type IV: q: p is ү

F is N true-u is F is N

[0,1]v )(

true-u If

))(()(

F is F is N Then

F F is N

))(()(

)(

G is N is F is N

value-truthliguistic a is

is F is N : q

p for value qualified-truth a be q and F is N : p for

value,y possibilit a value,-yprobabilit a value, truth :

true-u

FF

X

X

FG

F

vvμ

τ

uμμuμ

τ

uμμuμ

Gμτ

τ

τwhere

τ

γ

τ

τ

0)0.6,0.8,1. );,,;()( q

1.0)S(0.6,0.8,-1

S(5,10,15)-1

truevery is small is N :

is F is N : q

X

TRUE

SMALL

1510512 uSSuπ

μ

μ

q

τ

Page 83: [Slides3]

8383

PRUF – Type IV: pertaining to qualificationRules of Type IV: q: p is ү

)( )(

)( )(

is F is N is Fvery is N

true ant : false

antonym :ant

ant is F is N is F not is N

less or more is F is N ) is F is N ( less or ore

very is F is N ) is F is N (

not is F is N ) is F is N (

m is F is N ) is F is N (

/

0.5

0.5

ατα

ττ

vμvμ

vμvμ

ττ

ττ

ττm

ττvery

ττnot

ττm

τ

1

2

Page 84: [Slides3]

8484

PRUF – Type IV: pertaining to qualificationRules of Type IV: q: p is ү

true".very not" is " ? rich Barbara Is "

true"very not is rich is Barbara"

:to edapproximat be willnpropositio the

ly semanticalely approximat true and

)( where

true)-u ( is rich is Barbara

true-u ant is richvery is Barbara

true-u is richvery not is Barbara

richvery not is Barbara

:nspropositio equivalently Semantical

rich"very not is "

),

0.5

true-u (

τ

trueu

vvμ

ant

Barbara

anto

25 1

Page 85: [Slides3]

8585

Page 86: [Slides3]

8686

PT

BL

FL

+

bivalent logic

probability theoryTheory of Generalized-Constraint-Based Reasoning

CW

PT: standard bivalent-logic-based probability theoryCTPM : Computational Theory of Precisiation of MeaningPNL: Precisiated Natural LanguageCW: Computing with WordsGTU: Generalized Theory of UncertaintyGCR: Theory of Generalized-Constraint-Based Reasoning

CTPM GTU PNL

GC

Tools in current use New Tools

GCR

Generalized Constraint

fuzzy logic

NEED FOR NEW TOOLS

Page 87: [Slides3]

8787

PRECISIATED NATURAL LANGUAGE

Page 88: [Slides3]

8888

WHAT IS PRECISIATED NATURAL LANGUAGE (PNL)?

PRELIMINARIES

• a proposition, p, in a natural language,

NL, is precisiable if it translatable into

a precisiation language

• in the case of PNL, the precisiation

language is the Generalized Constraint

Language, GCL

• precisiation of p, p*, is an element of

GCL (GC-form)

Page 89: [Slides3]

8989

WHAT IS PNL?

PNL is a sublanguage of precisiable propositions in NL which is equipped with two dictionaries: (A) NL to GCL; (B) GCL to PFL (Protoform Language); and (C) a collection of rules of deduction (rules of generalized constrained propagation) expressed in PFL.

Page 90: [Slides3]

9090

PRECISIATED NATURAL LANGUAGE (PNL)

NLGCL

generalized constraint form of type r

p X isr Rtranslation

generalized constraint form of type r (GC(p))

p translation

precisiation language (GCL)

precisiation explicitation

GC-form CSNL

precisiablepropositions

in NL

p*

Page 91: [Slides3]

9191

PNL AND THE COMPUTATIONAL THEORY OF PERCEPTIONS

in the computational theory of perceptions (CTP), perceptions are dealt with through their descriptions in a natural language

perception = descriptor(s) of perception

• a proposition, p, in NL qualifies to be an object of computation in CTP if p is in PNL

Page 92: [Slides3]

9292

DEFINITION OF p: ABOUT 20-25 MINUTES

time

time

time

time

20 25

20 25

20 25

A

P

B

6

0

1

0

0

1

1c-definition:

f-definition:

f.g-definition:

PNL-definition: Prob (Time is A) is B

Page 93: [Slides3]

9393

PRECISIATION OF “approximately a,” *a

x

x

x

a

a

20 250

1

0

0

1

p

fuzzy graph

probability distribution

interval

x 0

a

possibility distribution

x a0

1

s-precisiation singleton

g-precisiation

cg-precisiation

Page 94: [Slides3]

9494

CONTINUED

x

p

0

bimodal distribution

GCL-based (maximal generality)GCL-based (maximal generality)

g-precisiation X isr R

GC-form

*a

g-precisiation

Page 95: [Slides3]

9595

Page 96: [Slides3]

9696

THE CENTERPIECE OF PNL IS THE CONCEPT OF A GENERALIZED

CONSTRAINT (ZADEH 1986)

Page 97: [Slides3]

9797

THE BASICS OF PNL

The point of departure in PNL is the key idea:A proposition, p, drawn from a natural

language, NL, is precisiated by expressing its meaning as a generalized constraint

In general, X, R, r are implicit in p

p X isr R

constraining relation

Identifier of modality (type of constraint)

constrained (focal) variable

The concept of a generalized constraint serves as a bridge from natural languages to mathematics

Page 98: [Slides3]

9898

GENERALIZED CONSTRAINT (Zadeh 1986)

• Bivalent constraint (hard, inelastic, categorical:)

X Cconstraining bivalent relation

X isr R

constraining non-bivalent (fuzzy) relation

index of modality (defines semantics)

constrained variable

Generalized constraint:

r: | = | | | | … | blank | p | v | u | rs | fg | ps |…

bivalent non-bivalent (fuzzy)

Page 99: [Slides3]

9999

CONTINUED

• constrained variable

• X is an n-ary variable, X= (X1, …, Xn)• X is a proposition, e.g., Leslie is tall• X is a function of another variable: X=f(Y)• X is conditioned on another variable, X/Y• X has a structure, e.g., X= Location

(Residence(Carol))• X is a generalized constraint, X: Y isr R• X is a group variable. In this case, there is a

group, G[A]: (Name1, …, Namen), with each member of the group, Namei, i =1, …, n, associated with an attribute-value, Ai. Ai may be vector-valued. Symbolically

G[A]: (Name1/A1+…+Namen/An)

Basically, X is a relation

Page 100: [Slides3]

100100

SIMPLE EXAMPLES

“Check-out time is 1 pm,” is an instance of a generalized constraint on check-out time

“Speed limit is 100km/h” is an instance of a generalized constraint on speed

“Vera is a divorcee with two young children,” is an instance of a generalized constraint on Vera’s age

Page 101: [Slides3]

101101

GENERALIZED CONSTRAINT—MODALITY r

X isr R

r: = equality constraint: X=R is abbreviation of X is=Rr: ≤ inequality constraint: X ≤ Rr: subsethood constraint: X Rr: blank possibilistic constraint; X is R; R is the possibility

distribution of Xr: v veristic constraint; X isv R; R is the verity

distribution of Xr: p probabilistic constraint; X isp R; R is the

probability distribution of X

Page 102: [Slides3]

102102

CONTINUED

r: rs random set constraint; X isrs R; R is the set-valued probability distribution of X

r: fg fuzzy graph constraint; X isfg R; X is a function and R is its fuzzy graph

r: u usuality constraint; X isu R means usually (X is R)

r: g group constraint; X isg R means that R constrains the attribute-values of the group

• Primary constraints: possibilistic, probabilisitic and veristic

• Standard constraints: bivalent possibilistic, probabilistic and bivalent veristic

Page 103: [Slides3]

103103

Page 104: [Slides3]

104104

BASIC PROBLEM

•identification of query-relevant information

•relevance-ranking of query-relevant information

question-answering system

search engine

•deduction from query-relevant information

meta-deduction

+

Page 105: [Slides3]

105105

DECISION ?

DECISION-RELEVANTINFORMATION

QUERY-RELEVANTINFORMATION

SEARCH ENGINE

SYNTHESIS

INFORMATION

Q/A SYSTEM

DEDUCTION ENGINE

DECISION

INFORMATION COMMAND

Page 106: [Slides3]

106106

CFS

Unit

USER

Spiders-Crawlers

Indexed WebPages Search Index

User Query

Retrieval

Web

SVM

Un-Supervised

Clustering

DB

CFS

UnitWeb

Analyzer

Terms

InLink

OutLink

tf.idf

Eqv. tf.idf

Eqv. tf.idf

Aggregation

SOM

Community Builder

DataMiner

Visualization

Image Analyzer

Image Analyzer and Annotation

Image Query

and Retrieval

Image Annotation

Image Extraction

Summarization

Deduction

Q/A System

Summarization Deduction Q/A System

Structured Information

Semantic Web

Intelligent System Analyzer

i.e. Diagnosis-Prognosis

Experts Knowledge

Model Representation Including Linguistic Formulation

• Functional Requirements• Constraints• Goals and Objectives• Linguistic Variables Requirement

Input From Decision Makers

Model Management • Query• Aggregation• Ranking• Fitness Evaluation

Evolutionary KernelGenetic Algorithm, Genetic Programming, and DNA

• Selection• Cross Over• Mutation

Model and

Data Visualization

Data Management

Un-Structured Information

User

User InterfaceDialog Function

Knowledge Base Editor

Inference Engine

Recommendation, Advice, and Explanation

KnowledgeRefinement

DataIF … THEN

Rule

Knowledge of Engineer

Knowledge Base

users ask for advice or provide preferences

inferences & conclusion

advises the user andexplains the logic

expertise is transferred and

it is stored

Data Sources and Warehouse

(databases)

Knowledge Representation, Data Visualization and

Visual Interactive Decision Making

Knowledge Discovery

and Data Mining

Generate Knowledge

Organize Knowledge Bases

Expert Knowledge

CFS

Unit

USER

Spiders-Crawlers

Indexed WebPages Search Index

User Query

Retrieval

Web

SVM

Un-Supervised

Clustering

DB

CFS

UnitWeb

Analyzer

Terms

InLink

OutLink

tf.idf

Eqv. tf.idf

Eqv. tf.idf

Aggregation

SOM

Community Builder

DataMiner

Visualization

Image Analyzer

Image Analyzer and Annotation

Image Query

and Retrieval

Image Annotation

Image Extraction

Summarization

Deduction

Q/A System

Summarization Deduction Q/A System

Structured Information

Semantic Web

Intelligent System Analyzer

i.e. Diagnosis-Prognosis

Experts Knowledge

Model Representation Including Linguistic Formulation

• Functional Requirements• Constraints• Goals and Objectives• Linguistic Variables Requirement

Input From Decision Makers

Model Management • Query• Aggregation• Ranking• Fitness Evaluation

Evolutionary KernelGenetic Algorithm, Genetic Programming, and DNA

• Selection• Cross Over• Mutation

Model and

Data Visualization

Data Management

Un-Structured Information

User

User InterfaceDialog Function

Knowledge Base Editor

Inference Engine

Recommendation, Advice, and Explanation

KnowledgeRefinement

DataIF … THEN

Rule

Knowledge of Engineer

Knowledge Base

users ask for advice or provide preferences

inferences & conclusion

advises the user andexplains the logic

expertise is transferred and

it is stored

Data Sources and Warehouse

(databases)

Knowledge Representation, Data Visualization and

Visual Interactive Decision Making

Knowledge Discovery

and Data Mining

Generate Knowledge

Organize Knowledge Bases

Expert Knowledge

Concept-Based Intelligent Decision Analysis

BISC-DSS

Deductive Web Engine

Beyond the Semantic Web

NeuFCSearch

Page 107: [Slides3]

107107

[ 0, 1]

[ tf-idf]

[set]

Term-Document Matrix

The use of Fuzzy Set Theory

The use of statistical-Probabilistic TheoryThe use of bivalent-logic Theory

The use of Fuzzy Set-Object-Based Theory

Specialization

Imprecise Search

Lycos, etc.

GA-GP Context-Based tf-idf; Ranked tf-idf

Topic, Title,

Summarization

Concept-Based Indexing

Keyword search; classical techniques; Google, Teoma, etc.

Use Graph Theory and Semantic Net. NLP with GA-GP Based NLP; Possibly AskJeeves.

NeuFCS

RBF

PRBF

GRRBF

ANFIS

RBFNN (BP, GA-GP, SVM)

Probability

Bayesian

Fuzzy

NNnet(BP, GA-GP, SVM)

LSI

FCS Based on Neuroscience Approach

Classical Search

NeuSearch: Neuroscience ApproachSearch Engine Based on Conceptual Semantic Indexing

),( jiw

),( jiw

Neuro-Fuzzy Conceptual Search (NeuFCS)

Page 108: [Slides3]

108108

Page 109: [Slides3]

109109

)(),(),,(),( kpjpkjpfkjw

Documents Space or

Concept and Context Space

Based on SOM or PCA

Word Space

Concept-Context Dependent Word Space

)(),(),,(),( jpipjipfjiw

W(i, j) is calculated based on Fuzzy-LSI or Probabilistic LSI

(In general form, it can be Calculated based on PNL)

i: neuron in word layer

j: neuron in document or Concept-Context layer

j: neuron in document or Concept-Context layer

k: neuron in word layer

Document

(Corpus)

Neu-FCS

Page 110: [Slides3]

110110

Page 111: [Slides3]

111111

ORGANIZATION OF WORLD KNOWLEDGEEPISTEMIC (KNOWLEDGE-DIRECTED) LEXICON (EL)

(ONTOLOGY-RELATED)

i (lexine): object, construct, concept (e.g., car, Ph.D. i (lexine): object, construct, concept (e.g., car, Ph.D. degree)degree)

K(i): world knowledge about i (mostly perception-based)K(i): world knowledge about i (mostly perception-based) K(i) is organized into n(i) relations K(i) is organized into n(i) relations RRiiii, …, R, …, Rinin

entries in Rentries in Rijij are bimodal-distribution-valued attributes of i are bimodal-distribution-valued attributes of i values of attributes are, in general, granular and context-values of attributes are, in general, granular and context-

dependentdependent

network of nodes and links

wij= granular strength of association between i and j

i

jrij

K(i)lexine

wij

Page 112: [Slides3]

112112

EPISTEMIC LEXICON

lexinei

lexinej

rij: i is an instance of j (is or isu)i is a subset of j (is or isu)i is a superset of j (is or isu)j is an attribute of ii causes j (or usually)i and j are related

rij

Page 113: [Slides3]

113113

GENERALIZED CONSTRAINT

•standard constraint: X C•generalized constraint: X isr R

X isr R

copula

GC-form (generalized constraint form of type r)

type identifier

constraining relation

constrained variable

•X= (X1 , …, Xn )•X may have a structure: X=Location (Residence(Carol))•X may be a function of another variable: X=f(Y)•X may be conditioned: (X/Y)• ...//////////...//: psfgrsupvblank≤r ⊃⊂=

Page 114: [Slides3]

114114

CONTINUED

r: rs random set constraint; X isrs R; R is the set-valued probability distribution of X

r: fg fuzzy graph constraint; X isfg R; X is a function and R is its fuzzy graph

r: u usuality constraint; X isu R means usually (X is R)

r: ps Pawlak set constraint: X isps ( X, X) means that X is a set and X and X are the lower and upper approximations to X

Page 115: [Slides3]

115115

Page 116: [Slides3]

116116

Where wi,j is granular strength of association between i and j, ri,j

is epistemic lexicon, wi,j <== ri,j, and ri,j is defined as follows: rij: i is an instance of j (is or isu) i is a subset of j (is or isu) i is a superset of j (is or isu) j is an attribute of i i causes j (or usually) i and j are related

Wi,j

i j ri,j

Based on PNL approach, w(i,j) is defined based on ri,j as follows:

Page 117: [Slides3]

117117

Original keywords Extended keyword

Concept-Context Nodes (RBF Nodes)

Wi,j Wj,k

Original Documents Extended Documents

Concept-Context Nodes (RBF Nodes)

W’i,j W’j,k

NeuSearch Model

Page 118: [Slides3]

118118

...

x1

x2

x3

Nx

X

...

)(1X

)(3X

)(1X

m

)(2X

Increased dimension: N->m1 functionlinear non :)(X

i

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

i

iiXwy

1

)(

Radial Basis Function is used to extract Concept

Page 119: [Slides3]

119119

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

SOM are used to make 2-D plot of Concepts

Page 120: [Slides3]

120120

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

Concept 1

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

Concept 2

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

...

x1

x2

x3

Nx

X

...

)(1 X

)(3 X

)(1Xm

)(2 X

Increased dimension:N->m1 functionlinear non :)(Xi

3w

1mw

1w

2w

Output

y

More likely to be linearly separated

im

iii Xwy

1

)(

Concept n

N Concepts are Extracted based on SOM and RBFs

SOM/LVQ are used to make 2-D plot of ConceptsSupervised and Unsupervised

Page 121: [Slides3]

121121

PNL-Based Conceptual Fuzzy Sets Using Neuroscience

Interconnection based on

Mutual Information

i: neuron in document layer

j: neuron in word layer

rij: i is an instance of j (is or isu)i is a subset of j (is or isu)i is a superset of j (is or isu)j is an attribute of ii causes j (or usually)i and j are related

ijrjiw ),(

Word Space

Concept-Context Dependent Word Space

i: neuron in word layer

j: neuron in document or Concept-Context layer

j: neuron in document or Concept-Context layer

k: neuron in word layerDocument

(Corpus) )(),(),,(),( jpipjipfjiw

)(),(),,(),( kpjpkjpfkjw

Page 122: [Slides3]

122122

Word Space

Input: Word

Neu-FCS

Activated Document or

Concept-Context

Output: Concept-Context Dependent Word

Page 123: [Slides3]

123123

Activated Document or

Concept-Context

Word Space

Input: Word

Neu-FCSOutput: Concept-Context Dependent Word

Document

(Corpus)

Page 124: [Slides3]

124124

FC-DNA as a basis for Common Sense Knowledge, Human Reasoning and

Deduction

Page 125: [Slides3]

125125

FC-DNA as a basis for Next Generation of Concept-Based Search Engine

Page 126: [Slides3]

126126

Masoud Nikravesh and Germano Resconi