5_2005web intelligence, wk and fl-studies in fuzziness and soft computing-springervol164-p1-18

Upload: dr-ir-r-didin-kusdian-mt

Post on 30-May-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    1/18

    Web Intelligence, World Knowledge and FuzzyLogic

    Lotfi A. ZadehBISC Program, C omputer Sciences Division, EECS DepartmentUniversity of California, Berkeley, CA 94720, USAEmail: [email protected] Program URL: http://www-bisc.cs.berkeley.edu/http://zadeh.cs.berkeley.edu/Tel.(office): (510) 642-4959Fax (office): (510) 642-1712

    A b s t r a c t : Existing search engineswith Google at the tophave many remarkable capabilities; but what is not among them is deduction capabilitythecapability to synthesize an answer to a query from bodies of information which reside in various parts of the knowledge base. In recent years, impressive progresshas been made in enhancing performance of search engines through the use ofmethods based on bivalent logic and bivalent-logic-based probability theory. Butcan such methods be used to add nontrivial deduction capability to search engines,that is, to upgrade search engines to question-answering systems? A view which isarticulated in this note is that the answer is "No." The problem is rooted in the nature of world knowledge, the kind of knowledge that humans acquire through experience and education.It is widely recognized that world knowledge plays an essential role in assessment of relevance, summarization, search and deduction. But a basic issuewhich is not addressed is that much of world knowledge is perception-based, e.g.,"it is hard to find parking in Paris," "most professors are not rich," and "it isunlikely to rain in midsummer in San Francisco." The problem is that (a) perception-based information is intrinsically fuzzy; and (b) bivalent logic is intrinsicallyunsuited to deal with fuzziness and partial truth.To come to grips with the fuzziness of world knowledge, new tools areneeded. The principal new toola tool which is briefly described in their noteisPrecisiated Natural Language (PNL). PNL is based on fuzzy logic and has the capability to deal with partiality of certainty, partiality of possibility and partiality oftruth. These are the capabilities that are needed to be able to draw on world

    mailto:[email protected]:[email protected]://www-bisc.cs.berkeley.edu/http://www-bisc.cs.berkeley.edu/http://zadeh.cs.berkeley.edu/http://zadeh.cs.berkeley.edu/http://www-bisc.cs.berkeley.edu/mailto:[email protected]
  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    2/18

    knowledge for assessment of relevance, and for summarization, search and deduction.

    1. IntroductionIn moving further into the age of machine intelligence and automated reasoning,we have reached a point where we can speak, without exaggeration, of systemswhich have a high m achine IQ (MIQ) (Zadeh, [17]). The W eb and especiallysearch engineswith Google at the topfall into this category. In the context ofthe Web, MIQ b ecomes W eb IQ, or W IQ, for short.Existing search engines have many remarkable capabilities. However, what is notamong them is the deduction capabilitythe capability to answer a query by asynthesis of information which resides in various parts of the knowledge base. Aquestion-answering system is by definition a system which has this capability.One of the principal goals of Web intelligence is that of upgrading search enginesto question-answering systems. Achievement of this goal requires a quantum jum pin the WIQ of existing search engines [1].Can this be done with existing tools such as the Semantic Web [3], Cyc [8],OWL [13] and other ontology-centered systems [12, 14]tools which are basedon bivalent logic and bivalent-logic-based probability theory? It is beyond question that, in recent years, very impressive progress has been made through the useof such tools. But can we achieve a quantum jum p in WIQ? A view w hich is advanced in the following is that bivalent-logic- based methods have intrinsicallylimited capability to address complex problems which arise in deduction from information which is pervasively ill-structured, uncertain and imprecise.The major problem is world knowledge the kind of knowledge that humans a cquire through experience and education [5]. Simple examples of fragments ofworld knowledge are: Usually it is hard to find parking near the campus in earlymorning and late afternoon; Berkeley is a friendly city; affordable housing is nonexistent in Palo Alto; almost all professors have a Ph.D. degree; most adultSwedes are tall; and Switzerland has no ports.Much of the information which relates to world knowledgeand especially to underlying probabilitiesis perception-based (Fig. 1). Reflecting the bounded ability of sensory o rgans, and ultimately the brain, to resolve detail and store information, perceptions are intrinsically imprecise. More specifically, perceptions are f-granular in the sense that (a) the boundaries of perceived classes are unsha ; and(b) the values of perceived attributes are granular, with a granule being a clump ofvalues drawn together by indistinguishability, similarity, proximity or functionality [2].

    Imprecision of perception-based information is a major obstacle to dealingwith world knowledge through the use of methods based on bivalent logic and bivalent-logic-based probability theory^both of which are intolerant of imprecisionand partial truth. W hat is the basis for this contention? A very sim ple exam ple offers an explanation.

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    3/18

    MEASUREMENT-BASED VS. PERCEPTION-BASED INFORMATIONINFORMATION

    measurement-basednumerical

    perception-basedlinguistic

    it is 35 It is very warmEva is 28 Eva is young Tandy is three years Tandy is a few

    older than Dana years older than Dana it is cloudy traffic is heavy Robert is very honest

    Figure 1. Measurement-based and perception-based information

    Suppose that I have to come up with a rough estimate of Pat's age. The information which I have is: (a) Pat is about ten years older than Carol; and (b) Carol hastwo children: a son, in mid-twenties; and a daughter, in mid-thirties.How w ould I come up with an answer to the query q: How old is Pat? First, usingmy world knowledge, I would estimate Carol's age, given (b); then I would add"about ten years," to the estimate of Carol's age to arrive at an estimate of Pat 'sage. I would be able to describe my estimate in a natural language, but I would notbe able to express it as a number, interval or a probability distribution.How can I estimate C arol's age given (b)? Humans have an innate ability to process perception-based informationan ability that bivalent-logic-based methods donot have; nor would such methods allow me to add "about ten years " to my estimate of Carol's age.There is another basic problem the problem of relevance. Suppose that in-stead of being given (a) and (b), I am given a collection of data which includes (a)and (b ), and it is my problem to search for and identify the data that are relevant tothe query. I came across (a). Is it relevant to ql By itself, it is not. Then I cameacross (b). Is it relevant to ql By itself, it is not. Thus, what I have to recognize isthat, in isolation, (a) and (b) are irrelevant, that is, uninformative, but, in combination, they are relevant i.e., are informative. It is not difficult to recognize this inthe simple example under consideration, but when we have a large database, theproblem of identifying the data which in combination are relevant to the query, isvery complex.

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    4/18

    Suppo se that a proposition, p, is, in isolation, relevant to q, or, for short, is i-relevant to q. What is the degree to which p is relevant to ql For example, to whatdegree is the proposition: Carol has two children: a son, in mid-twenties, and adaughter, in mid-thirties, relevant to the query: How old is Carol? To answer thisquestion, we need a definition of measure of relevance. The problem is that thereis no quantitative definition of relevance within existing bivalent-logic-based theories. We will return to the issue of relevance at a later point.The ex ample which w e have considered is intended to point to the difficulty ofdealing with world knowledge, especially in the contexts of assessment of relevance and deduction, even in very simple cases. The principal reason is that muchof world knowledge is perception-based, and existing theories of knowledge representation and deduction provide no tools for this pu ose. Here is a test probleminvolving deduction form perception-based information.The Tall Swedes Problem

    Perception: Most adult Swedes are tall, with adult defined as over about 20 yearsin age.Query: What is the average height of Swedes?A fuzzy-logic-based solution of the problem w ill be given at a later point.A concept which provides a basis for computation and reasoning with perception-based information is that of Precisiated Natural Language (PNL) [15]. The capability of PNL to deal with perception-based information suggests that it may playa significant role in dealing with world knowledge. A quick illustration is theCarol example. In this example, an estimate of Carol's age, arrived at through theuse of PNL would be expressed as a bimodal distribution of the formAge(Carol) is (( , ;) + ...+ (P K )) , /=1, ..., nwhere the Vi are granular values of Age, e.g., less than about 20, between about 20and about 30, etc.; P, is a granular probabihty of the event (Age(Carol) is /), /=1 ,..., n; and + should be read as "and." A brief description of the basics of PNL ispresented in the following.

    2. Precisiated Natural Language (PNL)PNL deals with perceptions indirectly, through their description in a natural language, Zadeh [15]. In other words, in PNL a perception is equated to its description in a natural language. PNL is based on fuzzy logica logic in which everything is, or is allowed to be, a matter of degree. It should be noted that a naturallanguag e is, in effect, a system for describing perception s.

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    5/18

    The point of departure in PNL is the assumption that the meaning of a proposition, p, in a natural language, NL, may be represented as a generahzed constraintof the form (Fig. 2)

    Standard constraint: X generalized constraint: X isr Rf XisrRt U

    copula' GC'form (generalized constraint form of type r^ modality Identifier

    constraining relationconstrained variable

    X may have a structure: X=Location (Residence(Carol))X may be a function of another variable: X=f(Y)X may be conditioned: (X/Y) r : = / < . . . / c / Z ) / b l a n k / v / p / u / r s / f g / p s / . , .

    Figure 2. Generalized Constraint

    where X is the constrained variab le, /? is a constraining relation wh ich, in gene ral,is not crisp (bivalent); and r is an indexing variable whose values define the modality of the constraint. The principal modalities are: possibilistic (r=blank); veris-tic(r=v); probabilistic(r=/?); random set(r=r^); fuzzy graph {r=fg); usuality (r=w);and Pawlak set {r=ps). The set of all generalized constraints together with theircombinations, qualifications and rules of constraint propagation, constitutes theGeneralized Constraint Language (GCL). By construction, GCL is maximally expressive. In general, X, R and r are implicit in p. Thus, in PNL the meaning of p isprecisiated through explicitation of the generalized constraint which is implicit inp , that is, through translation into GCL (Fig. 3). Translation of p into GCL is exemplified by the following.(a) Monika is young - Age(Monika) is young.

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    6/18

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    7/18

    ,.//,, v,.2 v,

    where the conjunction, ^, is taken to be min. More generally, conjunction may betaken to be a -norm [11].What is the rationale for introducing the concept of a generalized constraint? Conventional constraints are crisp (bivalent) and have the form X C, where X is theconstrained variable and is a crisp set. On the other hand, percep tions are f-granular, as was noted earlier. Thus, there is a mismatch betw een f-granularity ofperceptions and crisp constraints. What this implies is that the meaning of a perception does not lend itself to representation as a collection of crisp co nstraints;what is needed for this purpose are generalized constraints or, more generally, anelement of the Generalized Constraint Language (GCL) [15].Representation of the meaning of a perception as an element of GCL is a first steptoward computation w ith perceptionscom putation which is needed for deduction from data which are resident in world knowledge. In PNL, a concept whichplays a key role in deduction is that of a protoform an abbreviation of "proto typical form," [15].If p is a proposition in a natural language, N L, its protoform, PF(p ), is an abstraction of p which places in evidence the deep semantic structure of p. For example,the protoform of "Mon ika is you ng," is " ( ) is C," where A is abstraction of"A ge," is abstraction of "M onik a" and is abstraction of "yo ung ." Similarly,M ost Swedes are tall Coun t(A/5) is Q,where A is abstraction of "tall Sw edes," is abstraction of "Sw ede s" and Q is abstraction of "most." Two propositions, p and q, are protoform-equivalent, or PF-equivalent, for short, if they have identical protoforms. For example,/?: MostSwedes are tall, and q\ Few professors are rich, are PF-equivalent.The concept of PF-equivalence suggests an important mode of organization ofworld knowledge, namely protoform-based organization. In this mode of organization, items of knowledge are grouped into PF-equivalent classes. For example,one such class may be the set of all propositions whose protoform is A(B) is C,e.g., Monika is young. The partially instantiated class Price(5) is low, would bethe set of all objects w hose price is low. As will be seen in the following, p rotoform-based organization of world knowledge plays a key role in deduction formperception-based information.

    Basically, abstraction is a means of generalization. Abstraction is a familiarand widely used concept. In the case of PNL, abstraction plays an especially important role because PNL abandons bivalence. Thu s, in PN L, the concept of a protoform is not limited to propositions whose meaning can be represented within theconceptual structure of bivalence logic.In relation to a natural language, NL, the elements of GCL may be viewed asprecisiations of elements of NL. Abstractions of elements of GCL gives rise to

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    8/18

    MODULAR DEDUCTION DATABASE

    POSSIBILITYMODULE

    SEARCH

    PROB ABILITY FUZZY ARITHMETICMODULE ^g^^^ MODULE

    FUZZY LOGICMODULE EXTENSIONPRINCIPLE MODULE

    Figure 7. Basic structure of PNL: modular deduction database

    whereAoB is the composition of A and B, defined in the computational part,in which 4, JUB and JU

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    9/18

    what is referred to as Protoform Language, PFL, (Fig. 4). A consequence of theconcept of PF-equivalence is that the cardinaHty of PFL is orders of magnitudesmaller than that of GCL, or, equivalently, the set of precisiable propositions inNL. The small cardinality of PFL plays an essential role in deduction.

    THE BASIC IDEAP

    wpdescription

    NL NL(p)

    precisiationGCL

    ______wGC(p) 1perception description ofperception precisiationof perception

    GCLA

    GC(P)abstraction

    PFL, , 1PF(P) 1

    precisiationof perceptionGCL (Generaiized Constrain Language) is maximaiiy expressive

    Figure 4. Precisiation and abstractionThe principal components of the structure of PNL (Fig, 5) are: (1) a dictionaryfrom NL to GCL; (2) a dictionary from GCL to PFL (Fig. 6); (3) a multiagent,modular deduction database, DD B; and (4) a world knowledge database, WK DB .The constituents of DDB are modules, with a module consisting of a group of pro-toformal rules of deduction, expressed in PFL (Fig. 7), which are drawn from aparticular dom ain, e.g., probability, possibility, usuality, fuzzy arithme tic, fuzzylogic, search, etc. For example, a rule drawn from fuzzy logic is the compositionalrule of inference [11], expressed as

    XisA(X, ) is 5YisAoB

    JUAOB (^) = ^^^U( MA(^^)^M B(^''^))\

    symbolic part computational part

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    10/18

    10

    NL

    P' UJI 'Idictionary 1

    GCLP*

    GC(p)

    WKDBworldknowledgedatabase

    ^dictionary 2

    w

    PFLP** 1

    D J -PnHJ

    oBdeductiondatabase

    Figure 5. Basic structure of PNL

    I:proposition in NLP

    most Swedes are tall

    precisiationp* (GC-form)

    2 Count (tall.Swedes/Swedes) is most2: precisiationp* (GC-form)

    Z Count (tall.Swedes/Swedes) is most

    protoformPF(p*)

    Q A's are B's

    Figure 6. Structure of PNL: dictionaries

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    11/18

    11

    The rules of deduction in DDB are, basically, the rules which govern propagation of generalized constraints. Each module is associated with an agent whosefunction is that of controlling execution of rules and performing embedded computation s. The top-level agent controls the passing of results of computation froma module to other modules. The structure of protoformal, i.e., protoform-based,deduction is shown in Fig. 5. A simple example of protoformal deduction isshown in Fig. 8.

    GC(p) PF(P)Dana is young Age (Dana) is youngt XisA

    Tandy is a fewyears olderthan DanaAge (Tandy) is (Age (Dana))+fewT

    Yis(X+B)

    XisAYisrX+B)YisA+B^ Age (Tandy) is (young+few)

    MA^B(V) = SUPU(JUA(U)^MB(V - u)Figure 8. Example of protoformal reasoning

    The principal deduction rule in PNL is the extension principle [19]. Itssymbolic part is expressed as

    f(X)isAg(X)isB

    in which the antecedent is a constraint on X through a function of X,/(Z); and theconsequent is the induced constraint on a function of X, g(X),The computational part of the rule is expressed as

    | ( ) = 8 ( ( ))subject to

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    12/18

    12v = g(u)

    To illustrate the use of the extension principle, we will consider the Tall SwedesProblem:p: Most adult Swedes are tallq: What is the average height of adult Swedes?

    Let Pi(wi, ..., WAT) be a population of Swed es, with the height of being /z/, /=1 ,. . . , , and jUa(Ui) representing the degree to which is an adult. The averageheight of adult Swedes is denoted as have- The first step is precisiation of/?, usingthe concept of relative ECount:ZCount(talAadult.Swedes/adult.Swedes) is most

    or, more explicitly,^Mtall(hi)^Ma(^i)

    The next step is precisiation of q:

    Kve'^^A

    IS most.

    have i s ? 5 ,where 5 is a fuzzy numberUsing the extension principle, computation of have as a fuzzy number is reduced tothe solution of the variational problem.

    2.LI , n(h. ) ( . )

    subject to

    Note that we assume that a problem is solved once it is reduced to the solution of awell-defined computational problem.

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    13/18

    133. PNL as a definition languageOne of the important functions of PNL is that of serving as a high level definitionlanguage. More concretely, suppose that I have a concept, C, which I wish to define. For example, the concept of distance between two real-valued fu nc tio ns ,/and g, defined on the real line.The standard approach is to use a metric such as Lj or L2. But a standard metric may not capture my perception of the distance between/and g. The PNL-based approach would be to describe my perception of distance in a natural language and then precisiate the description.This basic idea opens the door to (a) definition of concepts for which no satisfactory definitions exist, e.g., the concepts of causality, relevance and rationality,among many others, and (b) redefinition of concepts whose existing definitions donot provide a good fit to reality. Among such concepts are the concepts of similarity, stability, independence, stationarity and risk.How can we concretize the meaning of "good fit?" In what follows, we dothis through the concept of cointension.More specifically, let / be a universe of discourse and let be a concep twhich I wish to define, with relating to elements of U. For example, / is a set ofbuildings and is the concept of tall building. Let/7(C) and d(C) be, respectively,my perception and my definition of Let I(p(C)) and I(d(C)) be the intensions ofp(C) and d(C), respectively, with "intension" used in its logical sense, [6, 7] thatis , as a criterion or procedure which identifies those elements of U which fit p(Qor d{Q. For example, in the case of tall buildings, the criterion may involve theheight of a building.Informally, a definition, d(C), is a good fit or, more precisely, is cointensive,if its intension co incides w ith the intension of p(C). A measure of goodness of fitis the degree to which the intension of d{C) coincides with that of p(C). In thissense, cointension is a fuzzy concept. As a high level definition language, PNLmakes it possible to formulate definitions whose degree of cointensiveness ishigher than that of definitions formulated through the use of languages based onbivalent logic.A substantive exposition of PNL as a definition language is beyond the scopeof this note. In w hat follows, we shall consider as an illustration a relatively simple version of the concept of relevance.

    4. RelevanceWe shall examine the concept of relevance in the context of a relational modelsuch as shown in Fig. 9. For concreteness, the attributes Ay, ...,An may be interpreted as symptoms and D as diagnosis. For convenience, rows which involve thesame value of D are grouped together. The entries are assumed to be labels offuzzy sets. For example, 5 may be blood pressure and 55 may be "high."

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    14/18

    14

    RELEVANCE, REDUNDANCE ANDDELETABILITY

    DECISION TABLEName INamej I

    I INamej^ 1

    Name^^i 11 1

    Namei 11 1Name^ 1

    1 'i i11 %1 ^k-l,l11 J_a7

    ^ /^irUfa-

    ^A+i.y

    i_2.,... -.

    A 1 1

    1it/t

    ^:+i,w 11Ufa 1^m 1

    1 ) 1UJ1 1^2

    1 1

    m

    -- th symptoma,j: value of j thsymptom ofNameD: diagnosis

    Figure 9 . A relational model of decisionAn entry represented as * means that the entry in question is conditionally redundant in the sense that its value has no influence on the value of D (Fig. 10).

    REDUNDANCE -^ DELETABILITYName 11Name^ 1

    1 ^i1 a,,

    ^ L - :*.

    ~vJa 1

    I D I1 1

    d21 . 1Aj is conditionally redundant for Name^ A, is a^ A is a^^If D is ds for all possible values of Aj in *

    Aj is redunda nt if it is conditionally redundant for all values of Name

    compactification algorithm (Zadeh, 1976); Quine-McCluskey algorithmFigure 10. Conditional redundance and redundance

    An attribute, Aj, is redundant, and hence deletable, if it is conditionally redundantfor all values of Name. An algorithm, termed compactification, which identifiesall deletable attributes is described in Zadeh [18]. Compactification algorithm is a

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    15/18

    15

    generalization of the Quine-McCluskey algorithm for minimization of switchingcircuits. The Redact algorithm in the theory of rough sets [10] is closely related tothe compactification algorithm.

    (Aj is ajj) is irreievant(uninformative)\Name

    Name1 r

    i+s

    Ai

    -%

    V

    K\ 1

    ]\d \1'Y\1 ^2

    Figure 11. Irrelevance

    The concept of relevance (informativeness) is weaker and more complex than thatof redundance (deletability). As was noted earlier, it is necessary to differentiatebetween relevance in isolation (/-relevance) and relevance as a group. In thefollowing, relevance should be inte reted as /-relevance.

    Dis?d if A is a rj

    constraint on Aj induces a constraint on Dexample: (blood pressure is high) constrains D(Aj is a^j) is uniformative if D is unconstrained

    AJ is irrelevant if it Aj is uniform ative for all a^jirrelevance \ deletability

    Figure 12. Relevance and irrelevance

    A value of /, say a^ is irrelevant (uninformative) if the proposition Aj is Urj doesnot constrain D (Fig. 11). For example, the knowledge that blood pressure is highmay convey no information about the diagnosis (Fig. 12).

    http://name/http://name/
  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    16/18

    16

    An attribute, Aj, is irrelevant (uninform ative) if, for all the prop osition Aj is rjdoes not constrain D. What is important to note is that irrelevance does not implydeletability, as redundance does. The reason is that Ay may be /-irrelevant but notirrelevant in combination with other attributes. An example is shown in Fig. 13.As defined above, relevance and redundance are bivalent concepts, with nodegrees of relevance or redundance allowed. But if the definitions in question areinte reted as idealizations, then a measu re of the departure from the ideal couldbe used as a measure of the degree of relevance or redundance. Such a measurecould be defined through the use of PNL. In this way, PNL may provide a basisfor defining relevance and redundance as matters of degree, as they are in realisticsettings. However, what should be stressed is that our analysis is limited to relational models. Formalizing the concepts of relevance and redundance in the context of the Web is a far more complex problema problem for which no cointen-sive solution is in sight.

    EXAMPLED: black or white

    A^ and A2 are irrelevant (uninformative) but not deletable

    D: black or white

    A .A2 is redundan t (deletable)Figure 13. Redundance and irrelevance

    5. Concluding remarkMuch of the information which resides in the Weband especially in the domainof world knowledgeis imprecise, uncertain and partially true. Existing bivalent-logic-based methods of knowledge representation and deduction are of limited ef-

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    17/18

  • 8/9/2019 5_2005Web Intelligence, WK and FL-Studies in Fuzziness and Soft Computing-SpringerVol164-p1-18

    18/18

    19.Zadeh, L.A. 1975. The concept of a linguistic variable and its application to approximatereasoning, Part I: Inf. Sci.8, 199-249, 1975; Part II: Inf. Sci. 5, 301-357, 1975; Part III:Inf Sci. 9, 43-80.