[studies in fuzziness and soft computing] soft computing in humanities and social sciences volume...

12
5 On Explicandum versus Explicatum A Few Elementary Remarks on the Birth of Innovative Notions in Fuzzy Set Theory (and Soft Computing) Settimo Termini 5.1 Introduction The aim of this paper is twofold. First of all I want to present some old ideas revisited in the light of some of the many interesting new developments occurred in the course of these last ten years in the field of the foundations of fuzziness. Secondly I desire to present a tentative general framework in which it is possible to compare different attitudes and different approaches to the clarification of the conceptual problems arising from fuzziness and soft computing. In what follows, then, I shall use some names as banners to indicate a (crucial) problem (i.e., Carnap’s problem, von Neumann’s problem, Galileian science, Aristotelian science and so on). Although it will be clear by reading the following pages, the use of these reference names (the association of a name to a certain problem) should not be considered as the result of a historically based profound investigation but only as a sort of slogan for a specific position and point of view, an indication which, of course, I hope (and trust) does not, patently, contradict historical evidence regarding the scientific attitudes, approaches and preferences of the named persons. In some cases the problem associated to a certain scientist could be not so central in his scientific interests as it could appear from the connection proposed here and as my slogan could suggest. It should, then, be taken as a sort of working hypothesis which remains independent from the historical accuracy of the etiquette assigned to the problem itself. Above, I mentioned “old” views, ideas and reflections on these themes. They refer maily to discussions I had along the years with old friends; some of them involved also in the 2009 meeting. Finally let me add that some (relatively) recent work (see, [14], [15]) has vigourously indicated additional interesting points of view which open new ways of affording already emerged and new emerging problems. 5.2 Carnap’s Problem It is well known that Rudolf Carnap in the first pages of his Logical foundations of probability [2] faced the (difficult) problem of the ways and procedures according to which a prescientific concept (which by its very nature is inexact) is trasformed R. Seising & V. Sanz (Eds.): Soft Comput. in Humanit. and Soc. Sci., STUDFUZZ 273, pp. 113–124. springerlink.com c Springer-Verlag Berlin Heidelberg 2012

Upload: veronica

Post on 08-Dec-2016

217 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5

On Explicandum versus ExplicatumA Few Elementary Remarks on the Birth of Innovative Notionsin Fuzzy Set Theory (and Soft Computing)

Settimo Termini

5.1 Introduction

The aim of this paper is twofold. First of all I want to present some old ideasrevisited in the light of some of the many interesting new developments occurredin the course of these last ten years in the field of the foundations of fuzziness.Secondly I desire to present a tentative general framework in which it is possibleto compare different attitudes and different approaches to the clarification of theconceptual problems arising from fuzziness and soft computing. In what follows,then, I shall use some names as banners to indicate a (crucial) problem (i.e., Carnap’sproblem, von Neumann’s problem, Galileian science, Aristotelian science and soon). Although it will be clear by reading the following pages, the use of thesereference names (the association of a name to a certain problem) should not beconsidered as the result of a historically based profound investigation but only asa sort of slogan for a specific position and point of view, an indication which, ofcourse, I hope (and trust) does not, patently, contradict historical evidence regardingthe scientific attitudes, approaches and preferences of the named persons. In somecases the problem associated to a certain scientist could be not so central in hisscientific interests as it could appear from the connection proposed here and as myslogan could suggest. It should, then, be taken as a sort of working hypothesiswhich remains independent from the historical accuracy of the etiquette assigned tothe problem itself. Above, I mentioned “old” views, ideas and reflections on thesethemes. They refer maily to discussions I had along the years with old friends; someof them involved also in the 2009 meeting. Finally let me add that some (relatively)recent work (see, [14], [15]) has vigourously indicated additional interesting pointsof view which open new ways of affording already emerged and new emergingproblems.

5.2 Carnap’s Problem

It is well known that Rudolf Carnap in the first pages of his Logical foundations ofprobability [2] faced the (difficult) problem of the ways and procedures accordingto which a prescientific concept (which by its very nature is inexact) is trasformed

R. Seising & V. Sanz (Eds.): Soft Comput. in Humanit. and Soc. Sci., STUDFUZZ 273, pp. 113–124.springerlink.com c© Springer-Verlag Berlin Heidelberg 2012

Page 2: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

114 5 On Explicandum versus Explicatum

into a (new) exact scientific concept. He called this transformation (the transitionfrom the explicandum, the informal, qualitative, inexact prescientific notion to theexplicatum, its scientific, quantitative, exact substitute) the procedure of explication,a procedure which, as Carnap immediately observed, presents a paradoxical aspect.While in ordinary scientific problems, he, in fact, observes, “both the datum and thesolution are, under favorable conditions, formulated in exact terms ... in a problemof explication the datum, viz., the explicandum, is not given in exact terms; if itwere, no explication would be necessary. Since the datum is inexact, the problem it-self is not stated in exact terms; and yet we are asked to give an exact solution. Thisis one of the puzzling peculiarities of explication”. A comment of general type isnecessary at this point, although it may look as a digression. Two immediate corol-laries of Carnap’s argumentation are the following ones. First, we shall never ableto state that we have a proof that a proposed formal candidate is THE explicatum ofa certain explicandum. Secondly, we can have different competing explicata of thesame explicandum. Regarding the first point let me recall what Carnap has rightlyobserved: “It follows that, if a solution for a problem of explication is proposed, wecannot decide in an exact way whether it is right or wrong. Strictly speaking, thequestion whether the solution is right or wrong makes no good sense because thereis no clearcut answer. The question should rather be whether the proposed solutionis satisfactory, whether it is more satisfactory than another one, and the like.” Theimpossibility of having a rigourous proof does not imply, however, that we cannotbut proceed in a hazy way. The observation should not encourage sloppiness. Infact, “There is a temptation to think that, since the explicandum cannot be given inexact terms anyway, it does not matter much how we formulate the problem. Butthis would be quite wrong.”1 Although we cannot reach absolute precision, we cando, and must do our best for using the maximum of exactness or precision that isallowed by the treated problem, under given conditions. All these Carnap’s com-ments are of extrordinary importance for our issues. There is not a sharp separation

1 He continues affirming: “On the contrary, since even in the best case we cannot reachfull exactness, we must, in order to prevent the discussion of the problem from becomingentirely futile, do all we can to make at least practically clear what is meant as the expli-candum. What X means by a certain term in contexts of a certain kind is at least practicallyclear to Y if Y is able to predict correctly X’s interpretation for most of the simple, ordinarycases of the use of the term in those contexts. It seems to me that, in raising problems ofanalysis or explication, philosophers very frequently violate this requirement. They askquestions like: ’What is causality?’, ’What is life?’, ’What is mind?’, ’What is justice?’,etc. Then they often immediately start to look for an answer without first examining thetacit assumption that the terms of the question are at least practically clear enough to serveas a basis for an investigation, for an analysis or explication. Even though the terms inquestion are unsystematic, inexact terms, there are means for reaching a relatively goodmutual understanding as to their intended meaning. An indication of the meaning with thehelp of some examples for its intended use and other examples for uses not now intendedcan help the understanding. An informal explanation in general terms may be added. Allexplanations of this kind serve only to make clear what is meant as the explicandum; theydo not yet supply an explication, say, a definition of the explicatum; they belong still tothe formulation of the problem, not yet to the construction of an answer”.

Page 3: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5.2 Carnap’s Problem 115

between the exactness of the proofs or of the empirical verifications and the worldof everyday language and use of the words. Also in the case of “unsystematic, inex-act terms, there are means for reaching a relatively good mutual understanding as totheir intended meaning”. At the very root of “the procedure of explication” there is,then, a puzzling aspect and the indication of a way for not remaining blocked by thepuzzle. And the way makes implicit reference to a a different kind of “exactness”which is working in practice, although it would be useless (if not impossible) to tryto construct a general theoretical understanding of the phenomenon.

Let us briefly dwell on the second corollary indicated above. While looking ob-viously true, there exists, however, today a notheworthy exception. The informal,intuitive, everyday notion of “computable”, presents a unique explicatum (underthe demonstrably equivalence of variuos different formal definitions). So, the socalled and well known Church-Turing’s thesis, while remaining “a thesis” (some-thing which cannot be rigourously proved just for the reasons discussed by Carnap– we are confronting an informal idea with a formal definition) is, however, corrob-orated in a very strong way, stronger than one could think possible by prima facieconsidering the motivations provided by Carnap: any conceivable proposed newexplicatum turns out to be “equivalent” to previously considered proposals.

Let me briefly summarize what in my view can be fruitfully “stolen” from Car-nap’s observations for our problem. First of all, his “procedure of explication” canbe used as a very good starting point for looking in a unified way to the founda-tional problems of Soft Computing (SC). We have a uniform way for looking at andcomparing the different notions used in SC, the ways in which they have been andare “regimented”. Also the ways suggested by Carnap to clarify the aims and qual-ifications of the explicandum go hand in hand with what has been done in SC. Adifference remains in the general approach. The explicatum, in Carnap’s view, can-not but be something exactly defined in the traditional terms of Hard Sciences. Allhis observations, however, on the ways in which we can transform a very rough ideainto something sufficiently clear to be considered reasonable as the explicandum ofa certain concept, provide tools for using his ideas also along different paths.

In order to proceed a little bit further, let me quote in extenso what Carnap writesregarding the methodology we could follow in order to define a formal explicatumstarting from the original informal intuitive notion. “A concept must fulfil the fol-lowing requirements in order to be an adequate explicatum for a given explicandum:(1) similarity to the explicandum, (2) exactness, (3) fruitfulness, (4) simplicity. Sup-pose we wish to explicate a certain prescientific concept, which has been sufficientlyclarified by examples and explanations as just discussed. What is the explication ofthis concept intended to achieve? To say that the given prescientific concept is tobe transformed into an exact one, means, of course, that an exact concept corre-sponding to the given concept is to be introduced. What kind of correspondence isrequired here between the first concept, the explicandum, and the second, the ex-plicatum? Since the explicandum is more or less vague and certainly more so thanthe explicatum, it is obvious that we cannot require the correspondence between thetwo concepts to be a complete coincidence. But one might perhaps think that the

Page 4: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

116 5 On Explicandum versus Explicatum

explicatum should be as close to or as similar with the explicandum as the latter’svagueness permits.”

A few comments are needed. Let us preliminary recall the already mentionedunique known conterexample to the very natural general situation described by Car-nap (namely “that we cannot require the correspondence between the two conceptsto be a complete coincidence”): the concept of computation. My insisting on thisfact is motivated both by its intrinsic interest and by a question which will be sub-sequently asked in the paper. Secondly, let me observe that the second requirement(which is also the unique of a “formal” type) asked for having a good explicatumis exactness. Carnap does not require mathematization or similar things but only(only!) exacteness. Here we see in action the hand of the Master. It is not asked aformalization, axiomatization or the like, but exactness. So its scheme can be usedalso in fields different from traditional hard sciences paying only the (necessary)price of a preliminary clarification of the form of exactmness we are able to use inthe given domain and that the explicata of some central concepts in the theory aspir-ing at grasping some aspects of the considered domain are such just in virtue of thefact that they satisfy the requirement of the proposed form of exactness. In what fol-lows, I shall call Carnap’s problem, the one of analysing the concepts and notionsused to model and describe selected pieces of reality according to the procedureindicated by him and briefly described and commented above.

5.3 A Unifying Framework for Looking at Typical Case Studies

As I have already observed at the end of the previuos Section, Carnap’s problemcan be used for (at least) two different scopes. First, as a guide to look at problemsof interpretations and of new developments in some classical areas: in these casesit plays the role of providing an important reference point. Secondly, it can be usedas a tool for putting order in the analysis of some difficult new problems. In ourcase a very important example is provided just by the general theme of this Meet-ing, namely, the relationships existing between SC and Human Sciences. Anotherimportant and crucial topic is the one of analyzing the long range methodologicalinnovation provided by Zadeh’s CW and CTP (see, for instance, [37], [38]). In thepresent Section I shall briefly indicate how some classical problems can be seenin the perspective outlined in the previous Section. What will be presented, then,is nothing more than a very brief outline of a research programme, leaving to fu-ture contributions a detailed treatment of the problems and subjects involved. Whatfollows has also the aim of providing a preliminary test of the validity of the epis-temological working hypothesis that Carnap’s problem can be a useful tool. In theremaining part of this Section I shall, then, present some scattered observations ona few selected topics.

5.3.1 The Revisitation of Basic Logical Principles

Recently Enric Trillas (see, for instance, [28]) has posed the problem of the va-lidity of logical principles starting from various different technical results of fuzzy

Page 5: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5.3 A Unifying Framework for Looking at Typical Case Studies 117

logic. Of course, that the same acceptance of different truth values besides trueand false would pose interpretative problems to the unquestioned logical principleswas observed already by the founding fathers of many valued logic, namely, JanLukasiewicz and Emil Post (see, for instance, [5]). However, the way in whichthe problem is posed now is more general since also different aspects, not purelylogical are involved. A complete and satisfactory analysis of the problem asks thatboth the old questions and the new context in which we move today are taken intoaccount. Among these last ones we can remember both the richness and the prolifer-ation of technical tools and results, sometimes associated with a scarce awareness,we could say, of the conceptual implications of the implicit assumptions behindsome of the paths followed. And in some cases a sort of (technical and concep-tual) lack of significance of some developments. Carnap’s scheme can be fruitfullyused for a preliminary classification and clarification of at least some problems anddevelopments.

5.3.2 Vagueness vs. Fuzziness

What is the correct relationship between the two notions of vagueness and fuzzi-ness is an old and debated problem; incidentally, more challenging – in my view– than the one of the relationship between fuzziness and probability. While in thelast case, in fact, we are confronting two explicata, (the interesting problem beingthe “inverse” one of understanding of which explicanda they are good explicata),by comparing vagueness and fuzziness we are faced with a lot of intersting openproblems: is vagueness the explicandum and fuzziness the explicatum (or one of itsexplicata)? If not, which kind of relations can we establish or study of these twodifferent notions (uncomparable for what regards their different level of formaliza-tion? For a review of different aspects of these questions see [18], [19], [20] andthe whole volume [16]. Carnap’s scheme allows, in my view, to look at this netof interesting questions in a very general way allowing to examine differents facetsinside a unified setting. For instance, one interesting comment done by Terricabrasand Trillas [17]: “Fuzzy sets can be see as the best approximation of a quantitative,extensional representation of vagueness” acquires a new light in this context.

5.3.3 Measuring Fuzziness (Or Controlling Booleanity?)

If one accepts the existence of fuzzy theories as mathematically meaningful descrip-tions of concrete systems of a high level of complexity, the question arises whetherit is possible to control (and measure) the level of fuzziness present in the consid-ered description. This is the starting point which triggered the development of thethe so called “entropy measures” or “measures of fuzziness” of a fuzzy set (see, forinstance, [6], [8], [7], [10]). This problem was tackled in a very general way byfollowing the simple idea of proceeding in an two stages axiomatic way by strictlyconnecting requirements to be imposed and measures satisfying these requirements,keeping in mind, however, the fact that not every requirement one could abstractlyenvisage could always be imposed in each specific situation. In other words, the

Page 6: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

118 5 On Explicandum versus Explicatum

axioms should not be put all on the same level. One should pick up some basicproperties and requirements, necessary to characterize something as “measure offuzziness” of a fuzzy set. Other requirements could be imposed depending on theparticular situation under study. By proceeding in this way, one could have availablea wide class of measures from which one could choose the most adequate for thespecific problem under study. For what concerns the measures of fuzziness, in myview, the basic kernel of the theory may be considered fairly complete now, after thegeneral classification of the various families of measures provided by Ebanks [3].However some interesting problems remain. First, at the interpretative level, shouldwe interpret these measures as measures of fuzziness in a very general sense, orshould look at them as tools for controlling “booleanity”, as Enric Trillas has oftenposed the problem? Secondly, and strictly related to the previous question, is theanalysis of conceptual relationships existing between the standard (axiomatic) the-ory and conceptually different approaches. I refer, in particular, to the original wayof facing the problem of measuring how far a fuzzy set is from a classic character-istic function due to Ron Yager [31] (for a few remarks, see also [9]). His proposalallows one to look at the problem of intuitive ideas versus formal results from an-other point of view. His challenging idea is that of measuring the distance or thedistinction between a fuzzy set and its negation and the technical tool to do so isprovided by the lattice theoretical notion of “betweeness”. It can be shown that inall the cases in which it is possible to define Yager’s measure it is also possible todefine a measure of fuzziness in the axiomatic sense, discussed above. The point ofview of Yager, then, provides a new very interesting visualization but technically itdoes not allow one to extend the class of measures, as one would expect, due to theconceptual difference of the starting point. The general problem of the nature anddifferences of various ways of affording the problem is still in need of a conceptualclarification. Moreover, the discovery of new general requirements to be imposedunder the form of new axioms does not appear very likely as long as we move insidethe basic scheme of the theory of fuzzy sets with the standard connectives. Whatis open to future work is the adaptation of the proposed axiomatic scheme to somevariants of it. We may, for instance, change the connectives or also the range ofthe generalized characteristic functions. The problem of measuring vagueness ingeneral remains completely open, i.e., the problem of constructing a general the-ory of measures of vagueness, a goal which presupposes the existence of a generalformal theory of vague predicates. The alternative of finding out paths that can befollowed “to measure” in a non quantitative (numerical) way, however interestingand appealing it can appear, seems extremely more difficult also to envisage.

5.3.4 von Neumann’s Problem

All the developments of the theory of fuzzy sets corroborate the conviction of vonNeumann that escaping the constraints of all-or-none concepts would allow one tointroduce (and use) results and techniques of mathematical analysis in the field oflogic, increasing, then, the flexibility of logical tools and their wider application to

Page 7: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5.3 A Unifying Framework for Looking at Typical Case Studies 119

different fields.2 But one can see also a non trivial connection between measures offuzziness and von Neumann’s remarks on the role of the presence of error in logicsseen as as an essential part of the considered process.3 The program of construct-ing a calculus of thermodynamical type which could be considered a developmentof von Neumann’s vision of the role of error in “the physical implementation oflogic”, to quote his own words, was explicitly mentioned already in 1979 (see [8]).Let me observe that, from my knowledge, the first attempt to proceed technicallyin the direction indicated by von Neumann is due to Enric Trillas who in his [27]approached the problem of the logical connective of “negation” (in a subsequentpaper [1] the other one connective was taken into account). Measures of fuzzinessare indeed an element which could contribute, inside the general framework of thetheory of fuzzy sets, to the construction of a sort of “thermodynamical logic”. Theycan, in fact, be viewed as a particular way of studying the levels of precision of adescription. From this point of view they can already represent a treatment of errorof a “thermodynamical” type in some definite – albeit still vague – sense. They,moreover, can be inserted in logical inference schemes in which approximation,vagueness, partial or revisable information play a role either at the level of the relia-bility of the premises or of the inference rules (or both). Although a satisfactory andfairly complete integration of all these aspects remains to be done, we can mention,however, a possible development based on the fact that inside the theory of fuzzysets a lot of different measures has been developed. This will clearly appear, afterthe description of an attempt of outlining the basis of a possible treatment of the dy-namics of information (and in this context it will be clear my use of the expression“thermodynamical logic”.

5.3.5 Towards an “information dynamics”

The informal notion of “information” (the explicandum) is very rich and multi-faceted and so it is not strange that the formal theories that have been proposeddo not capture all the nuances of the informal notion. One could consider iso-lating some meaningful and coherent subsets of the properties and features of theexplicandum and look for satisfying formalizations of these aspects. Since they are

2 “There exists today a very elaborate system of formal logic, and, specifically, of logic asapplied to mathematics. This is a discipline with many good sides, but also with certainserious weaknesses. . . . About the inadequacies . . . this may be said: Everybody who hasworked in formal logic will confirm that it is one of the most refractory parts of mathemat-ics. The reason for this is that it deals with rigid, all-or-none concepts, and has very littlecontact with the continuous concept of the real or of the complex number, that is, withmathematical analysis. Yet analysis is the technically most successful and best-elaboratedpart of mathematics. Thus formal logic is, by the nature of its approach, cut off fromthe best cultivated portions of mathematics, and forced onto the most difficult part of themathematical terrain, into combinatorics.” ([11], p. 303)

3 “The subject matter . . . is the role of error in logics, or in the physical implementation oflogics – in automata synthesis. Error is viewed, therefore, not as an extraneous and misdi-rected or misdirecting accident, but as an essential part of the process under consideration. . . ” [12], p. 329)

Page 8: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

120 5 On Explicandum versus Explicatum

different aspects of one unique general concept anyway we must also pick up andstudy the way in which these subaspects interact. The process suggested abovepoints then not to a very general but static theory of information in which a uniqueformal quantity is able to take the burden of a multifaceted informal notion, but in-stead pinpoints an information dynamic in which what the theory controls is a wholeprocess (along with – under the pressure of changes in the boundary conditions –the relative changes of the main central (sub)notions involved in the theory itselfand their mutual interactions). In this way we pass from a situation in which there isonly one central notion on the stage to another one in which a report of what is hap-pening in a process (in which information is transmitted, exchanged and the like)is provided by many actors on the stage, each of which represents one partial as-pect of what the informal use of the word information carries with it. This scenarioresembles the one of thermodynamics: no single notion suffices for determiningwhat is happening in the system and the knowledge of the value assumed by one ofthe thermodynamical quantities can be obtained only as a function of (some of) theothers, by knowing (and working out) the quantitative relationships existing amongthem. That is what an information dynamic must look for: its principles, laws whichquantitatively state the connections existing among some of the central quantities ofthe theory.

5.3.6 Infodynamics of Fuzzy Sets

But let us see what can happen if we try to apply the ideas of this very generalscheme to the case of fuzzy sets. In [22], I tried to outline how a program of thistype could be looked for in the setting of the theory of fuzzy sets. I shall brieflysummarize now the general ideas showing the connection with the remarks doneabove on “von Neumann’s problem”. It is well known that many quantities havebeen introduced to provide a global (one could say, “macroscopic”) control of theinformation conveyed by a fuzzy set; for instance, measures of fuzziness, energymeasures (or “fuzzy cardinalities”) (see [6], [8], [7], [10]) or measures of specificity(see [33], [34]). Measures of fuzziness want to provide an indication of how far acertain fuzzy set departs from a classical characteristic function; measures of speci-ficity, instead, want to provide an indication of how much a fuzzy set approaches asingleton. These two classes of measures certainly control different aspects of theinformation conveyed by a fuzzy set; they are not, however, conceptually unrelated.For instance, if the measure of fuzziness of a certain fuzzy set is maximum (i.e., allthe elements have a “degree of belonging” equal to 0.5) then we indirectly knowsomething about specificity. Conversely, if the measure of specificity informs usthat we are dealing exactly with a singleton, we can immediately calculate the cor-responding measure of fuzziness. But a relationship between these measures alsoexists in not such extreme cases; in order to refine the way in which this kind ofknowledge can be exchanged one could also think to introduce other measures. Animportant role in this sense can be played by (generalized) cardinality of the fuzzyset and (if not all the elements are on an equal footing) also by a weighted cardi-nality, for instance, the so-called “energy” of a fuzzy set [8]. It would then be very

Page 9: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5.4 As a Sort of Conclusion 121

interesting to have some explicit quantitative relations among these measures, sinceit would provide a way of transforming our knowledge regarding one or two of thesequantities in a (more or less approximate) knowledge of the remaining one(s). Allthis stuff – this was the suggestion given in [22] – should be organized in a way sim-ilar to the structure of thermodynamics, listing principles and equations connectingthe various central quantities. The final goal of the project is to obtain ways ofcalculating the values of one of these quantities, once the values of other ones areknown, or to reconstruct the fuzzy set given the values of appropriate quantities.

So, we see that, by taking Carnap’s problem as a general guide, it is possibleto outline a possible development of von Neuman’s problem far beyond the short(although very dense and – considering the time in which they were written – veryinnovative) comments we have quoted above.

5.4 As a Sort of Conclusion

To examine the relationships existing between SC and human and social sciences isvery important and crucial since, through a simple inspection, many forgotten oldepistemological problems reemerge, showing a major role that they can play for un-derstanding in a non sectorial way, what we could call the “enterprise of scientificinvestigation”.4 At the same time a host of new problems interesting completely newquestions appear. So there exist different planes along which we can (and should)move. First of all, it is very important to collect all the possible interactions andapplications and explore different ways for obtaining other useful applications. Sec-ondly we should explore the reason why the interaction has not been till now moreextense. I have briefly indicated why, in my view, this phenomenon is not casualbut the true reasons can reside in the fact that to interact in a substantial way andnot occasional, can cause the emergence of deep (and, of course, both difficult andinteresting) problems. But the question deserves to be investigated more extensivelythan I have done in the previous pages in a very cursory way. Third, there exist cru-cial and difficult problems which in my view are a corollary of the reasons of themissed interaction, but which, independently of this, deserve to be studied and ana-lyzed with intellectual passion and scientific care . Before concluding, let me, then,list a few items that – in my view – will be of more and more crucial interest in thefuture since they stand at the crossroad of conceptual questions and the possibilityof scientific (technical) developments along non traditional paths:

4 In this regard, let me stress that I am looking at these problems from the specific (andlimited) point of view which can be of interest to the working scientist. For instance,when reflecting on vagueness, I share the very pragmatic point of view of van Heijenoortin [4] and leave out all the interesting and subtle considerations present in the philosophicalliterature on the subject. Let me stress that all these questions should be read in the generalcontext of the development of Cybernetics and Information Sciences. I refer to [21], [23],[24] for a few (still very preliminary) remarks. Please note that many argumentations inSection 3 are borrowed almost literally from [23].

Page 10: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

122 5 On Explicandum versus Explicatum

a) Galileian science vs Aristotelian science. With these very pompous terms, Iwant to refer to the two different attitudes which have been and still are presentin scientific investigation. The traditional (Galileian) attitude which in its stan-dard fields of investigation proceeds by bold hypotheses and rigourous empiri-cal control of the consequences of these same hypotheses by using sophisticatedformal machineries on the theoretical part and equally sophisticated technolog-ical machineries for the empirical control. What I called “Aristotelian” sciencerefer to a more descriptive attitude. A first preliminary discussion of my viewon this contrapposition can be found in [25] and related comments on “fam-ily resemblances” of human sciences and fuzzy set theory in [26] in this samevolume.

b) It remains also another non trivial problem, namely the one of taking as start-ing point the perceptions, not to be reduced or recovered, reconstructed fromother numerical classical components. But this is another story, which indicatesboth a strong connection of Zadeh’s new proposals which the questions arisedin point a) above and the fact that we should take into account also Husserl’sconception of science.

c) Relationships existing between Zadeh’CWW and Church-Turing’s Thesis.

To my knowledge, point b) has not been previously discussed, at least with refer-ence to problems of SC. Point c) has been discussed, for instance in [35], [30], alongthe traditional line of proving that the new model does not violate Church-TuringThesis; or, told in different terms, c) has been looked for in the sense of proving(under suitable “numerical” translations of CWW procedures) that the model ofcomputation inspired by CWW is equivalent to one of the classical models of com-putation. Although this is very interesting, however, a major challenge goes exactlyin the other direction: to analyze whether by manipulating words we can do “com-putations” (in a specific sense) to be defined in a more general way, which is notreducible to the classical notion of computation. Let me conclude by saying thatall the developments of the last 45 years in what now is known as the huge fieldof Soft Computing should be periodically confronted with the original setting pro-posed by Zadeh in [36], looking also to the first developments of the original ideas(see [32]). This could help in appreciating extensions of the general perspectivebut also – epistemologically relevant – conceptual shifts and programmatic drifts.Finally, let me indicate that Enric Trillas in his recent paper [29], which recollectssome scientific exchanges with Italy, indicates in a very sinthetic and challengingway a few crucial problems to approach in our young and intellectually stimulatingfield of investigation.

Acknowledgements

I want to thank Enric Trillas for many thought-provoking questions and discussions alongmany many years and Rudi Seising for equally interesting discussions along the last fewyears.

Page 11: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

5.4 As a Sort of Conclusion 123

References

[1] Alsina, C., Trillas, E., Valverde, L.: On Some Logical Connectives for Fuzzy SetsTheory. Journal of Mathematical Analysis and Applications 93, 15–26 (1983)

[2] Carnap, R.: Logical Foundations of Probability. Chicago University Press (1950)[3] Ebanks, B.R.: On Measures of Fuzziness and their Representations. Journal of Mathe-

matical Analysis and Applications 94, 24–37 (1983)[4] van Heijenoort, J.: Frege and Vagueness. In: van Heijenoort, J. (ed.) Selected Essays,

pp. 85–97. Bibliopolis, Naples (1985)[5] Lukasiewicz, J.: Philosophical remarks on many-valued systems of propositional logic.

In: Borkowski, L., Lukasiewicz, J. (eds.) Selected Works. Studies in logic and the foun-dations of mathematics, pp. 153–178. North-Holland Publ. Comp., Pol. Scientif. Publ.,Warszawa, Amsterdam (1970)

[6] De Luca, A., Termini, S.: A definition of a non probabilistic entropy in the setting offuzzy sets theory. Information and Control 20, 301–312 (1972); reprinted in: Dubois,D., Prade, H., Yager, R.R. (eds.): Readings in Fuzzy Sets for Intelligent Systems. pp.197-202, Morgan Kaufmann (1993)

[7] De Luca, A., Termini, S.: Entropy of L-fuzzy Sets. Information and Control 24, 55–73(1974)

[8] De Luca, A., Termini, S.: Entropy and energy measures of a fuzzy set. In: Gupta, M.M.,Ragade, R.K., Yager, R.R. (eds.) Advances in Fuzzy Set Theory and Applications, pp.321–338. North-Holland, Amsterdam (1979)

[9] De Luca, A., Termini, S.: On Some Algebraic Aspects of the Measures of Fuzziness.In: Gupta, M.M., Sanchez, E. (eds.) Fuzzy Information and Decision Processes, pp.17–24. North-Holland, Amsterdam (1982)

[10] De Luca, A., Termini, S.: Entropy Measures in Fuzzy Set Theory. In: Singh, M.G. (ed.)Systems and Control Encyclopedia, pp. 1467–1473. Pergamon Press, Oxford (1988)

[11] von Neumann, J.: The General and Logical Theory of Automata. In: Cerebral Mecha-nisms in Behaviour – The Hixon Symposium, J. Wiley, New York (1951); (reprinted in[13], pp. 288–328 )

[12] von Neumann, J.: Probabilistic Logics and the Synthesis of Reliable Organisms fromUnreliable Components. In: Shannon, C.E., MacCarthy, J. (eds.) Automata Studies.Princeton University Press (1956); (reprinted in [13] pp. 329–378 )

[13] von Neumann, J.: Design of Computers, Theory of Automata and Numerical Analysis.In: Collected Works, vol. V. Pergamon Press, Oxford (1961)

[14] Seising, R.: The Fuzzification of Systems. The Genesis of Fuzzy Set Theory and ItsInitial Applications – Its Development to the 1970s. STUDFUZZ, vol. 216. Springer,Berlin (2007)

[15] Seising, R. (ed.): Views on Fuzzy Sets and Systems from Different Perspectives. Philos-ophy and Logic, Criticisms and Applications. STUDFUZZ, vol. 243. Springer, Berlin(2009)

[16] Skala, H.J., Settimo, T., Trillas, E.: Aspects of Vagueness. Reidel, Dordrecht (1984)[17] Terricabras, J.-M., Trillas, E.: Some remarks on vague predicates. Theoria 10, 1–12

(1988)[18] Termini, S.: Aspects of Vagueness and Some Problems of their Formalization. In: [16],

pp. 205–230[19] Termini, S.: Vagueness in Scientific Theories. In: Singh, M.G. (ed.) Systems and Con-

trol Encyclopedia, pp. 4993–4996. Pergamon Press, Oxford (1988)

Page 12: [Studies in Fuzziness and Soft Computing] Soft Computing in Humanities and Social Sciences Volume 273 || On Explicandum versus Explicatum

124 5 On Explicandum versus Explicatum

[20] Termini, S.: Vague Predicates and the Traditional Foundations of Mathematics. In: In-ternational Congress for Logic, Methodology and the Philosophy of Science, Salzburg(1983)

[21] Termini, S.: Remarks on the development of Cybernetics. Scientiae Matematicae Japon-icae 64(2), 461–468 (2006)

[22] Termini, S.: On some vagaries of vagueness and information. Annals of Mathematicsand Artificial Intelligence 35, 343–355 (2002)

[23] Termini, S.: Imagination and Rigor: their interaction along the way to measuring fuzzi-ness and doing other strange things. In: Termini, S. (ed.) Imagination and Rigor, pp.157–176. Springer, Milan (2006)

[24] Termini, S. (ed.): Imagination and Rigor. Springer, Milan (2006)[25] Termini, S.: Concepts, Theories, and Applications: the role "experimentation” for

formalizing new ideas along innovative avenues. In: Trillas, E., Bonissone, P., Mag-dalena, L., Kacprycz, J. (eds.) Experimentation and Theory: Hommage to Abe Mam-dani. STUDFUZZ. Physica-Verlag (to appear, 2011)

[26] Termini, S.: On some " family resemblances” of Fuzzy Set Theory and Human Sci-ences. In: Seising, R., Sanz, V. (eds.) This same volume

[27] Trillas, E.: Sobre funciones de negación en la teoría de subconjuntos difusos, StocasticaIII, 47-59; an english version appeared in the volume. : Barro, S., Alberto, B., Sobrino,A. (eds.) Advances of Fuzzy Logic, pp. 31–43. Press of the Universidad de Santiago deCompostela, Spain (1998)

[28] Trillas, E.: Non Contradiction, Excluded Middle, and Fuzzy Sets. In: Di Gesù, V., Pal,S.K., Petrosino, A. (eds.) WILF 2009. LNCS (LNAI), vol. 5571, pp. 1–11. Springer,Heidelberg (2009)

[29] Trillas, E.: Il Laboratorio/Istituto di Cibernetica e la mia vita. In: Greco, P., Termini,S. (eds.) (a cura di) Memoria e progetto, GEM, Bologna, pp. 23–32 (2010) (an englishversion, not published, is also available)

[30] Wang, H., Qiu, D.: Computing with words via Turing machines: a formal approach.IEEE Transactions on Fuzzy Systems 11(6), 742–753 (2003)

[31] Yager, R.R.: On the Measures of Fuzziness and Negation. II Lattices. Information andControl 44, 236–260 (1980)

[32] Yager, R.R., Ovchinnikov, S., Tong, R.M., Nguyen, H.T. (eds.): Fuzzy Sets and Appli-cations: Selected Papers by L.A. Zadeh. Wiley, New York (1987)

[33] Yager, R.R.: On the specificity of a possibility distribution. In: Fuzzy Sets and Systems,vol. 50, pp. 279–292 (1992); Reprinted in: Dubois, D., Prade, H., Yager, R.R. (eds.)Readings in Fuzzy Sets for Intelligent Systems, pp. 203-216. Morgan Kaufmann (1993)

[34] Yager, R.R.: Measures of information in generalized constraints. International Journalof Uncertainty, Fuzziness and Knowledge-Based Systems 6, 519–532 (1998)

[35] Ying, M.S.: A formal model of computing with words. IEEE Transactions on FuzzySystems 10(5), 640–652 (2002)

[36] Zadeh, L.A.: Fuzzy sets. Information and Control 8, 338–353 (1965)[37] Zadeh, L.A.: Foreword to: Dubois, D., Prade, H. (eds.) Fundamentals of Fuzzy Sets,

Kluwer Academic Publishers (2000)[38] Zadeh, L.A.: From Computing with Numbers to Computing with Words – From Ma-

nipulation of Measurements to Manipulation of Perceptions. International Journal ofApplied Mathematics and Computer Science 12, 307–324 (2002)