associations among information granules and their ... · pdf fileof the...
TRANSCRIPT
Original ArticleInternational Journal of Fuzzy Logic and Intelligent SystemsVol. 13, No. 4, December 2013, pp. 245-253http://dx.doi.org/10.5391/IJFIS.2013.13.4.245
ISSN(Print) 1598-2645ISSN(Online) 2093-744X
Associations Among Information Granulesand Their Optimization inGranulation-Degranulation Mechanism ofGranular ComputingWitold PedryczDepartment of Electrical & Computer Engineering, University of Alberta, Edmonton, Canada,Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University,Jeddah, Saudi Arabia andSystems Research Institute, Polish Academy of Sciences, Warsaw, Poland
Abstract
Knowledge representation realized by information granules is one of the essential facets ofgranular computing and an area of intensive research. Fuzzy clustering and clustering aregeneral vehicles to realize formation of information granules. Granulation – degranulationparadigm is one of the schemes determining and quantifying functionality and knowledgerepresentation capabilities of information granules. In this study, we augment this paradigmby forming and optimizing a collection of associations among original and transformedinformation granules. We discuss several transformation schemes and analyze their properties.A series of numeric experiments is provided using which we quantify the improvement of thedegranulation mechanisms offered by the optimized transformation of information granules.
Keywords: Information granules, Granulation-degranulation, Association amonginformation granules, Fuzzy clustering, Granular computing
Received: Dec. 4, 2013Revised : Dec. 23, 2013Accepted: Dec. 24, 2013
Correspondence to: Witold Pedrycz([email protected])©The Korean Institute of Intelligent Systems
cc© This is an Open Access article dis-tributed under the terms of the CreativeCommons Attribution Non-Commercial Li-cense (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduc-tion in any medium, provided the originalwork is properly cited.
1. Introduction
Information granules being at the center of granular computing [1, 2] are fundamental conceptsusing which we perceive, structure and process knowledge in a human-centric fashion. Withthe ongoing progress witnessed in granular computing, it has become apparent that a numberof central problems still deserve our attention including a way of constructing informationgranules and a process of communication with the external world of data when conveyinginformation granules. The first group of tasks directly relates to clustering and fuzzy clustering.Clustering has been a central conceptual and algorithmic vehicle supporting a realization ofinformation granules. We have witnessed a great deal of developments in this domain [3]including numerous generalizations of clustering methods yielding a formation of higher ordergranular constructs, see [4] and granular fuzzy clusters [1], in general. There has been a visiblerole of fuzzy clustering in system modeling [5, 6]. The second problem of communicationbetween the world of numeric data and granular entities revolves around a fundamental conceptof the granulation-degranulation mechanism, assessment of its results quantified in terms of so–called degranulation criterion. The quality of information granules is captured by looking in away how numeric data are granulated and afterwards de-granulated and to which extent a
245 |
http://dx.doi.org/10.5391/IJFIS.2013.13.4.245
tandem of these two processing phases distorts the data. Thedistortions are inevitable (with several exceptions which dealwith one–dimensional data and engage triangular membershipfunctions, see [1, 7]).
The objective of this study is to further improve the granulation-degranulation abilities by introducing and optimizing transfor-mation of information granules so that the resulting degranula-tion error becomes minimized. We formulate an overall task asan optimization problem, introduce several categories of trans-formation functions (realizing suitable interactions/associationsamong information granules) and discuss the underlying opti-mization procedures.
The study is organized as follows. We start with a con-cise review of information granulation realized with the aid offuzzy clustering (Section 2). The granulation-degranulationidea along with the formal formulation of the problem is pre-sented in Section 3. Section 4 is devoted to the design ofinteractions among information granules while in Section 5 weprovide a number of illustrative experimental studies.
Throughout this study, we consider information granulesformed in the n-dimensional space of real numbers, x∈ Rn.
2. Information Granulation Through FuzzyClustering
Let us briefly recall the underlying concept and ensuing al-gorithmic aspects of the fuzzy C-means (FCM) algorithm [3,8], which is one of the commonly used techniques of fuzzyclustering and a computational vehicle to build informationgranules.
Given is a collection of n-dimensional data (patterns) {xk|k=1,2,. . . ,N} where xk ∈Rn. Our objective is to determine its struc-ture – a collection of “c” clusters (information granules). Fromthe algorithmic perspective, the problem is posed as a minimiza-tion of the following objective function (performance index)Q
Q =
c∑i=1
N∑k=1
umik||xk − vi||2 (1)
where v1,v2, . . . , vc are n-dimensional prototypes of the clustersand U = [uik] stands for a partition matrix expressing a wayof allocation of the data to the corresponding clusters; uikis the membership degree of data xk in the i-th cluster. Thedistance between the data xk and prototype vi is denoted by||.||. The fuzzification coefficient m (assuming values greaterthan 1) quantifies an impact of the membership grades on theindividual clusters and implies a certain geometry of the ensuing
information granules. Typically, the value of the fuzzificationcoefficient is set to 2. The partition matrix satisfies two essentialand practically justifiable properties
0 <
N∑k=1
uik < N, i = 1, 2, ..., c (2)
c∑i=1
uik = 1, k = 1, 2, ..., N (3)
The minimization of Q is completed with respect to U∈U andthe prototypes vi of V={v1, v2 ,...vc} of the clusters, namely
minQ with respect to U ∈ U , v1, v2, . . . , vc ∈ Rn (4)
Here U stands for a family of partition matrices, viz. the matri-ces satisfying conditions expressed by Eqs. (2) and (3).
From the optimization perspective, there are two individualoptimization tasks to be carried out separately to determinethe partition matrix and the prototypes. The results are well-documented in the literature and thoroughly discussed and themethod has been intensively experimented with. The optimiza-tion process is iterative and involves successive computing ofthe partition matrix and the prototypes
uik =1∑c
j=1
(||xk−vi||||xk−vj||
)2 (5)
i = 1, 2, . . . , c; k = 1, 2, . . . ,N.
vst =
∑Nk=1 u
mikxkt∑N
k=1 umik
(6)
s = 1, 2, . . . , c, t = 1, 2, . . . ,n (in the above calculations ofthe prototypes it has been assumed that the distance functionis Euclidean). We can look at the partition matrix by consider-ing its individual rows – denoting them by A1, A2, .., Ac weemphasize that fuzzy clustering gives rise to a collection ofinformation granules.
3. Granulation and Degranulation Principle
More formally, we can describe this knowledge representationin terms of information granules as a way of expressing anydata point x in terms of the information granules and describethe result as a vector in the c-dimensional hypercube, namely[0,1]c,
G : Rn → [0, 1]c (7)
www.ijfis.org Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing | 246
International Journal of Fuzzy Logic and Intelligent Systems, vol. 13, no. 4, December 2013
Space of
information
granules
Data space
{v1, v2, …, vc} U
G
Granulation
G-1
Degranulation
FCM
Figure 1. The granulation-degranulation mechanism: a general viewat the level of processing information granules. FCM, fuzzy C-mean.
The degranulation step is about a reconstruction of x on a basisof the family of information granules (clusters). It can be treatedas a certain mapping
G−1 : [0, 1]c → Rn (8)
The capabilities of the information granules to reflect the struc-ture of the original data can be conveniently expressed bycomparing how much the result of degranulation, say x dif-fers from the original pattern x, that is x̂ 6=x. More formally,x̂ = G−1(G(x)) where G and G−1 denote the correspondingphases of information granulation and de-granulation [9].
The crux of the granulation – degranulation principle is vi-sualized in Figure 1 [1]. Note the transformations G and G−1
operate between the spaces of data and the abstract space ofinformation granules.
Let us start with the granulation phase. More specifically, xis expressed in the form of the membership grades ui of x to theindividual granules Ai, which form a solution to the followingoptimization problem
Min
c∑i=1
ui(x)||x− vi||2 (9)
subject to the usual constraints imposed on the degrees of mem-bership
c∑i=1
ui(x) = 1 ui(x) ∈ [0, 1] (10)
The solution to the problem (we use here Lagrange multipliers)comes in the form,
Ai(x) = ui(x) =1∑c
j=1
(||x−vi||||x−vj||
)2 (11)
For the degranulation phase, given ui(x) and the prototypes vi,the vector x̂ is considered as a solution to the minimization
problem in which we reconstruct (degranulate) original x whenusing the prototypes and the corresponding membership grades
N∑i=1
ui(x)||x̂− vi||2 (12)
Considering the use of the Euclidean distance in the above per-formance index, the subsequent calculations are straightforwardyielding the result
x̂ =
∑ci=1 ui(x)vi∑ci=1 ui(x)
(13)
It is important to note that the description of x in more abstractfashion realized by means of Ai and being followed by theconsecutive degranulation brings about a certain granulationerror (which is inevitable given a fact that we move back andforth between different levels of abstraction). While the aboveformulas pertain to the granulation realized by fuzzy sets, thegranulation-degranulation error is also present when dealingwith sets (intervals). In this case we are faced with a quantiza-tion error, which becomes inevitable when working with A/D(granulation) and D/A (degranulation) conversion mechanisms.The problem formulated above is also associated with vectorquantization, which has been widely studied in the literature, cf.[10-14].
The quality of the granulation-degranulation scheme is quan-tified by means of the following performance index [9]
V =
N∑k=1
||xk − x̂k||2 (14)
This performance index articulates how much the reconstructionerror is related with the representation capabilities of informa-tion granules.
4. Building Interactions Among Information Gran-ules
Denoting information granules formed thorough the process ofclustering by A1, A2, ..Ac we consider a certain framework ofinteraction among them where such interaction gives rise to theenhancement of the degranulation features and subsequently areduction of the degranulation error. To highlight the essenceof the approach, we can refer to Figure 2.
The essence of this enhanced mechanism dwells on a map-ping (transformation) F: {A1, A2,.., Ac}→ {B1, B2. ,. . . , Bc}where Bi= F(A1, A2, . . . , Ac). The original membership func-
247 | Witold Pedrycz
http://dx.doi.org/10.5391/IJFIS.2013.13.4.245
Space of
information
granules
Data space
{v1, v2, …, vc} U
G
Granulation
G-1
Degranulation
FCM A1, A2, …, Ac B1, B2, …, Bc
F
Figure 2. Interaction–augmented degranulation mechanism; note aprocessing module resulting in a layer of interaction among originalfuzzy sets. FCM, fuzzy C-mean.
tions are affected by their interaction with other informationgranules. The newly formed granules are developed in a way itfurnishes them with better degranulation capabilities (viz. lowerdegranulation error).
In general, the transformation F: [0,1]c →[0,1]c can be imple-mented in numerous ways. Several interesting and practicallyviable alternatives are summarized in Table 1. The same tableincludes some comments about the nature of the mapping.
The transformation F is typically endowed with some param-eters (say a matrix of weights) and those imply the flexibility ofthe transformation. Both the type of the mapping as well as itsparameters (their numeric values) are subject to the optimiza-tion guided by the degranulation error V. More formally, wecan describe the result of this optimization as follows
(Fopt,wopt) = arg minF,wV (15)
where Fopt is an optimal transformation (coming out of a finitefamily of alternatives) whereas wopt is an optimized vectorof its parameters. The minimization shown in Eq. (15) isrealized with regard to the type of the transformation F and itsparameters w.
As to the development of interactions, a certain general ten-dency can be observed. Consider a linear combination of origi-nal information granules shown in Table 1. Let us rewrite it inthe following form
Bi(x) = Ai(x) +
c∑j = 1j 6= i
wijAj(x)
= Ai(x) +
c∑j = 1j 6= i
wij> 0
wijAj(x) +
c∑j = 1j 6= i
wij< 0
wijAj(x)
(16)We note that Bi is a distorted (transformed) version of Ai inwhich in an attempt to minimize the degranulation error theoriginal information granule Ai is impacted by the membershipgrades of the remaining Ajs. Depending upon the sign of the
Table 1. Selected alternatives of realization of transformation mecha-nisms
Type oftransformation
Formula Comments
Linear yi =∑cj=1 wijzj
y = Wzy ∈ [0, 1]c,z ∈ [0, 1]c
W =[wij] assumesvalues in [-a, a] wherea > 0; wij =1. It isassumed that the sumis contained in [0, 1]
Nonlinear yi =
φ(∑c
j=1 wijzj
) φ : R→ [0, 1] ismonotonicallynon-decreasingfunction. A typicalexample is a sigmoidalfunction,1/(1 + exp(−u))
Higher orderpolynomialtransformation
yi =∑cj=1 wijzj +∑cj=1 vijz
2j
Generalizes a linearrelationship, offershigher flexibility yet atexpense of using moreparameters
Logictransformation
yi =c
Sj=1
(wijtzj)
The transformationuses logic operationsof t-norms (t) andt-conorms (s) [6, 9]
weights, this process is of excitatory (wij >0) or inhibitory(wij <0) nature. In other words, those Ajs associated with thenegative weights wij form an inhibitory link while the excita-tory link is realized by the Ajcoming with the positive entriesof W. A lack of interaction is observed when wij are close tozero for all i 6=j or W = I where I is an identity matrix. Whilemore sophisticated transformation may contribute to the lowerdegranulation error, the interpretability of the transformationitself could be more difficult because of the existence of morenonlinear and convoluted character of established interactionsamong information granules.
5. Experimental Studies
In all experiments, we consider the linear transformation ofinformation granules (Table 1). Particle swarm optimization isused as an optimization vehicle [15]. We consider a generic ver-sion of this method where the dynamics (velocity and position)of each individual of the swarm of “M” particles is governed
www.ijfis.org Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing | 248
International Journal of Fuzzy Logic and Intelligent Systems, vol. 13, no. 4, December 2013
- 1
1
3
5
7
9
0 1 2 3 4 5 6 7 8 9
x 2
x 1
Figure 3. Two-dimensional synthetic data.
1.2
1.1
1
0.9
0.8
0.7
0.6
0.5 2 3 4
Number of clusters
V
Figure 4. Degranulation error V reported for c=2, 3, and 4. Theupper (grey) line concerns the error obtained when no associationsamong information granules are studied; W=I.
by the following expressions:
velocity : vij(iter + 1) = wvij(iter)
+ c1r1j(pij(iter)− xij(iter))
+ c2r2j(pgj(iter)− xij(iter))
position : xij(iter + 1) = xij(iter) + vij(iter + 1)(17)
i = 1, 2, . . . ,M and j = 1, 2, . . . , r. Here vi is the velocityof the i-th particle in a given search space of dimensionality“r” and r1 and r2 are the vectors of random numbers drawnfrom the uniform distribution over the unit interval. pi is thebest position of the i-th particle observed so far and pg is thebest particle in the entire swarm. The update of the particleis realized by adding the velocity vi(iter+1) to the currentposition xi(iter). The main parameters of this optimizationenvironment is set as follows (these specific numeric values arethose commonly encountered in the literature): c1=c2=1.49, w= 0.6. The range of the admissible values of the weights is setto [-4, 4].
The size of the population was set to 120 individuals whereasthe number of generations was equal to 100. With regard tothe use of the FCM algorithm, the fuzzification coefficientis set 2 and the method was run for 60 iterations (at whichpoint no changes to the values of the objective function werereported). The initial point of optimization was set up ran-
0
2
4
6
8
10
0 1 2 3 4 5 6 7 8 9
(a)
0
1
2
3
4
5
6
7
8
9
0 1 2 3 4 5 6 7 8 9
(b)
Figure 5. Results of degranulation: (a) no associations involved, and(b) optimized associations.
domly by choosing a partition matrix with random entries. Thedistance function used to run FCM as well as to compute thedegranulation error Eq. (14), is the weighted Euclidean distance
in the form ||a − b||2=∑n
j=1
(aj−bj)2
σ2j
where sj stands for astandard deviation of the j-th variable.
5.1 Synthetic Data
We start with a synthetic two-dimensional data with intent ofgaining a better insight into the nature of the optimization pro-cess and the produced results. The data set exhibits three well-delineated and compact clusters (groups), Figure 3.
We formed c = 2, 3, and 4 information granules (fuzzy clus-ters) and determined the resulting degranulation error for allthese situations, see Figure 4.
The most visible improvement is noted for c = 2. Just toobserve in which way the reduced value of V translates into thecharacter of the reconstructed data, Figure 5 contrasts the resultsobtained when no associations among information granuleswere considered. It is apparent that in this case a number of datawere collapsed whereas the inclusion of the associations helpretain the original data as it is visible in Figure 5(b), especiallywith regard to the cluster of data positioned close to the origin.
The matrix of the parameters W delivers a detailed insightinto the relationships among information granules. Here wehave
W=
[1.00 −0.05−0.09 1.00
]
W=
1.00 0.08 −0.06−0.01 1.00 −0.05−0.03 −0.03 1.00
249 | Witold Pedrycz
http://dx.doi.org/10.5391/IJFIS.2013.13.4.245
V
Generation no.
(a)
V
Generation no.
(b)
V
Generation no.
(c)
Figure 6. Fitness functions in successive generations: (a) e-coli, c=10,(b) auto, c=7, (c) housing c =9.
W =
1.00 0.37 −0.14 0.04
1.01 1.00 0.22 −0.080.63 −0.15 1.00 −0.03−1.1 0.44 −0.50 1.00
The relationships (associations) between information granulesare more visible with the increase of the number of clusterswith both inhibitory and excitatory linkages being present.
5.2 Selected Machine Learning Repository Data
A collection of three data sets coming from the Machine Learn-ing repository http://archive.ics.uci.edu/ml/ is considered, namelye-coli, auto, and housing. The snapshots of the progression ofthe optimization process where the fitness function (degranula-tion error) is reported in successive generations are presentedin Figure 6.
The plots of V treated as a function of the number of clusters
1200
1300
1400
1500
1600
1700
1800
1900
2 3 4 5 6 7 8 9 10
V
Number of clusters
(a)
600
700
800
900
1000
1100
1200
2 4 6 8 10
V
Number of clusters
(b)
3500
4000
4500
5000
5500
2 3 4 5 6 7 8 9 10
V
Number of clusters
(c)
Figure 7. V versus “c”, grey lines relate to reconstruction resultsproduced where no associations are involved: (a) e-coli, (b) auto, and(c) housing.
are displayed in a series of figures, Figure 7.
In all experiments, there is a visible improvement providedby invoking associations among information granules. Thisenhancement is quite steady irrespectively from the numberof clusters used in the granulation process. We also witness asaturation effect meaning that the values of V tend to stabilizewith the increase of the number of clusters. The plots displayedin Figure 8 deliver a certain look at the values of the weights;the diversity of the inhibitory and excitatory effects deliveredby their combination is also apparent. A more global view canbe formed by considering a sum of the weights reported in theindividual rows of the weight matrix (not counting the entriespositioned on a main diagonal) – these numbers tell us aboutan overall impact on the corresponding fuzzy set that arose as aresult of the transformation. As illustrated in Figure 9, there is avisible diversity of excitatory and inhibitory influences among
www.ijfis.org Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing | 250
International Journal of Fuzzy Logic and Intelligent Systems, vol. 13, no. 4, December 2013
(a)
(b)
(c)
Figure 8. Entries of the association matrix W: (a) e-coli, (b) auto, and(c) housing.
information granules.
6. Conclusion
Understanding and quantifying information granules in termsof their representation capabilities is essential to further ad-vancements of granular computing and pursuing their role inreasoning and modeling schemes. The study covered here un-derlines a facet of degranulation aspects and its improvementby admitting and quantifying linkages existing among informa-tion granules. This helps gain a better insight into the natureand interaction among information granules built on a basis ofnumeric data. In the numeric studies reported in the paper, westressed an important role of associations of fuzzy sets. While
- 1 . 5
- 1
- 0 . 5
0
0 . 5
1
1 . 5
2
1 2 3 4 5 6 7 8 9 1 0
(a)
- 0 . 8
- 0 . 6
- 0 . 4
- 0 . 2
0
0 . 2
0 . 4
0 . 6
0 . 8
1 2 3 4 5 6 7
(b)
- 2 . 5
- 2
- 1 . 5
- 1
- 0 . 5
0
0 . 5
1
1 . 5
2
1 2 3 4 5 6 7 8 9
(c)
Figure 9. Levels of interaction reported for the individual fuzzy sets:(a) e-coli, (b) auto, and (c) housing.
in this study we focused on the formalism of fuzzy sets (andfuzzy clustering), the developed framework is of general char-acter and as such can be investigated in depth when dealingwith other formalisms of information granules (sets, rough sets,shadowed sets and others).
Conflict of Interest
No potential conflict of interest relevant to this article wasreported.
References
[1] W. Pedrycz, Granular Computing: Analysis and Designof Intelligent Systems, Boca Raton, FL: Taylor & Francis,
251 | Witold Pedrycz
http://dx.doi.org/10.5391/IJFIS.2013.13.4.245
2013.
[2] W. Pedrycz, “From fuzzy sets to shadowed sets: inter-pretation and computing,” International Journal of In-telligent Systems, vol. 24, no. 1, pp. 48-61, Jan. 2009.http://dx.doi.org/10.1002/int.20323
[3] W. Pedrycz, Knowledge-Based Clustering: From Data toInformation Granules, Hoboken, NJ: Wiley, 2005.
[4] C. Hwang and F. C. H. Rhee, “Uncertain fuzzy clustering:interval type-2 fuzzy approach to c-means,” IEEE Trans-actions on Fuzzy Systems, vol. 15, no. 1, pp. 107-120, Feb.2007. http://dx.doi.org/10.1109/TFUZZ.2006.889763
[5] S. J. Kim and I. Y. Seo, “A clustering approach towind power prediction based on support vector regres-sion,” International Journal of Fuzzy Logic and Intel-ligent Systems, vol. 12, no. 2, pp. 108-112, Jun. 2012.http://dx.doi.org/10.5391/IJFIS.2012.12.2.108
[6] X. Y. Ye and M. M. Han, “A systematic approach toimprove fuzzy C-mean method based on genetic algo-rithm,” International Journal of Fuzzy Logic and Intel-ligent Systems, vol. 13, no. 3, pp. 178-185, Sep. 2013.http://dx.doi.org/10.5391/IJFIS.2013.13.3.178
[7] W. Pedrycz, “Why triangular membership functions?,”Fuzzy Sets and Systems, vol. 64, no. 1, pp. 21-30, May1994. http://dx.doi.org/10.1016/0165-0114(94)90003-5
[8] J. C. Bezdek, Pattern Recognition With Fuzzy ObjectiveFunction Algorithms, New York, NY: Plenum Press, 1981.
[9] W. Pedrycz and J. V. de Oliveira, “A development of fuzzyencoding and decoding through fuzzy clustering,” IEEETransactions on Instrumentation and Measurement, vol.57, no. 4, pp. 829-837, Apr. 2008. http://dx.doi.org/10.1109/TIM.2007.913809
[10] A. Gersho and R. M. Gray, Vector Quantization and SignalCompression, Boston, MA: Kluwer Academic Publishers,1992.
[11] R. M. Gray, “Vector quantization,” IEEE ASSP Magazine,vol. 1, no. 2, pp. 4-29, Apr. 1984. http://dx.doi.org/10.1109/MASSP.1984.1162229
[12] A. Lendasse, D. Francois, V. Wertz, and M. Verleysen,“Vector quantization: a weighted version for time-seriesforecasting,” Future Generation Computer Systems, vol.
21, no. 7, pp. 1056-1067, Jul. 2005. http://dx.doi.org/10.1016/j.future.2004.03.006
[13] Y. Linde, A. Buzo, and R. M. Gray, “An algorithm forvector quantizer design,” IEEE Transactions on Com-munications, vol. 28, no. 1, pp. 84-95, Jan. 1980. http://dx.doi.org/10.1109/TCOM.1980.1094577
[14] E. Yair, K. Zeger, and A. Gersho, “Competitive learningand soft competition for vector quantizer design,” IEEETransactions on Signal Processing, vol. 40, no. 2, pp. 294-309, Feb. 1992. http://dx.doi.org/10.1109/78.124940
[15] S. Yuhui and R. C. Eberhart, “Empirical studyof particle swarm optimization,” in Proceedings ofthe 1999 Congress on Evolutionary Computation,Washington, DC, July 6-9, 1999, pp. 1945-1950.http://dx.doi.org/10.1109/CEC.1999.785511
Witold Pedrycz is a Professor and CanadaResearch Chair (CRC) in ComputationalIntelligence in the Department of Electri-cal and Computer Engineering, Univer-
sity of Alberta, Edmonton, Canada. He is also with the SystemsResearch Institute of the Polish Academy of Sciences, Warsaw,Poland. He also holds an appointment of special professorshipin the School of Computer Science, University of Nottingham,UK. In 2009 Dr. Pedrycz was elected a foreign member ofthe Polish Academy of Sciences. In 2012 he was elected aFellow of the Royal Society of Canada. Witold Pedrycz hasbeen a member of numerous program committees of IEEE con-ferences in the area of fuzzy sets and neurocomputing. In 2007he received a prestigious Norbert Wiener award from the IEEESystems, Man, and Cybernetics Council. He is a recipient ofthe IEEE Canada Computer Engineering Medal 2008. In 2009he has received a Cajastur Prize for Soft Computing from theEuropean Centre for Soft Computing for “pioneering and multi-faceted contributions to granular computing”. In 2013 has wasawarded a Killam Prize.
His main research directions involve computational intelli-gence, fuzzy modeling and granular computing, knowledgediscovery and data mining, fuzzy control, pattern recognition,knowledge-based neural networks, relational computing, andSoftware Engineering. He has published numerous papers inthis area. He is also an author of 15 research monographs cover-ing various aspects of computational intelligence, data mining,and software engineering.
www.ijfis.org Associations Among Information Granules and Their Optimization in Granulation-Degranulation Mechanism of Granular Computing | 252
International Journal of Fuzzy Logic and Intelligent Systems, vol. 13, no. 4, December 2013
Dr. Pedrycz is intensively involved in editorial activities.He is an Editor-in-Chief of Information Sciences, Editor-in-Chief of IEEE Transactions on Systems, Man, and Cybernet-ics Systems and Editor-in-Chief of WIREs Data Mining and
Knowledge Discovery (Wiley). He currently serves as an Asso-ciate Editor of IEEE Transactions on Fuzzy Systems and is amember of a number of editorial boards of other internationaljournals.
253 | Witold Pedrycz