yakuwa et al. - novel time series analysis and prediction of stock trading using fractal theory and...
TRANSCRIPT
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 1/8
Novel Time Series Analysis and Prediction of Stock
Trading Using Fractal Theory and Time Delayed Neural
Network *
F umi nnr i
Yakuwa
Yasubikn
Dote
Mika
Yoneyama Shinji
U z u r a b a s h i
Panasonicushiro branch D e u a r t m e n t
of
C o m u u t e r D e u a r t m e n t of C o m u u t e r
Hokkaido Elec t r ic Power
8-1, Saiwai-cho,
Kushiro,
Sc ie nc e & S y s t e m s
Co . , Inc . Eng ine e r ing
Muroran Inst i tu te of
050, J A P A N T e c h n o l o g y
Phone: +S1-154-23-1114;
27-1,
Miz umoto -c ho ,
FAX: 181-154-23-2220
Muroran 050,
J A P A N
Phone: +SI-143-46-5432;
FAX: +SI-14346.5499
Abstrac t - The stock markets are well known fo r w ide
variations in prices over short and long terms. These
fluctuations are due
to
a large number of deals produced
by agents and act independently fro m each other.
However, even in the middle of the apparently chaotic
world, there are opportunities fo r m aking good
predictions [ I ] .
In this pap er the Nikkei stock prices over
1500
days
from July to
Oct. 2002 are analyzed and predicted using
a Hurst exponent (H), a fractal dimension (D), and an
autocorre~ation coefficient
(c).
They are
H
=
0.6699
D=2-H=1.3301 and C = 0.26558 over three days. This
obtained knowledge is embedded into the structure of our
developed time delayed neural network
121.
It is
confirmed that the obtained prediction accuracy is m uch
higher than that by a back propagation-type forward
neural network o r the short-term.
Although this predictor works fo r the short term. it
is embedded into our developedfiruy neural network [3]
t construct multi-blended local nonlinear models. It is
applied
to
general long term prediction whose more
accurate prediction is expected than that by the method
proposed in [I].
1
Introduction
The Nikkei
Average
Stock prices over
1500
days are
in the middle of the apparently chaotic world. In this
paper. on the basis of Zadeh’s proposal: i.e., “From
Manipulation of Measurements to Manipulation of
Perceptions-Computations with W ords” [25], that is a data
mining technology, knowledge easily comprehensible by
humans i s extracted by o btaining the features of the time
Sc ie nc e
&
S y s t e m s
Engineering Engineering Co.,Ltd .
Muroran
Inst i tu te of
Technology
27-1,
M i m m o t o - c h o ,
Muroran 050, J A P A N
Phone: +SI-143-46-5432;
FAX +SI-143-46-5499
Mobile &
S y s t e m
series using a Hurst exponent, a fractal analysis method,
and an autocorrelation analysis method .
In
order to extract
the knowledge, decision m aking rules com prehensible by
humans using the features are derived with rough set
theory [ 2 6 ] .Finally the knowledge is embedded into the
stmcture of the Time Delayed Neural Network (TDNN).
Th e accurate prediction i s obtained.
This paper is organized as follows.
In
Section
2
t ime
series analysis using fractal analysis is described. Section
3 illustrates the structure of neural networks for t ime
series. Section 4 des cribes short-term prediction u sing
TDNN . Som e conclusions are drawn in Section 5 .
2 Time Series Analysis using Fractal
Analysis
2.1
Fra c t a l
Fractal analysis provides a uniq ue insig ht into a wide
range of natural phenomena. Fractal objects are - those
which exhibit ‘self-similarity’. This m eans that the gen eral
shape of the object is repeated at arbitrarily smaller and
smaller scales. Coas tlines have thi s property: a particular
coastline viewed on a world map has the same character
as a small piece of i t seen on a local map. New details
appear at each smaller scale, so that the coastline always
appear s rough. Although true fractals repeat the detail to a
vanishingly small scale, examples in nature are self-
similar up to so me non-zero limit. Th e fractal dimension
measures how much complexity is being repeated at each
scale. A shape with a higher fractal dimension is more
complicated
or
‘rough‘ than on e with a lo wer dimension,
and
fills
more space. These dimensions are fractional: a
shape with fractal dimension of D=1.2, for example, fills
0-7803-7952-7/03/$17.00 003 IEEE. 134
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 2/8
more space than a one-dimensional curve, but less space
than a two-dimensional area. The fractal dimension
successfully tells much information about the geometry of
an object. Very realistic computer images of mountains,
clouds and plants can be produced by simple recursions
with the appropriate fractal dimension. Time series of
many natural phenomena are fractal. Small sections taken
from these series, one scaled by the appropriate factor,
cannot be distinguished from the whole signal. Being able
to recognize a time series as fractal means being able to
link information at different time scales. We call such sets
'self-affine' instead of self-similar because they scale by
different amounts
in
each axis direction.
There are many methods available for estimating the
fractal dimension of data sets. These lead to different
numerical results, yet little comparison of accuracy has
been made among them in the literature. We combine
two
methods which are known as the most popular for
assigning fractal dimensions to time series, the box-
cou ntin g method and rescaled range analysis.
2.2
Box-counting
The box-counting algorithm is intuitive and easy to
apply.
I t
can be applied to sets in any dimension, and has
been used on images of everything from river systems to
the clusters of galaxies. A fractal curve
is
a curve of
infinite detail, by virtue of its self-similarity. The len gth of
the curve is indefinite, increasing as the resolution of the
measuring instrument increases. The fractal dimension
determines the increase in detail, and therefore length, at
each resolution change. For a fractal, the length
L
as a
function of the resolution of the measurement device
6
is:
L ( 6 ) ocF
where D is an exponent known as the fractal
dimension. (For ordinary curves
~ 6 )
pproaches a
constant value as
6
decreases) Box-counting algorithms
measure L(S) for varying 6 by counting the number of
non-overlapping boxes of size
6
required to cover the
curve. These measurements are fitted to Eq. I )
to
obtain
an estimate of the fractal dimension, known as the box
dimension. A
fractal dimension can be assigned
to a
set of
time series data by plotting it as a function of time, and
calculating the box dimension.
Eq. (1)
will hold over
a
finite range of box-size; the smallest boxes will be of
width r where
r
is the resolution in time, and height 0
where
a is
the resolution of the magnitude of the time
series.
2.3 R I S method
The rescaled range analysis, also called as
R I S
o r
Hurst method, was invented by Hurst for the evaluation of
time dependent hydrological data
[8][9].
His original
work is related
to
the water reservoirs and the design of an
ideal storage on river Nile. After the detailed discussion of
this work by Mandelbrot [lO][ll], the method has
attracted much attention in many fields
of
science. For the
mathematical aspects of the me thod we refer to the papers
of M andelbrot
[19],
Feder
1121,
and Daley
[13].
Since its
earliest days the method was used for a number of
applications, whenever the question was the quantification
of long range statistical interdependence within time
series.
As
examples we can cite the analysis of the
asymmetry of solar activity [7][8], elaxation of stress
[SI,
problems in particle physics [IS], mechanical sliding in
solids
[19]. The
Hurst analysis is also used as a tool to
determine the self-similarity parameter of fractal signals
[20-231,
or
to
detect unwanted correlations in pseudo-
random number generaton
[23].
The Hurst exponent was
calculated for corrosion noise data in the work of Moon
and Skerry [24] where the corrosion resistance properties
of organic paint films was analyzed and a direct
relationship between the Hurst exponent and the corrosion
resistance of different coatings w as established. G reisiger
and Schauer [15] discussed the applicability of different
methods to the electrochemical potential and current noise
analysis. They concluded, that the Hurst exponent allows
the extraction
of
mechanistic information about corrosion
processes, hence suitable for characterizing coatings.
We give a brief introduction to the R I S method, in
lines
of
Feder's [I 91 work. L et the time coordinate,
t ,
he
discredited in terms of the time resolution, A t , as i = r / ,
The discrete time record of a given process is denoted by
x ,
=O,l ,--- ,N
if the total duration of the observation is
T
= N & .
According to the hasic idea of the R I S method
the time record is evaluated for a certain time interval,
called time lag, the length of which is
r
= j A t and begins
at to= luAt
.
Obviously, j <
N
and I < N hold. The
average of xI over the time lag is calculated as
3)
Next the accumulative deviation from the mean,
J ~ ~s evaluated as
(4)
1 = 4 . >
where
k
takes the values 15 k
5
135
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 3/8
In order to visualize the meaning of Eq. (4) let us
refer to the hydrological context in which the method was
devised by Hurst. Here
x is
the annual water input into a
reservoir in the ith year of a series of
N
years, and . v ~ ~
events with Gaussian distribution and zero mean. The
Hurst exponent for such a time record
is 1
2 .
For 112 <
H <
1 the time series is called
persistent,
i s
the
net
gain
or
loss
of stored water in the year
1
+ k .e.
i.e. an increasing trend in the past implies,
on
the average,
a continued increasing trend in the future,
or
a decreasing
trend in the past implies a decrease in the future. If,
O < H <1/2 prevails, the time series observed is anti
persistent, i.e. an increasing trend in the past implies
decreasing trend in the future and vice versa. Persistency
is found also in cases where the time series exhibit clear
trends
with relatively little noise
[ I
1][14][22].
some time within the time lag in question. That is, the
annual increment is the object of analysis. The ideal
reservoir never empties and never overflows,
so
the
required storage capacity is equal to the difference
between the maximum and minimum value of Y ~ , , , ~
over j . This difference is called the range, R , h 3 ,
R,
I , )
=
Yk,k. , 1
Yk ,.lo
1
5)
The variance of x, for the same period, r S given as
and the quotient R,, is called rescaled range. The
above expressions are referred to a given position of the
time lag in the time axis However, the time lag can be
shifted and the procedure giv en by I ) , (2), 3) and (4) can
be repeated for each position. Thus a series of rescaled
ranges is obtained the ave rage of which c an be evaluated.
As
a non-unique but rational choice the lag is shifted by
steps Eqs. ( 3 ) - ( 6 ) , hus a series of non-overlapping but
contacting intervals is constructed. I n other words a series
of R,,, /S,. " is evaluated with j fixed and I vaned as
I = ( l , + m , ) where m= 1,2 ;.. ,[N /j] with the square
bracket denoting integer part. Then the rescaled range for
the time lag
,
s calculated as the average:
2.4 In te rpre ta t ion of f rac ta l d imension
We have already mentioned that the fractal
dimension of an object
is
a measure of complexity and
degree of spac e filling. When the object is
a
series in time,
the dimension also tells us something about the relation
between increments. It is a useful and meaningful insight
into series o f natural processes.
2.5 Frac t ion a l Brownian mot ion
A particle undergoing Brownian motion moves by
jumping step-lengths which are given by independent
Gaussian random variables. For one-dimensional motion
the position of the particle in time, A'( /) , is given by the
addition of all past increments. The function X ( / ) is a
self-affine fractal, whose graph has dimension 1 5
Fractional Brownian motion Eh) generalizes
X ( f ) by allowing the increments to be correlated.
Ordinary Brownian motion can be defined by:
X r) ~ ( t , )
g1t
- tOl2
9 )
1 I = [ N i j ]
where
H =
I12
5
is
a
normalized independent
Gaussian process and X ( t J the initial position [4][5].
Replacing the exponent
H
= 1 / 2 in Eq.5 with any other
number in the range O < H < l defines an fBm function
X , ( r ) . The exponent H here corr espon ds to the statistic H
that R I S analysis calculates.
(I
/ S = -
C ( R j l S j , , )
IN 1
M
Hurst observed that there
is
a great number of
natural phenomena, for which the ratio
R I S
obeys the
rule
RISocr '
8)
The correlation function of future increments with
past increments for the motion X , ( t ) can be shown to be
where H is called Hurst exponent. The Hurst [5]:
exponent was seen to be between 0 and 1 The value
H
=
112 has a special significance, because this reflects
that the observations are statistically independent of each
other. This is the random noise case. For example the
increment series, i.e. the series of displacem ents, in Clearly, C ( t ) = O for H = I 1 2 ; increments in
Brownian motion
is
a seque nce of uncorrelated random ordinary Brownian motion are independent. For
H
>
1
1 2 ,
C r) s
positive for all . This m eans that after a positive
C t)=
22x 1
-
(10)
136
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 4/8
increment, future increments are more likely to be positive.
This is known as persistence. When H
<
I / 2 , increments
are negatively correlated, which means an increase in the
past makes
a
decrease more likely in the future. This is
called anti-persistence.
Now it is true for self-affine functions such as
X, , ( t )
that the fractal dimension, D is related to H by
[41:
We can then identify persistence
or
anti-persistence
in data sets whose graphs are fractal. Persistent time series
show long-term memory effects. An increasing trend in
the past is likely to continue in the future because future
increments are positively correlated to past ones.
Similarly, a negative trend will persist. This means that
extreme values in the series tend to be more extreme than
for uncorrelated series. In the context of climatic data,
droughts or extended rain periods are more likely for
persistent data.
In order to determine the Hurst exponent,
log(R/S)
is
plotted against logr and the slope renders
H .However, not
all
the points of this plot have the same
statistical weights: when P is very small, a large number
of
R
/ S data can be calculated but their scatter is large;
when
r
is very large, only few R I S data are at hand,
so
the statistics is poor. For this reason the first and the last
few points of the double logarithmic plot are usually
discarded.
To begin with, we verified whether change of a
stock price time series follows the random walk
hypothesis using the rescaled range analysis. We analyzed
stock prices time series of Nikkei Stock Average. The
analysis period used the data for 1500 days from July,
1996 to October, 2002.
In
the analysis, the logarithm
profitability was applied to the original analysis object.
. 0.66Pn-010 l
i_ __I
0 s
0 3
0 .
0 IO 10 , .o ,e
m
1 so I_
N
-
Figure 1 Rescaled range analysis of Nikkei Stock
Average time series
Figure
1
shows the result of analyzing the Nikkei
Stock Average Prices for 1500 days. From the gradient of
this obtained straight line, the Hurst exponent H) in the
Figure
2.
Relationship
of
scaling interval (N) versus Hurst
exponent (H)
As shown
in
this Figure 2, H
=0.88
orresponding
to N =
3
is the maximum. Therefore, it is found that Data
for 3 days show the strongest correlation using the fractal
analysis of the Nikkei Stock Average Prices. This
knowledge is discovered .using a feature.de map with
rough set theory [26] shown in Table I . Firstly a Hurst
exponent is obtained. Then the fractal dimension and the
autocorrelation coefficient are calculated from the Hurst
exponent.
Table I . Knowledge extraction with feature-rule map
with rough set theory
137
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 5/8
Table
2.
Pawlak’s Lower and Upper Approximation
N u m b e r
of
objects:
Cla ss Brownian Motion
Lower approximat ion
Upper approximat ion
Aooroximat ion accuracv
1
3
Class
N = 3 I
Upper approximat ion
Approximat ion accuracy
Class
1
1
Similaritv (Fractal)
N u m b e r of objects
1
Upper approximat ion 1
Aooroximat ion accuracv 1
E r r o r 0)
Epochs
3 Structure of Neural Network for
Time
Series
3 Inpu ts Nodes 5 Inpu t s N odes
59.4803 304.9743
201 447
In
order to embed the discovered know ledge into the
structure of neural networks, it is found that our
developed delayed neural network is suitable 121.
3.1
Tim e Delayed Neur a l Network (TDNN)
In order to handle dynamical systems time delay
elements representing the obtained knowledge are put into
the inputs of neural networks 121. The structure
configuration of FIR filter is shown in Figure 3. It is a
finite impulse response (FIR) digital filter which is
connected to each input of a back propagation type
forward neural network (BPNN).
A
time delay element is
also put between the inputs of the filter.
Zl elay element
Figure 3. FIR filter
where,
f
is a sigmoid function and the weights are
corrected by the Back Propagation algorithm (BP).
4
Short-term Prediction
of
using
BPNN
We obtained the features (knowledge: N = 3 ) of the
time series by the fractal analysis. Two kinds of
3
layers
BP neural networks which have a delay element between
each input node are considered. No filter at each node has
connected
in
the first one. The second one has a 3-order
FIR at each input node.
4.1
Simula t ion by 3 layer BPNN without f i l te rs
No filter is connected at each input node. The
structure of the neural network is shown in Table
3.
Table 3. BP networks structure
O u t u t N o d e
E ochs
500 300
4.1.1 BPNN simulation with
3
input nodes
The structure of the neural network is illustrated in
Figure 4. The simulation result with 3 input nodes is
shown in Figure
5
We predicted for seven days from the
1501st. The error and the number of epochs are given in
Table 4.
Table 4. BPNN without filters with
3
inputs
I
E r r o r
0)
I
59.4803
Epoc hs 201
4.1.2
BPNN simulation with 5 inpu t Nodes
In the same way, the structure of the neural network
is illustrated in Figure 6. The simulation result with 5
input nodes is shown in Figure 7. The error and the
number of epochs are listed in Table 5
Table 5 . BPNN without filters with
5
input nodes
E r r o r
0)
I
304.9743
Epochs 447
138
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 6/8
input
Three-layer
Back Propagation
Neural Network
JUtpUt
K)
Figure 4. The structure of th e BPNN w ithout filter with
3
input nodes
I
I
Figure 5. BPNN simulation without filters with 3 input
nodes
input Three-layer
Neural Network
Figure 6. The structure of the B PNN w ithout filte r with
5
input nodes
-
-*./-
--d -
I
-
>-
i
input
Three-layer
FIR filters
Zl
.Pm
Z 1 2-1
Back
Pmpagation output
Neural Nework
2-1. . 1
2-1.. 2 ~ '
Figure 8 . The structure of the BPNN with filter with
3
input nodes.
. ..__.. ..
I
I
Figure 9. BPNN Simulation with filter with 3 input nodes
input Three-layer
Figure 10. The structure of the BPN N with filter with 5
input nodes
Figure 7. BPNN simulation without filters with 5 input
nodes
Figure 11. BPNN Simulation with tilter with 5 input nodes
139
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 7/8
4.2 Simula t ion by the 3 layer BPNN with fil ters.
Table 7 shows the structure of the 3 layer BPNN
with filters.
5 Input Node s 304.9743
fi l te rs
Table 7. 3 layer BPNN with filters network structure
447
Inpu t s N odes 5 Inpu ts Nodes
3-order
FIR
f i l te r
connected connected
Hidden Node 3 3
5
Input Nodes
f i l te rs
4.2.1 Simulation w ith 3 inp ut Nodes
The structure of the neural network is illustrated in
Figure 8. The simulation result is shown in Figure 9.
Table 8 lists the prediction error and the number of
e mc hs .
1548.2962
Table 8. Simulation with 3 input nodes
O u t p u t N o d e I 1
I
Table 9. Simulation with 5 input nodes
Epochs I 500
I
r r o r
0)
88.2962
300
I Epoc hs 154 I
Table
10
tells that with both 3 input 5 input nodes
prediction accuracy is fairly high.
E r r o r 0)
Table IO . Conparison ofboth
47.3381
r r o r 0 )
Epochs
Table
1
I , Conparison of both networks
r
E r r o r o ) I Epoc hs
3
Inpu t s N ode 5 Inpu t s N ode
47.3381 88.2962
3 7
154
I T T i n p u t odes
I
59.4803
I
201 I
5
Conclusion
A data mining technique is applied to time series
analysis and prediction. From a large amount of data
understandable knowledge is extracted using a Hurst
exponent, a fractal analysis method and
an
autocorrelation analysis method. Then it is embedded into
the suitable network, BPFN . The accurate prediction is
obtained in the Nikkei Average Stock price time series by
the B PNN w ith filters.
References
[ I ]
0.
Castillo and P. Melin, “Hybrid Intelligent
Systems for Time Series Prediction Using Neural
Networks, F u u y Logic, and Fractal Theory”,
I€€€
Transactions on
NN,
Vol. 13, no. 6, pp. 1395-1407, Nov.
2002.
[2] M.
S.
Shafique and Y. Dote, “An Empirical Study
on Fault D iagnosis for -Nonlinear Time Series
using
Linear Regression Method and FIR Network”,
Trans.
I€€
ofJapan,
Vol.
120-C, no 10, pp. 1435-1440, Oct. 2000.
[3] F. Yakuw a, S. Satoh, M.
S
Shaikh, and Y. Dote,
“Fault Diagnosis for Dynamical Systems Using
Sol?
Computing”, Proceedings of the 2002 World Congress on
Computational Intelligence, Honolulu, Hawaii, U.S.A.,
May 12-17,2002.
[4]
288.
[5]
Inst. Techno. 2,1923, pp. 131-174
[6] T. Vicsek, Fractal Growth Phenomena, World
Scientific, Sigapore, 1992,
p.488.
. .
,.
J. Feder, Fractals, Plenum Press, New York, 1988, p.
-
.. -
/ _ I
N.
Wiener, Differential space, I . Math. Phys. Mass.
[7]
2000, p. 31.
H. Greisiger and T. Schauer, Prog. Org. Coat. 39,
[SI H.
E.
Hurst, Nature
180,
1957, p. 49 4,
[9] H. E. Hurst, R. P. Black and Y. M. Simaika, Long-
term Storage, an Experimental Study, Constable, London,
1965.
140
8/16/2019 Yakuwa Et Al. - Novel Time Series Analysis and Prediction of Stock Trading Using Fractal Theory and Time Delayed…
http://slidepdf.com/reader/full/yakuwa-et-al-novel-time-series-analysis-and-prediction-of-stock-trading 8/8
[ I O ] B. Mandelbrot and J.
R.
Wallis, WaferResour. Res:
5 1969, p. 228.
[ I 11 B. M andelbrot and J. R. Wallis, WaferResour. Res.
5 1969,
p.
967.
[121 J. Feder, Fractals, Plenum, New York, 1988,
(131 D.
J.
Daley, Ann. Probab. 27, 1999, p. 2035.
[I41
R.
W . Komm, SolarPhys. 156 ,1 995 ,~ . 7.
[I51 R. Oliver and J. L. Ballester, Solar
Phys.
169, 1996, p.
216.
[I61 A. Gadomski,Mod. Phys. Lett.
B
I , 1 9 9 7 , ~ .45.
[I71
1.
A . Lebedev and B. G. Shaikhatdenov, J Phys.
G:
Nucl. Part. Phys. 23, 1997, p. 637.
[I81 M. A.
F.
Gomes,
F. A.
Osouza and V. P. Brito,
J.
Phys. D 31, 1998, p. 3223.
[I91 C.
L.
Jones, G T. Lonergan and D. E. M ainwaring,
J.Phys. A
29
1996, p. 2509.
[20] C. Heneghan and
G.
McDarby, Phys. Rev.
E
62,
2000, p. 6103.
[21] C. W . Lung, J. Jiang, E. K. Tian and C. H. Zhang,
Phys. Rev. E
60,
1999, p. 5121.
[22] B. A. Cameras, B. Ph. van Milligen, M. A. Pedrosa,
R. Balbin,
C.
Hidalgo, D.
E.
Newman, E. Sanchez, M.
Frances, 1. Garcia-Cones, J. Bleuel, M. Endler, C. Ricardi,
S . Davies, G. F. Matthews, E. Martines, V. Antoni, A.
Latten and T. Klinger, Phys. Plasmas 5 , 1998, p. 3632.
[23]
B. M. Gammel, Phys. Rev. E58,1998, p . 2586.
[24]
M.
Moon
and B . Skerry, J. Coal. Technol.
67,
1995,
p. 35.
1251 L. A. Zadeh, Plenaly Talk on from computing with
numbers to computing with words-from manipulation of
measurements to manipulation
of
perceptions, Proceedings of
the IW SCI-99, June 16-18, Muroran, Japan.
1261
A
Kusiak, “Rough Set Theory:
A
Data
Mining Twll for
Semiconductor Manufacturing”,IEEE Trans. On EPM, Vol. 24.
No. I , pp. 44-50, January,2001
141