average monthly temperature forecast in romania by …

10
Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018 „ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 3685/ISSN-L 1844 - 7007 AVERAGE MONTHLY TEMPERATURE FORECAST IN ROMANIA BY USING SINGULAR SPECTRUM ANALYSIS MARINOIU CRISTIAN ASSOC. PROF.PH.D., PETROLEUM-GAS UNIVERSITY OF PLOIEŞTI e-mail:[email protected] Abstract Singular spectrum analysis (SSA) is one of the relatively recent time series analysis method which does not require a-priori assumption of a particular model. The method is based on the classical results in mathematics and has the advantage that it relies on the estimation of only two parameters. This paper briefly describes the main steps of the method and its use for forecast the time series of average monthly temperature in Romania. At the same time predictions with the same time series are made by using two other known forecast methods. By comparing methods in terms of prediction error we may find that using SSA leads to the best results. Key words: time series, forecast, average monthly temperature JEL Classification: C22, C53 1. Introduction One of the relatively recently developed methods for analysis and forecast of time series is the Singular Spectrum Analysis (SSA). SSA method was developed independently in the 1980s in the United States of America and the United Kingdom and in the 1990s in Russia in St. Petersburg, as the Caterpillar-SSA [1]. SSA method incorporates elements of classical time series analysis, multidimensional statistics and geometry, dynamical systems and signal processing [7, p.239]. According to [3, p.2], “the essential difference between SSA and the majority of methods that analyze time series with a trend- or/and periodicities lies in the fact that SSA does not require an a-priori model for trend as well as the a-priori knowledge of number of periodicities and period values”. It also has only two parameters to be estimated and, in addition, it is very flexible in modulating periodicities, which makes irrelevant the additive or multiplicative character of the analyzed time series model. These features and others make of SSA method a very powerful tool of analysis and forecast with applications in various fields: financial mathematics, marketing, geology, hydrology etc. (Briceno et al.) propose using SSA to predict the demand of electric charge in a region of Venezuela, the SSA method is described in detail in [7] and as an application, the authors also present the analysis and prediction of time series Monthly Accidental Death in US (period 1973- 1978), in [1] the analysis and the forecast of The UK Tourism Income time Series is made etc. . 2. The SSA method Let = ( 1 , 2 ,⋯ ) be a time series of length . The basic idea of the SSA algorithm is to decompose, in a first phase (decomposition phase) the time series in several independent components, and to detect, after this operation, its fundamental elements (trend, periodic components and noise). In the second phase (reconstruction phase), some of these components are used for reconstruction and forecasting time series [2]. The main steps of the algorithm are as follows [5]: 2.1. The decomposition of the time series 2.1.1. Embedding 48

Upload: others

Post on 12-Dec-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

AVERAGE MONTHLY TEMPERATURE FORECAST IN ROMANIA BY USING

SINGULAR SPECTRUM ANALYSIS

MARINOIU CRISTIAN

ASSOC. PROF.PH.D., PETROLEUM-GAS UNIVERSITY OF PLOIEŞTI

e-mail:[email protected]

Abstract

Singular spectrum analysis (SSA) is one of the relatively recent time series analysis method which does not

require a-priori assumption of a particular model. The method is based on the classical results in mathematics and has

the advantage that it relies on the estimation of only two parameters. This paper briefly describes the main steps of the

method and its use for forecast the time series of average monthly temperature in Romania. At the same time

predictions with the same time series are made by using two other known forecast methods. By comparing methods in

terms of prediction error we may find that using SSA leads to the best results.

Key words: time series, forecast, average monthly temperature

JEL Classification: C22, C53

1. Introduction

One of the relatively recently developed methods for analysis and forecast of time series is the

Singular Spectrum Analysis (SSA). SSA method was developed independently in the 1980s in the

United States of America and the United Kingdom and in the 1990s in Russia in St. Petersburg, as

the Caterpillar-SSA [1].

SSA method incorporates elements of classical time series analysis, multidimensional statistics

and geometry, dynamical systems and signal processing [7, p.239]. According to [3, p.2], “the

essential difference between SSA and the majority of methods that analyze time series with a trend-

or/and periodicities lies in the fact that SSA does not require an a-priori model for trend as well as

the a-priori knowledge of number of periodicities and period values”. It also has only two

parameters to be estimated and, in addition, it is very flexible in modulating periodicities, which

makes irrelevant the additive or multiplicative character of the analyzed time series model. These

features and others make of SSA method a very powerful tool of analysis and forecast with

applications in various fields: financial mathematics, marketing, geology, hydrology etc.

(Briceno et al.) propose using SSA to predict the demand of electric charge in a region of

Venezuela, the SSA method is described in detail in [7] and as an application, the authors also

present the analysis and prediction of time series Monthly Accidental Death in US (period 1973-

1978), in [1] the analysis and the forecast of The UK Tourism Income time Series is made etc. .

2. The SSA method

Let 𝒚 = (𝑦1, 𝑦2, ⋯ 𝑦𝑁) be a time series of length 𝑁. The basic idea of the SSA algorithm is to

decompose, in a first phase (decomposition phase) the time series in several independent

components, and to detect, after this operation, its fundamental elements (trend, periodic

components and noise). In the second phase (reconstruction phase), some of these components are

used for reconstruction and forecasting time series [2]. The main steps of the algorithm are as

follows [5]:

2.1. The decomposition of the time series

2.1.1. Embedding

48

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

Let 𝐿 be an integer so that 1 < 𝐿 < 𝑁 and 𝐾 = 𝑁 − 𝐿 + 1. By sliding the window of

length 𝐿 along the 𝒚 time series we generate 𝐾 lagged vectors 𝑋𝑗 as follows:

𝑋𝑗 = (𝑦𝑗 , 𝑦𝑗+1, … 𝑦𝑗+𝐿−1)𝑇, 1 ≤ 𝑗 ≤ 𝐾 (1)

Column vectors 𝑋𝑗, 1 ≤ 𝑗 ≤ 𝐾 form together the matrix 𝑿= [𝑋1𝑋2 … 𝑋𝐾], of 𝐿𝑥𝐾 dimension called

the trajectory matrix of 𝒚 time series. From the way the construction of 𝑿 matrix was made, it is

noted that its elements have the property: 𝑿𝑖,𝑗 = 𝑦𝑖+𝑗−1, that is a Hankel matrix [11].

2.1.2 Singular Value Decomposition( S.V.D.) of trajectory matrix 𝑿

Let λ1 ≥ λ2 ≥ ⋯ λL ≥ 0 be the eigenvalues of the matrix 𝑿𝑻𝑿, arranged in descending

order. The singular value decomposition of the matrix 𝑿(𝐿𝑥𝐾) is [13]

𝑿 = 𝑼𝑫𝑽𝑻, (2)

where 𝑼(𝐿𝑥𝐿) and 𝑽(𝐾𝑥𝐾 ) are orthogonal matrix and 𝑫(𝐿𝑥𝐾) is the diagonal matrix. The

columns 𝑼𝟏, 𝑼𝟐, … , 𝑼𝑳 of the matrix 𝑼 are called the left singular vectors and the columns 𝑽𝟏,

𝑽𝟐, … , 𝑽𝑲 of the matrix 𝑽 are called the right singular vectors and they are the eigenvectors of

matrices 𝑿𝑿𝑻 and 𝑿𝑻𝑿, respectively [10]. If d=𝑟𝑎𝑛𝑘(𝑋) = max (i, such that λi > 0) then D=

𝑑𝑖𝑎𝑔(√λ1, √λ2, ⋯ , √λd) , where √λ1, √λ2, ⋯ , √λd are the singular values of matrix 𝑿. Then, from

equation (2), 𝑿 matrix decomposes as:

𝑿 = 𝑿𝟏 + 𝑿𝟐 + ⋯ . +𝑿𝒅 , (3)

where 𝑿𝒊 = √λiUiViT, 1 ≤ 𝑖 ≤ 𝑑. The triple (√λi, Ui, Vi

T) is called the i-th eigentriple. The

proportion 𝑝𝑖 of the variation of 𝑿 is explained by 𝑿𝒊 = √λiUiViT and it is calculated by the

formula [2]: 𝑝𝑖 = λi/ ∑ λkdk=1 .

2.2. The reconstruction of the time series

2.2.1 Grouping

This step called also eigentriple grouping refers to partitioning the set of {1,2, … , 𝑑} in 𝑚

subsets of indices noted 𝐼1, 𝐼2, … , 𝐼𝑚, as a strategy to enable, if necessary, the detection of trend, of

periodic components or of noise. If for each group of indices 𝐼𝑖, 𝑖 = 1,2, … 𝑚 we note by 𝑿𝐼𝑖 the

sum of the matrices 𝑿𝒊 from this group, then the matrix 𝑿 can be rewritten as

𝑿 = 𝑿𝑰𝟏+ 𝑿𝑰𝟐

+ ⋯ . +𝑿𝑰𝒎 (4)

2.2.2. Diagonal averaging

In this step, for each matrix 𝑿𝑰𝒌, 1 ≤ 𝑘 ≤ 𝑚 it is obtained an additive component 𝒚(𝑘) =

[𝑦1(𝑘)

, 𝑦2(𝑘)

, … , 𝑦𝑁(𝑘)

] of the initial time series, reconstructed by the so called “Diagonal averaging”

process. The “Diagonal averaging” process applied to a matrix 𝑿𝑰𝒌, consists in obtaining each

element 𝑦𝑙(𝑘)

of this additive component of the initial time series as the arithmetic mean of the

elements of the anti-diagonal [14] characterized by 𝑖 + 𝑗 = 𝑙 + 1. In this way, the initial time series

is decomposed in 𝑚 time series, the time series 𝒚(𝑘) being generated by process “Diagonal

averaging” from matrix 𝑿𝑰𝒌.

49

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

3. Using SSA method for forecasting

Although SSA algorithm does not require the assumption of a particular model for the

analyzed time series, in order to use it as a prediction tool it is convenient to make certain

assumptions.Thus, we assume that the reconstructed time series 𝒚 can be written as a sum of two

time series 𝒚(1) and 𝒚(2) aproximately separable [5, p.17], meaning 𝒚 = 𝒚(1) + 𝒚(2). In practice,

typically 𝒚(1) coresponds to the grouping of the first 𝑑 the most significant singular values and it

represents the signal, while 𝒚(2) represents the noise. If further, the time series behaviour 𝒚(1) is

governed by the Linear Recurrence Relation (LLR) [5, p. 35]

𝑦𝑖(1)

= ∑ 𝑎𝑗𝑦𝑖−𝑗(1)𝑟

𝑗=1 , (5)

then the recurrence formula (5) can be used for the 𝒚 time series forecasting. The coeficients 𝑎 𝑗 can

be obtained from the eigenvectors resulted from the SVD decomposition [3]. Fortunately, the class

of time series which satisfy the recurrence formula (5) is very wide and includes polynomial,

harmonic, exponential etc. time series [7]. This makes that the forecast of time series based on the

relationship (5) frequently used in practice.

4. Singular Spectrum Analysis and the time series forecast of the average monthly

temperatures in Romania

The interest in weather prediction existed long before our era. For hundreds of years,

starting with the early-Babylonian, Chinese, Greek civilizations- until the Renaissance when the

first meteorological measuring instruments were invented (hygrometer for humidity, thermometer

for temperature, barometer for pressure), predictions were based on speculation of natural

philosophers, proving often to be inadequate [6]. Today, data from measurements of certain

characteristics of the weather (for example temperature, humidity and air pressure) is treated as time

series. Table 1 contains Average Monthly Temperature in Romania measured during 1991-2015.

Data are taken from [15] and represent an 𝒚 time series of length 300, which we call shortly time

series AMT. To achieve forecast of the average monthly temperatures for the period 2016-2020 we

will apply the steps of the SSA method [5], presented broadly in the previous paragraph.

For the decomposition and reconstruction of the time series it is necessary to estimate the 𝐿

value of the sliding window and group the time series components for reconstruction, so that to

allow a better separation of its components. In this way the speciality literature [3, p.47]

recommands to choose, 𝐿 < 𝑁/2, 𝑁 being the length of the analysed time series and, in addition,

if the time series has also an integer periodic component, 𝐿 must be proportionate to the value of

that period [8, p.244]. The graph of the time series AMT (Figure 1), reveals, as expected, its

oscillatory behaviour, for a period of 12 months. By taking into account this information we chose

𝐿 = 144.

Figure 1. The graph of average monthly temperatures in Romania for the period 1991-2015. Source: made by the author

using functions from R[12]

50

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

The way we group the components of the decomposed time series in order to reconstruct it

is based on suggestions offered by some particularly important graphics for this analysis. Figure 2

represents the graph of the singular values (square roots of eigenvalues) in descending order of their

values. Typically, this graph, called eigenspectrum, is shaped like a hockey stick, the first part of

the graph is steep and the next is slow. If there are plateau areas, it shows that the eigenvalues are

equal or nearly equal. Practice shows that if this happens in the steep part of the graph then the

eigentriples of these singular values represent potential harmonic oscillations [8, pp.166-167]. This

occurs in Figure 2, suggesting that eigentriples 2 and 3 could be grouped together to reconstruct the

harmonic component of the time series.

Figure 2. The graph of the singular values (eigenspectrum) of the time series AMT. Source: made by the author using

functions from R

Figure 3 represents the graphs of the first 10 eigenvectors and their contribution to the

reconstruction of the time series. It is noted that the first three most important eigenvectors explain

(56.47% + 21.05% + 20.59%) = 98.11% of the time series. Eigenvectors are themselves subseries

of the analyzed time series and therefore their graphic representation can give us suggestions in

order to correct the identification of the trend, of harmonic components and noise. Figure 3 suggests

that the first eigenvector determines the trend of the time series, because [7, p.249) the trend is

characterized by reduced variability and it is usually determined by the eigentriples associated to the

first largest singular values.

.

51

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

Figure 3. Plots of the eigenvectors obtained in the phase of decomposition of the time series AMT. Source: made by the

author using functions from R

According to [7, pp.246-247], if in the graphical representation of the pairs of successive

eigenvalues regular polygons are highlighted, then appropriate eigentriples are potential harmonic

components with the oscillation period equal to the number of sides of the polygon. In Figure 4, the

graphs of the first 10 successive pairs of eigenvectors are represented. It is clearly seen that the

graphical representation of the eigenvector 2 against eigenvector 3 is a regular polygon with 12

sides which confirms the existence of harmonic components of period 12.

Figure 4. The graph of the pairs of eigenvectors. Source: made by the author using functions from R

52

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

The degree of separability of the components proposed for reconstruction can be seen by

analyzing the w-correlogram of these components, represented graphically in figure 6. The w-

correlation of two components 𝒚(1) and 𝒚(2)of a time series of length 𝑁 is defined by the formula

[7, p.248):

𝜌(𝒚(1), 𝒚(2)) =(𝒚(1),𝒚(2))

w

‖𝒚(1)‖w

‖𝒚(2)‖𝑤

(6)

where

(𝒚(1), 𝒚(2))w

= ∑ wkNk=1 yk

(1)yk

(2), ‖𝒚(𝑖)‖

w= √(𝒚(𝑖), 𝒚(𝑖))w , i=1,2, 𝑤𝑘 = min (𝑘, 𝐿, 𝑁 − 𝑘),

𝐿 ≤ 𝑁/2

The idea underlying the understanding of w-correlogram is that, the closer to zero the absolute

value of coefficient 𝜌 (which can range between 0 and 1), the better the degree of separability of

two components. For this reason two components with a value 𝜌 closed to value 1 are strongly

correlated and so they can be grouped in the process of reconstruction of the time series. The graph

in figure 6 represents the graphical image of the w-correlation matrix in which the shades from

white to black where each cell is located represents the degree of correlation of the related

components (the value 𝜌 = 0 is represented in white and the values |𝜌| = 1 in black).

Figure 6. The graph of the w-correlation matrix. Source: made by the author using functions from R

From the analysis of the w-correlation matrix we can see that, as expected, the component

related to the first eigentriple is not w-correlated with any other component and represents the trend

of the time series. The components 2 and 3 are strongly correlated and are not w-correlated with any

other component, fact that justifies their grouping in the reconstruction of the time series. The other

components are correlated in a lesser or a greater extent and represent the noise.

53

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

In conclusion, the decomposition of the time series AMT is the following:

The component trend represented by the eigentriple which contains the first singular value

(56.47% contribution)

The harmonic component represented by the eigentriples 2 and 3 ( 21.05% +20.59% =41.64 % contribution)

The component noise represented by the rest of the eigentriples (1.89% contribution)

It is worth noting that the signal obtained by reconstruction (trend and harmonic component) is very

strong (representing 98.11% of the total variance in the data). The graph of the original time series

and the graphs of the main components -.trend, seasonality, noise obtained by the reconstruction of

the time series are shown in figure 7.

Figure 7. The graph of the original time series and its components: trend, seasonality, noise.

Source: made by the author using functions from R

Figure 8 presents both the graph of the original time series (solid line) and the graph of

reconstructed time series (dashed line). It is worth noting that the two time series are very similar,

which confirms the high quality of reconstructed time series.

54

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

Figure 8. The graph of the original time series (solid line) and the graph of reconstructed time series (dashed line).

Source: made by the author using functions from R

5. Results and discussions

AMT time series reconstruction creates the premises of its use for making predictions using

the Linear Recurrence Relation (5). The graph of time series and of prediction for a period of 48

months is shown in Figure 8.

Figure 8. The graph of AMT time series and of forecast values.Source: made by the author using functions from R

Average monthly temperatures predicted for 2019 year and related prediction intervals are

shown in Table 1.

55

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

Table 1. Forecast values and their 95% (Lo 95, Hi 95) prediction intervals

Month Point Forecast Lo 95 Hi 95

Jan 2019 -1.01817828 -1.5573986 -0.5724255

Feb 2019 0.24152553 -0.2984936 0.6638098

Mar 2019 4.28840779 3.5683993 4.6937883

Apr 2019 10.04084436 9.2149059 10.6289533

May 2019 15.96056381 15.1014423 16.7306848

Jun 2019 20.46444728 19.6858186 21.3627096

Jul 2019 22.34829633 21.7423061 23.2679232

Aug 2019 21.10920307 20.5167169 21.9383732

Sep 2019 17.08021122 16.3841963 17.7281011

Oct 2019 11.34117963 10.6062079 11.7926256

Nov 2019 5.42978059 4.7296793 5.8410843

Dec 2019 0.92990889 0.3119404 1.3735101

Source: made by the author with results obtained in R

In order to estimate the accuracy of the forecasts by using the SSA method we have divided

the AMT time series into two parts [9]: the first part, made by the observations for the period 1991-

2011 (80% of the data), was used as training data and the second part, made by the observations for

the period 2011-2015 (20% of the data), was used as test data. The estimation of the prediction

accuracy was performed by using the following indicators: RMSE (Root Mean Squared Error),

MAE (Mean Absolute Error), MAPE (Mean Absolute Percentage Error), MASE (Mean Absolute

Scaled Error). For comparison we calculated the same indicators using two methods for forecast:

Arima and the method of Neural Networks. The results are presented in Table 2 and show that for

all the given indicators, the values obtained by using SSA are the smallest and so, in this case, the

prediction made by SSA is the best.

Table 1. Estimated values of the forecast error

Forecast error

indicator

Method

RMSE MAE MAPE MASE

SSA 2.02 1.65 73.29 0.88

ARIMA 2.51 1.96 108.08 1.04

Neural Network 2.81 2.23 115.85 1.18

Source: made by the author with results obtained in R

6. Conclusions

In this paper we presented the SSA method and its application for the analysis and forecast

of average monthly temperatures time series for the period 2016-2020. The analysis of the time

series allowed its decomposition and reconstruction, by detecting its trend, its seasonal components

and its noise, the obtained signal being very strong (it explains 98.11% of the total variance

data).The forecast for the period 2016-2020 was carried out on the basis of the signal obtained by

the reconstruction of the time series by using linear recurrence relation (LRR). In order to test the

ability of the SSA method to provide quality forecasts, estimations of forecast error were made

(with the same training data and the same test data), by using the SSA method as well as the

ARIMA and the Neural Networks methods. For each forecast error estimator used (RMSE, MAE,

MAPE and MASE) the values obtained for SSA method were lower, indicating that, in this case,

the SSA method was the best.

56

Annals of the „Constantin Brâncuşi” University of Târgu Jiu, Economy Series, Issue 3/2018

„ACADEMICA BRÂNCUŞI” PUBLISHER, ISSN 2344 – 3685/ISSN-L 1844 - 7007

7. Bibliography

[1] Beneki, Ch., Bruno, E., Costas, L., (2009). Signal Extraction and Forecasting of the UK

Tourism Income Time Series. A Singular Spectrum Analysis Approach, University Library of

Munich, Germany, MPRA Paper available at

https://www.researchgate.net/publication/46446611_Signal_Extraction_and_Forecasting_of_the_U

K_Tourism_Income_Time_Series_A_Singular_Spectrum_Analysis_Approach [accessed on

27.05.2018]

[2] Briceno, H., Rocco, C., M., Zio, M., 2013, Singular Spectrum Analysis for Forecasting, of

Electric Load Demand, Chemical Engineering Transactions, Vol 33

[3] Golyandina, N., Korobeynikov, A., 2013, Basic Singular Spectrum Analysis and, Forecasting

in R, available at https://www.researchgate.net/profile/Nina_Golyandina, [accessed on 27.05.2018]

[4] Golyandina, N., Nekrutkin, N., Zhigljavsky, A., 2001, Analysis of time series structure: SSA

and related techniques, Chapman & Hall/CRC, New York

[5] Golyandina, N., Zhigljavsky, A., 2013, Singular Spectrum Analysis for Time Series, Springer

Briefs in Statistics, DOI: 10.10007/973-3-642-34913-3-2

[6] Graham, S., Parkinson, C., Chahine, M., 2002, Weather Observatory through the ages, Earth

Observatory, available at https://earthobservatory.nasa.gov/Features/WxForecasting/wx2.php,

[accessed on 20.05.2018]

[7] Hassani, H., 2007, Singular Spectrum Analysis, Methodology and comparison, Journal of Data

Science, 5, pp. 239-257

[8] Huffaker, R., Bittelli, M., Rosa, R., 2017, Nonlinear Time Series Analysis with R, Oxford

University Press

[9] Hyndman, R., J., Athanasopoulos, G., Forecasting: Principles and Practice, available at

https://www.otexts.org/fpp/2/5 [accessed on 20.05.2018]

[10] Massachusetts Institute of Technology (MIT), Singular Value Decomposition (SVD) tutorial

available at http://web.mit.edu/be.400/www/SVD/Singular_Value_Decomposition.htm, [accessed

on 25.05.2018]

[11] Mathworld, Hankel matrix, available at http://mathworld.wolfram.com/HankelMatrix.html,

[accessed on 24.05.2018]

[12] R Core Team , 2017, R: A language and environment for statistical computing. R

Foundation for Statistical Computing, Vienna, Austria, URL https://www.R-project.org

[13] Vinod, H. U., Recent advances in regression methods, Marcel Dekker, New Zork, 1981

[14]*** https://www.interviewbit.com/problems/anti-diagonals/, [accessed on 25.05.2018]

[15]***http://sdwebx.worldbank.org/climateportal/index.cfm?page=downscaled_data_download&

menu=historical, [accessed on 5.05.2018]

57