new introduction to spatial first order discrete hmm · 2018. 2. 26. · the hmm was taken by...

6
Cite this article: Madadizadeh F, Ghanbarnejad A, Zeraati H, Tabar VR, Batmanghelich K, et al. (2018) Introduction to Spatial First Order Discrete HMMing. Ann Biom Biostat 4(1): 1029. Central Bringing Excellence in Open Access Annals of Biometrics & Biostatistics *Corresponding author Hojjat Zeraati, Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran, Email: zeraatih@tums. ac.ir Submitted: 16 November 2017 Accepted: 07 February 2018 Published: 11 February 2018 ISSN: 2333-7109 Copyright © 2018 Zeraati et al. OPEN ACCESS Keywords Discrete HMM Spatial structure Research Article Introduction to Spatial First Order Discrete HMM Farzan Madadizadeh 1 , Amin Ghanbarnejad 2 , Vahid Rezaei Tabar 3 , Kayhan Batmanghelich 4 , Abbas Bahrampour 5 , and Hojjat Zeraati 1 1 Department of Epidemiology and Biostatistics, Tehran University of Medical Sciences, Iran 2 Social Determinants in Health Promotion Research Center, Hormozgan University of Medical Sciences, Iran. 3 Department of Statistics, Allameh Tabatabai University, Iran 4 Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, USA 5 Department of Epidemiology and Biostatistics, Kerman University of Medical Sciences, Iran Abstract Hidden Markov models (HMMs) are well known powerful and flexible statistical methods for modeling one dimensional time series data. They are used when the vector of observation and hidden states are processed. Sometimes, a matrix of data (spatial structure) is dealt with instead of a vector, in this situation there is urgent need to define a new extension of HMM models which can be considered as spatial structure of data. Discrete Hidden Markov model (DHMM) is a type of HMMs with discrete observations. This study presents a new extension of the first order DHMM for data with more than one dimension which is a spatial generalization of the first order DHMM. As a matter of fact, this new model will be able to model the matrix of observation and hidden states. INTRODUCTION Hidden Markov models (HMMs hereafter) are advanced statistical models proposed in the late 1960s, that are well suited for pattern recognition, data sequences generation, classification and pattern segmentation problems in different areas Hui et al. (2013), Bilal et al., (2013). These models are still being developed. HMM is a statistical model with finite intra-model states, capable of learning. It was originally introduced as a classifier or predictor for various applications [1]. The widespread popularity of HMM is due to its strong mathematical structure and statistical foundation, as well as the presence of efficient training procedures in its framework [2]. Although HMM has been known for decades, it has been applied explicitly to modeling one dimensional data and is less used with spatial structure data [3]. HMM has long been of interest to researchers, and useful steps have taken to develop these models. The first steps to develop the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the details of these complex models, whose mechanism was like a hidden black box [2]. In this tutorial, two types of HMM, continuous and discrete type, were explained. The development of HMMs can be generally examined from two perspectives. First, model development based on the hidden chain including introduction of first order, second order, and high order models, factorial models, hierarchical models, etc., all of which develop the dependency of hidden states of the model. The second category, in addition to the purpose of the first category, has considered modeling the dependency of observations. From this category, we can mention HMMs with the dependency of first-order observations, second- order autoregressive HMMs with dependency of first-order observations and second order autoregressive HMMs with dependency of second-order observations. In order to investigate the first category studies, for more compatibility of the model with discrete data, Watson et al. (1992), introduced and explained the second-order discrete HMM [4]. This type of HMM was, in fact, the generalization of the first-order Discrete HMM (DHMM hereafter) with a second-order Markov chain in its hidden states sequence. Fine 1988, introduced the hierarchical HMM (HHMM hereafter) as an extension of the HMM including the sequence of hierarchical states [5]. In other words, in this model, each hidden state of the model itself is a hierarchical model. It also has three types of hidden states including Internal, production and end state and three types of transition probabilities including vertical, horizontal,and forced. In recent years, this statel has been used to recognize the movement of fingers [6], and to learn the movement structure[7]. Autoregressive HMM (ARHMM hereafter) was simplified by focusing on the modeling of continuous observations dependence by Rabbiner 1989 [2]. Recent applications of these models are

Upload: others

Post on 12-Oct-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

Cite this article: Madadizadeh F, Ghanbarnejad A, Zeraati H, Tabar VR, Batmanghelich K, et al. (2018) Introduction to Spatial First Order Discrete HMMing. Ann Biom Biostat 4(1): 1029.

CentralBringing Excellence in Open Access

Annals of Biometrics & Biostatistics

*Corresponding author

Hojjat Zeraati, Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran, Email: [email protected]

Submitted: 16 November 2017

Accepted: 07 February 2018

Published: 11 February 2018

ISSN: 2333-7109

Copyright© 2018 Zeraati et al.

OPEN ACCESS

Keywords•Discrete HMM•Spatial structure

Research Article

Introduction to Spatial First Order Discrete HMMFarzan Madadizadeh1, Amin Ghanbarnejad2, Vahid Rezaei Tabar3, Kayhan Batmanghelich4, Abbas Bahrampour5, and Hojjat Zeraati1

1Department of Epidemiology and Biostatistics, Tehran University of Medical Sciences, Iran2Social Determinants in Health Promotion Research Center, Hormozgan University of Medical Sciences, Iran.3Department of Statistics, Allameh Tabatabai University, Iran4Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, USA5Department of Epidemiology and Biostatistics, Kerman University of Medical Sciences, Iran

Abstract

Hidden Markov models (HMMs) are well known powerful and flexible statistical methods for modeling one dimensional time series data. They are used when the vector of observation and hidden states are processed. Sometimes, a matrix of data (spatial structure) is dealt with instead of a vector, in this situation there is urgent need to define a new extension of HMM models which can be considered as spatial structure of data.

Discrete Hidden Markov model (DHMM) is a type of HMMs with discrete observations. This study presents a new extension of the first order DHMM for data with more than one dimension which is a spatial generalization of the first order DHMM. As a matter of fact, this new model will be able to model the matrix of observation and hidden states.

INTRODUCTIONHidden Markov models (HMMs hereafter) are advanced

statistical models proposed in the late 1960s, that are well suited for pattern recognition, data sequences generation, classification and pattern segmentation problems in different areas Hui et al. (2013), Bilal et al., (2013). These models are still being developed.

HMM is a statistical model with finite intra-model states, capable of learning. It was originally introduced as a classifier or predictor for various applications [1]. The widespread popularity of HMM is due to its strong mathematical structure and statistical foundation, as well as the presence of efficient training procedures in its framework [2].

Although HMM has been known for decades, it has been applied explicitly to modeling one dimensional data and is less used with spatial structure data [3].

HMM has long been of interest to researchers, and useful steps have taken to develop these models. The first steps to develop the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the details of these complex models, whose mechanism was like a hidden black box [2]. In this tutorial, two types of HMM, continuous and discrete type, were explained.

The development of HMMs can be generally examined from two perspectives. First, model development based on the hidden chain including introduction of first order, second order, and high order models, factorial models, hierarchical models, etc., all

of which develop the dependency of hidden states of the model.

The second category, in addition to the purpose of the first category, has considered modeling the dependency of observations. From this category, we can mention HMMs with the dependency of first-order observations, second-order autoregressive HMMs with dependency of first-order observations and second order autoregressive HMMs with dependency of second-order observations.

In order to investigate the first category studies, for more compatibility of the model with discrete data, Watson et al. (1992), introduced and explained the second-order discrete HMM [4]. This type of HMM was, in fact, the generalization of the first-order Discrete HMM (DHMM hereafter) with a second-order Markov chain in its hidden states sequence.

Fine 1988, introduced the hierarchical HMM (HHMM hereafter) as an extension of the HMM including the sequence of hierarchical states [5]. In other words, in this model, each hidden state of the model itself is a hierarchical model. It also has three types of hidden states including Internal, production and end state and three types of transition probabilities including vertical, horizontal,and forced. In recent years, this statel has been used to recognize the movement of fingers [6], and to learn the movement structure[7].

Autoregressive HMM (ARHMM hereafter) was simplified by focusing on the modeling of continuous observations dependence by Rabbiner 1989 [2]. Recent applications of these models are

Page 2: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

CentralBringing Excellence in Open Access

Zeraati et al. (2018)Email: [email protected]

2/6Ann Biom Biostat 4(1): 1029 (2018)

the study of Barber et al. (2010), for predicting short winds [8], study of Wue et al. (2017), for using a Bayesian nonparametric vector ARHMM for testing robot performance [9], and study of Tuncel et al. (2018), for using autoregressive forests for modeling of multivariate time series[10].

Zuanetti et al. (2017), introduced the second-order ARHMM with continuous observations [11]. This model is in fact a generalization of the first-order HMM, in which the current hidden state of the model depends on the hidden states of the previous two times. This model also modeled the dependence between continuous observations through the autoregressive process.

There are very few studies on the spatial property of the hidden states of the HMM; they are as follows:

Mari et al. (2006), used a continuous second-order HMM to classify the data with spatial and temporal features [12]. The data of their study included the images of agricultural lands that were collected annually. The hidden states chain of the model was the type of the product of each land. Different lands were spatially dependent due to their proximity.

Feiyang and Horace 2006, introduced the concept of the spatial Markov model for continuous observations [13]. The resulting model was used for automatic classification of images. In order to extract spatial domain information, they used the two dimention Gabor filter.

Huang and Kennedy 2008, used HMM to reveal hidden spatial patterns. HMM measures spatial dependence through transition probabilities [14]. In this study, spatial structures of the city and house prices were used as hidden state and observations, respectively.

Luigi et al. (2017), used the continuous spatial HMM to explain the spatial distributions of wild species in different regions [15]. The model was introduced by Bayesian structure and inferences were made through MCMC algorithms. The proposed model was called an indirect graphical model.

Therefore, as the literature review shows, the development of HMMs is more in the field of continuous observations, and less attention has been paid to the spatial relations of the hidden states in the DHMMs. Therefore, this study aims to fill this gap to introduce the spatial HMM with discrete observations. In othe words, this study aimed to introduce a new extension of DHMM with underlying first order markov chain which can be used for the modeling of discrete spatial data.

The remaining parts of this paper are organized as follows:

In section two, the theory of first order HMMs is reviewed. In section three, spatial first order DHMM is discussed and the newly generalized parameters are defined. In section four application of proposed method is presented. In section fiver, our research discussion and future work is presented.

FIRST ORDER DHMM DHMM can be regarded as a probabilistic generative model,

such that a sequence of internal hidden states of the model which is not directly visible produces a sequence of discrete

observations. The hidden sequence of the model obeys the Markov chain property. Regarding the dependency order of states, the markov chain model was nominated [16]. For example, first-order HMM has first-order dependency on the chain of its states.

A first order HMM is defined by giving:

a finite sequence (number =N) of states. A random variable denoting the state at time t called tX .

1 2{ , ,...., }MO φ φ φ= a finite sequence (number =M) of observations. A random variable denoting the observation at time t called tO .

• Transition probabilities and their matrix.

[ ]1(X s | X s ) 1 i, j N, , 1,Aij t j t i ijj

ija P aa−= = = ≤ ≤ == ∑ (1)

• Emission probabilities and their matrix.

[ ](O ) P( | X S ) 1 ,1 , (O ) , (O ) 1i t t f t i i t i t ff

b O i N f M B b bφ φ= = = ≤ ≤ ≤ ≤ = = =∑ (2)

The initial state distribution

(X ), 1, 1i t i ii

P s i Nπ π= = = ≤ ≤∑ (3)

According to the above formula, first order HMM can be specfied by three main parameters i j(a , b (O ), )i t iλ π= (2). For more details, Figure 1 presents a First order HMM with three states (Figure 1).

The first-order HMM is exposed to three problems for appropriate and useful application in the real world [17]. In addition to implying any problem in the following, the solution is given for any problem.

Likelihood calculation

How can the likelihood be calculated given the main parameters of model ( )λ and a sequence with T-length of observations 1 2{ , ,....., }TO O O O= ?

The first method is based on the probability of seeing the sequence of observations with length (t, t<T) and being in a certain state i, followed by the forward searching of the recursive relation and extending it to a total sequence with length T (T>t), hence the likelihood can be computed as follows:

Forward variable:

( ) ( )1,..., ,t t t ii P O O X sα λ= = (4)

( ) ( ) ( )1 11

N

t t i j j ti

j i a b oα α+ +=

= ∑ (5)

Likelihood :

( ) ( )1

N

Ti

P O iλ α=

=∑ (6)

The other technique is the probability of seeing the T-t sequence of the rest observations (t+1 to the end) and being in a certain state i, followed by backward search for finding a recursive relation and extending it to the total sequence of observations

Page 3: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

CentralBringing Excellence in Open Access

Zeraati et al. (2018)Email: [email protected]

3/6Ann Biom Biostat 4(1): 1029 (2018)

Figure 1 First order HMM with three states.

and as a result, the likelihood is calculated as follows:

Backward variable:

( ) ( )1, ... , , ,t t T t ii P O O X sβ λ+= = (7)

( ) ( ) ( ) ( )11

, ... , ,N

t t T t i i i t ti

i P O O X s b O iβ λ π β−=

= = =∑ (8)

( ) ( )1 2 1, ... , ,T ii P O O X sβ λ= =

Likelihood :

( ) ( ) ( ) ( )1 1 11

, ... , |N

T i ii

P O P O O b O iλ λ π β=

= =∑ (9)

Find the single best (optimal) state sequence for given observation sequence

How can we find the best states sequence of the model that is a more likely producer of this series of observations, given that it possesses the main parameters of the model ( )λ and a sequence of observations with length-T?

Solution: Due to Bayes’ rule:

(X,O | )( | , ) arg max P(X | O, ) arg max P(X,O | )(O | ) X X

PP X OP

λλ λ λλ

= ⇒ (10)

Because the denominator is constant (does not have X). Therefore, rather than maximization of ( ),P X O λ , its equivalent

( ),O |P X λ can be maximized by the Viterbi algorithm and the optimal state sequence can be found.

The Viterbi algorithm employs the joint probability of states and observation sequences, ( ),O |P X λ , as a criterion for optimization for states sequence (X) and identifies the most probable path for the sequence of states. In fact, the Viterbi algorithm tends to find a path including all sequences of the possible states with the maximum likelihood.

Training of model (parameter estimation)

The most difficult problem to which the first-order HMM is exposed is the problem of model training. Unfortunately, there is no accurate method capable of creating a model from a given sample set. The process of finding an accurate solution for this problem is still ongoing. The training of model means how learn model parameters λ , by using training data (set of states and a sequence of observation) in which the optimization criterion is satisfied. A current optimization criterion was the maximization of likelihood by the Forward-Backward iterative algorithm [18].

SPATIAL FIRST ORDER DHMMSpatial first order DHMM is an extension of the first-order

HMM to spatial data with more than one dimension, such that in these models, the future state not only depends on the current state in one dimension but also on the current state in another dimension [3].

Image processing is a good example of spatial structure, hence during our study, all extensions were based on a two dimensional spatial structure through the use of this image size [19].

Suppose there was an image with two dimensional size equal to , 1, 2,...., 1, 2,...,m L m M× = =

Spatial first order DHMM can be defined as follows:

Hidden states matrix size is N N×

Observation matrix size is m×

11 1

1

...... ....... ...... ...... .. ...... ...... .

...... ......

m

m

o o

O

o o

=

• Markov property in this situation will be:

Page 4: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

CentralBringing Excellence in Open Access

Zeraati et al. (2018)Email: [email protected]

4/6Ann Biom Biostat 4(1): 1029 (2018)

,m 1,1 1, 1,1 1, ,1 ,2 , 1

,m 1, , 1 ,m 1, ,m , 1

( | x ,....x ,.... ,.... , ,... .... )

( | , ) ( | ). ( | )

m m m

m m m m

P x x x x x x

P x x x P x x P x x

− − −

− − − −= =

In other words, the current state of the model will depend only on the previous state in both dimensions.

• The transition probabilities matrix in this situation will be

,m 1, , 1

,m 1, ,m , 1

{a } a ( s | s , )

( s | s ). ( s | )

ijk ijk k m i m j

k m i k m j

H V

A P x x x s

P x x P x x s

− −

− −

= = = = =

= = = = =

Therefore, the transition probability matrix can be defined in terms of the horizontal transition matrix (H) and the vertical transition matrix (V) (Figure 2.).

• Emission probability matrix is

( ),m 1, , 1

{ (O )} , (O )s , s ,φ − −

= == = = =

ijk m ijk m

m f k m i m j

B b bP O x x x s

• Initial probability matrix is

{ }ijπΠ =

Hence, our model generalizes the first order DHMM by introducing new basic parameters. The spatial first order HMM is shown in Figure (3).

Formally, the Spatial first order DHMM can be specified by four main parameters ( ), , ,H V Bλ = Π . Of course, readers should note that a detailed description of the application of Spatial first order HMM needs to define a new extension of all algorithms of the first order discrete HMM and our study is in the early stage.

APPLICATION OF PROPOSED MODELAs a matter of fact, without defining all new extensions we

cannot introduce a real example of spatial first order DHMM application, for instance, training parameters of Spatial first order HMM needs to define a new extension of the Forward Backward algorithm. Also, testing of model recognition needs to define a new extension of the Viterbi algorithm for spatial data. Nevertheless, the overall summary of model application will be introduced here.

In order to explain our proposed method, suppose a random image is selected, we need to feature extraction techniques such as two dimensional Wavelet Transform (WT) to make a matrix of observations. According to the outputs, wavelet transformations can be classified into different types such as continuous, discrete etc [20]. In the present study, two dimensional DWT was used as a feature extraction technique.

After using this valuable transformation, a matrix of observations was obtained. Spatial first order HMM assumes that these observations are emissioned sequences from hidden states and the hidden states are only dependent on adjacent states in a two dimensional neighbourhood.

Usually, the parameters were selected experimentally. For example, if our observation matrix has a 4 4× size, we can run a spatial first order HMM with different number of states such as

24 16= etc (Figure 4, Table 1)

CONCLUSION & FUTURE WORKHMMs are a doubly stochastic process, in which the

underlying stochastic process produces a sequence of hidden states, and the other simultaneous process produces the sequence of visible observations. Based on the discreteness or continuity of the sequence of observations, Markov models are divided into two discrete and continuous categories. Although HMMs have been developing over the years, this development has been less occurred in the area of discrete observations, for example, the development of the model has not been done for considering spatial relations of hidden states in a model with discrete observations.

The spatial first order DHMM is a generalization of a DHMM with respect to the dependence between hidden states in a two dimension space. Despite the fact that the development of spatial HMMs in 2006 was began by Feiyang and Horace [13], these models have not yet been introduced for models involving discrete observations.

The present study is a small step towards the development of spatial models in the form of a first-order DHMM. In fact, this model is a two dimension HMM with first-order Markov chain in the hidden part of the model. Spatial dependencies between the states means the dependence of each state on adjacent state in each dimension, which can be calculated by introducing horizontal and vertical transition probabilities.

This study is still in the early stages of complete modeling of the spatial first-order DHMM. The full development of the model will occur with the introduction of generalized Forward , Backward and Viterbi algorithms. In addition, to validate the proposed model, it is necessary to apply the model on real data and simulation.

ARHMM is a generalization of HMMs with modeling of observations’ dependence as an autoregressive process [8]. Naming these models is based on the Markov chain order and the order of observations’ dependence. The generalizations of the autoregressive model are generally in the field of models with continuous observations so that in the domain models with discrete observations, the introduction of the model seems to be necessary. Therefore, in the first step, the development of autoregressive models with discrete observations and in the next step, the development of spatial autoregressive models with discrete observations are proposed.

Hierarchical HMMs are a generalization of HMMs, in which each hidden state of the model is a hierarchical HMM [5]. Although the definition of horizontal and vertical transition possibilities is also discussed here, it means the possibility of transition between different layers that is different from the definition of horizontal and vertical transmission possibilities in the present study, which is defined for each hidden state. The absence of generalization of these models is conspicuous in the domain of DHMMs, so the generalization of these models is also suggested in the domain of discrete models.

Page 5: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

CentralBringing Excellence in Open Access

Zeraati et al. (2018)Email: [email protected]

5/6Ann Biom Biostat 4(1): 1029 (2018)

Figure 2 Transition probability in spatial structure of first order HMM; Abbreviations: H: Horizontal Transition Matrix; V: Vertical Transition Matrix

Figure 3 Spatial first order HMM, two dimension.

Figure 4 Scheme of proposed spatial first order HMM.

Page 6: New Introduction to Spatial First Order Discrete HMM · 2018. 2. 26. · the HMM was taken by Rabiner 1989 after the presentation of the educational article of HMM by revealing the

CentralBringing Excellence in Open Access

Zeraati et al. (2018)Email: [email protected]

6/6Ann Biom Biostat 4(1): 1029 (2018)

In a nutshell, our research presents a new extension about discrete first order HMMs and shows generations of basic parameters for spatial first order HMMs, when applied for matrix data. Generally speaking, in contrast to simple first order HMM, spatial first order HMM is considered more contextual information by revising all traditional components of model such as the essential parameters.

ACKNOWLEDGMENTThe authors gratefully thank the reviewers and Editor for

the detailed comments which lead to much improvement of the paper. This study was part of PhD thesis.

REFERENCES1. Madadizadeh F, Montazeri M, Bahrampour A. Predicting of liver

disease using Hidden Markov Model. RJMS. 2016; 23: 66-74.

2. Rabiner L, Juang B. An introduction to hidden Markov models. IEEE assp magazine. 1986; 3: 4-16.

3. Liu R, Men C, Wang X, Xu F, Yu W. Application of spatial Markov chains to the analysis of the temporal–spatial evolution of soil erosion. Water Sci Technol. 2016; 74: 1051-1059.

4. Watson B, Tsoi AC. Second order Hidden Markov Models for speech recognition. Proceedings of Fourth Australian International Conference on Speech Science and Technology. 1992.

5. Fine S, Singer Y, Tishby N. The hierarchical hidden Markov model: Analysis and applications. Machine learning. 1998; 32: 41-62.

6. Mousas C, Anagnostopoulos CN. Real-time performance-driven finger motion synthesis. Comput Graph. 2017; 65: 1-11.

7. Mousas C. Full-Body Locomotion Reconstruction of Virtual Characters Using a Single Inertial Measurement Unit. Sensors. 2017; 17: 2589.

8. Barber C, Bockhorst J, Roebber P. Auto-regressive HMM inference

with incomplete data for short-horizon wind forecasting. Adv Neural Inf Process Syst. 2010.

9. Wu H, Rojas J, Lin H, Harada K. Robot Introspection with Bayesian Nonparametric Vector Autoregressive Hidden Markov Models. 2017.

10. Tuncel KS, Baydogan MG. Autoregressive forests for multivariate time series modeling. Pattern Recognition. 2018; 73: 202-215.

11. Zuanetti DA, Milan LA. Second-order autoregressive Hidden Markov Model Braz. J Probab Stat. 2017; 31: 653-665.

12. Mari J-F, Le Ber F. Temporal and spatial data mining with second-order hidden markov models. Soft Computing. 2006; 10: 406-414.

13. Feiyang Y, Ip HH. Spatial-HMM: A new approach for semantic annotation of histological images. Pattern Recognition. 2006

14. Huang R, Kennedy C. Uncovering hidden spatial patterns by hidden markov model. Springer. 2008.

15. Spezia L. Generalized Spatial Hidden Markov Models and Remote Sensing. 2013.

16. Nguyen L. Tutorial on hidden markov model. Applied and Computational Mathematics. 2017; 6: 16-38.

17. Dugad R, Desai UB. A tutorial on hidden Markov models. Signal Processing and Artifical Neural Networks Laboratory. Technical Report No: SPANN-961. 1996.

18. Dean TA, Singh SS, Jasra A, Peters GW. Parameter estimation for hidden Markov models with intractable likelihoods. Scandinavian J Stat. 2014; 41: 970-987.

19. Lim S, Akiyama M, Frangopol DM, Jiang H. Experimental investigation of the spatial variability of the steel weight loss and corrosion cracking of reinforced concrete members: novel X-ray and digital image processing techniques. Struct Infrastruct E. 2017; 13: 118-134.

20. Gupta D, Choubey S. Discrete wavelet transform for image processing. Int J Emerging Technology and Advanced Engineering. 2015; 4: 598-602.

Table 1: HMM variant and their futures.

Type of HMM Type of observation Futures

HMM Continuous-Discrete Gaussian HMM modeling continuous observation.DHMM modeling Discrete observation.

ARHMM Continuous- Discrete Modeling of observations’ dependence as an autoregressive process

HHMM Continuous- Discrete Modeling of hidden state's hierarchical structure as a vertical, horizontal,and forced transition probability.

Spatial DHMM Continuous- Discrete Modeling of hidden state's dependence as an vertical transition probability.

Madadizadeh F, Ghanbarnejad A, Zeraati H, Tabar VR, Batmanghelich K, et al. (2018) Introduction to Spatial First Order Discrete HMMing. Ann Biom Biostat 4(1): 1029.

Cite this article