Rawia Sammout
Rawia Sammout

Reputation: 221

Gaussian hidden markov model

I am following a tutorial from this link http://www.blackarbs.com/blog/introduction-hidden-markov-models-python-networkx-sklearn/2/9/2017 in order to implement hidden markov model in my example. I have 2 hidden states and 2 observed states .

As I understand from the code in the tutorial first step in HMM is to estimate parameters of the model using maximum likelihood estimation model and then from the results of the parameters we can predict hidden states.
So Vitebri algorithm is used in order to train the model to find the optimal parameters and then predict the observed states.
Is it the case? I can share my code if it is more explicable.

Upvotes: 0

Views: 1460

Answers (1)

Tala Warang
Tala Warang

Reputation: 91

actually parameter estimation finds all, starting probs., transition probs. (for hidden states) and observations probs. (for observed states). They all are called parameters of HMM. There are at least two parameters estimation techniques/algorithms to get them all, 1. Baum-Viterbi or Viterbi Training or Viterbi Extraction and 2. Baum-welch.

Upvotes: 1

Related Questions