Kylee Holcomb
Kylee Holcomb

Reputation: 27

Hidden Markov Model Coin & Dice Example with Prolog

I got the specific problem from here and wanted to implement it on Cplint as Im learning now the principles of ProbLog

using the above HMM as an example

So from the above model we get A red die, having six sides, labeled 1 through 6.

• A green die, having twelve sides, five of which are labeled 2 through 6, while the remaining seven sides are labeled 1.

• A weighted red coin, for which the probability of heads is 0.9 and the probability of tails is 0.1.

• A weighted green coin, for which the probability of heads is 0.95 and the probability of tails is 0.05.

As a solution, I want to create a sequence of numbers from the set {1, 2, 3, 4, 5, 6} with the following rules:

• Begin by rolling the red die and writing down the number that comes up, which is the emission/observation.

• Toss the red coin and do one of the following:

➢ If the result is heads, roll the red die and write down the result.

➢ If the result is tails, roll the green die and write down the result.

• At each subsequent step, you flip the coin that has the same color as the die you rolled in the previous step. If the coin comes up heads, roll the same die as in the previous step. If the coin comes up tails, switch to the other die.

My state diagram for this model has two states, red and green, as shown in the figure. In addition, this figure shows: 1) the state-transition probability matrix A, b) the discrete emission/observation probabilities matrix B, and 3) the initial (prior) probabilities matrix π. The model is not hidden because you know the sequence of states from the colors of the coins and dice. Suppose, however, that someone else is generating the emissions/observations without showing you the dice or the coins. All you see is the sequence of emissions/observations. If you start seeing more 1s than other numbers, you might suspect that the model is in the green state, but you cannot be sure because you cannot see the color of the die being rolled.

Consider the Hidden Markov Model (HMM) M=(A, B, π), assuming an observation sequence O=<1,1,2,2,3,6,1,1,1,3> what is the probability the hidden sequence to be

H =<RC,GC, GC,RC, RC,GC, GC,GC,GC,GC>

where RC and GC stand for Read Coin and Green Coin respectively. Use the cplint or ProbLog to calculate the probability that the model M generated the sequence O. That is, calculate the probability

P(H|O) = P(<RC,GC, GC,RC, RC,GC, GC,GC,
GC,GC>| <1,1,2,2,3,6,1,1,1,3>)

What I did so far are two approaches. 1)

:- use_module(library(pita)).

:- if(current_predicate(use_rendering/1)).
:- use_rendering(c3).
:- use_rendering(graphviz).
:- endif.

:- pita.

:- begin_lpad.

hmm(O):-hmm1(_,O).

hmm1(S,O):-hmm(q1,[],S,O).

hmm(end,S,S,[]).

hmm(Q,S0,S,[L|O]):-
    Q\= end,
    next_state(Q,Q1,S0),
    letter(Q,L,S0),
    hmm(Q1,[Q|S0],S,O).


next_state(q1,q1,S):0.9;
next_state(q1,q2,S):0.1.

next_state(q2,q1,S):0.05;
next_state(q2,q2,S):0.95.


letter(q1,rd1,S):1/6;
letter(q1,rd2,S):1/6;
letter(q1,rd3,S):1/6;
letter(q1,rd4,S):1/6;
letter(q1,rd5,S):1/6;
letter(q1,rd6,S):1/6.

letter(q2,gd1,S):7/12;
letter(q2,gd2,S):1/12;
letter(q2,gd3,S):1/12;
letter(q2,gd4,S):1/12;
letter(q2,gd5,S):1/12;
letter(q2,gd6,S):1/12.


:- end_lpad.

state_diagram(digraph(G)):-
    findall(edge(A -> B,[label=P]),
      (clause(next_state(A,B,_,_,_),
        (get_var_n(_,_,_,_,Probs,_),equalityc(_,_,N,_))),
        nth0(N,Probs,P)),
      G).

which Im creating the diagram

and the 2 one is this which I just creating the two coins and dices. I dont know how to continue from this. The 1st one is specific from a example from cplint. I cannot find any other forum specified for this kind of tasks. Seems like problog is "dead"

:- use_module(library(pita)).

:- if(current_predicate(use_rendering/1)).
:- use_rendering(c3).
:- endif.

:- pita.

:- begin_lpad.

heads(RC): 0.9; tails(RC) : 0.1:- toss(RC).
heads(GC): 0.95; tails(GC) : 0.05:- toss(GC).

toss(rc);

RD(0,1):1/6;RD(0,2):1/6;RD(0,3):1/6;RD(0,4):1/6;RD(0,5):1/6;RD(0,6):1/6.
RD(0,1):1/6;RD(0,2):1/6;RD(0,3):1/6;RD(0,4):1/6;RD(0,5):1/6;RD(0,6):1/6:-
    X1 is X-1,X1>=0,
    RD(X1,_),
    \+ RD(X1,6)

GD(0,1):1/12;GD(0,2):1/12;GD(0,3):1/12;GD(0,4):1/12;GD(0,5):1/12;GD(0,6):7/12.
GD(0,1):1/12;GD(0,2):1/12;GD(0,3):1/12;GD(0,4):1/12;GD(0,5):1/12;GD(0,6):7/12:-
    X1 is X1-1,X1>=0,
    GD(X1,_),
    \+ GD(X1,12).


toss(RC).
toss(GC).

:- end_lpad.

Upvotes: 1

Views: 538

Answers (2)

VincentD
VincentD

Reputation: 11

Not sure if this is still useful but in ProbLog you could have tried something like this:

%% Probabilities
1/6::red_die(1,T) ; 1/6::red_die(2,T) ; 1/6::red_die(3,T) ;
1/6::red_die(4,T) ; 1/6::red_die(5,T) ; 1/6::red_die(6,T).

7/12::green_die(1,T) ; 1/12::green_die(2,T) ; 1/12::green_die(3,T) ; 
1/12::green_die(4,T) ; 1/12::green_die(5,T) ; 1/12::green_die(6,T).

0.9::red_coin_head(T).
0.95::green_coin_head(T).

%% Rules

% Start with tossing red
toss_red(1).
% Toss red if previous toss was red, head.
toss_red(T) :- T > 1, Tprev is T - 1, toss_red(Tprev), red_coin_head(Tprev).
% Toss red if previous toss was green but tails.
toss_red(T) :- T > 1, Tprev is T - 1, toss_green(Tprev), \+green_coin_head(Tprev).
% Toss green if previous toss was green, head.
toss_green(T) :- T > 1, Tprev is T - 1, toss_green(Tprev), green_coin_head(Tprev).
% Toss green if previous toss was red but tails.
toss_green(T) :- T > 1, Tprev is T - 1, toss_red(Tprev), \+red_coin_head(Tprev).

% Writing results from red_die if next toss is red.
results([X],1) :- red_die(X,1), toss_red(1).
results([X|Y],T) :- T > 1, Tprev is T - 1, red_die(X,T), toss_red(T), results(Y,Tprev).
% Writing results from green_die if next toss is green.
results([X|Y],T) :- T > 1, Tprev is T - 1, green_die(X,T), toss_green(T), results(Y,Tprev).


results(X) :- length(X, Length), results(X,Length).
results(X) :- length(X, Length), results(X,Length).

% Query
query_state :-
    toss_red(1),
    toss_green(2),
    toss_green(2),
    toss_red(3),
    toss_red(4),
    toss_green(5),
    toss_green(6),
    toss_green(7),
    toss_green(8),
    toss_green(9).
    toss_green(10).

evidence(results([1,1,2,2,3,6,1,1,1,3])).
query(query_state).

Which according to this has a probability of 0.00011567338

Upvotes: 1

Kylee Holcomb
Kylee Holcomb

Reputation: 27

hmmpos.pl from here seems to be usefull enough to continue

Upvotes: 0

Related Questions