Reputation: 25
Markov chain method and conditional probability are these two things related? and if they happen to be related; please explain the relation between them.
Upvotes: 2
Views: 118
Reputation: 1150
Markov chains and conditional probabilities try to answer different questions. However they are related in some sense.
In markov chains, we look at A system with state and state-transitions. Events trigger state transitions and the probability of an event might depend on the state the system is in - and this is where conditional probabilities come into play.
Let's take a look into the following example to get a grasp of conditional probabilities first:
Conditional Probability can be defined as:
P(A|B) := P( A AND B ) / P(B)
In words: Assuming event B already occurred how big is the probability that event A will occur ?
Example: balls in a box:
Lets have (R)ed, (B)lue (L)ight and (H)eavy balls in a box. A ball can be either heavy or light and either red or blue.
Balls | Light | Heavy | Total
------------------------------------
Red | 10 | 20 | 30
Blue | 30 | 40 | 70
Total | 40 | 60 | 100
The probabilities of picking P(X), where X means (R)ed, (B)lue, (H)eavy or (L)ight, Red and Light(RL) Red and Heavy(RH), etc... are the following:
Event | N | Total | P
----------------------------
R | 30 | 100 | 0.3
B | 70 | 100 | 0.7
L | 40 | 100 | 0.4
H | 60 | 100 | 0.6
RL | 10 | 100 | 0.1
RH | 20 | 100 | 0.2
BL | 30 | 100 | 0.3
BH | 40 | 100 | 0.4
We talk about conditional probabilities if we encounter a question like:
What is the probability of having blue balls if we already picked a heavy ball ?
P(B|H) = P(B AND H) / P(H) = #BH / #H = 40 / 60 = 2/3
Markov chains are a bit different:
For an example with Markov chains, we need a slightly different experiment.
Imagine a setup of two boxes; one with (L)ight balls and one with (H)eavy balls.
Experiment:
Pick N Balls and put them back into the box afterwards.
Start at box (L)
If a (B)ed ball is picked, pick a Ball from the (H)eavy Box.
If a (R) ball is picked, pick a Ball from the (L)ight Box.
Question: How likely is it, that the nth Ball is Heavy ?
When dealing with Markov chains we try to build a state machine first: The State (L) means you are picking from the box with the Light Balls The outcome of Picking a ball might lead a transition to the same state or a different one. Transitions will be denoted as {R,B} and their probability in brackets.
+-----+ R(2/4) +-----+
| |<------------------- | |
.-------->| | | | <------.
\R(1/4) | L | B(3/4) | H | / B(4/6)
\--------| | ------------------->| | -----/
+-----+ +-----+
Now we can express the State as Vectors and all transitions and their probability as a Matrix. After one step (N=1) we will be in the following state:
^N
|1/4 2/6| |1| |1/4| | L |
| | x | | = | | = | |
|3/4 4/6| |0| |3/4| | H |
So the likelihood of being in state L is 1/4 and state H is 3/4. If N = 1000 we have just to reapply the transition matrix 1000 times, which is the same as raising the matrix to it's 1000th power and applying to the state vector. After 1000 steps the probability of being in state L will be ~0.31 and H ~0.69.
Notes:
By design, the entries of the matrix are the conditional probabilities of the first problem.
The nth power of the matrix converges and therefore the probabilities of being in a certain state after infinite steps.
Upvotes: 2
Reputation: 31109
In probability theory, conditional probability is a measure of the probability of an event given that (by assumption, presumption, assertion or evidence) another event has occurred.
The conditional probability of A given B is usually written as P(A|B)
Let's go to Markov Chain
a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history. i.e., conditional on the present state of the system, its future and past are independent.
And define variables:
A - current state of a process
B - predicted state of the process
Even it is future and past independent we have a present condition and a prediction IS dependent from current state as condition. So we can write it as P(A|B) which is conditional probability definition.
Upvotes: 1