Markov Chain


A Markov chain describes the sequence of events and probability of happening each event depends only on the previous event.

 

 

Markov_Chain
Figure-1: A Two Markov Chain

In figure-1, you can see two states Markov chain. In which there are two states A and B. The probability of changing state from A to B is 0.70 and from A to A is 0.30. Further, the probability of changing state from B to  A is 0.20 and from B to B is 0.80. 

Have any Question or Comment?

One comment on “Markov Chain

Markov chains are really incredible and have so any applications! One of the best ways to mess around with them is to create a statistical model of English text and try to finish sentences (like fill-in-the-blanks).

Leave a Reply

Subscribe to Our Newsletter