
Markov chain definition
As we said, a Markov chain is a mathematical model of a random phenomenon that evolves over time in such a way that the past influences the future only through the present. In other words, a stochastic model describes a sequence of possible events in which the probability of each event depends only on the state that was attained in the previous event. So, Markov chains have the property of memorylessness.
Let's consider a random process described by a sequence of random variables, X = X0, ..., Xn, which can assume the values in a j0, j1,…, jn set. We will say that it has the Markov property if the evolution of the process depends on the past only through the present—that is, the state in which we found ourselves after n steps. This can be defined as follows:
This relation must apply to all the parameters if they are well-defined conditional probabilities. A discrete-time stochastic process X that has the Markov property is said to be a Markov chain. A Markov chain is said to be homogeneous if the following transition probabilities do not depend on n, but only on i and j:
When this happens, the following changes are made to the formula:
Given this, we can calculate all the joint probabilities by knowing the numbers pij, plus the following initial distribution:
This probability is called a distribution of the process over time zero. The pij probabilities are called transition probabilities, and, to be specific, pij is the probability of transition from i to j in a time step.