All about Markov.. (Part 1)
Markov Property, Markov Chain, Hidden Markov Model (HMM), Markov Random Field (MRF), Markov Chain Monte Carlo (MCMC).

If you are involved in statistics, AI or similar field you may have heard about various terms like Markov Property, Markov Chain, Hidden Markov Model (HMM), Markov Random Field (MRF), Markov Chain Monte Carlo (MCMC)…etc. A bunch of ‘Markov’s , isn’t it? Are they the same, if not what each term defines for. There are pretty deep papers and articles on these topics with a lot of mathematical formulas, but in this article I am going to illustrate these topics in a more simple and intuitive manner.
Before going into the topics, let’s have a look into the beginning of all of these concepts. Andrey Markov was the great mind behind it. He was a Russian mathematician and one of the most reputable staticians to the date. His work in 19th century related to stochastic process modeling is admirable more today than past. I may say, all kind of development what we experience today has a huge impact from his theories. In other words, the world wouldn’t be this much developed unless his theories were not available.
Gear up! Here we go.
I assume that the readers have understanding of basic probability theory, statistics and mathematics. If not, please get familiarized.
Markov chain is a good starting point to describe all these concepts one by one. Let’s begin there.
Markov Chain
Markov Chain is literally a chain or a sequence which models/shows some phenomena. What kind of a chain it is. It is a chain of stochastic events. Oops! Too Complex!! There is nothing to be worried about. Stochastic means random. It is a chain of random events. The terms chain and event give some feeling about time,am I right? An event is something which happens in a particular time instance. How about this explanation. ‘A time sequence of random events’.
Ex: Daily weather report of particular area.
Day 1- Rainy, Day 2- Cloudy, Day 3- Cloudy, Day 4- Sunny ….
Is that all? No.
We didn’t define the term ‘Markov’ in Markov chains.
The events in the Markov Chain has Markov property associated. That’s why we called it a Markov Chain.
Markov Property
To describe Markov property, we need some understanding of conditional probability. If you are not, please get familiarized with it as I mentioned earlier.
We consider an event has Markov property when the next event of a sequence is only dependent on the current event and not on any of the past events.
Too long!
An example will make it clear. Let’s use the aforementioned weather example.
Weather sequence : …,Rainy, Cloudy,Cloudy,Sunny,Cloudy,Sunny,Rainy…
If today is sunny day (Pay attention to bold), then tomorrow being a cloudy day is only dependent upon today’s weather condition which is sunny. Tomorrow being a cloudy day is independent of all the past sequence (…,Rainy, Cloudy,Cloudy)upto today.
So that, we called Markov property is a memoryless property.
I think, now you have a clear idea about Markov property. This is mathematically represented as follows.

Probability of state(event) at time t+1 given the state at time t is same as, prob. of state at t+1 given all the previous states.
Now we can define Markov chain- A sequence of random Markovian (with Markov property)events.
Hurray! We have come to the end of the first part of ‘All about Markov’ . Hopefully, this simple description let you understand about two concepts; Markov Chain and Markov Property intuitively and clearly.
In the next article, I will be describing how to represent the Markov chain in a state diagram and rest of the topics.