All about Markov.. (Part 2)

Shashika Chamod
5 min readApr 24, 2020
Happy Person; photo courtesy: pexels.com

Hurrah!! You have come to the second part of the ‘All about Markov..’ article series. If you have not gone through part 1 yet, I recommend you to do so for the completion of knowledge. Without further worries, let’s begin.

In part 1, I mentioned about two concepts; Markov Chain and Markov Property. In this article, I am going to extend the Markov Chain concept more, so that the intuition as well as the pictorial view will be reserved in your mind.

Note: In the previous tutorial, the term ‘event’ has been used. Please keep in mind that the term ‘state’ which I may use hereafter is having the same meaning.

If we consider a simple phenomenon of a finite number of states like daily weather conditions (cloudy, sunny,rainy-finite number (3)of events), we can simply represent all the possibilities in that phenomenon in a simple state transition diagram. Nothing to worry about these terms, even if you are not familiarized with them.

The state transition diagram is nothing but a small diagram showing all the possible states and the transitions between them. It is also a directed graph. I will take another simple example from our life so that we can get used to seeing some of the day to day life incidents as Markovian processes.

Think about the moods that you experience. Somedays you are gloomy. Somedays you are happy. The ‘happy’ and ‘gloomy’ are the two states. You go back and forth into these moods each day. It’s Markovian. The mood that you are gonna be on tomorrow will be dependent only upon the mood you are today. Let’s represent this in a state transition diagram.

State transition diagram of ‘Day Moods’

Let’s read this picture. What! Read a picture. Yes, it is the way to grab things fast and in an efficient manner. The numbers represent probabilities. Now it is pretty straightforward. READ IT!

In this diagram, there are two types of components; vertices-for states and edges — for transition probabilities. If you are gloomy today, then there is a probability of 0.7 that tomorrow being a gloomy day as well. There is a probability of 0.3 of being in a happy mood. Apply this to happy state as well! !!Homework!!

What are the other things, that you have noticed in the state transition diagram? Probability figures! Yes, you are correct! The sum of probability figures out of one state to all the other states including self is 1. This is based on simple probability theory that I am explaining. These tiny things are important if you are going to dive deeper into these theories later on.

The above graphical representation can also be represented in a more mathematical manner as well. We use a matrix for that. State Transition Matrix.

State Transition Matrix — ‘Day Moods’

Across the rows, you can see the transition probabilities from one state to the others. Simple, isn’t it?

With the matrix representation, we can get into the usages of Markov Chains. In a general overview, there are many applications from finance situation prediction, search engine algorithms to speech recognition, etc.

Try to solve a simple problem with the current knowledge we have to get more understanding.

Let’s use the example just made above by me. It may not be important as a practical case but, it’s a simple way to step into the pool and start swimming.

What’s the problem we are trying to solve with Markov Chain?

Given the probabilities of current day moods, what will be the probabilities of your moods in 2 days' time?

Here, we need additional information; the current day mood probabilities.

Initial probabilities

This represents that, there is a 0.3 probability that you are being gloomy and 0.7 probability of that you are being happy today. (See how the general probability theory of, summing up to 1 also applied here).

How can we solve this with the Markov chain — state transition matrix?

State Transitions in a Tree Diagram

State transition tree diagram is used to give a better understanding of the upcoming calculations. It represents the same state transition diagram in a different way.

We can calculate Day 1 being ‘Happy’ probability as follows. (Assume following abbreviations — D=Day , M=Mood , G=Gloomy, H=Happy, Prob_Trans = Transition Probability)

Prob(D=1, M=H) =

Prob(D=0,M=G)*Prob_Trans(G,H) + Prob(D=0,M=H)*Prob_Trans(H,H)

= 0.3*0.3 + 0.7*0.6 = 0.51

Similarly, you can do it for Day 1 being ‘Gloomy’ situation too.

Prob(D=1, M=G) =

Prob(D=0,M=G)*Prob_Trans(G,G) + Prob(D=0,M=H)*Prob_Trans(H,G)

= 0.3*0.7 + 0.7*0.4 = 0.49

Or simply doing 1–0.51 = 0.49 as there are only two states here.

We can simply do this entire calculation using matrices according to the following formula.

Prob.of day N = Initial Prob * (Transition Prob^N)

We get the same answers. Whoa!! It’s good to use the simple methods, once you know the intuition behind those simple methods.

This is half of the answer. I asked in the question about, two days after condition.

How can we do this? The same formula can be applied here.

After two days probabilities

We can see that the state transition matrix has been multiplied twice along with the initial probability matrix. And, we got the two answers 0.547 for being gloomy and 0.453 for being happy. This can be extended to any number of days.

Are you HAPPY now? :D

Now it’s your turn to find out a phenomenon from your life and apply these theories to obtain probabilities or predictions. Go ahead and do that!

We have come to the end of the second article related to Markov related theories. So far, we have studied two things Markov Property (Part1), Markov Chains(Part1 and Part2).

From the next part onwards we are going to explain the rest of the Markov topics which are complex and needed more intuition. Don’t worry. Gear up. You have done a great job so far. It will be just a walk in the park thing!!

--

--