This process has been categorized into discrete time Markov Chain and Continuous Markov Chain. In discrete time period, the transition from any state to another happens with probability distribution influenced only by the current state. It is not affected by the sequence of events that preceded it. However, in continuous time, the time taken in each state takes positive real values having exponential distribution. The future predictions about the experiment rely solely on the current state of the experiment. It is not influenced by the historical behaviour. Fig 2, represents Continuous Markov Chain.
