1. Dictionary
  2. Markov Chain

Markov Chain

A set of processes where the probabilities for the next state are dependent on the present state.

Share the Blog