Dictionary » M » Markov chains

Markov chains

Markov chains

a stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.


Please contribute to this project, if you have more information about this term feel free to edit this page



This page was last modified 05:04, 19 April 2007. This page has been accessed 629 times. 
What links here | Related changes | Permanent link