Markov chains

Markov chains

a stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Retrieved from "http://www.biology-online.org/bodict/index.php?title=Markov_chains&oldid=78471"
First | Previous (Markov andrei) | Next (Markov process) | Last
Please contribute to this project, if you have more information about this term feel free to edit this page.