discrete-time Markov chain
English
editNoun
editdiscrete-time Markov chain (plural discrete-time Markov chains)
- (mathematics, probability theory) A sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.
- Synonym: DTMC