discrete-time Markov chain

English

edit

Noun

edit

discrete-time Markov chain (plural discrete-time Markov chains)

  1. (mathematics, probability theory) A sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.
    Synonym: DTMC
  NODES