搜尋結果
Markov chain
- IPA[ˈmärˌkôf]
美式
- a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
noun: Markov model, plural noun: Markov models
- 相關詞
- a stochastic model describing a sequence of possible events in which the probability of each ...
Oxford Dictionary