Yahoo奇摩字典 網頁搜尋

搜尋結果

  1. Markov chain

    • IPA[ˈmärˌkôf]

    美式

    • n.
      a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
    • noun: Markov model, plural noun: Markov models

    • 相關詞
    • n.
      a stochastic model describing a sequence of possible events in which the probability of each ...

    Oxford Dictionary