Hyper Dictionary

English Dictionary Computer Dictionary Thesaurus Dream Dictionary Medical Dictionary


Search Dictionary:  

Meaning of MARKOV CHAIN

WordNet Dictionary
 
 Definition: [n]  a Markov process for which the parameter is discrete time values
 
 Sponsored Links: 
 
 Synonyms: Markoff chain
 
 See Also: Markoff process, Markov process

 

 

Computing Dictionary
 
 Definition: 

(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

 

 

COPYRIGHT © 2000-2003 WEBNOX CORP. HOME | ABOUT HYPERDICTIONARY