Kamus SABDA Mobile
Bahasa Indonesia English

Found 1 definition: markov chain.

markov chain top

Pos: Noun Phrase
[WORDNET DICTIONARY]

Noun markov chain has 1 senses

   markov chain(n = noun.process) markoff chain - a Markov process for which the parameter is discrete time values;
is a kind of markoff process, markov process


[CIDE DICTIONARY]

markov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.].

   A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk. [PJC]


[RELATED WORDS]

alpine golden chain, anchor chain, andrei markov, apparel chain, ball and chain, bicycle chain, block chain, branched chain, branched chain ketoaciduria, brequet chain, chain, chain armor, chain armour, chain fern, chain gang, chain letter, chain lightning, chain mail, chain of mountains, chain pickerel, chain pike, chain printer, chain pump, chain reaction, chain reactor, chain saw, chain stitch, chain store, chain tie, chain tongs, chain up, chain wheel, chain wrench, chemical chain, closed chain, crotch chain, daisy chain, discount chain, engineer's chain, ernst boris chain, flower chain, food chain, golden chain, gunter's chain, long chain, markoff chain, markov, markov process, mountain chain, nautical chain, open chain, paper chain, pennine chain, pull chain, restaurant chain, retail chain, safety chain, sheet chain, sir ernst boris chain, snow chain, straight chain, tire chain, virginia chain fern, watch chain