markov chain(n = noun.process) markoff chain - a Markov process for which the parameter is discrete time values;
is a kind of markoff process, markov process
A random process (
alpine golden chain, anchor chain, andrei markov, apparel chain, ball and chain, bicycle chain, block chain, branched chain, branched chain ketoaciduria, brequet chain, chain, chain armor, chain armour, chain fern, chain gang, chain letter, chain lightning, chain mail, chain of mountains, chain pickerel, chain pike, chain printer, chain pump, chain reaction, chain reactor, chain saw, chain stitch, chain store, chain tie, chain tongs, chain up, chain wheel, chain wrench, chemical chain, closed chain, crotch chain, daisy chain, discount chain, engineer's chain, ernst boris chain, flower chain, food chain, golden chain, gunter's chain, long chain, markoff chain, markov, markov process, mountain chain, nautical chain, open chain, paper chain, pennine chain, pull chain, restaurant chain, retail chain, safety chain, sheet chain, sir ernst boris chain, snow chain, straight chain, tire chain, virginia chain fern, watch chain