21-03-2012, 04:39 PM
Markov Chains
Tutorial05.ppt (Size: 322.5 KB / Downloads: 185)
Markov Process
Markov Property: The state of the system at time t+1 depends only on the state of the system at time t
Stationary Assumption: Transition probabilities are independent of time (t)
Markov ProcessGambler’s Example
Gambler starts with $10
- At each play we have one of the following:
• Gambler wins $1 with probability p
• Gambler looses $1 with probability 1-p
– Game ends when gambler goes broke, or gains a fortune of $100
(Both 0 and 100 are absorbing states)
Markov Process
Markov process - described by a stochastic FSM
Markov chain - a random walk on this graph
(distribution over paths)
Edge-weights give us
We can ask more complex questions, like