Across
- 4. A distribution that does not change as the process evolves
- 6. The likelihood of a state change occurring
- 8. The movement from one state to another in a Markov model
- 9. The observable output associated with a hidden state
- 10. A type of Markov model where states are not directly visible
Down
- 1. The property that future states depend only on the current state
- 2. A table used to represent state transition probabilities
- 3. A process involving random variables and probabilities
- 5. A sequence of states linked by transitions in a Markov process
- 7. A specific condition or situation at a point in a Markov process
